Crypto-Current (039)


State of Play

§4 — Humans are neither tigers, nor bees. Regardless of ethnic variation, or ideological faction, they are neither solitary, nor collective (eusocial[1]), but social, and societies are essentially middling, or ambiguous. The concepts of the social and the individual, or the public and the private, are reciprocal, and mutually compromised. Social beings are necessarily (always, but only) partially coordinated, through transactional bonds. They have neither group mentality nor perfect autonomy. The way they get along together is an ineluctable and perennial problem, resolved through precarious, transient, meta-stable solutions. All promises of definitive fusion or fission, perfected solidarity or independence, are strictly utopian. The perpetual tension of dynamic social arrangements is an unsurpassable human reality.[2] It occupies the zone of coordination.

§4.01 — The ineradicable ambivalence of the social animal is captured by the theory of games. A tiger does not play games with a prey animal, anymore than bees play games with each other. A game is a transactional integration, at once too intimate for a non-social animal, and too fractured for a consolidated collective.

§4.02 — Games, in the game-theoretical sense of the term – the one relevant here – are always played in the wild. That is to say, they cannot be exited by cheating. If knocking over the table is a move that can be made (in reality) and doing so ends the game, it wasn’t a game of any seriousness to begin with. In any game that matters, cheating is a permitted move, as soon as it is possible at all. It might be said, more precisely, that any game which effectively prevents cheating is embedded within a greater game where such prevention is actualized, as an outcome. Any regulated game is carved out of the wild, and it is the outer game – that carves – which game theory attends to. In this lies its realism, distinguishing its objects from circumscribed, ludic amusements. Games merit social attention precisely because they contain cheating as integral options. Trust has to be internally processed, not extraneously presumed. There are no external referees.[3]

§4.03 — Due to its extreme elegance, and consequent generality of application, Prisoners’ Dilemma (PD) has come to achieve broad acceptance as the archetypal game. The scenario is elementary, by design. Two prisoners are held in noncommunicating cells. Each has the same, binary (or ‘Boolean’) strategic decision to make – to betray the other, or not. The entire game can therefore be represented by a 2×2 matrix. Finally, each space (or outcome) contains two numbers, representing the payoff to the players. In PD this aspect is perfectly symmetrical – the situation of each player exactly mirrors that of the other. Every payoff is a weighted negative utility – dramatized as a prospective period of jail time. All of the information on the outcome grid (or payoff table) is objective. It represent the dilemma facing each player as both, equally, would acknowledge it, without controversy, or perspectival inflection. In principle, it is accessible to both prisoners, and guides their choice of ‘move’.

§4.04 — PD has no well-coordinated solution, unless the game is multiplied – to become iterated,[4] and mnemonic. This is because there is no strictly rational alternative to defection (betrayal) in the absence of additional information, such as the kind that would be provided by the persistence of reputational positions through multiple cycles. In this respect, PD models coordination problems of the tragedy of the commons type, in which the optimization of collective interest is practically unobtainable.[5] ‘Free-rider’ problems are sub-components of the same dilemma, which indicates that it is generalizable to parasitic relations of all kinds.[6] Within all of these cases, rational individual decisions aggregate to a collective failure, expressed as systemic collapse in extreme cases, or – more typically – as a deadweight (negative sum) loss to the population as a whole.

§4.05 — It bears repeating – or reiterating – because it cannot be easily over-emphasized, that Prisoners’ Dilemma has extraordinarily general application to coordination problems. It would, indeed, be quite reasonable to characterize it as the model trust crisis. When concentrated into an atom, the pure element of the game is a double chance of treachery – subjective and objective – arising from the ineliminable hazard, on both sides, of betrayal. What Bitcoin acknowledges, from the beginning, is that to escape the prison-house of distrust is no easy thing, once mere moral exhortation in the direction of altruism is theoretically shelved. The recognition of this problem as a problem is socio-political realism itself. It is at this fork in the road that almost everything is decided.

§4.06 — PD is a close analog of a number of other game theoretical dilemmas, of which the best known is ‘Chicken’ – itself based upon an abstract model of bipolar geopolitical conflict in the context of nuclear deterrence. Chicken has several variants, distinguished primarily by dramatization. In one, competitors wrestle at the edge of a cliff, and double ‘defection’ pushes both over the edge. Another version of Chicken sets two drivers accelerating towards each other in automobiles. The contestant who swerves, loses. If neither swerves, a common calamity results (equivalent to the double defection – or collective pessimal – equilibrium in PD). An important difference between classic PD and Chicken, however, is that in the latter scenario(s) the contestants are not held to be strictly non-communicating. While the final decision of each antagonist remains a black box to the other, thus preserving the core of the game-theoretical dilemma, preliminary expressions of commitment are permissible.[7] Chicken thus permits strategies that involve signaling.

§4.07 — The DSP tells us that signs are cheap. Communication of commitment, therefore, is no trivial matter. Semantically and syntactically flawless statements of exceptional rhetorical quality still commonly – and even typically – mean nothing.[8] To repeat the essential, in the ways that matter most they are easy to say (and their repetition is cheaper still). Unless a cost is credibly attached to them, their flourishes make no additional contribution. The problem of credible commitment, as it arises within the theory of games, thus closely tracks that of the contract in crypto-economics. In both cases the strength (or value) of the signal is directly proportional to a conspicuous contraction of discretionary power, corresponding to an irreversible operation. Only when it is impossible – or at least infeasible – to back-down, recant, or renege, does a signal acquire game-theoretical significance. Burning bridges behind oneself signals something that no rhetorical flight is able to match. Even the importance of precedent – or reputation – in iterated PD is based on the status of the past as an irrevocable commitment. If what had been done could be taken back, like a fumbled move in a friendly game of chess, it would count for nothing. The irrevocable consumption of freedom provides the content for strategic signs.

§4.08 — Bitcoin is a game, in the strong or technical sense, because it does not control cheating through a transcendent rule (upheld by a “trusted third party”), but rather through an immanent principle (Nakamoto Consensus). Its immediate ancestry, within the game-theoretic lineage, descends from the formulation of The Byzantine Generals’ Problem, dating back to the mid-1970s.[9] As Lamport, Shostak, and Pease explain the problem (with line breaks preserved from the original):

We imagine that several divisions of the Byzantine army are camped outside an enemy city, each division commanded by its own general. The generals can communicate with one another only by messenger. After observing the enemy, they must decide upon a common plan of action. However, some of the generals may be traitors, trying to prevent the loyal generals from reaching agreement.

The generals must have an algorithm to guarantee that

A. All loyal generals decide upon the same plan of action.

The loyal generals will all do what the algorithm says they should, but the traitors may do anything they wish. The algorithm must guarantee condition A regardless of what the traitors do.

The loyal generals should not only reach agreement, but should agree upon a reasonable plan. We therefore also want to insure that

B. A small number of traitors cannot cause the loyal generals to adopt a bad plan.

§4.09 — ‘Byzantine failures’ arise when parties distributed within a communication network, containing unreliable nodes, are obstructed from reaching agreement, because they cannot confidently establish among themselves what has in fact been communicated, or from which agents messages have been received. A global perspective appears unobtainable, and local perspective is vulnerable to compromise. The extreme difficulty involved in Byzantine communications makes them a model coordination problem, of special relevance to Internet-connected agencies. Crucially, for our purposes here, and beyond, the problem follows upon an assertion of immanence (a critique), since it is defined primarily by the absence of a transcendent tribunal with global insight. None of the ‘generals’ are able to stand outside the system, call upon an authoritative criterion from beyond it, or even direct their communications around it. Their relation to each other is technically flat (or peer-to-peer). Any solution has to be drawn from out of the system itself – which is to say, from the self-organizational resources inherent to sheer multiplicity.

§4.091 — Nakamoto Consensus, in game-theoretic context, is the name for a solution to the Byzantine Generals’ Problem, based upon proof-of-work. By including proof-of-work within each message (hashed block), the generals are able to make the measure of agreement reached – i.e. computational power committed – into an intrinsic property of their communications. Agreement about the message is folded into the message. As blocks are chained, securely, in strict succession, the signal of consensus strengthens. In meeting a reiterated proof-of-work criterion, the blockchain accumulates immanent credibility. It replaces an extrinsic – and intractable – question about the reliability of communications with an intrinsic communication of reliability. Trust is made into the message.

[1] Within the terrestrial biosphere eusociality is most vividly represented by the Hymenoptera (ants, bees, wasps) and by termites. Unsurprisingly, therefore, the concept has been advanced primarily by entomologists. Suzanne Batra and E.O. Wilson have been particularly significant in advancing the concept, based on insect models in both cases. Nevertheless, truly communistic mammals do exist, instantiated by two species of mole-rat. As with eusocial insects, mole-rat societies are rigidly segmented between fertile and infertile castes (a biological precondition for equilibrium communistic organization). Eusocial species incarnate the solution to a coordination problem. The games involved (searches for evolutionarily stable strategies) have been resolved at the genetic level. That the pseudo-individuals within insect hives or colonies do not engage in competitive games with each other is precisely what eusociality means. Within (merely) social species, in contrast, genetics is under-determining, setting only general parameters for intra-social cooperative and competitive behavior. The execution of games is delegated to phenotypic performance, without access to any collective optimum state, or established strategic equilibrium. Such animals thus inherit the plasticity implied by ongoing (and uncompletable) games – which opens the sphere of culture, as a semi-autonomous domain of variation and emergent outcomes.  

[2] The same set of distinctions between the social, the a-social and the eusocial, is invoked by James A. Donald in his path-breaking study on the foundations of Natural Law,

The tripartite distinction echoes Aristotle’s classical statement, from the Politics: “Man is by nature a social animal; an individual who is unsocial naturally and not accidentally is either beneath our notice or more than human. Society is something that precedes the individual. Anyone who either cannot lead the common life or is so self-sufficient as not to need to, and therefore does not partake of society, is either a beast or a god. ”

[3] We can say, more precisely, that any game overseen by an external referee is embedded within a greater game, perhaps recursively, until reaching the transcendental plane which isn’t subject to adjudication by anything beyond itself.

[4] The ubiquity of the PD coordination model does not escape Venkatesh Rao, who observes: “… the well-known Iterated Prisoner’s Dilemma (IPD) model [is] sometimes called the e. coli of social science research.” In the words of Simon Dedeo: “As a tool for the mathematical study of human behavior, [PD] is the equivalent of Galileo’s inclined plane, or Gregor Mendel’s pea plants.”

[5] Garrett Hardin’s ‘The Tragedy of the Commons’ (1968) first coined the term that now seems so indispensable. Despite its compelling simplicity, there is little sign of subsequent intellectual convergence upon what this model of overexploitation through coordination failure implies. At the political level, socialists and libertarians – equally – find their analyses and prescriptions supported by it. An ideological meta-tragedy has thus confirmed its insight in the very process of failing to draw common conclusions from it. Hardin’s classic essay can be found at:

[6] Radical coordination failure in biological systems is epitomized by the parasite that kills its host. Despite the difficulty of evaluating deep historical evidence, under natural conditions in which extinction is the fate of approximately all species, it nevertheless seems reasonable to interpret this extreme pessimal equilibrium as the exceptional case. It is widely recognized that diseases tend to decline in malignancy over time, as self-destructively extravagant forms of parasitism are weeded from the biological record. Epistemological and ontological selection effects here converge, as the ‘phenomenon’ of coordination failure is edited from the realm of evidence. (That we will tend to see what works is Darwinism itself.) As with the tragedy of the commons, parasitical relations – enveloping every kind of predator-prey interaction – are vulnerable to overexploitation failures. The attendant arms races are important drivers of biological diversity, and phenotypical extravagance. The effects of competition for light among trees – roughly, trees themselves – are only the most vivid example of the way biological form is dominated by the outcome of a long history of default to non-coordination.

[7] In The Strategy of Conflict (1960) Thomas Schelling emphasizes the importance of ‘credible commitment’ to classically-structured games. His insight is satirized – with great insight – in Kubrick’s Dr. Strangelove, which builds its plot around the understanding that the limit signal of commitment is strategic automation (or automatic retaliation). The strategic irrationality of making this extreme commitment without signaling it is a central comic device of the movie.  

[8] An entire poetics could be constructed in this space. Based upon the lacuna of credible commitments in the pure linguistic realm, it would reverse the game-theoretical problem into a source of creativity, by conceiving it as a rhetorical generator. (That is an undertaking for another occasion.)

[9] The Byzantine Generals’ Problem – which is the difficulty of achieving ‘Byzantine coordination’ – was initially named ‘The Two Generals Paradox’ upon its formulation by Jim Gray (in his ‘Notes on Data Base Operating Systems’, 1978), and was then generalized – to larger multiple agent systems – by Leslie Lamport, Marshall Pease, Robert Shostak (in 1980).

As humorously reformulated by Satoshi Nakamoto, in a post on the Cryptography mail list that scrupulously preserves the critical abstract properties of the problem: “A number of Byzantine Generals each have a computer and want to attack the King’s wi-fi by brute forcing the password, which they’ve learned is a certain number of characters in length. Once they stimulate the network to generate a packet, they must crack the password within a limited time to break in and erase the logs, otherwise they will be discovered and get in trouble. They only have enough CPU power to crack it fast enough if a majority of them attack at the same time. […] They don’t particularly care when the attack will be, just that they all agree. It has been decided that anyone who feels like it will announce a time, and whatever time is heard first will be the official attack time. The problem is that the network is not instantaneous, and if two generals announce different attack times at close to the same time, some may hear one first and others hear the other first.” The same post explains how a proof of work solution can be achieved:

It is especially important to note that the BGP formalizes the problem of coordination (in general) as synchronization. As already remarked (in Part One), it articulates a problem whose insolubility, in the context of cosmo-physical theory, coincides with general relativity, spacetime, and the renunciation of absolute succession. A solution to the BGP, therefore, is intrinsically post-relativistic. (Given the restoration of succession to the order of signs that follows from the innovation of the blockchain, the application of the ‘post-’ prefix in this case has exceptional – and reflexive – conceptual legitimacy.)

See also The Problem of Firing-Squad Synchronization, whose relevance is implicit in its name:

Crypto-Current (038)

§3.8 — Setting out on the path to a cognitive integration of Bitcoin calls for both anticipation and critical retrospection, and in fact compels it. Bitcoin drives a migration long promised by transcendental philosophy, from naïve ontology to a practical acknowledgment of the essence of being as the criterion of reality (finally indistinguishable from absolute succession, or order in-itself).[1] What emerges is nothing less than an artificial universe, founded – groundlessly – upon a spontaneously-engineered consistency. Once it is granted, practically, that no assertion of truth can be effectively sustained against a predominance of cognitive capability, all prospect of Archimedean (epistemological) leverage is subtracted. Bitcoin at once systematizes and implements this insight within its cycle of auto-production, establishing the foundations of transcendental authority through a realization of semiotic singularity. Truth is that which survives a process of elimination biased against duplicity.

§3.81 — The elegance, or economy, of Bitcoin’s virtual universe is fully consistent with a certain ontological luxuriance, encompassing a population of agents (represented by accounts), territories (wallets), objects (coins and coin-fragments), events (transactions), a consensual history (the blockchain), and – providing an ultimate criterion of reality – matter (computing power). Such tropical frondescence is also ecological. It generates niches, as zones of specialization, competition, and proliferation. As with all cases of techonomic revolution, the result is a ‘Cambrian Explosion’ of unpredictable, cross-stimulated innovation. The very meaning of ‘species’ undergoes escalation. As a side-consequence of its unprecedented ontological severity, or selectivity, Bitcoin triggers a re-population of the world.

§3.82 — Given the common principle of viral hijacking and double-spending, any DSP solution makes an immediate contribution to the field of computer security. “Trusted third parties are security holes,” Nick Szabo writes.[2] Bitcoin as critique is immediately security innovation, because immanence is self-policing. Transcendent sources of protection are vulnerabilities. It follows that Bitcoin security threats are characteristically extrinsic, applying to the edges of its commercium, where violence and fraud can be targeted at ‘people’ (IRL-IDs) and their insecure human flesh-machines. Most crudely, an individual can be menaced with a weapon (in meatspace), and told to hand over his private key. Alternatively, residual intermediaries – entrusted with the safekeeping of bitcoins – can abscond with them. Such dangers are, however, exogeneous. Even when they are associated with Bitcoin in public perception, their origin lies elsewhere.

§3.83 — Bitcoin has yet to be hacked.[3] The principal security threat to Bitcoin is still conceived – as it was already at the origin – as a ‘51% attack’ in which a hostile party (or coalition) commands sufficient applied computing power to overwhelm the consensus, and subsequently re-configure the protocol to its convenience.This vulnerability is finally game-theoretical rather than narrowly technical, as are Bitcoin’s defenses against it.[4] Incentives are an integral factor. Stated with maximum crudity: Why would an attacker be motivated to destroy an asset that has already been captured? Subversion of Bitcoin requires that one first owns it, at least to the degree that its devaluation becomes a self-inflicted injury. These questions are addressed a little more fully in Chapter-4 (directly following).

§3.84 — Bitcoin is nothing less than a semiotic restoration – an Occidental analog of the Confucian rectification of signs – and actually something more, because it is irreducibly innovative (on the efficient model of critique). For the first time, the securitization of a sign, as an economic token, has been understood. Meaning becomes hard currency. The immense philosophical revolution is implicit: It can be demonstrably made impractical to lie. Thus, by a negative and ‘merely technical’ route, all prior discourse on truth has been bypassed. With Bitcoin, there is now a truth engine. The consequences are not easily delimited. Even if Bitcoin remains to be definitively comprehended as the long-anticipated end of philosophy, there has never previously been a more convincing model for it. We know, from around the back, what truth is now.

§3.9 — While this book contains numerous signs representing economic values, this does not mean that it is made – even partly – out of money. The expression ‘BTC 21,000,000’ – as it appears here and in comparable texts – evidently has no monetary value whatsoever. From this alone we can confidently presume that monetary signs have some crucially distinctive characteristic, which is only very inadequately captured by any general semiotic determination such as ‘representations of economic value’. A monetary sign is something more than a sign that means ‘money’. Money, nevertheless, is made out of monetary signs.

§3.91 — In order for signs to function as money, they have not only to represent value as a signification, or to indicate it (for instance as an account code), they also have bear it, as something else. Alongside the semiotic aspects of signification and indication – and even perhaps on occasions instead of them – monetary signs require the characteristic of commutation, collection, or allocation.[5] They involve real, rather than merely metaphorical, substitution or exchange, as a condition of possibility for expenditure. A language-user can spend time and energy emitting words, but the words themselves are not – in any rigorous sense – spent. Vocabulary is not consumed in the process of speaking or writing, because a word is not – unless merely figuratively – ‘passed’ from one party to another, but rather duplicated each time a message is communicated. Where a message is spread, or proliferated, money is transmitted – in accordance with the rules of double-entry book-keeping, and contrary to the dynamics of multiplication through double spending. When money is as such, it is added to one wallet or account only in being deducted from another. Whenever – in contrast – money operates in the manner of a linguistic sign, it is spent without cost, and rapidly reduced to worthlessness.

§3.92 — It would be convenient if the word ‘token’ were available to carry the sense of the allocative sign, and there is some indication that the word is being adopted in crypto-currency circles in this way, indifferent to potential interference (and confusion) from its previously established technical and philosophical usage.[6] In their ordinary deployment, tokens count as money. Yet precisely because they allocate more than they signify, their meaning has remained – overwhelmingly – lodged in obscurity. They circulate in immense numbers, saying little.

§3.93 — If a new semiotic settlement is to follow in the wake of the Bitcoin protocol, and its solution to the DSP, there is an alternative common term all-but destined – if not, in fact, simply destined – to be cemented into the foundations. The allocative sign is the coin. General acceptance, in this regards, requires only an increment of abstraction, accompanied by an automatic reversal. Once the crypto-currency ‘-coin’ suffix, rather than alluding to concrete specie, acquires the status of a defining model, the word ‘coin’ becomes the technically-precise bearer of a semiotic function. A coin, then, would be fully characterized as a unit of DSP-resolved currency, typically instantiated as a highly-virtualized, Internet-communicable ledger entry, reproduced on a blockchain.  

§3.94 — Beside the signifier and the index – or no less beneath them – is the coin.

[1] Melanie Swan, whose writings on Bitcoin are distinguished by their extraordinary visionary sweep, describes the cryptocurrency as “in some sense … a supercomputer for reality”. (This is a ‘sense’ that she proceeds vigorously to explore.) Unfortunately, her framework of understanding is impaired at its highest level by a propensity to abundance theorizing, under the sign of cornucopian illusion. This error is unfortunately common – and even typical – among those drawing upon transhumanist inspiration. As the genealogy of Bitcoin vividly demonstrates, the primary manifestation of digital abundance is not the supercession of the commodity, but spam. Bitcoin restores robust scarcity, precisely insofar as it filters-out spam money. …

[2] See:

[3] Bitcoin is an experiment in digital security. See:

For an examination of more exotic threats, in the emerging epoch of quantum computing, see Vitalik Buterin:…

[4] See the Bitcoin paper, section 6: “The incentive may help encourage nodes to stay honest. If a greedy attacker is able to assemble more CPU power than all the honest nodes, he would have to choose between using it to defraud people by stealing back his payments, or using it to generate new coins. He ought to find it more profitable to play by the rules, such rules that favour him with more new coins than everyone else combined, than to undermine the system and the validity of his own wealth.”

In this same vein, Morgen E. Peck asks: “Why trust Bitcoin, or more specifically, why trust the technology that makes Bitcoin possible? In short, because it assumes everybody’s a crook, yet it still gets them to follow the rules. … In old security models, you tried to lock out all of the greedy, dishonest people. Bitcoin, on the other hand, welcomes everyone, fully expecting them to act in their own self-interest, and then it uses their greed to secure the network.” This is of course simply liberalism, as it was once understood. Bernard Mandeville’s The Fable of the Bees had already securely apprehended the principle. Originality lies in the implementation.

[5] The task of completing the basic triad of semiotic dimensions is a voyage into terminological torment. For signs to fold-back so far into themselves is an invitation to madness. If ‘commutation’ is vulnerable to ruinous interference from its dominant mathematical usage, ‘collective’ buckles under its hyper-density of ideological associations. To collect is to accumulate. The most primitive money, Nick Szabo suggests, consists of collectibles. Yet a social collective, in its strong ideological sense, reduces property to its zero-degree (with the full suppression of exclusive use). To invoke ‘collectivism’ in the context of monetary semiotics, then, could quite reasonably appear as a gratuitous provocation, only partially excused by the entertainment potential it releases. Its abrasiveness would most likely prove culturally unsustainable.

[6] The most firmly-established technical determination of the word ‘token’ is that locked into the logical and philosophical ‘type / token’ distinction, which has been adopted into computer science and programming. It distinguishes a type or class of thing from a thing. Wikipedia illustrates the distinction through an unimprovable sentence from Charles Sanders Pierce: “There are only 26 letters in the English alphabet and yet there are more than 26 letters in this sentence.” The counting of letter-types and letter-tokens is different. Only in the latter case does the arithmetic incline to economics, opening to factors such as production capacity and cost, batch sizes, and supply limits. (Transition to the economic register occurs via the product prototype, and its special, initial, or unique costs.) This book refers ambiguously either to the output of an authorial and editorial process, or a unit from a print run. The digital complication of the distinction, or the meaning of an instance and its economics (encompassing the entire conceptual and practical problematic of intellectual property), is in this case especially evident. Typal property, of the kind found in IP, is reliably confounding. This coin, similarly, splits on the type / token fracture line, which divides it between its twin faces as an example of a class of coinage, and as a unit of currency. A ‘token’ in this sense has a definite relation to the idea of the non-duplicitous or allocative sign, since it isolates non-generic (or ‘numerical’) identity.

Crypto-Current (037)

§3.7 — ‘Singularity’ is a stressed sign, even in advance of its capture by theories of decentralized crypto-currency. It carries a complex of meanings that can easily appear inconsistent, and perhaps only arbitrarily concatenated (although this is not a conclusion drawn in the present work). The simplest – logico- grammatical – sense and usage of the term is fixed by contrast to plurality. ‘Singularity’ is the state of being singular (undoubled, or in any way further pluralized). This austere meaning has been overwhelmed by more exotic cosmo-physical, eschatological, and philosophical references – to the event horizon of gravitational collapse, to the ‘wall across the future’ drawn by emergent superintelligences, and to non-generic being beyond the metaphysics of unity.[1] The term is further complicated by its substantial overlap with individuation, which has itself accumulated technical semantic mass through its application to the study of complex systems. It is an essential characteristic of any complex system that it individuates (itself).

§3.71 Bitcoin Singularity is over-determined within this cloud of associations. It is not only – as already proposed – an autonomization event, or threshold of individuation, but also a de-pluralization (through resolution of the DSP), and even a crisis (or ‘critical-point’) in the history of terrestrial intelligence, with definite invocation of Technological Singularity – for which it arguably provides an infrastructural foundation. Singularity eludes comparison. It can be designated, but not definitely signified. It marks a limit of objectification, rather than an object. Kantian transcendental realism – whose place carrier is the non-objective thing-in-itself – prepares us for it.

§3.72 — Solving the DSP upon the digital plane requires that the relevant entities – units of value – can be copied without being multiplied. Unless carefully formulated, therefore, the problem can appear simply insoluble (as a straightforward contradiction). How can digital replication be assimilated to the conservation of singularity? As seen, repeatedly, such apparent contradiction (or pseudo-paradox) is the reliable indication of an incompletely resolved diagonal problem. The solution is the scarce sign, consolidated as a concept, but also – and no less fundamentally – actualized as a technical achievement. Bitcoin realizes a diagonal function, instantiated through digitally-replicable but economically precious signs.

§3.73 — The Bitcoin singularity simulation is – among many other things – a philosophical event of extraordinary significance: the technical initiation of absolute succession. From this point, history explicitly enters the phase of synthetic ontology, or the techno-commercial production of being. Reality is re-grounded in a catalyzed – and henceforth catallactic – construction, which functions as an ultimate criterion. In all questions directed towards the veracity of signs, the blockchain is – if as yet only virtually – the terminal tribunal.[2] Intrinsic to this innovation is the necessity, or strict principle, that no superior authority is possible. Within the entire cosmos of signs, encompassing all social and cultural exchanges, it is only through the blockchain – or some adequate analog – that the extinction of duplicity is ensured.

§3.74 — Such claims can only appear hyperbolic. They correspond, as previously noted, to the objective idealism of transcendental philosophy, insofar as they dismiss all prospect of external epistemological leverage as pre-critical. Nothing can be brought to bear upon Bitcoin from without that is not manifestly inferior to it in respect to the capability for truth validation. There cannot be an intellectually compelling reason for any anthropo-philosophical criticism of Bitcoin to be believed. To be discredited, in this ultimate or transcendental milieu, is only to be effectively selected against. Such an eventuality does not depend upon a philosophical decision (in the still prevailing sense of this term), but upon abandonment through hard-forking and effective loss of consensual support. The blockchain automatically facilitates the subtraction of every cosmos – or advancing world-line – compatible with duplicity. Block validation, then, is the basic mechanism of a selective ontology.

§3.741 — It has to be expected that no less than several decades will be required for the full epochal radicality of this transition to be appreciated, at an even approximately adequate scale. The current (Perez) ‘Great Surge of Development’ and its installation of blockchain-based distributed systems sets the pace of cultural assimilation. In accordance with rhythmic historical precedent, the ‘wild exaggerations’ of the germinal phase becomes the conventional wisdom of the mature techonomic order.

[1] ‘Singularity’ has been an over-invested term, even prior to its inflation by intellectual fashion. In its philosophical usage, it refers to non-generic being. This acceptation has twin lineages, within Anglophone and Continental traditions, but these converge upon a (single) conceptual core. Both draw – critically – upon the Leibnizean principle of the identity of indiscernibles, which proposes that no two things can be different if their complete (or maximally-elaborated) logical definitions do not differ in any respect. This is a principle that succumbs to the general crisis of logicism, as it unfolded within the early 20th century, most decisively in the work of Gödel and Turing. It draws upon the notion of a comprehensive definition, which falls prey to criticism based on the discovery of irresolvable incompleteness, or non-computable numbers. Cantor’s late 19th century demonstration of diagonal argument anticipates the crisis of logical comprehension, in its most abstract features. The singular begins where the project of definition encounters a rigorously-insurmountable limit. ‘This is not that’ does not even begin to tell us what ‘this’ is, or to provide its name. Determination-through-negation stalls at the threshold of singularity, where logical signifiers are supplanted by diagonal indicators, pointing into the rigorously incomprehensible. The numerical identity of the thing designates a logically-intractable excess, conserved (diagonally) even after infinitely exhaustive qualitative determination. Transcendental numbers provide the pattern. Individuation is ontologically basic (or transcendental), even if it is typically missed, or misidentified, as ego or object. Real selves and things, in themselves, are singularities. Reality disintegrates into them. No universe can encompass singularities. It is rather that any apparent universe is fragmented by them. (Black holes are not cosmic furnishings, but doors.) Singularities are transcendental by definition, since they elude all super-ordinate jurisdiction, or domain-subordination. Laws ‘break-down’ at their boundaries. They are thus elements of absolute multiplicity, or difference without genre. Historical or ‘Technological’ Singularity is – if only superficially – another thing entirely. Vernor Vinge describes it (perfectly) as a “wall across the future”. Historical structures can be extended up to, but not into it. This ‘Singularity’ sets the absolute limit of all projections. Like a black hole, it is epistemologically-impermeable. John Smart has suggested that it might even – sensu strictobe a black hole, achieved as an engineered techno-compression catastrophe. According to this forecast, communication time-lag minimization drives implosion. Only collapsed matter is fast enough for the future. Translated into the terminology of Bitcoin, optimization of the block discovery rate tends to singularity. Because impending terrestrial Singularity is a thing, and not a signification, it overspills every attempt at comprehensive definition. This stream of references is therefore far from exhaustive. In particular, there is an additional noteworthy employment of ‘singularity’ within the sphere of aesthetic production, designating the threshold at which the name of an artist acquires distinctive consistency, and thus efficient dehumanization. With a sufficiently abstract sense of the ‘artist’ this usage curves back into the main current. Singularity is the ultimate agent, or it is nothing.

[2] Once again, the obvious reference is found here:

Crypto-Current (note)

Attentive readers will notice a radical format change (due to WordPress updating). Footnotes, in particular, now work more like those in the book, although more strewn. Links have been returned to URLs, which is probably less convenient, but also more book-like.

No real point in mentioning this, unless to pre-emptively dismiss rumors of sudden-onset insanity.

Crypto-Current (036)

§3.6 — The duplicity of the sign has numerous variations, and double spending (narrowly conceived) is by no means the only one with direct relevance to Bitcoin. A digital monetary system is intrinsically open to fraudulence (manipulative duplicity) at every scale, since not only its currency units, but also its associated websites, exchanges, and institutions – up to the level of an entire implemented protocol or commercium – are vulnerable, as a matter of first-order principle, to cloning. In this (widened) sense, the DSP is the indicator of a fully scale-free vulnerability.

§3.61 — Bitcoin, as a whole, is replicable open source software. It has no secure uniqueness, beside that – by no means inconsiderable – of coming first. The fact that the distinctive identity of Bitcoin inheres solely in its originality – which is to say its historical privilege – is already an invitation to clone invasion, at multiple levels. Since the avenue of monetary counterfeiting is blocked by the Bitcoin DSP solution, digital duplicity is displaced, and in fact up-scaled. From the corruption of currency units, it is redirected into the corruption of currency institutions and systems. Fraudulent entities proliferate at the edge of the Bitcoin system, from fly-by-night scam sites to entire exchange businesses (whose structural corruption is as likely due to the unconscious consequences of defective design, as to malicious criminal intention).  

§3.62 — Among these dubious displacements of the DSP, the propagation of more-or-less Bitcoin-like currencies has a special place. The topic of altcoins is particularly engaging, and easily merits a dedicated work on its own account. As a deposit for creative techonomic endeavor, these variant cryptocurrencies are perhaps unsurpassed. Yet, when approached on the grimmest and most narrowly-critical track, they appear as deviant paths off the Bitcoin blockchain,[1] and – worse still – as a recrudescence of the DSP, amplified to the level of entire currency systems.

§3.63 — It is unnecessary to make too much of the fact that no less than three different altcoins have been brazenly named ‘Scamcoin’.[2] ‘Hammer of the altcoins’ Daniel Krawisz argues that they are all scams,[3] comparing them to cargo cults, for which there is an expectation of “similar results through blind imitation”. According to this argument, the proliferation of altcoins is a pathological phenomenon, to be denounced as an impediment to the emergence of Bitcoin’s natural monopoly (since, due to network effects, “one would always expect a single currency to overcome all its competitors”[4]). Because they sap network-effects, however feebly, altcoins are a parasitic drain, interfering with the ability of Bitcoin to rapidly reap the full consequences from its first-mover advantage. Krawisz writes: “…once Bitcoin exists, then there is no additional value, from a monetary standpoint, of creating knock-offs. … What makes Bitcoin great cannot easily be duplicated. … Altcoins can only be explained if we believe the purpose of cryptocurrencies is to make money rather than to become money.” 

§3.64 — Between Bitcoin and a close-clone altcoin, the difference that matters is invisible to even the most painstaking inspection of code. To avoid distraction, it is advisable to suspend all such comparison, and to assume – instead – perfect duplication. Bitcoin – as an event or real singularity – has no exclusive essence that can be separated from its history. It is merely an instantiation of its own code, even if the first one. Its currency potential is a matter of momentum, exhausted by its path dependency (or “history and community” as Krawisz puts it). Only the workings of nonlinear network effects, based upon its ‘first-mover’ or ‘incumbency’ advantage – rather than any determinable differences in kind – distance Bitcoin, in principle, from its proximate competitors.

§3.65 — Bitcoin does not defeat forgery by being difficult to forge, but rather – absolutely – the opposite. It abandons such terrain in advance, on the implicit assumption that all original identity is indefensible in the digital epoch. Synthetic being, alone, can secure itself. Once again, and not for the last time in this exploration, we are returned to the rift – the abyss. Bitcoin’s integrity is groundless. Every imaginable redoubt of essential uniqueness is denied to it in principle (or a priori). It can be based upon nothing other than the circuitry of auto-production, whose only ‘foundation’ lies within itself.

[1] There are three basic ways an altcoin can relate (‘cladistically’) to the Bitcoin blockchain. Competitor currencies, in particular, typically represent a separate lineage, initiated by cloning and minor modification of the Bitcoin protocol. Only slightly more speculatively – which is to say experimentally – they can be produced by ‘hard fork’ (speciation) events within the blockchain. Within the emerging digital ecology, hard forks can be expected to make an important contribution to basic political-economic conceptuality. Perhaps the most interesting possibility, with regards to evolutionary complexity, is provided by the option of attachment as side-chains. In this situation, a comparatively high degree of intricate, symbiotic co-evolution is built into the pattern of rising diversity from the beginning. See:

Bitcoin’s first hard fork occurred in August 2017, with the split of Bitcoin Cash. The break divided the crypto-currency between a major lineage prioritizing the security of a store of value, and a minor lineage prioritizing convenience as a means of payment. Monetary theory had become a process of experimental dissociation. Fitz Tepper at TechCruch commented helpfully: “Essentially, like everything else in crypto, no one knows what’s about to happen next.”

In discussing the relation between the archaic RNA world and its obscure predecessor – perhaps Cairns-Smith-type information-preserving clays – as a highly-abstract analogy to the potential transition to a post-DNA hegemonic information medium, Hans Moravec coined the term replicator usurpation. In this regard, the comparison of blockchains to genomes is of evident relevance. Both are characterized by the massive redundancy that comes from ubiquitous copying. Every cell or node contains a full version of the record. Additionally, speciation functions comparably in both cases. Variants stemming from any given speciation event (or hard fork) share a lineage. Diversity has a cladistic structure, or fragmentation record, registered in the conservation of common code.

Fred Ehrsam notes that: “Forking is a … critical evolutionary mechanism for blockchains. Just like mutations to DNA in biological organisms allow for evolution through natural selection, forking lets us run multiple experiments in parallel where the strongest versions survive.”


The topic of forking, amid other types of crypto-currency proliferation and diversification, will return in relation to the concept of inflation in a post-macroeconomic world. The emerging monetary schematics suggest spontaneous (decentralized) market regulation of the price of money via the propagation of difference rather than the amplification of the same. Speciation replaces printing as a deflation-control.

[2] The original ‘ScamCoin’ was released in January 2014. It was succeeded by two further altcoins bearing the same name.

[3] See:

[4] The argument for natural cryptocurrency monopoly, in its most abstract features, is a strict analog of the proposal advanced among certain influential voices engaged with the prospect of AI-Singularity, that such an event would be expected to install an effectively-unchallenged ‘Singleton’. In both cases, the argument identifies a point of criticality (or singularity) at which first-mover advantage is amplified explosively, by powerful positive feedback, leading rapidly – or at least with exponential cybernetic inevitability – to total domination. An articulate defense of this idea has been presented by Nick Bostrom:

Crypto-Current (035)

§3.44 — From the perspective of the miner, bitcoins are immanent remuneration for primary production, or resource extraction. They function as digital gold. As the simulation of a finite resource, it is natural that their production rate should exhibit declining marginal returns. Each increment of mining effort confronts an increasingly challenging environment, under conditions of steady depletion. For Bitcoin, as for gold, economic dynamics automatically counter-balance industrial exertion, as prices adjust in response to supply constraints. This process of continuously revised bitcoin price discovery cannot be determined within the protocol, but occurs at its edges, where economic agents trade into, and out of, bitcoins – synthesizing the Bitcoin commercium with its outside.

§3.45 — Within the protocol, adjustments are restricted to supply modifications, modeling the depletion of an abstract resource that is advanced as a general commodity (i.e. money). Bitcoin splits its schedule of decreasing returns in two, separating its measures of reward and difficulty. This double contraction – while clearly redundant from the viewpoint of austere abstract theory – enables a superior degree of flexible calibration, in response to a dynamic environment, volatilized above all by rapid improvements in computational engineering (and product delivery). By dividing bitcoin output compression between two interlocking processes, the protocol is able to stabilize the rate of block validation in terms of an ‘objective’ (external) time metric. The difference between these two modes of nominal reward restriction reflects a schism in time, between the intrinsic, intensive, absolute succession of the blockchain, and the extrinsic, geometricized order of pre-existing (globalized) chronological convention. Integrated reward is a complex chrono-synthesis, occurring at the boundary where Bitcoin’s artificial time – proceeding only by successive envelopment (of blocks into the chain) – meets the social-chronometric time of measurable periods. ‘Ten minutes’ means nothing on the blockchain (in itself), until connected by an interlock that docks it to a chronometer.

§3.46 — Are not all blocks time-stamped? it might be objected. To avoid confusion at this point, it is critical to once again recall the difference between the ordinal and the cardinal, succession and duration. Time-stamps are ordinal indicators, without any intrinsic cardinality, and with merely suggestive cardinal allusion. They implement an ordering convention. Metric regulation of periods is an entirely distinct function. ‘Chain’ means series (and nothing besides).

§3.47 — The bitcoin reward rate halves, stepwise, in four-year phases, on an asymptotic progression towards the limit of BTC 21,000,000 – the protocol’s horizon of zero-return. Taken in isolation, this exponential decline looks smoothly Zenonian (asymptotic), or infinitesimalizing, until arbitrarily terminated at a set point of negligible output. It is scheduled to pass through 34 ‘reward eras’ in the last of which – with block 6930000 – BTC issuance reaches zero. Due to the power of exponential process, 99% of all bitcoins are issued by Era-7 (during which 164,062.5 bitcoins are added to the supply).* The end of Bitcoin’s mining epoch is anticipated in the year 2140. After this point, at a date so distant that it belongs to the genre of science fiction, continuation of the system requires that mining-based block validation incentives are fully replaced by transaction fees. Evidently, the transition process cannot be expected to await its arrival at this remote terminus, which marks a point of completion, rather than inauguration.

§3.48 — The reward schedule is further tightened by increasing difficulty of the hashing problem. Rather than executing a pre-programmed deceleration, Bitcoin’s rising difficulty responds dynamically to technological acceleration, and balances against it, thus holding the block validation rate roughly constant. Even as the reward rate tumbles – when denominated in BTC – the block processing rate is approximately stabilized, at a rate of one block every ten minutes, regardless of the scope and intensity of mining activity.

§3.49 — ‘Difficulty’ modification is a synchronization. The Zenonian time of intensive compression that determines the BTC reward-rate is – taken on its own – wholly autonomous, or artificial. As already noted, its chronometric ‘ticks’ are block validation events, registered in serial block numbers (and their ‘epochs’). They have no intrinsic reference to the units of ordinary time. It is only with the stabilization of the block-processing rate that the time of Bitcoin is made historically convertible, or calendrically intelligible, through the latching of block numbers to confirmed or predicted dates. This is a supplementary, synthetic operation, which coincides with the protocol’s anthropomorphic adoption. The time of the blockchain is intrinsic, and absolute, but its history is a frontier, where it engages ‘us’. As the blockchain is installed, and thus dated, an artificial time in-itself – consisting only of absolute succession – is packaged as phenomenon.

§3.5 — It can easily be seen that bitcoin mining is an arms race, of the ‘Red Queen’ type.** Since the total bitcoin production rate has zero (supply) elasticity, local advances in production can only be achieved at the expense of competitors. In consequence, inefficient miners are driven out of the market (as their costs – especially electricity bills – exceed the value of their coin production). This brutal ecology has forced rapid technological escalation, as miners upgrade their operations with increasingly specialized mining ‘rigs’. In the course of this process, the standard CPUs initially envisaged as the basic engines of bitcoin mining have been marginalized by dedicated hashing hardware, from high-performance graphics processing units (GPUs) – originally designed for application to computer games – through field-programmable gate arrays (FPGAs), to application-specific integrated circuits (ASICs). Bitcoin has thus stimulated the emergence of a new information technology industrial sub-sector.

§3.51 — With the completion of this production cycle, Bitcoin Singularity is established in a double sense (we will soon add others). An unprecedented event has occurred, upon a threshold that can only be crossed once, and an innovation in autonomization attains actuality, establishing the law for itself. Bitcoin provides the first historical example of industrial government. It is ruled in the same way that it is produced, without oversight. At the limit, its miners are paid for the production of reality – effectively incentivized to manifest the univocity of being as absolute time.***

* For a more detailed description of the Bitcoin reward schedule, see.
Fort a compact, chronometric representation of mining difficulty, see.

** The Red Queen dilemma, as formulated by Lewis Carroll in Alice in Wonderland, is that “it takes all the running you can do, to keep in the same place.” Daniel Krawisz makes another comparison: “When a person upgrades their mining computer, they mine at a faster rate and therefore earn more bitcoins. However, when everyone upgrades, the mining does not become more efficient as a whole. There is only supposed to be one new block every ten minutes regardless of how hard the network is working. Instead, the network updates the difficulty to require more stringent conditions for future blocks. All miners may work harder, but none is better off. It is rather like a forest, in which every tree tries to grow as tall as possible so as to capture more light than its fellows, with the end result that most of the solar energy is used to grow long, dead trunks.”

*** The doctrine of the univocity of being is derived from Duns Scotus, and passes into modernity by way of its implicit contribution to Spinozistic ontology, as re-activated by Deleuze. It can be formulated in various ways. Most basically, the meaning of ‘being’ is insensitive to its application, and unaffected by differences of kind. Thus, the being of any being is no different from that of any other. God is not a flake of dandruff (and differs very significantly in kind from one, whether the distinction is entertained from a theist or atheist perspective), but the being of God has no difference whatsoever from that of a flake of dandruff – and even if God is held not to exist, the being denied him is the same as that of any existent thing. In other (more ‘Heideggerian’) words: ‘Being’ is not susceptible to ontic qualification. In its pure conception, therefore, what is said by ‘univocity of being’ is exactly equivalent to ontological difference.

Crypto-Current (034)

§3.4 — The Bitcoin DSP-solution unshackles (digital) proliferation from duplicity, in the production of replicable singularity. As with every diagonal construction, this outcome is pseudo-paradoxical, since it reformulates an apparent contradiction. From the latent matrix of abundant signs and scarce things, it extracts the scarce sign. Through this procedure, crypto-currency is implemented as critique. It coins a diagonal concept, not as impractical-contemplative ‘theory’, but as working code.

§3.41 — Duplicity – or the DSP – is primarily registered as a monetary problem, in the guise of counterfeit currency, and secondarily as a problem of identity authentication, responding to impersonation. On the Internet, however, another manifestation of the same basic syndrome has been far more prevalent, socially advanced, and technically provocative. The critical driver, on the path to a cryptographic solution to the DSP, has been spam.

§3.42 — ‘Spam’ is narrowly defined as a species of advertising adapted to the conditions of near cost-free electronic communication. Its first large-scale manifestation was ‘unsolicited bulk email’ (UBE), a sub-category of the more general phenomenon of the ‘electronic spam’ which exploits the receptivity of instant messaging systems, newsgroups, mobile phones, social media, blog comments, and online games, among others. While advertising is the principal motivation for this massive duplication of unwanted – and typically only vaguely directed – communications, spam procedures (and supportive technologies) can also be employed for DoS (denial-of-service) attacks, which are designed to overwhelm a specifically-targeted recipient with an inundation of messages. At a sufficiently abstract level of apprehension, no strong boundary of principle differentiates advertising spam from a denial-of-service (DoS) attack, except that the former is generally divergent (one-to-many) and the latter convergent (many-to-one). The residual distinction is motivational. The injury (cost) to the recipient that is an inevitable side effect of spam promotion (‘collateral damage’) is a primary objective for the DoS assailant.

§3.421 — ‘Spam’ – abstractly conceived – spontaneously expresses the consequences of extreme information economy, or radical dematerialization, and is thus emblematic of electro-digital semiotic crisis. It follows the Law of the WORM – write once read many (times) – into a near-costless replication explosion. Unsurprisingly, any recipient of electronic communications is vulnerable to spam harassment, generating a problem that tends to ubiquity. The arms race between spammers and spam filters is recognizable from that characterizing the cross-excitation of infections and immune systems in the biological sphere. Cheap sign contagion is the common syndrome. As the various Turing Test-type defenses attest, any effective obstacle to the automation of spam production increases its cost. The time taken to ‘prove you are human’ adds friction at the point of terminal message delivery, where it cannot be easily eliminated – pre-emptively – by the spammer. Such ad hoc defenses necessarily aim to raise messaging cost, in order to restore the signal of commitment that digitization has erased.

§3.422 — The difference between a solution to the DSP and a spam filter turns out to be somewhere between subtle and non-existent. Both respond to the destructive consequences of semiotic economy – cheap signs – as these climax within networked, digital electronics.* The critical step in this respect was taken by Adam Back in 1997, with hashcash, a proof-of-work based messenger credentials system. As Back describes the innovation: “Hashcash was originally proposed as a mechanism to throttle systematic abuse of un-metered internet resources such as email, and anonymous remailers in May 1997. … The hashcash CPU cost-function computes a token which can be used as a proof-of-work.”**

§3.423 — Rather than offering another piecemeal response to some particular spam problem, Back’s solution looks more like an attempt to fix the Internet, or even more than this. Hashcash tackles the spam problem at its source (cheap signs). Rather than defensively fending off ever more cunning spam intrusions, it enables a positive signal that someone has taken the trouble to communicate this, with the ‘trouble’ being attested by proof-of-work certification. This solution can be seen as a basic filter. It works as an admission pass, rather than a policing operation. The cost of duplicity is raised at the root, which involves the DSP being grasped as the root.

§3.424 — The very name ‘hashcash’ attests to the realization that proof-of-work certification is self-monetizing. Evidence of effort – when this is pre-formatted as a signal of commitment – has intrinsic potential value, independent of its application. A currency is initated automatically, and all that remains is the process of price-discovery. Bitcoin provides a framework within which this process can occur.

§3.43 — However tempting it might be to construe proof-of-work as an algorithmic reprise of the labor theory of value (an LTV 2.0),*** it is not from political economy that Bitcoin derives its sense of ‘work’ – unless by extraordinary circuitousness – but from computer science. The work to be proven, in the validation of a block and associated currency issuance, is performed by a CPU in the course of a mathematical puzzle-solving exercise, and demonstrated through successful execution of a computational task. It is the final measure – beyond which no appeal is possible – of the contribution made by any node to the running of the network. Such work is probabilistic, rather than deterministic. There is no application of computational effort that can strictly guarantee reward. The work required of the miner is persistence in pursuit of a low-probability outcome, through repeated trials. It is both structurally and genetically related to a process of stubborn cryptographic attack – ‘hacking’ in its colloquial, though not traditional, sense – and also to a grueling search for success in a lottery-type game of chance.

§3.431 — Proof-of-work is accomplishment at a test, which can then be employed as a key. In the case of Bitcoin, it simultaneously ‘unlocks’ new bitcoins and casts a ‘vote’ that counts towards the consensual updating of the blockchain. Incentive and service are nondecomposably married. Optimal functionality is achieved by making the content of the test entirely meaningless. It serves as a demonstration of brute force (trial-and-error) computation, inherently resistant to rationalization, and thus irreducibly arduous. It is not a test of cognitive achievement, in any general or sophisticated sense, but solely of computational effort. Its sole ‘significance’ is its difficulty. Despite the obvious risk of anthropomorphism, it might even be described as an ordeal, or – less dramatically – as a trial, unambiguously demonstrating commitment.

§3.432 — Would it not be preferable to have this ‘work’ also (i.e. simultaneously) applied to a problem of intrinsic value?**** In its most positive formulation, this question has been a stimulus to altcoin differentiation. Anything other that mining might do, beside sheer block validation, seems to indicate an unexploited seam of surplus value. Such suggestions are strictly analogous to a recommendation that gold prospecting be bound to valuable activities of some other kind (such as fossil hunting). On the basis of fundamental economic principle, they merit the most vigilant suspicion, since they amount to a deliberate confusion of cost calculations, promoted in the name of a superior – or at least supplementary – utility. Yet however much the costs of mining are strategically muddied – and in fact, in some complex fashion, cross-subsidized – they still need to be unmuddied, to exhibit an economically-intelligible commitment. Mining investment is a signal, which cannot be dissolved into extraneous purposes without destruction of critical information. To whatever extent bitcoin miners are generating bitcoins by accident, is also the degree to which their contribution to Nakomoto Consensus, or block validation, is devalued. The perfect pointlessness of bitcoin generation procedures – for anything other than Bitcoin system consolidation (as remunerated in bitcoins) – is a feature, and not a bug. Cybernetic closure, or self-reference, is its own reward, and it is only as such that it acquires distinctive monetary characteristics. As always within the terrain of auto-production, this is the inescapable abyss, or principle of immanence. The self-propagating circuit has no ground beyond itself, and can only be impaired by the attempt to provide one.

* ‘Spam’ invites a very general definition as the spontaneous expression of digital economics (or near-zero cost communications). The Wikipedia article on email spam makes the point well: “Since the expense of the spam is borne mostly by the recipient, it is effectively postage due advertising.” The Internet Society attributes the term to the celebrated Monty Python comedy sketch depicting the widely-derided tinned meat as “ubiquitous and unavoidable”. Estimates of the cost of email spam vary wildly, with the high-end figures reaching over US$100 million annually, for US businesses alone, by the early years of the 21st century. Global spam volume in 2011 is thought to have exceeded 7 trillion messages. The illusion of costlessness is illustrated starkly by the phenomenon of spam, through the revelation of an unanticipated trade-off. Whatever is free is abused. If an activity with discernible externalities can be pursued without definite commitment, it tends to produce a tragedy of the commons (see Chapter 3). Microscopic private utilities within a zero-cost matrix generate an explosion of informational pollution. Abundance theories, therefore, have special cause to be intellectually disturbed by the phenomenon. Spam is a toxic Cornucopia.

** See: ‘Hashcash – A Denial of Service Counter-Measure’ (2002).

*** Proof-of-work as a foundation for commercial value echoes a theme that has reverberated through the tradition of political economy. It leads, by scarcely-resistible digression – onto an associated track of exceptional historical richness, which is the analysis of value-creation as work, or labor time. From Smith to Marx, this has been a conceptual commitment that closely coincides with classicism in economic theory. Subsequently, the power of the marginalist – and especially Austrian – analysis has tended to entirely overshadow the intellectual labors of the objective value theorists, and even to topple them into derision. Marginalism threw its political-economic precursors into eclipse due to the evident superiority of its transcendental foundations, even if this articulation of its success found no corresponding acceptance within professionalized economic study. From the critical perspective, the objectification of the (subjective) negative utility of effort can only appear metaphysical. Its historical supersession, in this regard, follows predictably from its essential – and rigorously intelligible – error.
It would be unfortunate, nevertheless if a type of Whig-historical triumphalism were to obliterate all understanding of the genuine theoretical insight now entombed with the Labor Theory of Value (or LTV). Most basically, the LTV already incorporates an important critical-subjectivist insight. This can be stated in different ways. Conceived in terms of power, it is the recognition that money does not primarily overcode inanimate resources, but rather represents a distributed system of command (though one rendered inconspicuous by its intrinsic exit-option). Wealth is bound by exchange equivalences to static assets only because it quantifies a capacity to direct activity. It crystallizes compliance. The ‘normal’ economic evolution in the direction of services makes this reality explicit, from the side of consumption. An analogous subjectivization is recognizable in regards to utility (value). Commercial incentives – including those at work in the labor market – can be theoretically systematized as an economization of effort. The value of a possession is the incarnated advantage of no longer having to struggle to obtain it. The critical reversal here is blatant, and crucial. Within the Marxist tradition it is understood as the disillusioning of a fetishization. It is not the thing, but the difficulty of its acquisition, that establishes the foundation of its value. This is an insight, of course, whose foundations were solidly established by the earlier classical economists, Ricardo most notably. The LTV, it can be seen, is a critical relief from naïve objectification, even if it is also – in well-understood respects – a perpetuation of it.
When pursued in detail, however, whether as a matter of economic theory or industrial practicality, fixing the relation between time and work has proved daunting. Quantification of work on the basis of standard time units exhibits an obvious dependency upon chronometric technology. This, in itself, suffices to identify the topic as distinctively modern. Beside comparatively accurate time-measurement, the practice of compensated labor time also requires some adequate degree of work monitoring. It is necessary to know both how much time is spent working, and that this time is spent working. In practice, these requirements have been understood as demands for oversight, regardless of the ideological characteristics of the industrial regime in question. Solutions to the informational problems of work monitoring have been institutionalized within the factory organization of production, integrally, originally, and necessarily. Such systematization of proof-of-work within an anthropological context reaches its most remarkable expression within the methods of Taylorism, which from a theoretical perspective is only an elaborate social hack. The time-and-motion analysis required for ‘scientific management’ is stubbornly intractable to political-economic abstraction (of a kind sufficient for rigorous conversion into monetary quantities), and no practical advance of conceptual significance has occurred subsequently. Immanent proof-of-work, despite its supposed manifestation in the commodity – as exchange value – eluded both ‘bourgeois’ political economy and its socialist critics. The production of measurable (human) labor time has proceeded alongside its theoretical analysis, within a semi-parallel, partially interactive, historical dynamic. This is investigated, within the tradition of Marxist historical sociology, by E.P. Thompson, in his essay on ‘Time, Work-Discipline and Industrial Capitalism’. He is meticulous in noting that “a severe restructuring of working habits” has been practically inseparable from the relevant “changes in the inward notation of time”. That theorization has not proceeded in this case from the inside out, is the critical historical materialist insight. Labor time was a distributed, experimental, piecemeal process, before it was a political-economic conception.
Karl Marx’s maxim for socialist compensation “to each according to his work” achieves an ironic actualization in the Bitcoin reward system. All power to distributed hashing capability!

**** In the words of Nick Szabo: “The secret to Bitcoin’s success is certainly not its computational efficiency or its scalability in the consumption of resources. Specialized Bitcoin hardware is designed by highly paid experts to perform only one particular function – to repetitively solve a very specific and intentionally very expensive kind of computational puzzle. That puzzle is called a proof-of-work, because the sole output of the computation is just a proof that the computer did a costly computation.”
“Very smart, but also very wasteful,” is the way one exemplary critic describes the proof-of-work concept. “All this computer time is burned to no other purpose. It does no useful work – and there is debate about whether it inherently can’t do useful work – and so a lot of money is spent on these lottery tickets. At first, existing computers were used, and the main cost was electricity. Over time, special purpose computers (dedicated processors or ASICs) became the only effective tools for the mining problem, and now the cost of these special processors is the main cost, and electricity the secondary one. … What this means is that the cost of operating Bitcoin is mostly going to the companies selling ASICs, and to a lesser extent the power companies.”
Resonantly: “In mid January 2014, statistics maintained at showed that ongoing support of Bitcoin operations required a continuous hash rate of around 18 million GH/sec. During the course of one day, that much hashing power produced 1.5 trillion trial blocks that were generated and rejected by Bitcoin miners looking for one [of] the magic 144 blocks that would net them $2.2 million USD. Almost all Bitcoin computations do not go towards curing cancer by modeling DNA or to searching for radio signals from E.T.; instead, they are totally wasted computations.”
Crypto-currency pioneer ‘Wei Dai’ is emphatic about the importance of teleological purification to efficient proof-of-work schemes: “The only conditions are that it must be easy to determine how much computing effort it took to solve the problem and the solution must otherwise have no value, either practical or intellectual.” [My emphasis.]

Crypto-Current (033)

§3.3 — Grasped abstractly, the most powerful functional innovation of the Bitcoin protocol is the binding of currency issuance to the servicing of system integrity, which twists the process into a consistent circuit. It is this loop that enables the protocol to achieve autonomy, or – in a reflexive articulation – self-reliance. Because industrial incentives cover all regulatory requirements, self-reproduction is embedded within the process of bitcoin production. The protocol makes it impossible to produce bitcoins without automatically policing Bitcoin. Primary wealth extraction cannot take place without verifying transactions – through the validation of blocks – and thus tending the system as a whole, consistently and comprehensively (as if with an invisible hand). Stated succinctly, Bitcoin instantiates immanent economic government.

§3.31 — This auto-productive economic security circuit is evidence for the fundamental integrity of the Bitcoin blockchain. Currency and distributed public ledger are a single functional system, with neither making coherent operational sense without the other. This is a point made with exceptional cogency by Bitcoin commentator ‘Joe Coin’:

Given the crucial requirement to preserve decentralization, the problem Satoshi had to solve while designing Bitcoin was how to incentivize network participants to expend resources transmitting, validating, and storing transactions. The first step in solving that is the simple acknowledgement that it must provide them something of economic value in return. … The incentive had to be created and exist entirely within the network itself … any instance of a blockchain and its underlying tokens are inextricably bound together. The token provides the fuel for the blockchain to operate, and the blockchain provides consensus on who owns which tokens. No amount of engineering can separate them.*

§3.32 — The threshold crossed here is both subtle and immense. Retrospectively, it will have been almost nothing, since the techonomic circuitry it invokes was – now demonstrably – already the operational principle of modern civilization (capitalism). It is only through Bitcoin, however, that the essential techno-commercial integrity of capitalism is brought into crisp focus, and extracted from speculative debate. When the machine is theoretically apprehended, ‘holistically’, as a real individual – or, far more consequentially, implemented as such – neither its technical nor its economic ‘aspects’ can be diverted into transcendence, or contingency, as extraneous, mutually-independent factors. Incentives are inherent to the machinery.** In a sense more complex – and involving – than anything the harsh paradox of the term immediately communicates, Bitcoin is a purposive mechanism. The conclusive action of the Bitcoin system – block validation – which seals each cycle of its reproduction, is a non-decomposable teleo-mechanical step (a diagonal escalation, or transcendental synthesis). It is industrialism, the mechanizing market, distilled to a previously unrealizable quintessence.

§3.33 — ‘Capital’ means – simultaneously and indissolubly – technological assets (machine-stock) and comparatively illiquid money (investment). Between these twin aspects there is only formal (and not real) difference. Their real integrity is demonstrated by techonomic machinery. The economic analysis of capital is diverted through technology, since wealth cannot be grasped substantially except in its cycle through productive apparatus, but technological analysis is drawn, reciprocally, into economics by the integration of rewards into the machine. At the level of philosophical reflection, under the cognitive conditions inherited from its mainstream European traditions, such techonomic integrity is difficult to hold together. To fuse mechanical causes with behavioral incentives in a techno-strategic assembly is to meld registers that have been determined as mutually inconsistent since antiquity.

§3.34 — Techonomic apprehension runs into a direct collision with the commanding dualism of the modern mentality, by insisting upon a re-animation of the compact between efficient and finalistic action. According to the complacent tenets of the new (or ‘enlightened’) cultural settlement, based upon the drastic demotion of scholasticism and its displacement by a substitute theo-scientific division of labor,*** the bridge from mechanism (cause-effect) to teleology (means-end) had been definitively dismantled. Each was henceforth to be compartmentalized within a distinct, wholly independent dimension. Their sole residual relation was orthogonal (or demarcated). The realms of directed liberty, and of instructed mechanism, were to be perfectly isolated from each other, and mutually withdrawn beyond all possibility of reciprocal interference. In this arrangement was to be founded the modern peace, of no lesser consequence than that of Westphalia, and something close to a genuine social contract. Through it, an amoral techno-science was co-produced beside an agnostic politics. Two complementary templates for expertise arose, each pledged to silence in the house of the other. This compact has been at once the condition for the gestation of an autonomous industrial power, and – on exactly the same grounds – an obstacle to its cognitive digestion. With the surfacing of the concealed techonomic entity, it buckles, loses coherence, sheds explanatory credibility, and undergoes accelerating social desanctification. Modernity’s axial, though predominantly inexplicit, concept of the mechanical instrument – whose self-contradiction had been concealed as if within a collapsed dimension – escapes its bonds and re-emerges to break the basic categories of Occidental thought. That is where we are now.

§3.35 — The intellectual crisis stimulated with ever-increasing intensity by techonomic escalation (that is, by capitalism, or efficient critique), has fertilized a luxuriant foliage of ‘deconstruction’. Yet, the untenability of orthogonal conceptuality does not necessitate a subsidence into cognitive dilemma, or aporia. Even when the problem is restricted within the narrow bounds of its philosophical formalization, it opens a positive path – pursued since the inception of the process – into diagonal action, or individuation. It is surely implausible to decry as ‘unthinkable’ what has been demonstratively operationalized. Bitcoin attests to such a process with each cycle of block validation and Nakamoto Consensus. The process demands something structurally and functionally indistinguishable from transcendental philosophy, insofar as it is to be constituted – even very approximately – as a coherent object of thought. What it makes of this ‘philosophy’, however – as it pushes through upgrades into successively ultra-radicalized immanentizations – is rarely self-advertized as such. What it apparently offers, instead, is ‘technology’ – a term that is a near-exact synonym for ‘instrumental mechanism’, and one that undergoes comparable internal schism, across the same conceptual rift.

§3.36 — In any approach to the techonomic entity – plotted as if from outside – the notions of emergence (or individuation), diagonal process, teleo-mechanical causality, integral nonlinearity, and transcendental escalation begin to exhibit a general inter-substitutability. All of these things, among many others, are convertible by simple transforms into immanentization, or the real operation of critique. An efficient side-lining of pseudo-transcendence – achieved by way of a dynamic flattening – is the reliable signature of the trend. The solution to the DSP is a diagonalization.

* Source. The importance of this argument is almost impossible to over-estimate.

** A (2014/10/29) tweet by Balaji S. Srinivasan describes the diagonal succinctly: “Bitcoin allows algorithms to act on incentives.”

*** That which is settled by the formalization of techno-political compartmentalization is, of course, the great war of religion that inaugurates European – and thus global – modernity. In a way still stronger than that outlined by Max Weber in his The Protestant Ethic and the Spirit of Capitalism (1905), self-propelling industrialization coincides with a break from the Catholic civilization of the West. The consolidation of an immanent techonomic principle (‘growth’ or positive cybernetic nonlinearity) presupposes a drastic contraction of the sphere of ecclesiastical cultural authority. Capitalism is that, by essence, which is not answerable to anything beyond itself. Its incremental actualization, therefore, presupposes social fractures, from which superordinate moral agency has receded. Among the major civilizations of the world, only Europe – under the impact of Reformation – realized this condition during the early modern period. A broken religion is a basic requirement of modernity, which Protestantism pioneered uniquely. (The work of David Landes explores this catalytic dissociation in detail.) Modern social institutions thus formalize and entrench a disconnection between what is and should be. Science is freed, in principle, to tell ugly truths. Engineers are freed to devise machines whose purposes the uncontaminated dynamic of capital accumulation alone dictates. Modernization calls for nothing other than this. The division of labor, or authority, between (traditional) religious doctrine and (modern) techno-scientific investigation is philosophically consolidated into the distinct spheres of practical and theoretical reasoning (to employ the Kantian vocabulary, as concretely instantiated in the topical differentiation between the first two Critiques). In very recent times, this enduring demarcation is faithfully reproduced – without notable modification – by Stephen J. Gould’s conception of Non-Overlapping Magisteria (NOMA), which divide religion and science, values and facts, in the same way, and with the same crypto-political emphasis upon jurisdictions. Given the historical status of this argument, as a near-perfect restatement of the original critical settlement, laid down in the final decades of the 18th century, it is surely extraordinary that Kant is nowhere mentioned in Gould’s essay.

Crypto-Current (032)

§3.2 — The Bitcoin paper consists of twelve short sections, including an introduction and conclusion. It is compressed to a minimal summary at this point, although discussed in pieces throughout the book, and rehearsed at slightly greater length in the first appendix. The emphasis here is critical, oriented – as is the paper itself – to the dissolution of the DSP, and thus the construction of a plane of transactional immanence, from which all transcendent elements (or “trusted third parties”) have been evacuated. The transcendental argument of the Bitcoin paper runs as follows:

§3.21 — The “trust based model” is expensive, socially frictional, and vulnerable to fraud. To overcome these problems, Bitcoin proposes the substitution of “cryptographic proof” for “trust” (which is to be obsolesced by irreversible crypto-commitments). The elimination of trust-based mediations reduces transaction costs. The system remains resilient in the absence of oversight, so long as a predominance of applied “CPU power” is controlled by “honest nodes”.

§3.22 — An “electronic coin” is defined “as a chain of digital signatures”, which is equivalent to “a chain of ownership” (this is described later, in the conclusion, as the “usual framework” for crypto-currency construction). The elimination of the need for a “trusted third party” (or “mint”) requires that transactions be “publicly announced” within a system that enables “participants to agree on a single history of the order in which they were received”.

§3.23 — Bitcoin’s synthetic history draws upon established procedures for digital time validation, using a timestamp server to chain its hashed blocks in succession. “Each timestamp includes the previous timestamp in its hash,” constructing an artificial history as a robust series of envelopments – or ordered swallowings – “with each additional timestamp reinforcing the ones before it.”

§3.24 — The timestamped blocks are secured against tampering by proof-of-work locked hashes.* Such irreversibility is at once a deployment of cryptographic asymmetry, a consummation of contractual integrity, and a realization of (time-like) successive order. Notably, it is isomorphic with a thermodynamic – or statistical mechanical – gradient.

§3.25 — The network reproduces itself through a six-step block creation cycle. Since nodes “always consider the longest chain to be the correct one”, synthetic history, as an ordinal-quantitative variable, functions as a (selective) ontological criterion. Accepted blocks provide the building material for the subsequent cycle of network reproduction.

§3.26 — Bitcoin builds incentives into its infrastructure. Nodes are automatically compensated for the work they perform maintaining the network through the issuance of new coins. The system thus attains techonomic closure. The horizonal finitude of the Bitcoin money supply necessitates an eventual transition to payments based on transaction fees. Well-organized incentives also fulfill a security function, by motivating potential attackers to support rather than subvert the network.

§3.27 — Blocks can be compressed to economize on memory demand by pruning Merkle Trees. Moore’s Law is invoked as a realistic projection of exponential decline in digital memory price over time, moderating the requirement for information parsimony.

§3.28 — Further economy is offered by a payment verification short-cut (involving a modest sacrifice of security in exchange for added convenience).

§3.29 — Bitcoin transactions contain multiple inputs and outputs, to facilitate the integration and disintegration of coins during transfers.

§3.291 — Bitcoin radically adjusts the structure of transaction privacy. Rather than drawing a curtain of obscurity between a transaction and the world, in the traditional fashion, it nakedly exposes the transaction to public scrutiny. The new line of concealment is drawn between the transactional agents and their off-line identities, at the precise boundary of the commercium, therefore, and no longer within it. Secure masks are proposed as the new basis of privacy protection, coinciding with the anonymity of public keys.

§3.292 — The prospect of a successful attack upon the blockchain diminishes exponentially with the addition of “honest” blocks. An attacker therefore has a window of opportunity, which closes at a rate based on the block-processing capacity of the network.

§3.293 — The conclusion, summarizing the entire argument, is a masterpiece of lucid intelligence. (It is reproduced in its entirety in Appendix-1.)

* Adam Back’s Hashcash system (1997) provides the model. The use of a proof-of-work test – earning a Hashcash stamp – to eliminate spam by pre-emptive vetting of costless messages, contributes a solution of equal efficacy against DoS (denial-of-service) attacks. See subsequent discussion in this chapter.

Crypto-Current (031)

§3.1 — In its attachment to the principle of pure economic theory, fastidiously intolerant of even nominal political compromise, Bitcoin is an experiment in Austrianism. When allowance is made for its abstraction from metallic coinage, it is Mises as operational code. While the fact that Bitcoin is happening is radically novel, necessarily, because it can only now take place – in the age of public key cryptography and proof-of-work credentials – what is happening is not new at all, or at least, the monetary model that Bitcoin implements in software is not. In the words of Pierre Rochard: “As Bitcoin adoption increases we will finally be able to ‘empirically validate’ what Austrians have been arguing for decades: 100% reserve banking with a scarce medium of exchange prevents speculative manias, financial crises, and economic depressions”.*

§3.11 — Yet, while the offense to hard-money economic philosophies presented by inflationary fiat currency – which has nourished Austrian criticism since the 1930s – continues to feed support into the Bitcoin project, its central role has been displaced by, and subsumed into, the formulation of the DSP. Discretionary state money-printing is only one special case of the far more general economic incredibility of signs. Technological, rather than political-economic dynamics, have played the decisive role in bringing this problem to its point of productive crisis.

§3.12 — Even if digital dematerialization is only ever an approximation, its economic consequences are concrete, and drastic. Since the ‘materiality’ of any product tends to operate inertially, dampening proliferation, the attenuation of materiality corresponds to a process of acceleration. Exponential decline in information costs, as captured by Moore’s Law,** implies informational explosion. The trend corresponds to a second (and numerically tractable) sense of the ‘Californian Ideology’ war-cry: “information wants to be free”.*** If the concept of ‘liberty’ is irreducibly hazy and controversial, while also prone to irresolvable metaphysical complications, that of cost suppression is definite and quite precisely accountable. Evidently, the preservation of scarcity under conditions of digital instantiation is a peculiar challenge, for the obvious reason that electronics enables the replication of perfect copies at near-zero cost. Prior to the theorization of this problem in monetary terms, it had been noisily exhibited by disputes over the digital ‘piracy’ of media products, corresponding to an unprecedented practical crisis in the regime of intellectual property.

§3.13 — The final (or near-final) subtraction of substantial expense from money production is conceptually clarifying. It prompts – or sharpens – the demand for a solution to the central problem that has haunted money since its beginnings. Once the proliferation of signs is freed from all serious inhibition, semiotic tokens of scarcity are catapulted into a climax state of vulnerability, and the DSP is exposed with unprecedented starkness.**** It is here, at the furthest antipodes from metallic commodity money, that a peculiar folding – into simulation – restores the gold model to a central position in monetary theory, and, more consequentially, money production.***** It is precisely because Bitcoin no longer represents gold, however indirectly, that it is able to simulate gold, with such extreme (abstract) fidelity that it can be said – persuasively – to exceed gold in its most relevant monetary features, including even that of scarcity, alongside communicability, divisibility, and verifiability. As a simulation, Bitcoin necessarily produces an artificial substantiality in the course of its solution to the DSP, and ultimately as its solution. The critique of duplicity is indistinguishable from an ontological experiment.

§3.14 — The DSP originates from a ‘fact’ so basic that it crosses from the order of (empirical) actuality into that of (transcendental) principle: Signs are cheap. To substitute a sign for a thing, a signification for a demonstration, is an economization. It is commonly said ‘that is easy to say’, and – relatively speaking – it is. At the first-order level of cynical amorality – or of pure game-theoretic rationality – it pays to break promises, which cost so little to make, and yet may be arbitrarily expensive to keep. This alone suffices to suggest why there cannot be signs without an implicit problem of trust. The consequences are double-edged. Economization of any kind – getting the same for less, or more for less – is positively adaptive (or selectively promoted) to such an extent that evolutionary processes are indistinguishable from the formation and transformation of codes. Inherent to the economy of code, however, is a vulnerability to exploitative messages, which seize upon the exorbitant efficiency of the sign as a resource (or meta-resource) to be appropriated. Genetic code invites virus. Zoosemiotics invite mimicry.****** Linguistic expression invites deceit. Money invites the DSP. The sign is co-emergent with duplicity.

§3.15 — Bitcoin’s solution to the DSP is the blockchain, or ‘public ledger’ – a decentralized record of transactions which selects-out all non-original (or duplicitous) payments. Only the first instance of any bitcoin deduction from an account is validated, and preserved. All duplicate payments – cases of double spending – are edited out of the blockchained reality-record, automatically, through rejection of those inconsistent blocks in which such defects occur. Simply by protecting itself against splits – or forks – the blockchain constitutes a consistent plane of Being, upon which any particular being can be what it is, and nothing else instead, or besides. Positive absence of duplicity is thus an efficient ontological criterion, or selective principle. The blockchain is pre-determined to construct reality in such a way that fraudulence will not have taken place. That alone remains real which is consistent with the integrity of identity-money, or potential value.******* Only the non-duplicitous will have really occurred, as perpetually re-evidenced by the synthetic past that is reproduced on the blockchain, as a consistent artificial memory, endorsed by Nakamoto Consensus, beyond which no superior tribunal can in reality exist. The blockchain is demonstrably capable of making itself real. In this way it departs from all merely conceptual or ideological assertions of ontological grounding, while implicitly dispensing with the political superstructures through which such assertions are concretely propagated. The reality criterion it introduces takes the form of an automatic – which is to say non-negotiable – law. The force of this law is derived from what can be, rather than – directly – from what is, or what ought to be. There is no double spending on the blockchain because there cannot be.********

* See.

** While, strictly, Moore’s Law (initially proposed in 1965) concerns only transistor-density, it has come to serve as a general proxy for exponential trends in technology, and especially in electronics. The centrality of integrated circuits to the entire info-tech ecology ensures that Moore’s Law, even in its narrowest sense, projects a development curve of huge – and expanding – scope. In large part due to this, it is a predictive principle that lends itself to abstraction and generalization. (Ray Kurzweil’s ‘Law of Accelerating Returns’ or ‘LOAR’ is exemplary in this regard.) Under the name of Moore’s Law, the self-exciting circuit is established as the central model of techonomic process. It thus provides a kind of theoretical shorthand, enabling the widespread promotion of schematics for an ultramodernist meta-sociology, based upon the doubling-period, with accelerating variation as the sole constant. The nonlinearities propelling it include its own feedback into the processes it describes, as a ‘road-map’ – or, more accurately, a schedule – setting the pace of improvement in relevant technological specifications. Exponential technological improvement is normalized, and accepted as a benchmark. Acknowledgement of the trend becomes a causal factor in its own perpetuation. (Theory-practice orthogonalism is diagonalized.) In its loosest invocation, it corresponds approximately to run-away techno-commercial deflation. Macroeconomic capture of industrial deflation is the principal political-economic story of the Keynesian epoch. Capitalistic surplus is ‘nationalized’ through currency issuance. The imperative to ‘fight deflation’ – inspired by Great Depression mythology – lends this process of systematic appropriation a perverse moral dignity. Automatic valorization of money – through capital (or ‘total factor productivity’) improvement – is compensated by centralized monetary management, benchmarked to price stability. Within this epoch, therefore, Moore’s Law describes a process of systematic economic expropriation, by monetary authorities, of those gains from advances in industrial productivity that would otherwise be distributed spontaneously to consumers (by falling prices, i.e. deflation). Electronic money reverses this tendency.

*** According to Wikipedia, the slogan is attributable to Stewart Brand, uttered in a remark at the first Hackers Conference, in 1984. Whatever the utopian suggestion that might have been heard in this slogan, it would eventually be drowned out by the dark counter claim: It is the destiny of any open near-zero-cost communication system to be spammed into dysfunction.

**** The commercial value of any transaction depends upon its exclusivity, which opens directly into questions of identity. The idea of a ‘digital signature’ – a very closely-related pseudo-paradox – binds identity and value to the suppression of fraudulent duplication. To repeat Satoshi Nakamoto’s critical formulation (Bitcoin #2): “We define an electronic coin as a chain of digital signatures.”

***** In the words of the Bitcoin paper (#6): “The steady addition of a constant of amount of new coins is analogous to gold miners expending resources to add gold to circulation. In our case, it is CPU time and electricity that is expended.”

****** Among the most striking examples of specifically zoosemiotic parasitism are instances of Batesian mimicry (named after the naturalist Henry Walter Bates, 1825-92). Typically, these involve the adoption by a non-toxic species of markings indicating toxicity, and thus an evolutionary strategy of free-riding upon acquired, and broadcast, unpalatability. Bates discovered the phenomenon, after noticing the remarkable similarity of coloration in certain non-related butterfly species. The semiotic convergence, he theorized, was driven by adaptive imitation. Signs ‘backed’ by poisons were easy to imitate, and thus allowed species to advantage themselves of the signal, while economizing on the original bio-chemical ‘message’. Such fraudulence, naturally, has its costs to the original, toxic species, who now find the signal communicated by their markings diluted. A process of semiotic inflation begins to work itself out.

******* The language of ‘potential’ is rejected in the name of contingency by a recent variety of transcendental philosophy associated in particular with Quentin Meillassoux and (in its financial application) Elie Ayache. For these thinkers, the projection of possibilities – or probabilities – is itself a transcendent illusion, constituting a metaphysics that is subject to critique. We are unable to follow Ayache into an employment of critique that ventures without discernible hesitation into the hyperbolic, in that it construes market pricing as simply incalculable (and even, on the inverse face – where it is theoretically captured as a stroke of ‘writing’ – as something close to a divine power). Pricing discovers nothing within the Ayache account, unless its own status as a truly sovereign decision, coincident with the genesis of being (the ‘event’). ‘Potential’ is used here in its physical sense – potential energy and ‘potential difference’ (voltage) – which is to say, real tension, or capacity (for work). Insofar as the concept of disequilibrium is ‘flattened’ by that of contingency, the consequence is massive information destruction. Potentials exist (virtually) prior to their probabilistic formalization. They are not epistemological productions. Followers of Elie Ayache, who can be expected to balk at this modal vocabulary, are also unlikely to find their concerns assuaged by the mere assertion that it is only derivatively related to probabilistic models, while primarily referring to something else entirely, namely free energy, or productive capability, as designated (and quantified) prior to its actual employment or consumption. Statistical mechanics – even in its abstract conception and its far-from-equilibrium application – provides the bridge between the science of probabilities and the capacity to do work. Crudely stated, abstract industrialism is here counter-posed to hyperbolic financialism, under the (post-duplicitous) sign of the machine. The industrialization of money, driven by Bitcoin, demonstrates a deep teleology very different to that manifested in the evolution of financial assets through ever higher sublimities of derivation.

******** Just, as for Kant, the causal consistency of nature is a matter of transcendental necessity rather than empirical fact, so the absence of double spending on the blockchain ‘follows’ inevitably from what the blockchain is. To understand the blockchain is already to know (as a matter of transcendental principle) that the DSP is thereby resolved.