Crypto-Current (041)

§4.2 — War games are built into the fabric of the Internet. This is at once a matter of uncontested genealogy, and of an as-yet only very partially explored transcendental-strategic landscape.[1] As we have seen, according to one (comparatively mathematicized) formal meta-description, Bitcoin arose as the solution to such a game – the Byzantine Generals’ Problem. This immediate context is so closely tied to the achievement of the Bitcoin Protocol, by those most closely associated with its formulation, that it has been widely adopted as a definition.[2] Yet even if the solution to Byzantine coordination establishes the game theoretical significance of Bitcoin, it does not exhaust it, even remotely.

§4.21 — Bitcoin is both less than, and more than, a mathematical theorem, because it remains a game in process, and also a meta-game. There is an irreducible informality to Nakamoto Consensus, insofar as it remains open, or unsettled, at multiple levels. As a concrete procedure, it effectively invokes a sociotechnical process of uncertain destiny within its demonstration, making it ill-suited to the purposes of mathematical proof.[3] If the mining procedure – rather than the reward criterion – could be fully specified in advance, and thus support predictive deductions, it would do no work. Incentivization – in every case – presumes non-deducible outcomes. Bitcoin, like all incentive systems, is a synthesizer. It produces a social process, as an event, and an arena (or agora), and thus advances experimental game theory, through an artificial environment especially conducive to the emergence of spontaneous (‘trustless’) coordination. Concretely, this space is a hothouse for business innovation, which constitutes the leading – and perhaps still ‘bleeding’ – edge of microeconomics, where generalized theory and practical enterprise have yet to dissociate. The boundaries of the Protocol, while strictly defined in certain respects, are profoundly unsettled in many others, and there is no strongly economical way to settle them. ‘Where does it end?’ is a question that has to be explored historically, without conceptual short-cuts, by an irreducible synthetic process. It is thus roughly modeled by the Bitcoin mining procedure, where the ineluctable necessity of trial-and-error – or uncompressible method – precludes all possibility of rapid philosophical (i.e. purely conceptual) resolution. Bitcoin is a game, and is like history, in that it cannot be worked out without being actually played – or hashed.

§4.22 — Real games are far-from-equilibrium processes that approach formality without actualizing it. They consume freedom – by contracting discretion – with every move that is made, and prolong themselves by reproducing it, in a circuit. Only insofar as this holds do they include incentives, as an irreducible teleological element. The open-ended mechanization of purposes is the diagonal along which they proceed. When apprehended at sufficient scale, this process is equivalent to industrialization. With the arrival of Bitcoin, money is – for the first time – subsumed into industrial revolution. A great historical circuit is cybernetically closed (which does not mean finished, but something closer to the opposite, i.e. initiated). Techonomic fusion – the singularity guiding modernity’s convergent wave – can for the first time be retrospectively identified. On Halloween 2008, the end began. What modernity has been from the start was then sealed.

§4.23 — Friedrich Nietzsche’s On the Genealogy of Morals dedicates itself to describing how man became “an animal with the right to make promises”. The story has turned out to be even longer and more intricate than his work anticipated, but the quasi-paradox there explored, knotted into the concept of debt, retains its pertinence into our time. How is a free commitment possible? Bitcoin attends explicitly to the same problem. “Transactions that are computationally impractical to reverse” – of the kind Bitcoin facilitates – constitute voluntarily-adopted mechanized commitments, immunized against all vicissitudes of will. Since algorithmic irreversibility enables an inability (or disables an ability), there is much here that seems self-contradictory upon superficial consideration.[4] Yet such a facility – or, indeed, power – of self-limitation is already fully implicit in the word ‘bond’, and in any serious sense of commitment. A contract is an expenditure of liberty. The motto on the coat of arms of the London Stock Exchange, Dictum Meum Pactum (‘My Word is My Bond’), extends the principle – by etymological suggestion – to the most elementary cases of formalized social association (‘pacts’). Society is a game, which arises from its ragged edges. The deal describes the frontier.

§4.24 — During a ‘Fireside Chat’ on ‘Bitcoin and the Future of Payments Technology’[5] Larry Summers makes exactly the same point:

This is an area that I think is rich with irony. … the single most important development in the history of the common law corporation was when the legal principle that it could be sued was established. And you might ask: why was it good to be sued? Well, because if you can’t be sued you can’t enter into a binding contract, and only when you could enter into a binding contract could you carry on commerce in a major way.

§4.25 — Bitcoin subtracts the option to defect (or double spend). The protocol sets the rules of a new game, in which the violation of contract ceases to be a permissible ‘move’. By automatizing this constraint, and thus withdrawing it simultaneously from the realms of contractual agency and regulatory oversight, Bitcoin instantiates algorithmic governance in its own, specific domain. Human discretion is displaced to the boundary of the Bitcoin commercium, and into the zones of meta-decision (for economic agents and authorities respectively) whether to enter or permit Bitcoin. These dilemmas introduce a knot of complex and typically highly-recursive games that can be grouped under the umbrella term ‘Bitcoin politics’.

[1] A ‘transcendental-strategic landscape’ – constituted by an absence of transcendent legality – corresponds to a the concrete problem of anarchism, in the sense this term is understood by realist international relations theory (IRT), and realist strategic analyses more widely. That is to say, it poses issues of security without any possibility of appeal to superordinate authorities (or authoritative referees). Hobbesian political theory, in which “the war of all against all” is exposed by a secular ‘Death of God’, establishes itself upon a negative foundation. Leviathan begins from that which cannot be relied upon. Whether domestically, or internationally, the transcendental (i.e. ultimate) theater in which powers meet is defined by the subtraction of any original commanding unity. Security is thus theoretically constituted as a problem, corresponding to a primordial lacuna. Since it is not given, it has to be positively produced, and it is in the identification of this practical conundrum that IRT isolates its proper object of study. On the Internet, as in the international arena, it is only upon such a cleared, immanent plane, that a true game can take place. It cannot be sufficiently stressed that the conflictual field is not – as its critics have over the centuries necessarily insisted – a positive presupposition, but rather a mere default, assuming only original diversity under the conditions of an absent integral authority. Despite its manifest tendency to decay into a Utopian projection, the perpetually-regenerated credibility of anarchism is founded not upon its transcendent aspiration, but upon its transcendental problematic. Given only war, how is coordination possible?  

[2] While the definition of Bitcoin as a solution to the Byzantine Generals’ Problem remains controversial, the principal objections to this description can reasonably be described as arcane. As Oleg Andreev notes, in a brief but valuable discussion of the proof-of-work solution, any actual production of communications integrity is compromised in its logical purity by practical limits (bounded by cryptographic intractability). In other words, precisely because it is transcendental, Nakamoto Consensus cannot be transcended even by its own proof. The limit is set by the working machine. This is a matter of extreme generality. While persistently – and even essentially – tempted by Platonism as a heuristic, mathematical procedures require instantiations which are transcended only in conceptual principle, which is to say: hypostatically, through appeal to transcendent grounds whose authority is purely ceremonial. Compelling demonstration already returns the problem to the machine, and its real potentials for effective execution. Operationalizations are not, in reality, superseded, or subordinated, but only (at most, and typically) bracketed, or abbreviated, and thus – again, in reality – assumed. The credibility of the Idea refers to potential demonstration. The keystone of proof says nothing else. Untested trust is an oxymoron. It would be a grave error – though an all-too common one – to seek an epistemological demotion of ‘credibility’ to the psychological category of ‘mere opinion’ while admitting this. Credibility is basic. Without it, no truth has been earned. This is the meaning of deduction in its critical and realistic sense. What lies beyond is metaphysics, enthroned upon arbitrary assertion. Irrespective of any extravagantly-promised protections, there is no confidence – no security – to be derived from that. However much Bitcoin has to appear as an Idea, therefore, it is irreducible to one. It cannot be expected that this stubborn factuality is susceptible to comprehensive dissolution into the form of the concept, still less that it will be fully factored into a security analysis. On the contrary, realism predicts its chronic idealization (i.e. misidentification). In this respect, philosophy is a security hole (proposing answers in place of solutions, or dispelling threats only in ideality), if not – in its institutional form – a particularly serious one. … Since insecurity has no adequate idea, it cannot be speculatively resolved. This point of elementary realism calibrates the appropriate level for confidence in philosophy (and does so in actuality, not only in principle). Philosophy is not seriously entrusted with keeping anything safe. Its invitation to live dangerously is – in this respect – a sensible concession to the inevitable. The untested or – still worse – untestable model need not be about danger to be dangerous. Armchairs are places where things can go wrong without limit. … The Byzantine Generals do not secure themselves through a speculative philosophy, but through a robust procedure. Did they have a ‘good plan’ before testing it? (It could, at most, only appear so.) … Security concerns only risk, which is never merely a conceptual possibility, but always a matter of discovery. The fact that Bitcoin appears to be a ‘sound idea’ is not finally separable from its concretely-elaborated existence as the most rigorously-tested trust mechanism in the history of the earth. …

Ian Grigg argues that the classic coordination problem has been displaced, into the far more protean quandaries of a ‘dynamic membership set’. Critical Bitcoin security challenges, most specifically that of the Sybil attack (based upon identity proliferation), entirely exceed the horizon of the BGP. “If Bitcoin solved the Byzantine Generals Problem, it did it by shifting the goal posts.”

[3] A machine with integral incentives necessarily combines formal – or formalizable – and informal elements. To a still-imperfect approximation, but with definite teleological inclination, Bitcoin is politically closed, while commercially and industrially open. In this respect it echoes – and even escalates – the ideal of the arch-liberal (capitalist) social order. The mining objective is exactly specified. The criterion for mining success, compensated automatically in bitcoins, is a hash of the current (problem) block whose nonce begins with a definite number of zeroes (a figure adjusted for difficulty). Despite this extreme formality, the mining procedure involves both chance and – more importantly – innovation. Bitcoin hashing is formally constrained to trial-and-error methods, with probabilistic outcome. In the words of the Bitcoin wiki: “The probability of calculating a hash that starts with many zeros is very low, therefore many attempts must be made.” Everything beyond the product specifications (the puzzle solution) is left open. In particular, the production techniques are left undetermined, and thus open to industrial innovation. See:

Similarly, and even more markedly, the commercial opportunities opened by the protocol are uncircumscribed. The ‘value’ of the Bitcoin currency, in the broadest sense, is settled dynamically outside the blockchain, through a radically decentralized and uncomputably complex dynamic of exchange. (The exchange process – catallaxy – is the computation.) The protocol sets the total stock of bitcoins, without predetermining their distribution (between agents) or price (when denominated in any other financial medium). The value of the currency cannot be derived from the rules determining its quantity. It is synthetic. Bitcoin’s productivity lies in what it leaves open, even as its integrity is secured by what it closes.

[4] Self-binding is a classical problem, epitomized by the strategy adopted by Odysseus in his passage past the Sirens. Anticipating an irresistible seduction, he commits to a decision which he then – by crude socio-technical means – renders irreversible. Within game theory, the same problem is a central preoccupation. It is admirably summarized by Scott Alexander: “… it sounds weird to insist on a right to waive your rights. Isn’t that more of an anti-right, so to speak? But … read your Schelling. In multiplayer games, the ability to limit your options can provide a decisive advantage. If you’re playing Chicken, the winning strategy is to conspicuously break your steering wheel so your opponent knows you can’t turn even if you want to. If you’re playing global thermonuclear war, the winning strategy is to conspicuously remove your ability not to retaliate, using something like the Dead Hand system. Waiving your right to steer, waiving your right not to nuke, these are winning strategies; whoever can’t do them has been artificially handicapped.”

[5] The quote is extracted from this video record:

Crypto-Current (040)

§4.1 — Explicitly acknowledged ‘network problems’ long pre-exist the electronic era, by centuries, if not millennia. They constitute a guiding thread within what has been called ‘the tradition of spontaneous order’.[1] Order is spontaneous if it solves a coordination problem without appeal to any organizational element at a higher – or super-social – level. Spontaneous order, social self-organization, or immanent social process, is thus counter-posed to transcendent social design (from above, or beyond), and to its corresponding theoretical justifications, which amount to a social metaphysics, typically serving concrete functions as guiding ideologies of super-ordinate control. As long recognized by its opponents, the assertive recognition of this theoretical conundrum cannot be practically dissociated from an implicit political stance.[2]

§4.11 — In the absence of superior guidance, solutions to coordination problems have to emerge, out of – and as – games. This is only to say that cooperation between the agents involved cannot be presupposed, but has to arise from their interaction, if it is – indeed – to appear at all. To assume altruism or solidarity in such cases is a failure to think the problem at all. Coordination is the explanandum. The collectivist dogma is not an answer, therefore, but an alternative to the question. An answer is a trust machine, for which Bitcoin is the model. It is strictly complementary to a minimal presupposition, that of trustlessness (for which it is the solution).

§4.12 Byzantine generalization extends to the very limit of network communication problems, to the difficulties of establishing coordination within radically dispersed (and thus zero-trust) multiplicities, encompassing non-idealized societies / populations of every variety. It is not, of course, that the concrete existence of trust is simply denied, only that it is rigorously thematized, as a social product requiring explanation – and what counts as an ‘explanation’ cannot, whether overtly or covertly, merely presuppose what it is called upon to explain. (This demand, as we have seen, is already totally – even ultimately – controversial.) If there is trust, there has to be a trust engine, conceived without pre-existent bonds of trust as a part. At the most abstract level, therefore, this is a topic that would have been familiar to the thinkers of the Scottish Enlightenment, as to all those participating productively within the theoretical tradition of spontaneous order. It is exactly this same problem of decentralized coordination (in the absence of any transcendent contribution provided by assumed altruism or common purpose) that has been the essential guideline for realistic social analysis within the ‘whig’ lineage of descriptive liberalism, exemplified most famously by Adam Smith’s figure of the ‘invisible hand’.[3]

§4.13 — Evolutionary biology, as a science of emergent networks, has engaged very similar problems from its inception, often with identical tools. This is especially evident in the modeling of ecological equilibrium within large, long-term biological system dynamics, in which the absorption of extinctions defines the ‘true network’ (by the absence of indispensable nodes). A more recent attempt to formalize such coordination is found in the game theory of John von Neumann, which has itself been effectively applied to biological networks at a variety of scales.[4] The latest – and still rapidly self-transforming – incarnation of this tradition can be seen in the science of ‘complex adaptive systems’ as exemplified by the research programs of the Santa Fe Institute.[5] In each case, the defining theoretical object is emergent coordination, in which no appeal to any centralized, superordinate, or orchestrating principle is made, unless this can be identified with the system itself, as such. The target of such researches is transcendental order, as captured by the immanent rules of distribution (which are ultimately indistinguishable from the real correlate of mathematics).

§4.14 — Games, strictly understood, therefore, arise under the minimalistic assumptions tolerable to an analytical anarchism,[6] that is: after the methodical subtraction of all presumed coordination, or the conversion of such presuppositions into formal theoretical problems. The implicit critical impulse driving the construction of such research programs is evident. That which might have been asserted, as a transcendent principle, or metaphysical dogma, is to be instead explained, as an immanent, emergent, or ‘evolutionary’ outcome. Whether explicitly understood in such terms, or not, every such enterprise is a regional application of critical philosophy. Spontaneous order is the correlate of critique. The solution, however, cannot correspond to a philosophical thesis (of any traditional type), or even to a ‘machinic proposition’ – an engineering diagram, simulation, or protocol – but can only emerge as the synthetic product of such a proposition, when executed. The notion that coordination problems (of significant complexity) can be anticipated in detail by a process of pure ratiocination is a philosophical disease, of recognizably pre-critical type. The idea of the diagonal is not the diagonal.

§4.15 — If the discovery of spontaneous order as a problem corresponds to the execution of critique, it can be formalized through a diagonalization matrix. When the pre-critical opposition of centralized coordination to uncoordinated dispersion is tabulated, it draws a graph isomorphic with the Kantian schema. At the level of the abstract formalism, this latter is echoed with such extreme fidelity that we can borrow from it freely, while switching its terms through a simple hash. Substitute centralization for analysis, decentralization for synthesis, order (coordination) for the a priori, and disorder for the a posteriori. As with the Kantian original, the first diagonal term fails – there is no centralized disorder, any more than there are analytic a posteriori judgments. Centralization is an intrinsically anomalous distribution, necessarily threatened by the prospect of a fall into disorder. Its complementary conception, ‘simple anarchy’, is no less invulnerable to theoretical dismissal. The previously acknowledged terms, centralized order, and decentralized disorder (like the analytic a priori, and synthetic a posteriori) are therefore preserved. Common sense is not abolished, at least, not initially. In the exact formal place of Kant’s invention/discovery of the synthetic a priori, the critique of coordination, too, generates a viable diagonal product – decentralized order. (Quod erat demonstrandum). This is the Great Oblique worked by all realistic social theory since the inception of the modern epoch.

§4.16 — In Cyberspace, the tradition of spontaneous order has been massively accelerated. Classical coordination problems have been reformulated as experimental fields, opened for exploration by the emergence of increasingly-powerful computer simulation tools, and consolidated as practical solutions through the implementation of cryptographic systems and P2P platforms. Philosophical reflection has, to a very considerable extent, been side-lined by technical applications reinforced by far superior criteria of evidence, which is to say: practical demonstration. In the age of electronic information technology networks can be tested as products, and it is possible to ask of a complex idealization, as never before, does it work? This transition, through computer simulation, from explanation to implementation, registers a process of technological envelopment without definite limits. It corresponds to an interminable tide of disintermediation that established institutions of intellectual authority have not yet begun to fear enough.

§4.17 — While institutionalized philosophy has tended to lag the network trend, rather than pioneer it, something that might reasonably be called ‘abstract network theory’ has nevertheless arisen to become a guiding philosophical theme since the mid-20th century. Among those modes of philosophical writing that have been most influential upon the wider humanities and social sciences, this attention to the distinctive characteristics of networks has been especially notable.[7] Appropriately enough, this ‘discourse’ – or dispersed conversation – has no uncontroversial center of authority, stable common terminology, shared principles, or consistent ideology. Its rhetoric tends to systematically confuse advocacy with criticism, and both with analysis, as it redistributes partial insight around a number of relatively tightly-interlocking social circuits (which are overwhelmingly dominated by a familiar set of theoretically-superfluous moral-political axioms).[8] Yet, however deeply regrettable this concrete situation might be considered from the perspective of austere theory, it cannot be simply wished away. Intellectual production itself occurs within networks, and those with greatest promotional capability are among those least optimized for pure intellection.

§4.18 — The cultural systems in which the philosophical (and sub-philosophical) formalization of radically decentralized – or ‘true’ – networks has emerged, through a multi-decade self-reflexive process, are eccentrically articulated, very partially self-aware, and only weakly integrated. Yet even in these noisy and deeply compromised circles, cross-cut by vociferous extraneous agendas, and subjected only very weakly to a hard reality criterion, convergence upon the rigorous conception of a model network has been inexorable. An especially influential example is the rhizome of Gilles Deleuze and Félix Guattari, which provides philosophy with its most rigorously-systematized account of acentric and anorganic order.[9]

§4.19 — A network, in this sense, has no indispensable nodes, entitling it to the adjective ‘robust’. Once again, the Internet – at least in its idealized conception – exemplifies such a system. It is typical, at this point, to recall the origins of this ‘network of networks’ in a military communications project (ARPANET), guided by the imperative of ‘survivability’ realized through radical decentralization.[10] As will be repeatedly noted, on the crypto-current the security criterion is primary, providing system dispersion with its principle. This suggests, as an ideological generalization, that there is no basic ‘trade-off’ between liberty and security. Rather, it is through security (alone) that liberty establishes its reality. The term “crypto-anarchy” condenses this conclusion into a program.

§4.191 — Such invocations of the strategic investment in distribution and redundancy, however predictable, remain conceptually productive. They are especially valuable as a corrective to modes of discourse – typical among contemporary humanistic studies – which tend to haze the harsh selective or eliminative function of critique into a vapid metaphor. It is the military ancestry of the Internet that is tacitly referenced in the celebrated maxim of John Gilmore (1993): “The Net interprets censorship as damage and routes around it.” In order to save command-control, it was necessary to fundamentally subvert it.

[1] Barry Norman’s ‘The Tradition of Spontaneous Order’ (2002) lays out the intellectual history of the idea, within a primarily legal and economic context:

Whe abstracted beyond its socio-historical exemplification, and apprehended as a cosmic-philosophical principle, spontaneous order refers to immanent coordination within dispersed collectivities, making it approximately synonymous with ‘the science of multiplicities’ in general. It designates the systematicity of the system, which can be alternatively formulated as an irreducibility without transcendence, or an immanent non-locality. Whether philosophical, or colloquial, the association of spontaneity with notions of vitality is ultimately arbitrary. Spontaneity is not life, unless ‘life’ is conceived as the irreducible operation of the multiple as such.

[2] Public Choice Theory serves as a reliable proxy for minimalistic, game-theoretical axioms in their directly political application. In her book All You Can Eat: Greed, Lust and the New Capitalism (2001), Linda McQuaig cites a short fable by Amartya Sen, designed to satirize the Public Choice approach: “‘Can you direct me to the railway station?’ asks the stranger. ‘Certainly,’ says the local, pointing in the opposite direction, towards the post office, ‘and would you post this letter for me on your way?’ ‘Certainly,’ says the stranger, resolving to open it to see if it contains anything worth stealing.” Notably, in contenting itself with a satirization of disciplined moral skepticism, this line of criticism is brazenly unembarrassed about its dogmatic structure. Its rhetorical brilliance diverts from its substantive socio-historical meaning, which is the acknowledgment that solidarity is premised on the absence of a question. In other words, it demands, finally, the docile acceptance of an inarticulate presupposition. In this demand is to be found its irreparable weakness, shared by all cultural commitments – typically religious in nature – that are grievously wounded by the very fact of coming into question. A comparable defense of transcendent altruism founded upon the pre-delimitation of critique will be seen in David Graeber’s ‘everyday communism’. There, too, fatal injury coincides with the beginning of skeptical thought. Since, in this framework of collectivist moral norms, the investigation is the real crime, an intrinsic inclination to totalitarianism proves difficult to control.

There is one, and only one, coherent rejoinder available to the Left on this point. The table has to be reversed, in order for autonomous individuated agency to occupy the role of the explanandum, with its production and reproduction within the collectivity determined as the critical problem. Naturally, the production of individuation within complex adaptive systems would not satisfy this demand, since the presupposition of any such analysis is uncoordinated multiplicity. The Left is compelled to maintain that war is not ‘the father of all’. It is darkly amusing, then, that we continue to argue about it.

[3] In Smith’s most widely-cited words, taken from the Wealth of Nations, the repudiation of assumed altruism is explicit: “It is not from the benevolence of the butcher, the brewer, or the baker that we expect our dinner, but from their regard to their own self-interest. We address ourselves not to their humanity but to their self-love, and never talk to them of our own necessities, but of their advantages.” It is only through this theoretical self-restraint that the emergence of social cooperation is critically explained, rather than dogmatically asserted. It is important to note, however, that the recognition of the problem is not in itself a solution. The solution is not a philosophical reservation, but a machine. In Smith’s case, of course, it is the market mechanism, understood as a catallactic generator, or engine of spontaneous order, that represents the general (abstract) form of social solutions to novel coordination problems, in a way that remains mainstream within economics up to the present day. When grasped within this framework, it is at least tempting – if perhaps not strictly compelling – to suggest that any purported successor to the market, capable of usurping the role of the market, could only itself be a new iteration of the market. The most prevalent contemporary radical (left) alternative requires that coordination problems, rather than being solved, are instead terminally dispelled. Among those unfashionable vestiges of the Marxist tradition which emphasize the inheritance of capitalistic management practices by post-capitalistic social regimes (as exemplified by Lenin’s Taylorism), there is at least the promise of collective action solutions taking the form of legacy cultural resources, although without solid prospect of further innovation in a post-competitive milieu.

[4] The formal application of game theory to evolution dates back to the early 1970s, pioneered by John Maynard Smith and George R. Price, through the theorization of evolutionary strategies (see ‘The Logic of Animal Conflicts’, 1973,…15S). Since the notion of ‘strategy’ has an ineliminable telic inflection, this mode of evolutionary formalization can be philosophically-conceived as a diagonalization of biology. A video interview with Smith on the topic can be found here:;jsessionid=7E3CA894713888028A8D7066FB3F263E

The preparatory theoretical work of W. D. Hamilton is especially notable in this regard, since it is explicitly focused on the application of Darwinian intellectual tools to subtract the metaphysical pre-supposition of altruism. The rhetorical provocation of selfishness, as seen most prominently in the Hamilton-based neo-Darwinian synthesis of Richard Dawkins, testifies to this project. The ‘selfish gene’ poses altruism as a problem, which Hamilton had already productively addressed. This is how transcendental critique reaches social biology, whose compressed name – socio-biology – was, from its inception, a political scandal. Collectivist metaphysics was not slow to recognize the profound intellectual menace – at once ancient and scientifically-renovated – that it faced.

[5] The Santa Fe Institute gathers cross-disciplinary researchers for the study of far-from-equilibrium systems and their emergent behavior, characterized by path dependency, sub-optimality, and massive informational imperfections. Of special relevance to the discussion here has been the pioneering of ‘Complexity Economics’, associated in particular with the work of Brian Arthur, which provides a disciplined critique of equilibrium models in neoclassical economics. Markets approach equilibrium only in the way that all working machines approach an entropy maximum, without this limit being ever – or even approximately – actualized. See the institute’s website at

The concrete socio-historical application of such thinking found in Manuel De Landa’s (2000) A Thousand Years of Nonlinear History (see bibliography).  

[6] Between ‘descriptive liberalism’ and ‘analytical anarchism’ – despite the apparent terminological escalation – there is no distinction of serious theoretical significance. ‘Order out of chaos’ (or at ‘the edge of chaos’) is the consistent explanatory horizon.

[7] The most influential proximal ancestor of current network theory (by far) has been connectionism. This intellectual movement, characterized by an extreme interdisciplinarity, introduced the problems of electronic engineering and distributed information systems into the study of biological, psychological, and social phenomena. Its name was coined by neural net-theorist Donald Hebb, in the 1940s. Rising to prominence from the early 1980s, connectionism has generalized the study of neural networks (or parallel distributed processing) in suggestive coincidence with the rise of the Internet. Its research orientation has been guided by the proposition that complex adaptive behavior is better explained by the pattern of connections between simple units than by complex properties inherent within concentrated entities of any sort. At a larger scale, connectionism updates the diffuse multi-century current of empirical associationism, accommodating it to the prospects of technological actualization within electronic networks. It is thus marked by a comparative disregard for elaborate symbolic structures, and for highly-organized internal states more generally. The atomic elements of connectionist analysis are linkage modules, supporting emergent systemic behavior. Like cybernetics before it, connectionism outgrew its own disciplinary contour, and dissolved into the sophisticated common sense of the world. (Generalized dominion can be difficult to distinguish from disappearance.) Subsequent attempts to specify a definite ‘theory of networks’ have been programmatically vaguer, resorting typically to some notional gestures of obeisance in the direction of mathematical graph theory. Much of this work has been seduced by literary temptations, bringing its enduring theoretical productivity into serious question. Most damagingly, in respect to its own capacity for conceptual genesis, the primary analytical discipline of connectionism – programmatic dismantling of mysterious essences into distributions – has been increasingly neglected, and replaced by an arcane cult of whole ‘objects’ withdrawn from the menace of disintegration.

[8] Axioms are independent, formally-articulated assumptions. Since each axiom is a basic presupposition which cannot (therefore) either be derived from, or support, any other, a set of axioms is an irreducible multiplicity. Consequently, an axiomatic has a characteristic mode of variation, based upon composition from wholly independent parts, which can be added or subtracted with comparative freedom. Because axioms are bedrock elements, their selection demands an irreducible experimentalism. (As a matter of logical necessity, the systems they compose can never determine the characteristics of a missing axiom.) Within a social context, the pursuit of minimal axiomatic systems is ideologically charged, since it corresponds to a contraction of public purposes. The historical disintensification of capitalism has proceeded, as Deleuze & Guattari note, by way of an axiomatic proliferation. Addition of axioms is the way capital has been socially compromised. From a Francophone perspective, this tendency appears as a resilient teleological structure. When the same prediction is extended to Anglophone cultures, in which the recession of classical liberalism remains far more seriously contested, less confident conclusions are advisable.  

[9] Rigorous transcendental formulation of the model network – with all the conceptual ironies and traps this involves – finds its highwater-mark in Deleuze & Guattari’s essay ‘Rhizome’ (which introduces the second volume of Capitalism & Schizophrenia, A Thousand Plateaus). A ‘rhizome’ acquires facile (contrastive) definition through its distinction from the ‘arborescent’ schema of hierarchical sub-division. It proposes an absolute horizontality, approached through dimensional collapse, which opposes the rhizome to the tree, reticulation to hierarchical structure, and contagion to heredity. “In truth, it is not enough to say, ‘Long live the multiple’, difficult as it is to raise that cry. The multiple must be made, not by always adding a higher dimension, but rather in the simplest ways, by dint of sobriety, with the number of dimensions one already has available – always n – 1. (the only way the one belongs to the multiple: always subtracted). Subtract the unique from the multiplicity to be constituted; write at n – 1 dimensions. A system of this kind could be called a rhizome” (ATP 6). When analytically decomposed, the rhizome exhibits a number of consistent, distinctive features. “1 and 2. Principles of connection and heterogeneity: any point of a rhizome can be connected to anything other, and must be. … 3. Principle of multiplicity: it is only when the multiple is effectively treated as a substantive, ‘multiplicity’, that it ceases to have any relation to the One as subject or object, natural or spirtitual reality, image and world. Multiplicities are rhizomatic, and expose arborescent pseudomultiplicities for what they are. … 4. Principle of asignifying rupture: against the oversignifying breaks separating structures or cutting across a single structure. A rhizome may be broken, shattered at a given spot, but it will start up again on one of its old lines, or on new lines. You can never get rid of ants because they form an animal rhizome that can rebound time and again after most of it has been destroyed. … 5 and 6. Principle of cartography and decalcomania: a rhizome is not amenable to any structural or generative model. … [The map] is itself a part of the rhizome. …” (ATP 7-13). Manuel DeLanda proposes the term ‘meshworks’ for such systems of flat, heterogeneous, interconnectivity, which he opposes to (comparatively rigid and homogeneous) ‘hierarchies’.

[10] A network, in what is by now the overwhelmingly dominant sense of this term, is by definition robust. It is constructed in such a way as to tolerate massive disruption. Thus the perennial relevance of the military roots of the Internet. In its practical application, communications resilience has been inseparable from resistance to censorship. Like the basic Internet protocols, Tor (‘the onion router’) is decended from a military research program, initiated in the mid-1990s by the United States Naval Research Laboratory and subsequently pursued by DARPA. The regularity with which critical elements of state security apparatus are deflected into crypto-anarchist streams, and inversely, suggests a principle of strategic reversibility, in conformity (on one side) with the ‘poacher-turned-gamekeeper’ phenomenon. Given the prominence of treachery within all rigorously-constructed games, it should not be surprising to encounter this extreme fluidity of partisan identities, in which allies and enemies switch positions. The figure of ‘the hacker’ is notably liminal in this regard, representing – simultaneously – an ultramodern incarnation of the outlaw, and a skill-set indispensable to effective formations of contemporary power. At its highest levels, strategy is a matter of ‘turning’. This is the diplomacy intrinsic to strategy (and thus to war), once the latter is liberated from the strait-jacket of its political – or Clausewitzean – determination, and released into a conceptual space bounded only by a transcendental horizon. Friend and foe are defined within the great game, rather than outside, by its transcendent (political) frame. There can be no assumed parties at the transcendental level of the game, where not only every alignment, but any constitution of agency is itself a ‘move’. Agencies are pieces, consequent to strategic decisions. However tempting it might be to dismiss such considerations as arcane, they are already being condensed by commercial crypto-systems as practical problems. In regards to their legal form, modern business corporations have been synthetic agencies for well over a century. In the emerging epoch of Digital Autonomous Organizations (‘DAOs’), whose economic and military implications are yet scarcely glimpsed, this status advances from a matter of legal definition to one of software implementation. It consummates the historical triumph of code. At the end, as – invisibly – from the beginning, the transcendental subject is fully immanent to a plane of strategic production. 

Crypto-Current (035)

§3.44 — From the perspective of the miner, bitcoins are immanent remuneration for primary production, or resource extraction. They function as digital gold. As the simulation of a finite resource, it is natural that their production rate should exhibit declining marginal returns. Each increment of mining effort confronts an increasingly challenging environment, under conditions of steady depletion. For Bitcoin, as for gold, economic dynamics automatically counter-balance industrial exertion, as prices adjust in response to supply constraints. This process of continuously revised bitcoin price discovery cannot be determined within the protocol, but occurs at its edges, where economic agents trade into, and out of, bitcoins – synthesizing the Bitcoin commercium with its outside.

§3.45 — Within the protocol, adjustments are restricted to supply modifications, modeling the depletion of an abstract resource that is advanced as a general commodity (i.e. money). Bitcoin splits its schedule of decreasing returns in two, separating its measures of reward and difficulty. This double contraction – while clearly redundant from the viewpoint of austere abstract theory – enables a superior degree of flexible calibration, in response to a dynamic environment, volatilized above all by rapid improvements in computational engineering (and product delivery). By dividing bitcoin output compression between two interlocking processes, the protocol is able to stabilize the rate of block validation in terms of an ‘objective’ (external) time metric. The difference between these two modes of nominal reward restriction reflects a schism in time, between the intrinsic, intensive, absolute succession of the blockchain, and the extrinsic, geometricized order of pre-existing (globalized) chronological convention. Integrated reward is a complex chrono-synthesis, occurring at the boundary where Bitcoin’s artificial time – proceeding only by successive envelopment (of blocks into the chain) – meets the social-chronometric time of measurable periods. ‘Ten minutes’ means nothing on the blockchain (in itself), until connected by an interlock that docks it to a chronometer.

§3.46 — Are not all blocks time-stamped? it might be objected. To avoid confusion at this point, it is critical to once again recall the difference between the ordinal and the cardinal, succession and duration. Time-stamps are ordinal indicators, without any intrinsic cardinality, and with merely suggestive cardinal allusion. They implement an ordering convention. Metric regulation of periods is an entirely distinct function. ‘Chain’ means series (and nothing besides).

§3.47 — The bitcoin reward rate halves, stepwise, in four-year phases, on an asymptotic progression towards the limit of BTC 21,000,000 – the protocol’s horizon of zero-return. Taken in isolation, this exponential decline looks smoothly Zenonian (asymptotic), or infinitesimalizing, until arbitrarily terminated at a set point of negligible output. It is scheduled to pass through 34 ‘reward eras’ in the last of which – with block 6930000 – BTC issuance reaches zero. Due to the power of exponential process, 99% of all bitcoins are issued by Era-7 (during which 164,062.5 bitcoins are added to the supply).* The end of Bitcoin’s mining epoch is anticipated in the year 2140. After this point, at a date so distant that it belongs to the genre of science fiction, continuation of the system requires that mining-based block validation incentives are fully replaced by transaction fees. Evidently, the transition process cannot be expected to await its arrival at this remote terminus, which marks a point of completion, rather than inauguration.

§3.48 — The reward schedule is further tightened by increasing difficulty of the hashing problem. Rather than executing a pre-programmed deceleration, Bitcoin’s rising difficulty responds dynamically to technological acceleration, and balances against it, thus holding the block validation rate roughly constant. Even as the reward rate tumbles – when denominated in BTC – the block processing rate is approximately stabilized, at a rate of one block every ten minutes, regardless of the scope and intensity of mining activity.

§3.49 — ‘Difficulty’ modification is a synchronization. The Zenonian time of intensive compression that determines the BTC reward-rate is – taken on its own – wholly autonomous, or artificial. As already noted, its chronometric ‘ticks’ are block validation events, registered in serial block numbers (and their ‘epochs’). They have no intrinsic reference to the units of ordinary time. It is only with the stabilization of the block-processing rate that the time of Bitcoin is made historically convertible, or calendrically intelligible, through the latching of block numbers to confirmed or predicted dates. This is a supplementary, synthetic operation, which coincides with the protocol’s anthropomorphic adoption. The time of the blockchain is intrinsic, and absolute, but its history is a frontier, where it engages ‘us’. As the blockchain is installed, and thus dated, an artificial time in-itself – consisting only of absolute succession – is packaged as phenomenon.

§3.5 — It can easily be seen that bitcoin mining is an arms race, of the ‘Red Queen’ type.** Since the total bitcoin production rate has zero (supply) elasticity, local advances in production can only be achieved at the expense of competitors. In consequence, inefficient miners are driven out of the market (as their costs – especially electricity bills – exceed the value of their coin production). This brutal ecology has forced rapid technological escalation, as miners upgrade their operations with increasingly specialized mining ‘rigs’. In the course of this process, the standard CPUs initially envisaged as the basic engines of bitcoin mining have been marginalized by dedicated hashing hardware, from high-performance graphics processing units (GPUs) – originally designed for application to computer games – through field-programmable gate arrays (FPGAs), to application-specific integrated circuits (ASICs). Bitcoin has thus stimulated the emergence of a new information technology industrial sub-sector.

§3.51 — With the completion of this production cycle, Bitcoin Singularity is established in a double sense (we will soon add others). An unprecedented event has occurred, upon a threshold that can only be crossed once, and an innovation in autonomization attains actuality, establishing the law for itself. Bitcoin provides the first historical example of industrial government. It is ruled in the same way that it is produced, without oversight. At the limit, its miners are paid for the production of reality – effectively incentivized to manifest the univocity of being as absolute time.***

* For a more detailed description of the Bitcoin reward schedule, see.
Fort a compact, chronometric representation of mining difficulty, see.

** The Red Queen dilemma, as formulated by Lewis Carroll in Alice in Wonderland, is that “it takes all the running you can do, to keep in the same place.” Daniel Krawisz makes another comparison: “When a person upgrades their mining computer, they mine at a faster rate and therefore earn more bitcoins. However, when everyone upgrades, the mining does not become more efficient as a whole. There is only supposed to be one new block every ten minutes regardless of how hard the network is working. Instead, the network updates the difficulty to require more stringent conditions for future blocks. All miners may work harder, but none is better off. It is rather like a forest, in which every tree tries to grow as tall as possible so as to capture more light than its fellows, with the end result that most of the solar energy is used to grow long, dead trunks.”

*** The doctrine of the univocity of being is derived from Duns Scotus, and passes into modernity by way of its implicit contribution to Spinozistic ontology, as re-activated by Deleuze. It can be formulated in various ways. Most basically, the meaning of ‘being’ is insensitive to its application, and unaffected by differences of kind. Thus, the being of any being is no different from that of any other. God is not a flake of dandruff (and differs very significantly in kind from one, whether the distinction is entertained from a theist or atheist perspective), but the being of God has no difference whatsoever from that of a flake of dandruff – and even if God is held not to exist, the being denied him is the same as that of any existent thing. In other (more ‘Heideggerian’) words: ‘Being’ is not susceptible to ontic qualification. In its pure conception, therefore, what is said by ‘univocity of being’ is exactly equivalent to ontological difference.

Crypto-Current (034)

§3.4 — The Bitcoin DSP-solution unshackles (digital) proliferation from duplicity, in the production of replicable singularity. As with every diagonal construction, this outcome is pseudo-paradoxical, since it reformulates an apparent contradiction. From the latent matrix of abundant signs and scarce things, it extracts the scarce sign. Through this procedure, crypto-currency is implemented as critique. It coins a diagonal concept, not as impractical-contemplative ‘theory’, but as working code.

§3.41 — Duplicity – or the DSP – is primarily registered as a monetary problem, in the guise of counterfeit currency, and secondarily as a problem of identity authentication, responding to impersonation. On the Internet, however, another manifestation of the same basic syndrome has been far more prevalent, socially advanced, and technically provocative. The critical driver, on the path to a cryptographic solution to the DSP, has been spam.

§3.42 — ‘Spam’ is narrowly defined as a species of advertising adapted to the conditions of near cost-free electronic communication. Its first large-scale manifestation was ‘unsolicited bulk email’ (UBE), a sub-category of the more general phenomenon of the ‘electronic spam’ which exploits the receptivity of instant messaging systems, newsgroups, mobile phones, social media, blog comments, and online games, among others. While advertising is the principal motivation for this massive duplication of unwanted – and typically only vaguely directed – communications, spam procedures (and supportive technologies) can also be employed for DoS (denial-of-service) attacks, which are designed to overwhelm a specifically-targeted recipient with an inundation of messages. At a sufficiently abstract level of apprehension, no strong boundary of principle differentiates advertising spam from a denial-of-service (DoS) attack, except that the former is generally divergent (one-to-many) and the latter convergent (many-to-one). The residual distinction is motivational. The injury (cost) to the recipient that is an inevitable side effect of spam promotion (‘collateral damage’) is a primary objective for the DoS assailant.

§3.421 — ‘Spam’ – abstractly conceived – spontaneously expresses the consequences of extreme information economy, or radical dematerialization, and is thus emblematic of electro-digital semiotic crisis. It follows the Law of the WORM – write once read many (times) – into a near-costless replication explosion. Unsurprisingly, any recipient of electronic communications is vulnerable to spam harassment, generating a problem that tends to ubiquity. The arms race between spammers and spam filters is recognizable from that characterizing the cross-excitation of infections and immune systems in the biological sphere. Cheap sign contagion is the common syndrome. As the various Turing Test-type defenses attest, any effective obstacle to the automation of spam production increases its cost. The time taken to ‘prove you are human’ adds friction at the point of terminal message delivery, where it cannot be easily eliminated – pre-emptively – by the spammer. Such ad hoc defenses necessarily aim to raise messaging cost, in order to restore the signal of commitment that digitization has erased.

§3.422 — The difference between a solution to the DSP and a spam filter turns out to be somewhere between subtle and non-existent. Both respond to the destructive consequences of semiotic economy – cheap signs – as these climax within networked, digital electronics.* The critical step in this respect was taken by Adam Back in 1997, with hashcash, a proof-of-work based messenger credentials system. As Back describes the innovation: “Hashcash was originally proposed as a mechanism to throttle systematic abuse of un-metered internet resources such as email, and anonymous remailers in May 1997. … The hashcash CPU cost-function computes a token which can be used as a proof-of-work.”**

§3.423 — Rather than offering another piecemeal response to some particular spam problem, Back’s solution looks more like an attempt to fix the Internet, or even more than this. Hashcash tackles the spam problem at its source (cheap signs). Rather than defensively fending off ever more cunning spam intrusions, it enables a positive signal that someone has taken the trouble to communicate this, with the ‘trouble’ being attested by proof-of-work certification. This solution can be seen as a basic filter. It works as an admission pass, rather than a policing operation. The cost of duplicity is raised at the root, which involves the DSP being grasped as the root.

§3.424 — The very name ‘hashcash’ attests to the realization that proof-of-work certification is self-monetizing. Evidence of effort – when this is pre-formatted as a signal of commitment – has intrinsic potential value, independent of its application. A currency is initated automatically, and all that remains is the process of price-discovery. Bitcoin provides a framework within which this process can occur.

§3.43 — However tempting it might be to construe proof-of-work as an algorithmic reprise of the labor theory of value (an LTV 2.0),*** it is not from political economy that Bitcoin derives its sense of ‘work’ – unless by extraordinary circuitousness – but from computer science. The work to be proven, in the validation of a block and associated currency issuance, is performed by a CPU in the course of a mathematical puzzle-solving exercise, and demonstrated through successful execution of a computational task. It is the final measure – beyond which no appeal is possible – of the contribution made by any node to the running of the network. Such work is probabilistic, rather than deterministic. There is no application of computational effort that can strictly guarantee reward. The work required of the miner is persistence in pursuit of a low-probability outcome, through repeated trials. It is both structurally and genetically related to a process of stubborn cryptographic attack – ‘hacking’ in its colloquial, though not traditional, sense – and also to a grueling search for success in a lottery-type game of chance.

§3.431 — Proof-of-work is accomplishment at a test, which can then be employed as a key. In the case of Bitcoin, it simultaneously ‘unlocks’ new bitcoins and casts a ‘vote’ that counts towards the consensual updating of the blockchain. Incentive and service are nondecomposably married. Optimal functionality is achieved by making the content of the test entirely meaningless. It serves as a demonstration of brute force (trial-and-error) computation, inherently resistant to rationalization, and thus irreducibly arduous. It is not a test of cognitive achievement, in any general or sophisticated sense, but solely of computational effort. Its sole ‘significance’ is its difficulty. Despite the obvious risk of anthropomorphism, it might even be described as an ordeal, or – less dramatically – as a trial, unambiguously demonstrating commitment.

§3.432 — Would it not be preferable to have this ‘work’ also (i.e. simultaneously) applied to a problem of intrinsic value?**** In its most positive formulation, this question has been a stimulus to altcoin differentiation. Anything other that mining might do, beside sheer block validation, seems to indicate an unexploited seam of surplus value. Such suggestions are strictly analogous to a recommendation that gold prospecting be bound to valuable activities of some other kind (such as fossil hunting). On the basis of fundamental economic principle, they merit the most vigilant suspicion, since they amount to a deliberate confusion of cost calculations, promoted in the name of a superior – or at least supplementary – utility. Yet however much the costs of mining are strategically muddied – and in fact, in some complex fashion, cross-subsidized – they still need to be unmuddied, to exhibit an economically-intelligible commitment. Mining investment is a signal, which cannot be dissolved into extraneous purposes without destruction of critical information. To whatever extent bitcoin miners are generating bitcoins by accident, is also the degree to which their contribution to Nakomoto Consensus, or block validation, is devalued. The perfect pointlessness of bitcoin generation procedures – for anything other than Bitcoin system consolidation (as remunerated in bitcoins) – is a feature, and not a bug. Cybernetic closure, or self-reference, is its own reward, and it is only as such that it acquires distinctive monetary characteristics. As always within the terrain of auto-production, this is the inescapable abyss, or principle of immanence. The self-propagating circuit has no ground beyond itself, and can only be impaired by the attempt to provide one.

* ‘Spam’ invites a very general definition as the spontaneous expression of digital economics (or near-zero cost communications). The Wikipedia article on email spam makes the point well: “Since the expense of the spam is borne mostly by the recipient, it is effectively postage due advertising.” The Internet Society attributes the term to the celebrated Monty Python comedy sketch depicting the widely-derided tinned meat as “ubiquitous and unavoidable”. Estimates of the cost of email spam vary wildly, with the high-end figures reaching over US$100 million annually, for US businesses alone, by the early years of the 21st century. Global spam volume in 2011 is thought to have exceeded 7 trillion messages. The illusion of costlessness is illustrated starkly by the phenomenon of spam, through the revelation of an unanticipated trade-off. Whatever is free is abused. If an activity with discernible externalities can be pursued without definite commitment, it tends to produce a tragedy of the commons (see Chapter 3). Microscopic private utilities within a zero-cost matrix generate an explosion of informational pollution. Abundance theories, therefore, have special cause to be intellectually disturbed by the phenomenon. Spam is a toxic Cornucopia.

** See: ‘Hashcash – A Denial of Service Counter-Measure’ (2002).

*** Proof-of-work as a foundation for commercial value echoes a theme that has reverberated through the tradition of political economy. It leads, by scarcely-resistible digression – onto an associated track of exceptional historical richness, which is the analysis of value-creation as work, or labor time. From Smith to Marx, this has been a conceptual commitment that closely coincides with classicism in economic theory. Subsequently, the power of the marginalist – and especially Austrian – analysis has tended to entirely overshadow the intellectual labors of the objective value theorists, and even to topple them into derision. Marginalism threw its political-economic precursors into eclipse due to the evident superiority of its transcendental foundations, even if this articulation of its success found no corresponding acceptance within professionalized economic study. From the critical perspective, the objectification of the (subjective) negative utility of effort can only appear metaphysical. Its historical supersession, in this regard, follows predictably from its essential – and rigorously intelligible – error.
It would be unfortunate, nevertheless if a type of Whig-historical triumphalism were to obliterate all understanding of the genuine theoretical insight now entombed with the Labor Theory of Value (or LTV). Most basically, the LTV already incorporates an important critical-subjectivist insight. This can be stated in different ways. Conceived in terms of power, it is the recognition that money does not primarily overcode inanimate resources, but rather represents a distributed system of command (though one rendered inconspicuous by its intrinsic exit-option). Wealth is bound by exchange equivalences to static assets only because it quantifies a capacity to direct activity. It crystallizes compliance. The ‘normal’ economic evolution in the direction of services makes this reality explicit, from the side of consumption. An analogous subjectivization is recognizable in regards to utility (value). Commercial incentives – including those at work in the labor market – can be theoretically systematized as an economization of effort. The value of a possession is the incarnated advantage of no longer having to struggle to obtain it. The critical reversal here is blatant, and crucial. Within the Marxist tradition it is understood as the disillusioning of a fetishization. It is not the thing, but the difficulty of its acquisition, that establishes the foundation of its value. This is an insight, of course, whose foundations were solidly established by the earlier classical economists, Ricardo most notably. The LTV, it can be seen, is a critical relief from naïve objectification, even if it is also – in well-understood respects – a perpetuation of it.
When pursued in detail, however, whether as a matter of economic theory or industrial practicality, fixing the relation between time and work has proved daunting. Quantification of work on the basis of standard time units exhibits an obvious dependency upon chronometric technology. This, in itself, suffices to identify the topic as distinctively modern. Beside comparatively accurate time-measurement, the practice of compensated labor time also requires some adequate degree of work monitoring. It is necessary to know both how much time is spent working, and that this time is spent working. In practice, these requirements have been understood as demands for oversight, regardless of the ideological characteristics of the industrial regime in question. Solutions to the informational problems of work monitoring have been institutionalized within the factory organization of production, integrally, originally, and necessarily. Such systematization of proof-of-work within an anthropological context reaches its most remarkable expression within the methods of Taylorism, which from a theoretical perspective is only an elaborate social hack. The time-and-motion analysis required for ‘scientific management’ is stubbornly intractable to political-economic abstraction (of a kind sufficient for rigorous conversion into monetary quantities), and no practical advance of conceptual significance has occurred subsequently. Immanent proof-of-work, despite its supposed manifestation in the commodity – as exchange value – eluded both ‘bourgeois’ political economy and its socialist critics. The production of measurable (human) labor time has proceeded alongside its theoretical analysis, within a semi-parallel, partially interactive, historical dynamic. This is investigated, within the tradition of Marxist historical sociology, by E.P. Thompson, in his essay on ‘Time, Work-Discipline and Industrial Capitalism’. He is meticulous in noting that “a severe restructuring of working habits” has been practically inseparable from the relevant “changes in the inward notation of time”. That theorization has not proceeded in this case from the inside out, is the critical historical materialist insight. Labor time was a distributed, experimental, piecemeal process, before it was a political-economic conception.
Karl Marx’s maxim for socialist compensation “to each according to his work” achieves an ironic actualization in the Bitcoin reward system. All power to distributed hashing capability!

**** In the words of Nick Szabo: “The secret to Bitcoin’s success is certainly not its computational efficiency or its scalability in the consumption of resources. Specialized Bitcoin hardware is designed by highly paid experts to perform only one particular function – to repetitively solve a very specific and intentionally very expensive kind of computational puzzle. That puzzle is called a proof-of-work, because the sole output of the computation is just a proof that the computer did a costly computation.”
“Very smart, but also very wasteful,” is the way one exemplary critic describes the proof-of-work concept. “All this computer time is burned to no other purpose. It does no useful work – and there is debate about whether it inherently can’t do useful work – and so a lot of money is spent on these lottery tickets. At first, existing computers were used, and the main cost was electricity. Over time, special purpose computers (dedicated processors or ASICs) became the only effective tools for the mining problem, and now the cost of these special processors is the main cost, and electricity the secondary one. … What this means is that the cost of operating Bitcoin is mostly going to the companies selling ASICs, and to a lesser extent the power companies.”
Resonantly: “In mid January 2014, statistics maintained at showed that ongoing support of Bitcoin operations required a continuous hash rate of around 18 million GH/sec. During the course of one day, that much hashing power produced 1.5 trillion trial blocks that were generated and rejected by Bitcoin miners looking for one [of] the magic 144 blocks that would net them $2.2 million USD. Almost all Bitcoin computations do not go towards curing cancer by modeling DNA or to searching for radio signals from E.T.; instead, they are totally wasted computations.”
Crypto-currency pioneer ‘Wei Dai’ is emphatic about the importance of teleological purification to efficient proof-of-work schemes: “The only conditions are that it must be easy to determine how much computing effort it took to solve the problem and the solution must otherwise have no value, either practical or intellectual.” [My emphasis.]

Crypto-Current (033)

§3.3 — Grasped abstractly, the most powerful functional innovation of the Bitcoin protocol is the binding of currency issuance to the servicing of system integrity, which twists the process into a consistent circuit. It is this loop that enables the protocol to achieve autonomy, or – in a reflexive articulation – self-reliance. Because industrial incentives cover all regulatory requirements, self-reproduction is embedded within the process of bitcoin production. The protocol makes it impossible to produce bitcoins without automatically policing Bitcoin. Primary wealth extraction cannot take place without verifying transactions – through the validation of blocks – and thus tending the system as a whole, consistently and comprehensively (as if with an invisible hand). Stated succinctly, Bitcoin instantiates immanent economic government.

§3.31 — This auto-productive economic security circuit is evidence for the fundamental integrity of the Bitcoin blockchain. Currency and distributed public ledger are a single functional system, with neither making coherent operational sense without the other. This is a point made with exceptional cogency by Bitcoin commentator ‘Joe Coin’:

Given the crucial requirement to preserve decentralization, the problem Satoshi had to solve while designing Bitcoin was how to incentivize network participants to expend resources transmitting, validating, and storing transactions. The first step in solving that is the simple acknowledgement that it must provide them something of economic value in return. … The incentive had to be created and exist entirely within the network itself … any instance of a blockchain and its underlying tokens are inextricably bound together. The token provides the fuel for the blockchain to operate, and the blockchain provides consensus on who owns which tokens. No amount of engineering can separate them.*

§3.32 — The threshold crossed here is both subtle and immense. Retrospectively, it will have been almost nothing, since the techonomic circuitry it invokes was – now demonstrably – already the operational principle of modern civilization (capitalism). It is only through Bitcoin, however, that the essential techno-commercial integrity of capitalism is brought into crisp focus, and extracted from speculative debate. When the machine is theoretically apprehended, ‘holistically’, as a real individual – or, far more consequentially, implemented as such – neither its technical nor its economic ‘aspects’ can be diverted into transcendence, or contingency, as extraneous, mutually-independent factors. Incentives are inherent to the machinery.** In a sense more complex – and involving – than anything the harsh paradox of the term immediately communicates, Bitcoin is a purposive mechanism. The conclusive action of the Bitcoin system – block validation – which seals each cycle of its reproduction, is a non-decomposable teleo-mechanical step (a diagonal escalation, or transcendental synthesis). It is industrialism, the mechanizing market, distilled to a previously unrealizable quintessence.

§3.33 — ‘Capital’ means – simultaneously and indissolubly – technological assets (machine-stock) and comparatively illiquid money (investment). Between these twin aspects there is only formal (and not real) difference. Their real integrity is demonstrated by techonomic machinery. The economic analysis of capital is diverted through technology, since wealth cannot be grasped substantially except in its cycle through productive apparatus, but technological analysis is drawn, reciprocally, into economics by the integration of rewards into the machine. At the level of philosophical reflection, under the cognitive conditions inherited from its mainstream European traditions, such techonomic integrity is difficult to hold together. To fuse mechanical causes with behavioral incentives in a techno-strategic assembly is to meld registers that have been determined as mutually inconsistent since antiquity.

§3.34 — Techonomic apprehension runs into a direct collision with the commanding dualism of the modern mentality, by insisting upon a re-animation of the compact between efficient and finalistic action. According to the complacent tenets of the new (or ‘enlightened’) cultural settlement, based upon the drastic demotion of scholasticism and its displacement by a substitute theo-scientific division of labor,*** the bridge from mechanism (cause-effect) to teleology (means-end) had been definitively dismantled. Each was henceforth to be compartmentalized within a distinct, wholly independent dimension. Their sole residual relation was orthogonal (or demarcated). The realms of directed liberty, and of instructed mechanism, were to be perfectly isolated from each other, and mutually withdrawn beyond all possibility of reciprocal interference. In this arrangement was to be founded the modern peace, of no lesser consequence than that of Westphalia, and something close to a genuine social contract. Through it, an amoral techno-science was co-produced beside an agnostic politics. Two complementary templates for expertise arose, each pledged to silence in the house of the other. This compact has been at once the condition for the gestation of an autonomous industrial power, and – on exactly the same grounds – an obstacle to its cognitive digestion. With the surfacing of the concealed techonomic entity, it buckles, loses coherence, sheds explanatory credibility, and undergoes accelerating social desanctification. Modernity’s axial, though predominantly inexplicit, concept of the mechanical instrument – whose self-contradiction had been concealed as if within a collapsed dimension – escapes its bonds and re-emerges to break the basic categories of Occidental thought. That is where we are now.

§3.35 — The intellectual crisis stimulated with ever-increasing intensity by techonomic escalation (that is, by capitalism, or efficient critique), has fertilized a luxuriant foliage of ‘deconstruction’. Yet, the untenability of orthogonal conceptuality does not necessitate a subsidence into cognitive dilemma, or aporia. Even when the problem is restricted within the narrow bounds of its philosophical formalization, it opens a positive path – pursued since the inception of the process – into diagonal action, or individuation. It is surely implausible to decry as ‘unthinkable’ what has been demonstratively operationalized. Bitcoin attests to such a process with each cycle of block validation and Nakamoto Consensus. The process demands something structurally and functionally indistinguishable from transcendental philosophy, insofar as it is to be constituted – even very approximately – as a coherent object of thought. What it makes of this ‘philosophy’, however – as it pushes through upgrades into successively ultra-radicalized immanentizations – is rarely self-advertized as such. What it apparently offers, instead, is ‘technology’ – a term that is a near-exact synonym for ‘instrumental mechanism’, and one that undergoes comparable internal schism, across the same conceptual rift.

§3.36 — In any approach to the techonomic entity – plotted as if from outside – the notions of emergence (or individuation), diagonal process, teleo-mechanical causality, integral nonlinearity, and transcendental escalation begin to exhibit a general inter-substitutability. All of these things, among many others, are convertible by simple transforms into immanentization, or the real operation of critique. An efficient side-lining of pseudo-transcendence – achieved by way of a dynamic flattening – is the reliable signature of the trend. The solution to the DSP is a diagonalization.

* Source. The importance of this argument is almost impossible to over-estimate.

** A (2014/10/29) tweet by Balaji S. Srinivasan describes the diagonal succinctly: “Bitcoin allows algorithms to act on incentives.”

*** That which is settled by the formalization of techno-political compartmentalization is, of course, the great war of religion that inaugurates European – and thus global – modernity. In a way still stronger than that outlined by Max Weber in his The Protestant Ethic and the Spirit of Capitalism (1905), self-propelling industrialization coincides with a break from the Catholic civilization of the West. The consolidation of an immanent techonomic principle (‘growth’ or positive cybernetic nonlinearity) presupposes a drastic contraction of the sphere of ecclesiastical cultural authority. Capitalism is that, by essence, which is not answerable to anything beyond itself. Its incremental actualization, therefore, presupposes social fractures, from which superordinate moral agency has receded. Among the major civilizations of the world, only Europe – under the impact of Reformation – realized this condition during the early modern period. A broken religion is a basic requirement of modernity, which Protestantism pioneered uniquely. (The work of David Landes explores this catalytic dissociation in detail.) Modern social institutions thus formalize and entrench a disconnection between what is and should be. Science is freed, in principle, to tell ugly truths. Engineers are freed to devise machines whose purposes the uncontaminated dynamic of capital accumulation alone dictates. Modernization calls for nothing other than this. The division of labor, or authority, between (traditional) religious doctrine and (modern) techno-scientific investigation is philosophically consolidated into the distinct spheres of practical and theoretical reasoning (to employ the Kantian vocabulary, as concretely instantiated in the topical differentiation between the first two Critiques). In very recent times, this enduring demarcation is faithfully reproduced – without notable modification – by Stephen J. Gould’s conception of Non-Overlapping Magisteria (NOMA), which divide religion and science, values and facts, in the same way, and with the same crypto-political emphasis upon jurisdictions. Given the historical status of this argument, as a near-perfect restatement of the original critical settlement, laid down in the final decades of the 18th century, it is surely extraordinary that Kant is nowhere mentioned in Gould’s essay.

Crypto-Current (032)

§3.2 — The Bitcoin paper consists of twelve short sections, including an introduction and conclusion. It is compressed to a minimal summary at this point, although discussed in pieces throughout the book, and rehearsed at slightly greater length in the first appendix. The emphasis here is critical, oriented – as is the paper itself – to the dissolution of the DSP, and thus the construction of a plane of transactional immanence, from which all transcendent elements (or “trusted third parties”) have been evacuated. The transcendental argument of the Bitcoin paper runs as follows:

§3.21 — The “trust based model” is expensive, socially frictional, and vulnerable to fraud. To overcome these problems, Bitcoin proposes the substitution of “cryptographic proof” for “trust” (which is to be obsolesced by irreversible crypto-commitments). The elimination of trust-based mediations reduces transaction costs. The system remains resilient in the absence of oversight, so long as a predominance of applied “CPU power” is controlled by “honest nodes”.

§3.22 — An “electronic coin” is defined “as a chain of digital signatures”, which is equivalent to “a chain of ownership” (this is described later, in the conclusion, as the “usual framework” for crypto-currency construction). The elimination of the need for a “trusted third party” (or “mint”) requires that transactions be “publicly announced” within a system that enables “participants to agree on a single history of the order in which they were received”.

§3.23 — Bitcoin’s synthetic history draws upon established procedures for digital time validation, using a timestamp server to chain its hashed blocks in succession. “Each timestamp includes the previous timestamp in its hash,” constructing an artificial history as a robust series of envelopments – or ordered swallowings – “with each additional timestamp reinforcing the ones before it.”

§3.24 — The timestamped blocks are secured against tampering by proof-of-work locked hashes.* Such irreversibility is at once a deployment of cryptographic asymmetry, a consummation of contractual integrity, and a realization of (time-like) successive order. Notably, it is isomorphic with a thermodynamic – or statistical mechanical – gradient.

§3.25 — The network reproduces itself through a six-step block creation cycle. Since nodes “always consider the longest chain to be the correct one”, synthetic history, as an ordinal-quantitative variable, functions as a (selective) ontological criterion. Accepted blocks provide the building material for the subsequent cycle of network reproduction.

§3.26 — Bitcoin builds incentives into its infrastructure. Nodes are automatically compensated for the work they perform maintaining the network through the issuance of new coins. The system thus attains techonomic closure. The horizonal finitude of the Bitcoin money supply necessitates an eventual transition to payments based on transaction fees. Well-organized incentives also fulfill a security function, by motivating potential attackers to support rather than subvert the network.

§3.27 — Blocks can be compressed to economize on memory demand by pruning Merkle Trees. Moore’s Law is invoked as a realistic projection of exponential decline in digital memory price over time, moderating the requirement for information parsimony.

§3.28 — Further economy is offered by a payment verification short-cut (involving a modest sacrifice of security in exchange for added convenience).

§3.29 — Bitcoin transactions contain multiple inputs and outputs, to facilitate the integration and disintegration of coins during transfers.

§3.291 — Bitcoin radically adjusts the structure of transaction privacy. Rather than drawing a curtain of obscurity between a transaction and the world, in the traditional fashion, it nakedly exposes the transaction to public scrutiny. The new line of concealment is drawn between the transactional agents and their off-line identities, at the precise boundary of the commercium, therefore, and no longer within it. Secure masks are proposed as the new basis of privacy protection, coinciding with the anonymity of public keys.

§3.292 — The prospect of a successful attack upon the blockchain diminishes exponentially with the addition of “honest” blocks. An attacker therefore has a window of opportunity, which closes at a rate based on the block-processing capacity of the network.

§3.293 — The conclusion, summarizing the entire argument, is a masterpiece of lucid intelligence. (It is reproduced in its entirety in Appendix-1.)

* Adam Back’s Hashcash system (1997) provides the model. The use of a proof-of-work test – earning a Hashcash stamp – to eliminate spam by pre-emptive vetting of costless messages, contributes a solution of equal efficacy against DoS (denial-of-service) attacks. See subsequent discussion in this chapter.

Crypto-Current (031)

§3.1 — In its attachment to the principle of pure economic theory, fastidiously intolerant of even nominal political compromise, Bitcoin is an experiment in Austrianism. When allowance is made for its abstraction from metallic coinage, it is Mises as operational code. While the fact that Bitcoin is happening is radically novel, necessarily, because it can only now take place – in the age of public key cryptography and proof-of-work credentials – what is happening is not new at all, or at least, the monetary model that Bitcoin implements in software is not. In the words of Pierre Rochard: “As Bitcoin adoption increases we will finally be able to ‘empirically validate’ what Austrians have been arguing for decades: 100% reserve banking with a scarce medium of exchange prevents speculative manias, financial crises, and economic depressions”.*

§3.11 — Yet, while the offense to hard-money economic philosophies presented by inflationary fiat currency – which has nourished Austrian criticism since the 1930s – continues to feed support into the Bitcoin project, its central role has been displaced by, and subsumed into, the formulation of the DSP. Discretionary state money-printing is only one special case of the far more general economic incredibility of signs. Technological, rather than political-economic dynamics, have played the decisive role in bringing this problem to its point of productive crisis.

§3.12 — Even if digital dematerialization is only ever an approximation, its economic consequences are concrete, and drastic. Since the ‘materiality’ of any product tends to operate inertially, dampening proliferation, the attenuation of materiality corresponds to a process of acceleration. Exponential decline in information costs, as captured by Moore’s Law,** implies informational explosion. The trend corresponds to a second (and numerically tractable) sense of the ‘Californian Ideology’ war-cry: “information wants to be free”.*** If the concept of ‘liberty’ is irreducibly hazy and controversial, while also prone to irresolvable metaphysical complications, that of cost suppression is definite and quite precisely accountable. Evidently, the preservation of scarcity under conditions of digital instantiation is a peculiar challenge, for the obvious reason that electronics enables the replication of perfect copies at near-zero cost. Prior to the theorization of this problem in monetary terms, it had been noisily exhibited by disputes over the digital ‘piracy’ of media products, corresponding to an unprecedented practical crisis in the regime of intellectual property.

§3.13 — The final (or near-final) subtraction of substantial expense from money production is conceptually clarifying. It prompts – or sharpens – the demand for a solution to the central problem that has haunted money since its beginnings. Once the proliferation of signs is freed from all serious inhibition, semiotic tokens of scarcity are catapulted into a climax state of vulnerability, and the DSP is exposed with unprecedented starkness.**** It is here, at the furthest antipodes from metallic commodity money, that a peculiar folding – into simulation – restores the gold model to a central position in monetary theory, and, more consequentially, money production.***** It is precisely because Bitcoin no longer represents gold, however indirectly, that it is able to simulate gold, with such extreme (abstract) fidelity that it can be said – persuasively – to exceed gold in its most relevant monetary features, including even that of scarcity, alongside communicability, divisibility, and verifiability. As a simulation, Bitcoin necessarily produces an artificial substantiality in the course of its solution to the DSP, and ultimately as its solution. The critique of duplicity is indistinguishable from an ontological experiment.

§3.14 — The DSP originates from a ‘fact’ so basic that it crosses from the order of (empirical) actuality into that of (transcendental) principle: Signs are cheap. To substitute a sign for a thing, a signification for a demonstration, is an economization. It is commonly said ‘that is easy to say’, and – relatively speaking – it is. At the first-order level of cynical amorality – or of pure game-theoretic rationality – it pays to break promises, which cost so little to make, and yet may be arbitrarily expensive to keep. This alone suffices to suggest why there cannot be signs without an implicit problem of trust. The consequences are double-edged. Economization of any kind – getting the same for less, or more for less – is positively adaptive (or selectively promoted) to such an extent that evolutionary processes are indistinguishable from the formation and transformation of codes. Inherent to the economy of code, however, is a vulnerability to exploitative messages, which seize upon the exorbitant efficiency of the sign as a resource (or meta-resource) to be appropriated. Genetic code invites virus. Zoosemiotics invite mimicry.****** Linguistic expression invites deceit. Money invites the DSP. The sign is co-emergent with duplicity.

§3.15 — Bitcoin’s solution to the DSP is the blockchain, or ‘public ledger’ – a decentralized record of transactions which selects-out all non-original (or duplicitous) payments. Only the first instance of any bitcoin deduction from an account is validated, and preserved. All duplicate payments – cases of double spending – are edited out of the blockchained reality-record, automatically, through rejection of those inconsistent blocks in which such defects occur. Simply by protecting itself against splits – or forks – the blockchain constitutes a consistent plane of Being, upon which any particular being can be what it is, and nothing else instead, or besides. Positive absence of duplicity is thus an efficient ontological criterion, or selective principle. The blockchain is pre-determined to construct reality in such a way that fraudulence will not have taken place. That alone remains real which is consistent with the integrity of identity-money, or potential value.******* Only the non-duplicitous will have really occurred, as perpetually re-evidenced by the synthetic past that is reproduced on the blockchain, as a consistent artificial memory, endorsed by Nakamoto Consensus, beyond which no superior tribunal can in reality exist. The blockchain is demonstrably capable of making itself real. In this way it departs from all merely conceptual or ideological assertions of ontological grounding, while implicitly dispensing with the political superstructures through which such assertions are concretely propagated. The reality criterion it introduces takes the form of an automatic – which is to say non-negotiable – law. The force of this law is derived from what can be, rather than – directly – from what is, or what ought to be. There is no double spending on the blockchain because there cannot be.********

* See.

** While, strictly, Moore’s Law (initially proposed in 1965) concerns only transistor-density, it has come to serve as a general proxy for exponential trends in technology, and especially in electronics. The centrality of integrated circuits to the entire info-tech ecology ensures that Moore’s Law, even in its narrowest sense, projects a development curve of huge – and expanding – scope. In large part due to this, it is a predictive principle that lends itself to abstraction and generalization. (Ray Kurzweil’s ‘Law of Accelerating Returns’ or ‘LOAR’ is exemplary in this regard.) Under the name of Moore’s Law, the self-exciting circuit is established as the central model of techonomic process. It thus provides a kind of theoretical shorthand, enabling the widespread promotion of schematics for an ultramodernist meta-sociology, based upon the doubling-period, with accelerating variation as the sole constant. The nonlinearities propelling it include its own feedback into the processes it describes, as a ‘road-map’ – or, more accurately, a schedule – setting the pace of improvement in relevant technological specifications. Exponential technological improvement is normalized, and accepted as a benchmark. Acknowledgement of the trend becomes a causal factor in its own perpetuation. (Theory-practice orthogonalism is diagonalized.) In its loosest invocation, it corresponds approximately to run-away techno-commercial deflation. Macroeconomic capture of industrial deflation is the principal political-economic story of the Keynesian epoch. Capitalistic surplus is ‘nationalized’ through currency issuance. The imperative to ‘fight deflation’ – inspired by Great Depression mythology – lends this process of systematic appropriation a perverse moral dignity. Automatic valorization of money – through capital (or ‘total factor productivity’) improvement – is compensated by centralized monetary management, benchmarked to price stability. Within this epoch, therefore, Moore’s Law describes a process of systematic economic expropriation, by monetary authorities, of those gains from advances in industrial productivity that would otherwise be distributed spontaneously to consumers (by falling prices, i.e. deflation). Electronic money reverses this tendency.

*** According to Wikipedia, the slogan is attributable to Stewart Brand, uttered in a remark at the first Hackers Conference, in 1984. Whatever the utopian suggestion that might have been heard in this slogan, it would eventually be drowned out by the dark counter claim: It is the destiny of any open near-zero-cost communication system to be spammed into dysfunction.

**** The commercial value of any transaction depends upon its exclusivity, which opens directly into questions of identity. The idea of a ‘digital signature’ – a very closely-related pseudo-paradox – binds identity and value to the suppression of fraudulent duplication. To repeat Satoshi Nakamoto’s critical formulation (Bitcoin #2): “We define an electronic coin as a chain of digital signatures.”

***** In the words of the Bitcoin paper (#6): “The steady addition of a constant of amount of new coins is analogous to gold miners expending resources to add gold to circulation. In our case, it is CPU time and electricity that is expended.”

****** Among the most striking examples of specifically zoosemiotic parasitism are instances of Batesian mimicry (named after the naturalist Henry Walter Bates, 1825-92). Typically, these involve the adoption by a non-toxic species of markings indicating toxicity, and thus an evolutionary strategy of free-riding upon acquired, and broadcast, unpalatability. Bates discovered the phenomenon, after noticing the remarkable similarity of coloration in certain non-related butterfly species. The semiotic convergence, he theorized, was driven by adaptive imitation. Signs ‘backed’ by poisons were easy to imitate, and thus allowed species to advantage themselves of the signal, while economizing on the original bio-chemical ‘message’. Such fraudulence, naturally, has its costs to the original, toxic species, who now find the signal communicated by their markings diluted. A process of semiotic inflation begins to work itself out.

******* The language of ‘potential’ is rejected in the name of contingency by a recent variety of transcendental philosophy associated in particular with Quentin Meillassoux and (in its financial application) Elie Ayache. For these thinkers, the projection of possibilities – or probabilities – is itself a transcendent illusion, constituting a metaphysics that is subject to critique. We are unable to follow Ayache into an employment of critique that ventures without discernible hesitation into the hyperbolic, in that it construes market pricing as simply incalculable (and even, on the inverse face – where it is theoretically captured as a stroke of ‘writing’ – as something close to a divine power). Pricing discovers nothing within the Ayache account, unless its own status as a truly sovereign decision, coincident with the genesis of being (the ‘event’). ‘Potential’ is used here in its physical sense – potential energy and ‘potential difference’ (voltage) – which is to say, real tension, or capacity (for work). Insofar as the concept of disequilibrium is ‘flattened’ by that of contingency, the consequence is massive information destruction. Potentials exist (virtually) prior to their probabilistic formalization. They are not epistemological productions. Followers of Elie Ayache, who can be expected to balk at this modal vocabulary, are also unlikely to find their concerns assuaged by the mere assertion that it is only derivatively related to probabilistic models, while primarily referring to something else entirely, namely free energy, or productive capability, as designated (and quantified) prior to its actual employment or consumption. Statistical mechanics – even in its abstract conception and its far-from-equilibrium application – provides the bridge between the science of probabilities and the capacity to do work. Crudely stated, abstract industrialism is here counter-posed to hyperbolic financialism, under the (post-duplicitous) sign of the machine. The industrialization of money, driven by Bitcoin, demonstrates a deep teleology very different to that manifested in the evolution of financial assets through ever higher sublimities of derivation.

******** Just, as for Kant, the causal consistency of nature is a matter of transcendental necessity rather than empirical fact, so the absence of double spending on the blockchain ‘follows’ inevitably from what the blockchain is. To understand the blockchain is already to know (as a matter of transcendental principle) that the DSP is thereby resolved.

Crypto-Current (030)

Bitcoin and its Doubles

§3.00 — The abstract to the 2008 Bitcoin paper reads (in full):

A purely peer-to-peer version of electronic cash would allow online payments to be sent directly from one party to another without going through a financial institution. Digital signatures provide part of the solution, but the main benefits are lost if a trusted third party is still required to prevent double-spending. We propose a solution to the double-spending problem using a peer-to-peer network. The network timestamps transactions by hashing them into an ongoing chain of hash-based proof-of-work, forming a record that cannot be changed without redoing the proof-of-work. The longest chain not only serves as proof of the sequence of events witnessed, but proof that it came from the largest pool of CPU power. As long as a majority of CPU power is controlled by nodes that are not cooperating to attack the network, they’ll generate the longest chain and outpace attackers. The network itself requires minimal structure. Messages are broadcast on a best effort basis, and nodes can leave and rejoin the network at will, accepting the longest proof-of-work chain as proof of what happened while they were gone.

§3.01 — Bitcoin coincides with a proposed subtraction. By dissolving the function hitherto attributed to a “trusted third party”* it realizes a flat network, in which all connections are P2P relations. Since the legitimating role of the third party – the extrinsic or transcendent element – is authentication of the originality of transactions, the network cannot be scoured of transcendence without “a solution to the double-spending problem”. The complicity of these twin goals is perfectly explicit: “What is needed is an electronic payment system based on cryptographic proof instead of trust, allowing any two willing parties to transact directly with each other without the need for a trusted third party.” Conceived as a project of political economy, this is what Bitcoin is.

§3.02 — The self-comprehension of Bitcoin, then, as already announced in the second sentence of the abstract, begins with the double spending problem – a concept so basic to all subsequent discussion that it demands abbreviation (to the ‘DSP’). Once again, a sub-division of the topic is in order. The DSP is (1) a highly-specific technical obstacle to the realization of any ‘trustless’ or decentralized digital currency, (2) a problem of extreme generality relevant to all monetary systems, roughly equivalent to fraudulence, (3) a re-formatting of the basic economic problem of scarcity for the epoch of Internet-based commerce, and finally – in its widest extension – (4) a crucial philosophical clue leading directly into the nature of the sign, and even re-founding rigorous semiotics (in the groundlessness of cybernetic self-reference).

§3.03 — Approaching Bitcoin as a solution to the DSP, in its narrowest and most functionally-critical sense, does not necessarily exhaust the significance of the protocol, still less its ripples of implication, but it undeniably comes close to capturing these in their essentials. It is in order to solve the DSP that Bitcoin innovates the blockchain, establishing – first in theory, then in implemented fact – the characteristic decentralization that defines it. Even features that are not, in principle, necessary to this solution, were in fact generated as more-or-less direct consequences of the approach that was selected to tackle it. For instance, while a simulation of metallism (and resultant rigid deflationary bias) is not strictly required for a blockchain-based digital currency, it follows as a matter of course from the way Bitcoin formalizes and resolves the DSP.

§3.04 — A prototypical form of the DSP afflicts even the very ‘hardest’ types of traditional money. Precious metal coinage can be debased through surreptitious adjustment of purity and magnitude – adulterated through admixture of inferior metals, reduced in size (by policy decision, executed through the mint), or attenuated through ‘clipping’ and ‘sweating’ (widespread practices of petty monetary fraudulence). The guiding strategy – or merely opportunistic tactic – in each case is that a certain quantity of gold or silver can be spent twice, traded as a sign, while economized as a substance. The fraudulent agent, whether government or private coin-clipper, exploits the door to duplicity inherent in the monetary character of the coin, according to which it operates as a sign of itself. The value of the coin has a double registry – the inscribed denomination, and the test of the weighing scale. Insofar as these two aspects of its worth can be prised apart, in some way that eludes convenient detection, precious substance can delegate its semiotic ghost to sustain the initial incarnation of its value, while taking on a second identity in another account. Insofar as money is a sign, the DSP shadows it, from its most primitive origin.

§3.05 — Paper money represents an immense semiotic liberation, and thus a corresponding accentuation of the DSP. The relation between inscribed denomination and metallic backing, no longer Janus-faced and intimate, yawns open. The distance is now bridged by an explicit promise, made in the name of a trusted authority, or monetary mediator. The immediate substance of the monetary sign – inked paper – is approximately worthless. The ‘paper’ of cash-money does not reference a commodity-substance, but a promissory vehicle. Value has been relegated to a word, in its contractual sense, historically consolidated through an evolution from the ownership titles (or ‘warehouse receipts’) issued by goldsmiths. Unsurprisingly, the ‘golden’ age of counterfeiting now begins. On the side of the legitimate monetary institutions and authorities, more exotic possibilities arise.

§3.06 — Even if fractional reserve banking – the principal financial business opportunity within a paper money economy – cannot be theoretically assimilated without reservation to the DSP, the resonances are, at the very least, uncanny. If not exactly spent twice, deposits are multiplied by lending.** If this function is formally reversed, in the customary manner, reserves (liabilities) are required only to cover a determinate fraction of loans (assets), which allows the latter to be inflated at a rate reciprocal to the reserve ratio. (A reserve requirement of 10% permits a ten-fold credit expansion from the ‘monetary base’.) Political permission for the multiplication of deposits can, therefore, be directly inferred from mandatory capital adequacy ratios (through simple inversion). The normalization of the practice marks a radical discontinuity in monetary history. From this point onwards, standard banking activity becomes the predominant source of currency issuance, and cash is fused inextricably with debt through the root mechanism of credit creation.

§3.07 — The rise of banking is inseparable from an eclipse of cash. Even in the popular imagination, money loses all association with a hoarded commodity, as it is re-embedded in an account, where it exists solely as a ledger entry. Henceforth, the reliability of money as a store of value is seen to rest upon nothing more substantial than the integrity of institutionalized accounting procedures, which would subsequently – and in turn – be made conditional upon higher-level (political-administrative) monetary management. Since the inception of the electronic age, the digital transcription of financial ledgers has accelerated the trend, fostering explicit, widely-publicized dreams of ‘the cashless economy’. Cash-money becomes an increasingly marginal sub-component of credit flow, progressively ghettoized among atypically frictional, trivial, or disreputable money transfers. From the perspective of financial macro-management, its final abolition would be a consummation.

§3.08 — Fractional reserve banking partially anticipates macroeconomic governance in the discretion it affords to money creation, but – in itself – it offers only the faintest glimpse of the new world that is arising. This systematic incorporation of Keynesian ‘animal spirits’ into the realm of government policy objectives, beginning – very tentatively – in the 1930s, and then ascending to dominance in the post-war world, completes the politicization of the economic sign. Money is now invested with mass psychological meaning, identified with a technocratically-accessible dimension of collective arousal, and economic sentiment becomes an explicit object of administrative manipulation, through the money supply. The profundity of this development is easily under-estimated. In the era of macroeconomics, monetary policy is seamlessly fused with psychological operations, oriented to strategic public mood alteration or ‘demand management’ orchestrated with reference to an array of guiding concepts which are overtly attitudinal: ‘wealth effects’, ‘money illusion’, and ‘wage stickiness’ prominent among them. It is now the psycho-social propensities to save or spend that are to be theoretically reconstructed by the academic-administrative economic complex, with integral cynicism, on the functional analogy of pharmacological medicine. Economic and clinical therapeutics become increasingly hard to distinguish in principle, as they are differentiated only by their specific techniques of psychological intervention, and by the scales of their domains. In each case the (individual or collective) patient, vulnerable to ‘depression’, is subjected to expert treatment through the measured application of artificial ‘stimulus’. Feedback is provided by economic sentiment polling, designed to gauge business and consumer confidence. There is nothing metaphorical about any of this, except insofar as euphemism is called upon in the public presentation of monetary and fiscal objectives. Macroeconomic policy is – quite simply, and exactly – mass mind-control. As it is normalized, it sees ever less need to disguise the fact.

§3.09 — To remark upon a ‘double-spending problem’ at all in such a world, in which the very notion of intrinsic monetary integrity has been dissolved – with minimal remainder – into the politicized economy, replaced by the technopharmaceutical administration of financial dope, might easily seem comically (and no less tragically) Quixotic.*** It requires only the slightest deepening and darkening of perspective to see money, in itself, as a lost cause. No small part of the initial, catalytic excitement generated by Bitcoin is explained by this background of relentless hard money defeat.

* In his introduction to ‘The Dawn of Trustworthy Computing’, Nick Szabo describes the role of the ‘trusted third-party’ within the world’s existing electronic information infrastructure: “When we currently use a smart phone or a laptop on a cell network or the Internet, the other end of these interactions typically run on other solo computers, such as web servers. Practically all of these machines have architectures that were designed to be controlled by a single person or a hierarchy of people who know and trust each other. From the point of view of a remote web or app user, these architectures are based on full trust in an unknown ‘root’ administrator, who can control everything that happens on the server: they can read, alter, delete, or block any data on that computer at will. Even data sent encrypted over a network is eventually unencrypted and ends up on a computer controlled in this total way. With current web services we are fully trusting, in other words we are fully vulnerable to, the computer, or more specifically the people who have access to that computer, both insiders and hackers, to faithfully execute our orders, secure our payments, and so on. If somebody on the other end wants to ignore or falsify what you’ve instructed the web server to do, no strong security is stopping them, only fallible and expensive human institutions which often stop at national borders.”

** The influential Rothbardian tradition of libertarianism consistently denounces fractional reserve banking as systematic financial fraudulence, structurally indistinguishable from counterfeiting (or fake money production). Within this political-economic context, the attempt to differentiate the DSP from a ‘double lending problem’ (DLP) might easily appear unnecessarily fastidious.

*** The understanding of the market order as a Quixotic cause, in all its anachronism, is captured by the description of Ludwig von Mises as ‘The Last Knight of Liberalism’ in the title to Jörg Guido Hülsmann’s intellectual biography. The Austrian perspective, within which Mises appears so obviously to be a defender of the capitalistic principle in a post-capitalist world, is itself reflexively captured by the Quixotic framing it explores, and thus rendered scarcely legible by its own untimeliness and social peculiarity. The oddity of our world is captures by the prevalence of a political-economic denunciation that targets ‘neoliberalism’ – in which Mises is implicitly entangled as a triumphant voice. Not only utter defeat, then, but a subsequent ‘restoration’ as a representative of that by which one has been defeated. This is the dialectic as dark humor.

Crypto-Current (029)

§2.8 — Productive perpetuation of the critical tradition sets, as a preliminary task, discrimination between the necessities of transcendental philosophy and its contingencies.* Prominent among these latter is the temptation to philosophical anthropology, characterized most significantly by the identification of the human subject as the primary locus of time-synthesis. In this regard, the Bitcoinization of transcendental philosophy is direct, and drastic.

§2.81 — The time of the blockchain is absolute, non-geometric, synthetic, and intensive.** It produces a univocal order (sequence), and in the end does only this. Sequential ambivalence would make the double-spending problem intractable. Bitcoin teaches that a DSP solution cannot be less than absolute time. Bitcoin’s engine of selection is priority, primacy, or ordinal privilege – being first in line, or first past the post. Bitcoin mining is a race. Insofar as the winner of the race can be decided automatically – without controversy or irreducible relativistic complication – then sequential decidability is established in general. Philosophical modernization and the production of secure money are, at this precise point, indistinguishable, not only logically, but also ontologically, or numerically, through the singularity of their occurrence.

§2.82 — The most modest plausible interpretation of Bitcoin is that its tacit perspective replaces (a lost) absolute time. A stronger proposal is that absolute time is, with the blockchain, inaugurated. To articulate the thesis (more informatively) in reverse: The philosophy of absolute time anticipates the blockchain. In still other words, it retro-chronically depends upon it. Only in the blockchain does geometrically-irreducible arithmetic series find instantiation. Primordial time synthesis is henceforth something the technosphere knows how to do.

§2.83 — By the strictest conceivable (i.e. transcendental) principle, nothing beyond the blockchain has authority in relation to the blockchain, or could have. Were this not the case, a ‘trusted third party’, or organ of transcendent oversight, would remain operative, such that – reciprocally – the minimum conditions for the realization of Bitcoin would remain inaccessible. In other words, the Bitcoin protocol is transcendental because it is essentially beyond appeal. The idea of a superior tribunal is immanently nullified by it. Furthermore, not only is the Bitcoin blockchain transcendental, and thus unsurpassable, but also the model of the transcendental installed by the blockchain is itself unsurpassable. ‘The buck stops here’ in an ultimate definition. A certain ‘end of philosophy’ is thus reached. To argue otherwise is once again to propose an actual, or merely possible, court of appeal where there cannot, in principle, be one. There is nowhere to take a case against the blockchain and its statement of reality unless to a manifestly – i.e. effectively – inferior authority. All stubborn metaphysical commitments to the contrary case lack a realizable criterion, and can only regress to politics as a proxy. They might – and in fact will – be entertained, but no one will seriously bet upon them. Their enforcement requires escalating coercion, destined to reach levels that can only eventually prove impractical.

* This task is ‘preliminary’ because it is crudely (or pre-critically) formulated. Rigorization of critique coincides with the swapping-out of categorical understanding for diagonal conceptuality. ‘Necessity’ and ‘contingency’ – which apply solely to comprehensible objects – are patently ill-suited to the conditions of possibility for objectivity as such. An adequate formula could only be problematic, pseudo-paradoxical, and cryptic. It is expressed as a contingent necessity, or necessary contingency, compressed into a critical conception of chance. The philosophical work of singularity – as transcendental event – has obvious and extreme relevance. The religious evocations of an absolute occurrence do nothing in themselves to dispel its cognitive compulsion. Critique has to be some such thing, if it is in reality anything at all. As a matter of indissoluble principle, no mere datum of intellectual history can provide philosophical direction. Comparably, critique gropes towards self-apprehension as an essential accident (with accidental essence).

** To briefly recapitulate these characteristics, in order, the time of Bitcoin is:
(a) Absolute, by definition, since time-relativity is essentially identical to a double-spending predicament.
(b) Non-geometric, since succession cannot be adequately modeled in space, unless, tautologically, by designating a ‘time-line’ or ‘time-axis’ whose temporal intelligibility is strictly derivative. On any line, time-gradient can be referenced, but not described.
(c) Synthetic (rather than analytical), because innovation is intractable to anticipation, having the cryptographic structure of a trap-door function. Discovery process is irreducible.
(d) Intensive, because the order of succession is serial envelopment.

Crypto-Current (028)

§2.7 — Immanence is a selective principle (a criterion). Only consistency survives. Resolution of the double-spending problem means exactly this. When conceived lucidly, Bitcoin is simply critique. In other words, a formally-specified machine for dispelling metaphysics exists – and is running – now, under conditions promoting its intensive accumulation. In this regard, Bitcoin is the inheritor of Nietzschean ‘European Nihilism’ – or materialized critique in its unfolded, historical expression.

§2.71 — Negatively apprehended, nihilism corresponds to a ‘loss’ of transcendence. Some proposed – or (more commonly) merely accepted – higher order, culturally sustained by nothing of any greater security than a dogmatic metaphysics, slides into the abyss. It cannot be effectively defended. This is the most readily popularized narrative, adapted specifically to the dilapidation of Christian monotheism – the notorious ‘Death of God’. According to this construction, nihilism is a specifically world-historic mode of mourning. It corresponds to a disappearance of meaning, through loss of a referent previously revered as an indispensable exterior support (a vulgar God, or god-like powers, as attributed to the agencies of a state, or any other ‘trusted third party’ of sufficient dignity, such as a central bank). In this sense, nihilism abbreviates the collapse of transcendence, or the work of critique. Negativity is redoubled, first in the disjunction that determines ‘the beyond’ (transcendence), and then in its subtraction. Hence the cultural dull grief of a self-cancelation that can appear as less than nothing, such as that manifested in the stereotypical passage from theism to tedium. Yet the ‘meaning’ of nihilism is not exhausted by its depressive connotation. In its positive sense, nihilism closes a circuit. Rather than a registry of loss, it is a principle of sufficiency – even of ‘liberation’.* Certainly, and strictly, it is a production of independence, or autonomization, marked by a completion – or closure – that appears premature when referred to a bypassed element no longer presumed indispensable. The residual negativity of nihilism is then confined to the elimination of a dependency. It characterizes a relatively compact process that does not call upon anything beyond itself. Once again, the monetary example is to be preferred over the linguistic one. There is no backing. The remarked ‘loss’ of a trusted support is not distinguishable in reality from the discovery of an economical potency. The machine works without it.

§2.72 — Algorithmic governance subtracts discretion. It economizes government, in at least two senses. Government extravagance is formalized at the highest level of philosophical principle, and systematically eliminated through application of an economic criterion. The political element is determined practically – which is to say surgically – as superfluous cost. Antagonism, relative to an extant structure of authority, is intrinsic to the process, and essential to its positive nihilism. The point of critique is to kill stuff.

§2.73 — Bitcoin instantiates spontaneous (or apolitical) consensus, without authoritative central representation, escalating the intrinsic trend of the Internet. It manifests an aboriginal coordination between the elements of a multiplicity under conditions of simultaneity (or zero-communication). This is, of course, nothing more than an exceptional approximation to the ideal of a distributed system. But distributed systems do not spring into actuality from out of their ideal form. They have to be built. They have to and will be built, once their conditional ignition threshold is crossed. At the historical – i.e. ‘anthropomorphic’ – level, this inevitability is nothing other than Modernity, apprehended through its teleological structure, or defining gradient. That is why there is perhaps no pattern that more reliably characterizes the culture of Modernity than the rhythmic re-ignition of spontaneous order as a theoretical (and ideological) topic. The history of nihilism can be told entirely in such terms. There is always implicit reference to a subtracted overseer, whose removal defines the intensification of the process. “The death of God” provides the cultural allegory. Practical abolition of the State is set – from the beginning – as the horizon. A machine without metaphysics is anticipated by critique – but that takes time.

* Doublings, twinnings, and ambivalences are everywhere here. Christianity is at once that which falls into decrepitude, catalyzing the process of European nihilism, and the anticipatory dramatization of the death of God. Messianic religion is accomplished through its sacrifice. The response, within the Nietzschean text, couples morbid diagnosis of decadence to themes of exorbitant sufficiency. Decline (Untergang) glitches into an ‘overcoming’ – which is equally a shedding – through a selection mechanism summarized as eternal recurrence. “What is falling, that one should also push,” spake Zarathustra. Experiment in what one can do without. This is the undercurrent of austerity. In the libertarian traditions that preserve the basic orientation of classical liberalism, such a conversion of the negative is insistent. Negative rights, negative freedoms, and independence with emphasis upon the negative prefix are the whole of an economized positive program. Strip-out all superfluous axioms. Do without them. Between the elimination of metaphysics, and the positive modern philosophical program, there is no difference.