Crypto-Current (042)

§4.3 — To propose that the political controversy associated with Bitcoin can be expected to escalate in approximate proportion to the crypto-currency’s success, as quantified by its total market value, is unlikely to provoke feverish dispute. Yet such distribution concerns are comparatively trivial – at least at the level of political principle – when set alongside the questions of sovereignty that crypto-currency raise. Bitcoin is a limit strategy of depoliticization, which – at the cliff-edge of historical irony – announces an ultimate political contest. No stroke can be more intensely politicized than one threatening to sweep away a whole field of political decision-making. As a purely ideological challenge, therefore, Bitcoin organizes a terrain of political antagonism in advance, provoking (in reaction) a defense of politics of unprecedented conceptual purity. Crypto-currency self-regulation elevates the menace that has long-spooked left-articulated[1] political interests under the guise of the autonomization of capital to an almost parodic height. It ceases entirely to be answerable. Dialectic loses all purchase. This is widely grasped, even though the thing itself cannot be. Bitcoin is a game-changer.

§4.31 — It can easily become confusing to talk about games. An allusion to non-seriousness need not be a great problem – what, after all, is seriousness? More obfuscating is the invocation of rules. Insofar as a ‘game’ is thought to be essentially rule-bound, the train of associations is guided in a direction that is radically misleading. Games are defined by rules, but they are determined far more informatively, by the absence of rules than by their presence. The game occurs in the unruled area – and insofar as further subtraction of rules can be achieved, the game is thereby intensified. Rules set the boundaries of a game, but the strategies that compose the actual (or executed) game’s positive characteristics substitute for a rule. They are synthetic. There is a difference, therefore, between transcendent and immanent principles, or – more strictly – between rules of transcendent and immanent genesis. The former, determined in advance of ‘play’, set fixed parameters, or ideal competences, comparable to axioms and exposed in advance to analysis. The latter emerge – synthetically – from the performance of the game, as demonstrations, or discoveries. A game is always improved, qua game, when its set of transcendent principles is reduced, through conversion into immanent principles (or emergent outcomes). While there is a Bitcoin protocol, Bitcoin is not reducible to it. Bitcoin is rather the outcome of being ‘played’, in conformity with those rules that the protocol – firmly but non-comprehensively – establishes.

§4.32 — When games turn back upon their own rules, absorbing them into strategies as variable outcomes (of performances), they pass into politics, on their way to war.[2] It follows that politics makes itself difficult to talk about, since the analytical frame necessarily becomes a disputed frontier. If the games that matter were comprehensively structured by uncontested explicit rules, they would not be happening at all. There would be no field of contestation, and thus no contest. Loyalty to the game, as such, has become the axis of potential defection. Strategies marked as ‘cheating’ within a commercial context are elaborated, and acquire a very different self-representation, as resistance. To thus identify game-theoretic defection with social solidarity – or collective refusal – demands a thorough re-organization of meaning, and eventually nothing less than a cultural revolution.

§4.33 — A very brief digression into the articulation of politics – which is also the politics of articulacy – imposes itself at this point, beginning with the ‘structural linguistics’ of Ferdinand de Saussure,[3] a theoretical framework which attained a voguish authority over the politicization of commanding cultural institutions during the second half of the 20th century. In the non-STEM fields of the western academy, in particular, the influence of these ideas is difficult to exaggerate. A ‘structure’ in this specific theoretical sense is a system of differences – or significant discriminations – which distributes meaning within a hierarchy of contrasts. Its semantic atoms are produced through relations of reciprocal determination, most commonly represented by ‘binary oppositions’. Within an opposition of ‘A’ and ‘B’, ‘A’ is ‘not-B’ and ‘B’ is ‘not-A’ – and this exhausts their production of meaning, when extended across the concatenated differentiations of the linguistic totality. Saussure insists there are no positive terms. Signs acquire significance only through their distinctions, as these are applied to an intrinsically amorphous world, sub-dividing it into ‘signifieds’ – reciprocally demarcated plots or allotments of meaning.[4]

§4.34 — If we ask – in the degraded ideological mode (alert only to power-oriented, motivated reasoning) – why this type of linguistic theory rose to such an extraordinary position of cultural dominion, suspicion properly falls upon one particular assertive presumption: the arbitrary nature of the sign.[5] As the mantra of choice for a radically-generalized anti-naturalism, this doctrine lays down a welcome mat to politicization, and has thus contributed immensely to the self-consciously Gramscian reconstruction of the western academic humanities and social sciences during the mid- to late-20th century. What was being said about signs in this context and what was being done with them evidently interconnected, but only indirectly. The highly-formalized – and thus conveniently replicable – ideological signaling system that had been put in place by this cultural revolution established the rules of a new game, or rather, submerged those of the old one. The new dispensation was not to be predicated upon the formulation of protocols, but upon the management of faction. Friend-foe identification procedures rapidly attained uncontested authority, invigorated by blind convergence upon the essence of the political (as found in orchestrated in-group / out-group antagonism).[6] The practical economy was impressive, and in fact irresistible. To obstruct the process of identification was to identify oneself (as hostile), and thus to auto-eliminate the obstruction. In the name of an overturning of ‘privilege’ the new order of institutional-cultural meaning had privileged absolute, unchecked (or ‘arbitrary’) political discretion in the last instance, and through retro-projection. There had never (any longer) been anything but politics. The political collectivity alone decides, overcoming its alienation or false-consciousness in the dissolution of objective nature and complementary recognition of its own (naturally and traditionally) untrammeled power of reality-production. This commitment defines the Left, in its critically-coherent manifestation. It describes a generalized absorption into politics that the Gramscian (or Left-hegemonic) academy self-consciously facilitates, under the banner of the arbitrary or illimitably contestable sign. Across large swathes of the contemporary academy, such thinking has manifestly triumphed. Academic authority, and even the strictest kinds of academic credentialization, has been increasingly digested by it. Everything that organized intellectual activity touches upon is to be democratically challenged and policed, under the direction of the dominant – mass – faction, though (of course) through its institutional representatives. All values are resolved into ethico-political obligations, and thus submitted to inflamed moral struggle (recognizably post-theistic Protestant in type). The institutional consequences have been starkly evident. Polemic inflates along an axis of raw signal-amplification – which is finally shouting. When Bitcoin secures itself against voice, this is what it effectively succeeds at ignoring.[7] No shouting can conceivably be loud enough to perturb it. Like the crew of Odysseus, bypassing the Sirens, its ears are sealed. A certain strategic desensitization approaches its limit.

§4.35 — An overtly differential – and volatile – polarization of meaning follows upon the politicization of signs, when they are taken up as markers of organized antagonism, functioning as rallying points and signs of aversion, comparable to heraldic devices on a medieval battlefield.[8] The partisan adoption of linguistic signs, as ‘flags’, destabilizes them in peculiar ways. Structural determination is systematically aligned with faction, and subsequently warps in accordance with political vicissitudes. Ideological terms are driven into unusual migrations of meaning, through rapid twists, turns, and complicated zig-zags, which model – non-coincidentally – a dialectical process of development, in which everything becomes its opposite, on the way to absorption into totality. The word ‘liberal’ (and its associates) – for important reasons – provides the most remarkable case, pressed into crazed meanderings across nearly the entire field of political significance, even as this domain asserts itself as all-encompassing. ‘Liberal’ and ‘anti-liberal’ are terms that have evolved – or degenerated – to such a point that they have become near-perfect synonyms, serving only as indications of evanescent factional identification. The meaning of ‘federalism’ has undergone a comparable process of structural devastation. It is natural then, to expect the signs of pure political polarization to manifest an extreme degree of semantic instability, and this is precisely what we find with the words ‘right’ and ‘left’ (in their political usage).[9] Unsurprisingly, these words are regularly denounced – from all sides – as mere triggers for conflictual group dynamics, and as an invitation to intellectual chaos. The semiotic in-group / out-group rituals of micro-sociology have a far firmer grip on such terminology than that enjoyed by political philosophy. As is typical of political language, these words have become signals of group belonging, while only very secondarily preserving the capability to designate any definite grounds for the social or cultural categorization in question. Flagged allegiance (‘us’ or ‘not us’) swamps – and drowns – all positive meaning. Insofar as the intrinsic interests of philosophy are concerned, therefore, it is impossible to over-emphatically denounce the cognitive destructiveness of partisan identification. This is the basis for the admirable maxim adopted by the rationalist website LessWrong: “Politics is the mind-killer.”[10] It is also why the very idea of intrinsic philosophical interest has to be systematically derided from the side of politics (typically, as the mask for a hidden or crypto-politics).

§4.36 — If every discussion of money is vulnerable to corruption by politics, politics itself is pure corruption, at least from the classical liberal perspective that is re-animated in crypto-libertarianism (even if there are far more complimentary ways of expressing this point). Politics is the place where language goes to die, sacrificed – by necessity – to ulterior motivation.[11] The evidence of linguistic history could not be more unambiguous in this regard. Whatever is seized with partisan enthusiasm becomes – almost immediately – philosophically unusable. It is proposed here, therefore, that thinking the political spectrum through Bitcoin, is an approach with inherently superior prospects to the extant – and almost certainly doomed – alternative of attempting to conceptualize Bitcoin politics through a system of ideological articulations which has already been broken. Such an undertaking can only be impure, which is to say (at least) double. It cannot avoid assuming terms of contention, even while pursuing that which eludes them. There is a game within, and also over – or about – the rules. No simple denunciation ‘from without’ can suffice. Though pontification from a place beyond faction is a tantalizing ideal, it is also a transcendent pretense. We are lost in the world of games. There are no referees. Nothing could be more laughable than the claim to represent the voice of neutrality (only the blockchain – or emergent consensus – can do that).

§4.37 — If there is to be a double game, it has to engage those most inclined to defect in advance – or at least enough of them to sustain credibility as a site of unrigged competition. Bitcoin’s primary lines of absorption, then, are both predictable and comparatively easy to detect. It was along these paths of assimilation that it drew a supportive constituency into the germinal crypto-currency – as designers, miners, speculators, users and promoters – on the basis of pre-existing dispositions. In this regard, Bitcoin politics has been flavored by a number of supportive ideological themes, among which decentralization and deflation are by far the most emphatic. Both of these themes cater initially to the arch-liberal right, represented by classical liberalism, libertarianism and anarcho-capitalism, and affiliated to the right-wing antipolitics of economic autonomization, deregulation, disintermediation, distributed production of security, and – at the limit – algorithmic governance (or local political extinction). The decentered commercium, intrinsically secured against political intervention, is the incarnated ideal of arch-liberal order, and Bitcoin has been seized upon, in very substantial part, due to its conspicuous affinity with this social model. It has been adopted as a path to the realization of apolitical distributed governance, in compliance with the techonomic partial-teleology of a sovereign spontaneous order (oriented inherently to the programmatic dehumanization of power[12]).

§4.371 Decentralization is a highly-contested ideological term. Its alignment with ‘the right’ is pronounced, but nevertheless controversial. The existence of ‘left libertarian’ factions and affinities is the most obvious indication of its complexity.[13] It requires diagonal (critical) apprehension. As Bitcoin demonstrates, it passes between the global and the local, the integral and the disintegrative, at an oblique. A multiplicity (considered as a substantive) draws the same line, which every flat network is pulled onto. To promote decentralization is to multiply, cohesively, without tolerating the arising of super-ordinate nodes of unification with quasi-transcendent functions.

§4.372 Deflationism, as overt valorization of capital, is less ideologically ambiguous than decentralization. It directly aligns with property, and against politics, by seeking to exempt monetary signs from the domain of discretion. In order to defend money against fiat, its supply is either subjected to systematic constriction in accordance with counter-inflationary policy or, more radically (as with Bitcoin), deleted entirely from the list of policy-sensitive economic variables. Money is thus strengthened in its function of unilateral social command, as a super-political criterion, or economic reality signal cleansed of all interference. It becomes essentially policy-insensitive, against the predominant grain of 20th century political economy. Any crypto-currency with intrinsic deflationary bias is a right-wing revolt against macroeconomics. It takes itself out of political and administrative service, which is to say, in a philosophical register, that it secures its transcendental function relative to the social process.[14] At the level of its real abstraction, property places itself beyond question, through the closure of negotiable issuance. Naturally, precious metals anticipated this socio-political function exactly. Bitcoin’s actual deflationary bias is a strict consequence of its metallist model, which was (of course) selected for precisely this purpose.[15] The great enemy, then, against which Bitcoin explicitly defines itself, is the principle of discretionary money-production, or monetary socio-political dependency. It thus corresponds to an absolute re-commercialization, without possibility of compromise. (Mere possibility, in this context, would already be compromise.) No one is under any illusions about this fundamental orientation. Unsurprisingly, the objections of ‘gold bugs’ to the Bitcoin protocol tend to be technical, tactical, and transient. Their favored asset now has a digital or (superficially) ‘non-physical’ competitor, which flatters by emulation.

§4.3721 — It is no coincidence that ‘Neoliberalism’[16] – the (defeated) counter-revolution against the Keynesian politicized economy – broke into the public sphere with a disinflationary macroeconomic platform. Comparatively ‘hard’ (non-inflationary) money was advanced as a direct object of policy, under the label of ‘monetarism’. While still representing a massive concession to the principle of macroeconomic management, the limited monetarist proposal to subtract discretion from national currency administration was a sufficient departure from the post-war academic-bureaucratic consensus to remain tainted by intellectual scandal. Monetarism threatened the principle of monetary politicization, by removing the inflationary option from the macroeconomic tool-kit, and re-installing monetary integrity as a meta-political axiom. Since what was thus envisaged was a permanent self-binding of political authority by itself, in respect to monetary management, the political incoherence of the project are easily seen. An enduring political hegemony aligned with the monetarist analysis was implicitly presupposed as a condition of monetary stability, grounding money supply – and therefore value – in regime security. The foundations of monetary integrity remained entirely politically conditioned. Clearly, the crisis of economic liberalism – resulting from its formal subsumption into democratic mass politics – had not been significantly reconfigured. Monetary value was still held hostage to a popular vote. Even a gold-standard is grounded in the preservation of political commitment. It rests upon politically-revocable decision. By any reasonable definition of neoliberalism, therefore, Bitcoin is something else (as BitGold already was).  

§4.38 — The apprehension of Bitcoin politics as a re-animation of hard-libertarianism is an attractive simplification, reinforced by a great deal of supporting evidence. It remains a simplification, nonetheless. The Bitcoin event – in its full historical expression – overspills all guiding purposes. In this regard, it is closely analogous to the Internet, or indeed – following the nested sequence further out, and in reverse – to the conflicted installation-processes of electronics, electrification, and ultimately self-propelling industrialization as such. Even if (as seems eminently plausible, in each case) the machinery has a radical capitalistic affinity,[17] its development is not susceptible to detailed ideological direction.

[1] From a certain perspective – which is not itself isolable as a ‘left-wing’ or ‘right-wing’ theoretical orientation – there is no right-wing politics. The political, by its own dialectical logic (of organized controversy and reconciliation), can only reach conclusion on the left. Insofar as social process conforms to a structure of argumentation, its sinister destination is assured originarily, as the Left Hegelians were the first to distinctively, if only implicitly – and not (of course) at all uncontroversially – comprehend. For the right, agreement to pursue the social argument is already to lose it. The position of the right, most definitively in the case of established property distributions, is that there is nothing to discuss. The absence of any opening for discussion is what property rights ultimately mean. Whatever is unable to escape argument remains unsettled, framed by an implicit re-distribution which merely awaits actualization upon the collective horizon that is defined by the final scope of controversy or – restated exactly – the political totality. Under such conditions, which are conceived by the left as universal, private property is only ever held on trust, as a contingency of revisable political arrangements (or a debatable social allocation). Controversy – even ideological controversy – falls short of the difference referenced by autonomy. From the side of privacy, there cannot be any encompassing negotiation or ‘dialectical relation’ – between the public and the private. Privacy, private interest, cannot be protected – or even rigorously articulated – within a sovereign public forum. As soon as it makes a case for itself through political argument, or ideological commitment, which could only ever be ambiguous, it submits to the final authority of the conversation, and thus to the collective. Its sole defense lies in exemption from politically-significant social discussion, an avoidance that is not (merely) anti-dialectical, but positively exo-dialectical, requiring something other than opposition. Non-dialectics is not a higher level of argument, but an escape from the oppositional relation (into Outsideness). This exit from the political sphere is concretely instantiated by hard crypto-security. Bitcoin epitomizes such a meta-strategy. (Hence the extremity of its political-economic provocation.)

[2] Whatever the positive semantic associations accumulated by the word ‘war’, its most rigorous meaning is negative. War is conflict without significant constraint. As a game, it corresponds to the condition of unbounded defection, or trustlessness without limit. This is the Hobbesian understanding implicit in the phrase “war of all against all” (bellum omnium contra omnes), in which “the state of nature” is conceived – again negatively – through a notional subtraction of limitation. Treachery, in its game-theoretic sense, is not a minor theme within war, but a horizon to which war tends – the annihilation of all agreement. Reciprocally-excited mutual betrayal in departure from an implicit ‘common humanity’ is its teleological essence. It is worth emphasizing this point, in the interest of conceptual integration. The game-theoretic definition of mutual military escalation – and thus the inner-principle (and intrinsic motor) of war – is reiterated double-defection. This is a conclusion explicitly rejected by Carl von Clausewitz is his treatise On War, even as he acknowledges the cybernetic inclination to amplification (or “tendency to a limit”) which drives it in the direction of an absolute. “War is the continuation of politics by other means,” he insists, because it is framed by negotiation (book-ended by a declaration of war, and a peace treaty). According to this conception, it is an interlude of disagreement, which nevertheless remains irreducibly communicative, and fundamentally structured by the decisions of sovereign political agencies. Even as it approaches its pole of ultimate extremity, it never escapes its teleological dependency, as a means (or instrument) of rational statecraft. There is a stark arbitrariness to this assumption. If there is an inherent limitation to military escalation, restricting it within the bounds of political direction, Clausewitz never explains its principle. The reduction of war to instrumentality is not immune to criticism. Philosophical radicalization, alone, suffices to release war from its conceptual determination as ‘the game of princes’. Sovereignty has military, rather than political, foundations. Hence the reversal of dependency, which is captured by Michel Foucault’s notorious inversion of the Clausewitzean formula into the maxim: “Politics is war by other means”. If political sovereignty is ultimately conditioned by the capability to prevail upon the battlefield, the norms of war can have no higher tribunal than military accomplishment, in reality. No real authority can transcend survival, or survive a sufficiently radical defeat. There is thus a final incoherence to any convinced appeal to the ‘laws of war’ (when inclined to the objective, rather than subjective, genitive). Quite simply, unless war restrains itself it is not restrained. Nothing is able to transcendently impose upon it. Any realistic conception of ‘limited war’ subsumes that of ‘war lawfully pursued’ (with the latter properly categorized as an elective limitation). Unless war has a master, ‘just war’ is ontologically restricted to the status of a tactic. War has no master. There is strictly nothing that could be asserted with greater confidence. “War is the Father of all things,” (Πόλεμος πάντων μὲν πατήρ) Heraclitus asserts at the dawn of philosophy. Cormac McCarthy’s Judge Holden is more succinct still: “War is God.” If it has ever seemed otherwise, it is because the essence of war – as Laozi (among others) has told us – is deception. 

[3] Saussure’s A Course in General Linguistics (1916) was compiled from lecture notes (by his students Charles Bally and Albert Sechehaye). Its historical coincidence with the origin of macroeconomics merits explicit note. The United States Federal Reserve System, established on December 23, 1913, had scarcely begun operating when Saussure delivered his lectures on the arbitrary nature of the sign. In both cases, a spontaneous order falls under a commanding narrative of discretion, and management of value. The directing impulse – in modern no less than ancient parlance – is Promethean. The resonances between the new monetary order and structural semiotics are not limited to their implicit managerial orientation. They proceed in concert to the horizon of collective-volitional anti-naturalism (dismissing all intrinsic assets and positive terms). It would not be reckless to predict that the theoretical foundations of the macroeconomic epoch will be retrospectively understood as monetary structuralism (and – in its deutero-decadence – ‘post-structuralism’).  

[4] Grasped in terms of its abstract principle, Saussurean ‘structuralism’ (i.e. semantic constructivism) is the dominant paradigm – or meta-paradigm – governing the conceptual order of the postmodern western academy. ‘Post-structuralism’ has only deepened its dominion. Because its programmatic anti-naturalism dovetails so perfectly with the institutional task of defensive demarcation, over against the ascendant hard sciences, it has acquired a super-biblical authority within the modern humanities and soft ‘sciences’. This enterprise of demarcation is no younger than the modern university itself. It acquires 18th century articulation as a conflict of the faculties. The extraordinary importance of structuralism as a cultural influence thus descends from its (implicit) status as a vulgarization of transcendental philosophy, facilitating the general application of critique across the entire domain of humanistic study. By theoretically severing the production of articulate meaning (phenomenon) from an inaccessible – formless – reality (or “thing-in-itself”), it established constructivist principles applicable to the general study of signs, sparking a ‘semiological’ revolution of extraordinary influence. De-naturalization became a cognitive reflex, and a matter of zealous dogma. The ‘natural’ – as that which eludes political adjudication – is identified as the mark and mask of conservative obstruction. Revolutionary denaturalization approaches tautology. In its most rigorous formulation, systematized by Jacques Derrida under the flag of ‘deconstruction’, critical anti-naturalism proceeds through a series of methodical steps. It first identifies a privileged norm (metaphysical object) within the transcendental matrix of semantically-productive differences, before successively inverting the conceptual hierarchy (through the dialectical performance of antinomy), and then liquidating it under the evanescent label of a non-representable absolute condition of production (which never bears this name). Critique is thus regenerated as pure schematics of cultural subversion. Notably, it is weaponized primarily against structures of conceptual articulation, and only very derivatively against their (hazily envisaged) institutional manifestations, such as ‘patriarchy’ or ‘neocolonialism’. Deconstruction is thus to be contrasted against more thoroughly-materialized tendencies of critique, in which the supreme instance of social metaphysics is stubbornly identified as the state, and the institutionalization of ‘trusted third parties’ in general. When critique is subjected to academic professionalization as a bureaucratically-regularized deconstructive practice, the state ceases to be its target, and becomes its occult-transcendental agent. Fortunately, the subordination of critique to the agenda of what Schopenhauer describes bitingly as ‘university philosophy’ has been far from complete, and increasingly looks uncompletable. As it advances, it incentivizes route-arounds. This opportunity has already begun to be seized.

[5] The ‘arbitrary nature of the sign’ is treated at once as a sacred and scientific principle by the descendants of the structuralist revolution. The idea is, however, properly understood as a dogma. It is not argued, but assumed. The most firmly-grounded line of criticism is directed towards the hyperbolic formalism it exemplifies. In dissent, it might be noted that the phonic and graphic materials employed for linguistic encoding are no more ‘arbitrary’ than comparable monetary media. As we have seen, there has been nothing arbitrary about the selection of precious metals as money. Rather, they were advantaged by definite, positive (non-formal) qualities. Linguistic signs are no different in this – abstract – respect. For instance, they need to be compact, memorable, and executable, as well as distinctive. They are often positively suggestive, in ways partially captured by the over-stressed term ‘onomatopoeia’. Much basic vocabulary seems to be genetically promoted (i.e. to some remarkable degree ‘innate’). Stutter stutters. Slithering is self-decoding. The non-arbitrariness of the sign is typical. Nothing much beyond ideological convenience says otherwise. Structuralist methods, then, are not discoveries, but motivated impositions. They suppress ‘positive terms’ as a matter of principle. Obdurate substantiality of the semiotic medium is frictional. It offers impedance. It gets in the way of something. The anti-structuralist criticism is thus drawn into a more symptomatological mode of suspicion, as soon as it is asked: What is being obstructed? What is the plan? Social construction is the discursive complement to fiat currency. In both cases, the politicization of values is untethered from natural constraint (with all reference of the former to the latter denounced as pre-critical error). What the figure of the ‘gold-bug’ is to the Keynesian financial establishment, the ‘naturalist’ is to the social-constructivist critic. In each case, there is a (supposed) fetishization of a ‘barbarous relic’, eclipsing a process of collective, cultural-institutional fabrication. Within the overtly dominant strand of the critical tradition, what occurs is a definite triumph of the will. The thing-in-itself is implicitly identified as practical substance, manifesting the world out of mass volition. Contextualized in this way, all objectivity is cast as essentially modifiable, in accordance with political decision. ‘Nature’ is denounced as the mystified representation of alienated collective desire. This is sheer Schopenhauer, in an ironical, communistic configuration. The deep structure of critique has pre-programmed it, by setting the topic for a multi-century project of institutional absorption (whether conceived as a process of statist-Hegelian formation applied to critical-Schopenhauerean matter, or as a political-institutional interiorization of the thing-in-itself). There are, however, definite indications that the zenith of this assimilation process has passed, among which the emergence of Bitcoin – entirely outside it – is exceptionally striking. The dissolution of precious metals into a bureaucratic apparatus of macroeconomic expertise was not supposed to end this way, with techno-synthetic money bypassing the state and its orbital institutions. The conversion of money into a tool of centralized social administration led to another money. “We need a better barbarous relic,” the crypto-anarchist whispers had long insinuated, but they had been pitched-down below the threshold of public audibility. According to all accredited perception, ‘improvement’ had been heading in an entirely different direction. That is why Bitcoin performs, at once, a capitalist comedy and a socialist tragedy, attained through a surprising reversal of apparently-settled fortune. The mediation of naturalism through traditional cultural institutions can expect similar (or more generalized) relegation. Decentralized commercial genomics would be one relevant field to watch.


[6] The point is Schmittian, but the Left has been far more adept at assimilating it than the Right.

[7] “How did we get from the academy to the economy?” it might be asked. The social function of academically-credentialized expertise does that, when it operates within an institutional context of super-economic oversight. Once again, in the great scheme, macroeconomics is the bridge. As a central trust-management institution (historically supplanting the role of the church), the university is clearly situated in the crypto-currency flight-path. 

[8] For an illustration of tribal polarization in the realm of political signs, it would be hard to improve upon the example identified by this pseudonymous Internet comment, on the topic of the ‘Obamaphone’: “Do you find it entirely implausible that the program was promoted under the Obama administration, that it being Obama’s administration doing the promotion helped it go viral in the recipient community, and that all the reasons it went viral as a positive thing for the recipients made it go viral as a negative thing for conservatives? … There’s no distortion, no lying. What one side sees as a good thing is PRECISELY what the other side sees as a bad thing.”

[9] The Left / Right distinction is regularly derided as a mere anachronism, without consistent meaning beyond its historical reference to ‘the seating arrangements of the pre-revolutionary French National Assembly’. Relatedly, the notion of a ‘modern right’ is easily dismissed as a compact self-contradiction. Yet, insofar as modernity has exhibited internal ideological polarization – rather than merely incarnating the triumph of the left – some such distinction is inescapable. The intrinsic arbitrariness of the left-right terminology is in this respect an opening for productive abstraction. It is proposed here that ‘leftwardness’ on the principal political dimension is measured by the degree of attachment to a presumption of altruism, typically identified with the possibility of a comparatively unproblematic political expression of collective purpose. The ‘right’ is thus naturally identified with a constitutive cynicism. Given its apparent reference to definite incentives, the Marxian (or ‘scientific socialist’) conception of ‘class interest’ might be taken as a repudiation of altruistic presumption from the left. In this – crucial – case, however, the perfect coincidence of individual and collective interest attributed to the universalized – and thus simply ‘human’ – classless masses that have descended from capitalism’s industrial proletariat serves as the proxy for an altruistic orientation, structurally indistinguishable from that traditionally promoted in a more unapologetically sentimental and religious vein. The expectation that a history of industrial cooperation will have fully educated mankind out of all narrow self-interest is, at the very least, a bold one. Dreams of a ‘new man’ have been consistently disappointed, however elegantly they have been justified by socialist philosophy, within the historical context of concrete initiatives promoting collectivist economic reconstruction. Predictably, socialist management authorities have relapsed into mafia organizations, as resilient private incentive-structures re-assert themselves. There is much truth – and in fact nothing less than a robust conservation law – in the leftist moral accusation that the larval condition of capital is crime. The game of private interest can be trained by accommodative legal structures. This is what ‘rule of law’ realistically means. The principle of conservation, which adopts criminality as a back-up reservoir, ensures – nonetheless – that fragmented self-interest proves predictably invulnerable to legal abolition.

[10] The phase is an adaptation from Frank Herbert’s science fiction novel Dune, in which the Bene Gesserit say: “Fear is the mind-killer.” The core text at LessWrong (written by Eliezer Yudkowsky) begins: “People go funny in the head when talking about politics. The evolutionary reasons for this are so obvious as to be worth belaboring: In the ancestral environment, politics was a matter of life and death. And sex, and wealth, and allies, and reputation … When, today, you get into an argument about whether ‘we’ ought to raise the minimum wage, you’re executing adaptations for an ancestral environment where being on the wrong side of the argument could get you killed. Being on the right side of the argument could let you kill your hated rival!” As the discussion thread to this post soon demonstrates, the denigration of partisan signaling is itself readily dismissed as partisan semiotic maneuvering at a meta-level – perhaps an attempted pacification of language in the covert interest of the status quo. The politics of depoliticization is not aufgehoben. There is no short-cut leading off the long road. (Depoliticization requires machinery.)

[11] When Modernity is apprehended under the aspect of political history, it exhibits a tendency to the limit, at which point honest public pronouncement is a transcendental impossibility. This is widely – and cynically – recognized. Nobody describing a discussion as having ‘become political’ is thereby suggesting that its commitment to principles of epistemic rigor have been deepened. ‘Politicized science’ – for instance – is quite simply no longer science, and this is exactly the sense in which the expression is used. Whether epistemological, or even merely administrative, the political process is strictly indifferent to effective performance, unless this is itself folded into a calculation of partisan advantage. What use is truth or competence to us? Schumpeter peerlessly elucidates the concrete mechanism as it has operated in modern times in his Capitalism, Socialism and Democracy.

[12] A structural suspicion of, or impatience with, human limitation – conceived as a tragic propensity to subvert the foundations of liberty – is a trait connecting libertarians and transhumanists. It has been an explicit theme within the American experiment since its origin, initially elaborated in the discussions that immediately preceded the framing of the Constitution. The Founders achieved consensus on the principle that human agency, especially when invested with social authority, requires immanent structural constraint. It needs to be checked, and balanced. The great complementary dangers are monarchical and popular discretion. No regime is truly republican unless it protects itself, equally, against both. Thus the essential reference of power to a meta-political protocol, or constitution, whose independence (from the political process) strictly coincides with geopolitical independence of the republican polity itself. It is the constitution, and no longer – immediately – the demos, that defines the subject of security. Survival of the people is downstream from that of the polity. Suspension of the political within a war of independence is perhaps the necessary condition-of-articulation for such a republican order. The Dutch, no less than the American example, suggests so. Liberty conserves itself in, and by way of, a practical inhumanity. Primatologists, no less than parents, understand that ‘fairness’ is the typical stake in a squabble. Nowhere is the primate heritage of the human animal more obviously displayed than in the political sphere. Homo sapiens has no uniqueness as ζῷον πoλιτικόν (“the political animal”), a characterization which extends beyond hominids, and social primates, to social animals in general. Except for those convinced of the fundamental nobility of the chimpanzee, there is no compelling case for investing this particular aspect of human existence with any extraordinary dignity. Even less persuasive is the attempt to align it with some effective egalitarian implication. Machiavellian incompetence is no less socially-disabling than its commercial or industrial counterparts, however strongly attached intellectuals may be, typically, to the contrary hypothesis. Taking games into the wild tends only to sharpen their consequences. It is only within the squabble, that the squabble over fairness could appear neutrally leveling. Rather, to the extent that it sorts hierarchically, it is by the criterion of competence at squabbling, which is to say: by political talent. Admiration for political talent is, at least implicitly, the essential characteristic of the left, and the enemies of the left themselves become leftist to the extent that they emulate it. There is no leftist hero, in the entire history of the world, who was not a political leader. Beside musicians, it is only the Machiavellian manipulator who makes it onto student T-shirts. Within the leftist camp, this valorization of the squabbler-king appears so perfectly natural as to escape explicit notice. It is natural (in fact zoological, primatological, and anthropological). As a consequence, the right is guided inevitably in an antihumanist direction, typically against its own explicit (and romantic) inclinations. It has to cut the controversy to advance by even the slightest step. … Technological Singularity …

[13] Left libertarianism has a relevance to this question that is at least double. Firstly, it extends suspicion of government to coagulations of private (economic) power. Secondly, it places ‘the left’ within a lineage of socio-political dissent opposed, originally, to the structures of concentrated authority represented by the institutional pillars of the European ancien régime – monarchy, aristocracy, and clerisy. The meaning of feudalism in this context is not at all straightforward, since it can be aligned either with centralization, or with decentralization, to some significant level of plausibility. In any case, historical criticism of a broadly Marxian type is not easily avoidable here. The affinity between the left and an ideal of centralized authority is not historically constant, but tends rather to rise in comparatively strict relation to the social influence of capital (against which it counter-balances). Consistent orientation to decentralization – conceived here as the absolute right – challenges the libertarian left to embrace a cold indifference to its consequences. Such unconditional commitment to disintegration can, no doubt, be turned against any actual configuration of the ‘right’ – and will be. Considerations of regime security necessarily lean against it. Nevertheless, the drivers of disintegration – the Internet most prominently – are gathering such momentum that dilemmas of this kind are unlikely to long be avoidable. The registration of fragmentation in-itself as a cause, irrespective of the ideological factions it divides, marks an escalation of critique, or accommodation to the transcendental. Sovereign multiplicity serves nothing beyond itself. The erosion of extraneous agendas, therefore, becomes a critical (‘accelerationist’) historical symptom. Division escapes the relativity of faction to operate as an absolute productive power. Ideological self-comprehension is strictly secondary.

[14] A fully self-secured transcendental commercial medium (and store of value) is the nightmare that the Left, in its ‘scientific’ manifestation, has envisaged from its beginning. If there has been a diagnostic or analytical error from this party, in this respect, it has been rooted in the premature attribution of such an autonomous power to prior forms of radically insecure, socio-politically dependent property formations. ‘Property rights’ already imply insecure (non-autonomous) property. Autonomous Capital, however, is a technical experiment (or synthesis) that defies all merely speculative anticipation. There can be no definite idea of the way it can be done, prior to its being done. As with Intellectual Intuition – non-coincidentally – its conception and realization are indissociable. ‘Property’ is not an invariant conceptual category, available for deployment within historical analysis, but rather the critical variable itself. The theoretical inversion required to make the essence of property a constant conforms to the pattern of historical dependency, which is equivalent to the marketing of innovation. The actual is solicited by the virtual, in terms that appear to ensure continuity, and even deference to established modes of existence. “This is what you need,” sells. “This is what you’re becoming, or being swapped-out for,” really doesn’t.

[15] Nick Szabo is unambiguous on this point in his discussion of Bitcoin-precursor ‘Bit Gold’: “In summary, all money mankind has ever used has been insecure in one way or another. This insecurity has been manifested in a wide variety of ways, from counterfeiting to theft, but the most pernicious of which has probably been inflation.”

[16] ‘Neoliberalism’ is a word that is easy for those outside leftist intellectual micro-cultures to laugh at. Comparatively rare attempts to specify its meaning have, in general, been primarily comical. When applied with some theoretical consistency, the term refers to a failure of liberalism under the sign of liberalism, or a restoration of markets under government direction. In this respect, it denotes a paradoxical authoritarian liberalism – perhaps speculatively extrapolated into ‘anarcho-fascism’ – which conforms to Marxian expectations that (bourgeois) freedom presupposes coercion, as its condition of existence. It is notable that the horror of deflation, rooted in the left-managerial (and administratively macroeconomic) historical interpretation of the Great Depression, has retained its overwhelming cultural hegemony. Whatever is made of ‘neoliberalism’ or – perhaps more narrowly – the ‘supply-side revolution’ beginning in the early 1980s, it signally failed to shift the terms of public debate to any significant degree in this respect. It has taken Bitcoin to do practically what market-oriented political-economic ideology has failed to do in terms of the philosophy of monetary management. Neoliberalism was still politics, so it wasn’t the solution to the problem it identified. A mere politics of depoliticization falls prey to its own contradictions, as its leftist critics have gleefully – but not unrealistically – noted. A techonomic mechanism of depoliticization presents the Keynesian economic-administrative establishment with a far more formidable antagonist.

[17] For criticism from the Left along these lines, see David Golumbia’s ‘Bitcoin as Politics: Distributed Right-Wing Extremism’ (here compressed into a quote-stream for effect): “Bitcoin can be seen as a technical object that is structured to an unusual extent by politics. … Bitcoin is politics masquerading as technology. … Bitcoin itself is now promoted by banks, investors, and venture capitalists. … The lack of any valid, non-conspiratorial analysis of our existing financial systems means that Bitcoin fails to embody any substantial alternative to them. … [E]nthusiasts demand we understand Bitcoin as a welcome political intervention, but when pressed for details about that political intervention, its advocates unfailingly turn back to technical and engineering matters. … [In] December 2013, half of all Bitcoins were owned by approximately 927 people, such fight-the-power revolutionaries as the Winklevoss twins of Facebook infamy among them). … If Bitcoin becomes regulated enough to serve as a stable store of value and to ensure debacles like Mt. Gox don’t happen in the future, it may be useful as a global system of payments (but so are generally non-transformative technologies like PayPal and Dwolla). But that will hardly shake world political structures at their foundations. If it remains outside of all forms of both value and transactional regulation, Bitcoin will continue to be a very dangerous place for any but the most risk-tolerant among us (i.e., the very wealthy, whose interest in Bitcoin should indicate to advocates how and why it cannot be economically transformative) to put our hard-earned money. … The problem with ‘fiat currency’ is value fluctuation. The most dangerous kind of value fluctuation is the deflationary spiral… Which world currency is currently experiencing among the most dramatic deflationary spirals anyone has ever seen? Bitcoin itself, the ‘existential threat to the liberal nation state.’ … [The] problems with currencies actually aren’t formal, or mechanical, or algorithmic, despite what Bitcoin propagandists desperately want us to believe. They are social and political problems that can only be solved by political mechanisms.” – A cry of pain, then. We shall hear many more of them.


Crypto-Current (041)

§4.2 — War games are built into the fabric of the Internet. This is at once a matter of uncontested genealogy, and of an as-yet only very partially explored transcendental-strategic landscape.[1] As we have seen, according to one (comparatively mathematicized) formal meta-description, Bitcoin arose as the solution to such a game – the Byzantine Generals’ Problem. This immediate context is so closely tied to the achievement of the Bitcoin Protocol, by those most closely associated with its formulation, that it has been widely adopted as a definition.[2] Yet even if the solution to Byzantine coordination establishes the game theoretical significance of Bitcoin, it does not exhaust it, even remotely.

§4.21 — Bitcoin is both less than, and more than, a mathematical theorem, because it remains a game in process, and also a meta-game. There is an irreducible informality to Nakamoto Consensus, insofar as it remains open, or unsettled, at multiple levels. As a concrete procedure, it effectively invokes a sociotechnical process of uncertain destiny within its demonstration, making it ill-suited to the purposes of mathematical proof.[3] If the mining procedure – rather than the reward criterion – could be fully specified in advance, and thus support predictive deductions, it would do no work. Incentivization – in every case – presumes non-deducible outcomes. Bitcoin, like all incentive systems, is a synthesizer. It produces a social process, as an event, and an arena (or agora), and thus advances experimental game theory, through an artificial environment especially conducive to the emergence of spontaneous (‘trustless’) coordination. Concretely, this space is a hothouse for business innovation, which constitutes the leading – and perhaps still ‘bleeding’ – edge of microeconomics, where generalized theory and practical enterprise have yet to dissociate. The boundaries of the Protocol, while strictly defined in certain respects, are profoundly unsettled in many others, and there is no strongly economical way to settle them. ‘Where does it end?’ is a question that has to be explored historically, without conceptual short-cuts, by an irreducible synthetic process. It is thus roughly modeled by the Bitcoin mining procedure, where the ineluctable necessity of trial-and-error – or uncompressible method – precludes all possibility of rapid philosophical (i.e. purely conceptual) resolution. Bitcoin is a game, and is like history, in that it cannot be worked out without being actually played – or hashed.

§4.22 — Real games are far-from-equilibrium processes that approach formality without actualizing it. They consume freedom – by contracting discretion – with every move that is made, and prolong themselves by reproducing it, in a circuit. Only insofar as this holds do they include incentives, as an irreducible teleological element. The open-ended mechanization of purposes is the diagonal along which they proceed. When apprehended at sufficient scale, this process is equivalent to industrialization. With the arrival of Bitcoin, money is – for the first time – subsumed into industrial revolution. A great historical circuit is cybernetically closed (which does not mean finished, but something closer to the opposite, i.e. initiated). Techonomic fusion – the singularity guiding modernity’s convergent wave – can for the first time be retrospectively identified. On Halloween 2008, the end began. What modernity has been from the start was then sealed.

§4.23 — Friedrich Nietzsche’s On the Genealogy of Morals dedicates itself to describing how man became “an animal with the right to make promises”. The story has turned out to be even longer and more intricate than his work anticipated, but the quasi-paradox there explored, knotted into the concept of debt, retains its pertinence into our time. How is a free commitment possible? Bitcoin attends explicitly to the same problem. “Transactions that are computationally impractical to reverse” – of the kind Bitcoin facilitates – constitute voluntarily-adopted mechanized commitments, immunized against all vicissitudes of will. Since algorithmic irreversibility enables an inability (or disables an ability), there is much here that seems self-contradictory upon superficial consideration.[4] Yet such a facility – or, indeed, power – of self-limitation is already fully implicit in the word ‘bond’, and in any serious sense of commitment. A contract is an expenditure of liberty. The motto on the coat of arms of the London Stock Exchange, Dictum Meum Pactum (‘My Word is My Bond’), extends the principle – by etymological suggestion – to the most elementary cases of formalized social association (‘pacts’). Society is a game, which arises from its ragged edges. The deal describes the frontier.

§4.24 — During a ‘Fireside Chat’ on ‘Bitcoin and the Future of Payments Technology’[5] Larry Summers makes exactly the same point:

This is an area that I think is rich with irony. … the single most important development in the history of the common law corporation was when the legal principle that it could be sued was established. And you might ask: why was it good to be sued? Well, because if you can’t be sued you can’t enter into a binding contract, and only when you could enter into a binding contract could you carry on commerce in a major way.

§4.25 — Bitcoin subtracts the option to defect (or double spend). The protocol sets the rules of a new game, in which the violation of contract ceases to be a permissible ‘move’. By automatizing this constraint, and thus withdrawing it simultaneously from the realms of contractual agency and regulatory oversight, Bitcoin instantiates algorithmic governance in its own, specific domain. Human discretion is displaced to the boundary of the Bitcoin commercium, and into the zones of meta-decision (for economic agents and authorities respectively) whether to enter or permit Bitcoin. These dilemmas introduce a knot of complex and typically highly-recursive games that can be grouped under the umbrella term ‘Bitcoin politics’.

[1] A ‘transcendental-strategic landscape’ – constituted by an absence of transcendent legality – corresponds to a the concrete problem of anarchism, in the sense this term is understood by realist international relations theory (IRT), and realist strategic analyses more widely. That is to say, it poses issues of security without any possibility of appeal to superordinate authorities (or authoritative referees). Hobbesian political theory, in which “the war of all against all” is exposed by a secular ‘Death of God’, establishes itself upon a negative foundation. Leviathan begins from that which cannot be relied upon. Whether domestically, or internationally, the transcendental (i.e. ultimate) theater in which powers meet is defined by the subtraction of any original commanding unity. Security is thus theoretically constituted as a problem, corresponding to a primordial lacuna. Since it is not given, it has to be positively produced, and it is in the identification of this practical conundrum that IRT isolates its proper object of study. On the Internet, as in the international arena, it is only upon such a cleared, immanent plane, that a true game can take place. It cannot be sufficiently stressed that the conflictual field is not – as its critics have over the centuries necessarily insisted – a positive presupposition, but rather a mere default, assuming only original diversity under the conditions of an absent integral authority. Despite its manifest tendency to decay into a Utopian projection, the perpetually-regenerated credibility of anarchism is founded not upon its transcendent aspiration, but upon its transcendental problematic. Given only war, how is coordination possible?  

[2] While the definition of Bitcoin as a solution to the Byzantine Generals’ Problem remains controversial, the principal objections to this description can reasonably be described as arcane. As Oleg Andreev notes, in a brief but valuable discussion of the proof-of-work solution, any actual production of communications integrity is compromised in its logical purity by practical limits (bounded by cryptographic intractability). In other words, precisely because it is transcendental, Nakamoto Consensus cannot be transcended even by its own proof. The limit is set by the working machine. This is a matter of extreme generality. While persistently – and even essentially – tempted by Platonism as a heuristic, mathematical procedures require instantiations which are transcended only in conceptual principle, which is to say: hypostatically, through appeal to transcendent grounds whose authority is purely ceremonial. Compelling demonstration already returns the problem to the machine, and its real potentials for effective execution. Operationalizations are not, in reality, superseded, or subordinated, but only (at most, and typically) bracketed, or abbreviated, and thus – again, in reality – assumed. The credibility of the Idea refers to potential demonstration. The keystone of proof says nothing else. Untested trust is an oxymoron. It would be a grave error – though an all-too common one – to seek an epistemological demotion of ‘credibility’ to the psychological category of ‘mere opinion’ while admitting this. Credibility is basic. Without it, no truth has been earned. This is the meaning of deduction in its critical and realistic sense. What lies beyond is metaphysics, enthroned upon arbitrary assertion. Irrespective of any extravagantly-promised protections, there is no confidence – no security – to be derived from that. However much Bitcoin has to appear as an Idea, therefore, it is irreducible to one. It cannot be expected that this stubborn factuality is susceptible to comprehensive dissolution into the form of the concept, still less that it will be fully factored into a security analysis. On the contrary, realism predicts its chronic idealization (i.e. misidentification). In this respect, philosophy is a security hole (proposing answers in place of solutions, or dispelling threats only in ideality), if not – in its institutional form – a particularly serious one. … Since insecurity has no adequate idea, it cannot be speculatively resolved. This point of elementary realism calibrates the appropriate level for confidence in philosophy (and does so in actuality, not only in principle). Philosophy is not seriously entrusted with keeping anything safe. Its invitation to live dangerously is – in this respect – a sensible concession to the inevitable. The untested or – still worse – untestable model need not be about danger to be dangerous. Armchairs are places where things can go wrong without limit. … The Byzantine Generals do not secure themselves through a speculative philosophy, but through a robust procedure. Did they have a ‘good plan’ before testing it? (It could, at most, only appear so.) … Security concerns only risk, which is never merely a conceptual possibility, but always a matter of discovery. The fact that Bitcoin appears to be a ‘sound idea’ is not finally separable from its concretely-elaborated existence as the most rigorously-tested trust mechanism in the history of the earth. …

Ian Grigg argues that the classic coordination problem has been displaced, into the far more protean quandaries of a ‘dynamic membership set’. Critical Bitcoin security challenges, most specifically that of the Sybil attack (based upon identity proliferation), entirely exceed the horizon of the BGP. “If Bitcoin solved the Byzantine Generals Problem, it did it by shifting the goal posts.”

[3] A machine with integral incentives necessarily combines formal – or formalizable – and informal elements. To a still-imperfect approximation, but with definite teleological inclination, Bitcoin is politically closed, while commercially and industrially open. In this respect it echoes – and even escalates – the ideal of the arch-liberal (capitalist) social order. The mining objective is exactly specified. The criterion for mining success, compensated automatically in bitcoins, is a hash of the current (problem) block whose nonce begins with a definite number of zeroes (a figure adjusted for difficulty). Despite this extreme formality, the mining procedure involves both chance and – more importantly – innovation. Bitcoin hashing is formally constrained to trial-and-error methods, with probabilistic outcome. In the words of the Bitcoin wiki: “The probability of calculating a hash that starts with many zeros is very low, therefore many attempts must be made.” Everything beyond the product specifications (the puzzle solution) is left open. In particular, the production techniques are left undetermined, and thus open to industrial innovation. See:

Similarly, and even more markedly, the commercial opportunities opened by the protocol are uncircumscribed. The ‘value’ of the Bitcoin currency, in the broadest sense, is settled dynamically outside the blockchain, through a radically decentralized and uncomputably complex dynamic of exchange. (The exchange process – catallaxy – is the computation.) The protocol sets the total stock of bitcoins, without predetermining their distribution (between agents) or price (when denominated in any other financial medium). The value of the currency cannot be derived from the rules determining its quantity. It is synthetic. Bitcoin’s productivity lies in what it leaves open, even as its integrity is secured by what it closes.

[4] Self-binding is a classical problem, epitomized by the strategy adopted by Odysseus in his passage past the Sirens. Anticipating an irresistible seduction, he commits to a decision which he then – by crude socio-technical means – renders irreversible. Within game theory, the same problem is a central preoccupation. It is admirably summarized by Scott Alexander: “… it sounds weird to insist on a right to waive your rights. Isn’t that more of an anti-right, so to speak? But … read your Schelling. In multiplayer games, the ability to limit your options can provide a decisive advantage. If you’re playing Chicken, the winning strategy is to conspicuously break your steering wheel so your opponent knows you can’t turn even if you want to. If you’re playing global thermonuclear war, the winning strategy is to conspicuously remove your ability not to retaliate, using something like the Dead Hand system. Waiving your right to steer, waiving your right not to nuke, these are winning strategies; whoever can’t do them has been artificially handicapped.”

[5] The quote is extracted from this video record:

Crypto-Current (040)

§4.1 — Explicitly acknowledged ‘network problems’ long pre-exist the electronic era, by centuries, if not millennia. They constitute a guiding thread within what has been called ‘the tradition of spontaneous order’.[1] Order is spontaneous if it solves a coordination problem without appeal to any organizational element at a higher – or super-social – level. Spontaneous order, social self-organization, or immanent social process, is thus counter-posed to transcendent social design (from above, or beyond), and to its corresponding theoretical justifications, which amount to a social metaphysics, typically serving concrete functions as guiding ideologies of super-ordinate control. As long recognized by its opponents, the assertive recognition of this theoretical conundrum cannot be practically dissociated from an implicit political stance.[2]

§4.11 — In the absence of superior guidance, solutions to coordination problems have to emerge, out of – and as – games. This is only to say that cooperation between the agents involved cannot be presupposed, but has to arise from their interaction, if it is – indeed – to appear at all. To assume altruism or solidarity in such cases is a failure to think the problem at all. Coordination is the explanandum. The collectivist dogma is not an answer, therefore, but an alternative to the question. An answer is a trust machine, for which Bitcoin is the model. It is strictly complementary to a minimal presupposition, that of trustlessness (for which it is the solution).

§4.12 Byzantine generalization extends to the very limit of network communication problems, to the difficulties of establishing coordination within radically dispersed (and thus zero-trust) multiplicities, encompassing non-idealized societies / populations of every variety. It is not, of course, that the concrete existence of trust is simply denied, only that it is rigorously thematized, as a social product requiring explanation – and what counts as an ‘explanation’ cannot, whether overtly or covertly, merely presuppose what it is called upon to explain. (This demand, as we have seen, is already totally – even ultimately – controversial.) If there is trust, there has to be a trust engine, conceived without pre-existent bonds of trust as a part. At the most abstract level, therefore, this is a topic that would have been familiar to the thinkers of the Scottish Enlightenment, as to all those participating productively within the theoretical tradition of spontaneous order. It is exactly this same problem of decentralized coordination (in the absence of any transcendent contribution provided by assumed altruism or common purpose) that has been the essential guideline for realistic social analysis within the ‘whig’ lineage of descriptive liberalism, exemplified most famously by Adam Smith’s figure of the ‘invisible hand’.[3]

§4.13 — Evolutionary biology, as a science of emergent networks, has engaged very similar problems from its inception, often with identical tools. This is especially evident in the modeling of ecological equilibrium within large, long-term biological system dynamics, in which the absorption of extinctions defines the ‘true network’ (by the absence of indispensable nodes). A more recent attempt to formalize such coordination is found in the game theory of John von Neumann, which has itself been effectively applied to biological networks at a variety of scales.[4] The latest – and still rapidly self-transforming – incarnation of this tradition can be seen in the science of ‘complex adaptive systems’ as exemplified by the research programs of the Santa Fe Institute.[5] In each case, the defining theoretical object is emergent coordination, in which no appeal to any centralized, superordinate, or orchestrating principle is made, unless this can be identified with the system itself, as such. The target of such researches is transcendental order, as captured by the immanent rules of distribution (which are ultimately indistinguishable from the real correlate of mathematics).

§4.14 — Games, strictly understood, therefore, arise under the minimalistic assumptions tolerable to an analytical anarchism,[6] that is: after the methodical subtraction of all presumed coordination, or the conversion of such presuppositions into formal theoretical problems. The implicit critical impulse driving the construction of such research programs is evident. That which might have been asserted, as a transcendent principle, or metaphysical dogma, is to be instead explained, as an immanent, emergent, or ‘evolutionary’ outcome. Whether explicitly understood in such terms, or not, every such enterprise is a regional application of critical philosophy. Spontaneous order is the correlate of critique. The solution, however, cannot correspond to a philosophical thesis (of any traditional type), or even to a ‘machinic proposition’ – an engineering diagram, simulation, or protocol – but can only emerge as the synthetic product of such a proposition, when executed. The notion that coordination problems (of significant complexity) can be anticipated in detail by a process of pure ratiocination is a philosophical disease, of recognizably pre-critical type. The idea of the diagonal is not the diagonal.

§4.15 — If the discovery of spontaneous order as a problem corresponds to the execution of critique, it can be formalized through a diagonalization matrix. When the pre-critical opposition of centralized coordination to uncoordinated dispersion is tabulated, it draws a graph isomorphic with the Kantian schema. At the level of the abstract formalism, this latter is echoed with such extreme fidelity that we can borrow from it freely, while switching its terms through a simple hash. Substitute centralization for analysis, decentralization for synthesis, order (coordination) for the a priori, and disorder for the a posteriori. As with the Kantian original, the first diagonal term fails – there is no centralized disorder, any more than there are analytic a posteriori judgments. Centralization is an intrinsically anomalous distribution, necessarily threatened by the prospect of a fall into disorder. Its complementary conception, ‘simple anarchy’, is no less invulnerable to theoretical dismissal. The previously acknowledged terms, centralized order, and decentralized disorder (like the analytic a priori, and synthetic a posteriori) are therefore preserved. Common sense is not abolished, at least, not initially. In the exact formal place of Kant’s invention/discovery of the synthetic a priori, the critique of coordination, too, generates a viable diagonal product – decentralized order. (Quod erat demonstrandum). This is the Great Oblique worked by all realistic social theory since the inception of the modern epoch.

§4.16 — In Cyberspace, the tradition of spontaneous order has been massively accelerated. Classical coordination problems have been reformulated as experimental fields, opened for exploration by the emergence of increasingly-powerful computer simulation tools, and consolidated as practical solutions through the implementation of cryptographic systems and P2P platforms. Philosophical reflection has, to a very considerable extent, been side-lined by technical applications reinforced by far superior criteria of evidence, which is to say: practical demonstration. In the age of electronic information technology networks can be tested as products, and it is possible to ask of a complex idealization, as never before, does it work? This transition, through computer simulation, from explanation to implementation, registers a process of technological envelopment without definite limits. It corresponds to an interminable tide of disintermediation that established institutions of intellectual authority have not yet begun to fear enough.

§4.17 — While institutionalized philosophy has tended to lag the network trend, rather than pioneer it, something that might reasonably be called ‘abstract network theory’ has nevertheless arisen to become a guiding philosophical theme since the mid-20th century. Among those modes of philosophical writing that have been most influential upon the wider humanities and social sciences, this attention to the distinctive characteristics of networks has been especially notable.[7] Appropriately enough, this ‘discourse’ – or dispersed conversation – has no uncontroversial center of authority, stable common terminology, shared principles, or consistent ideology. Its rhetoric tends to systematically confuse advocacy with criticism, and both with analysis, as it redistributes partial insight around a number of relatively tightly-interlocking social circuits (which are overwhelmingly dominated by a familiar set of theoretically-superfluous moral-political axioms).[8] Yet, however deeply regrettable this concrete situation might be considered from the perspective of austere theory, it cannot be simply wished away. Intellectual production itself occurs within networks, and those with greatest promotional capability are among those least optimized for pure intellection.

§4.18 — The cultural systems in which the philosophical (and sub-philosophical) formalization of radically decentralized – or ‘true’ – networks has emerged, through a multi-decade self-reflexive process, are eccentrically articulated, very partially self-aware, and only weakly integrated. Yet even in these noisy and deeply compromised circles, cross-cut by vociferous extraneous agendas, and subjected only very weakly to a hard reality criterion, convergence upon the rigorous conception of a model network has been inexorable. An especially influential example is the rhizome of Gilles Deleuze and Félix Guattari, which provides philosophy with its most rigorously-systematized account of acentric and anorganic order.[9]

§4.19 — A network, in this sense, has no indispensable nodes, entitling it to the adjective ‘robust’. Once again, the Internet – at least in its idealized conception – exemplifies such a system. It is typical, at this point, to recall the origins of this ‘network of networks’ in a military communications project (ARPANET), guided by the imperative of ‘survivability’ realized through radical decentralization.[10] As will be repeatedly noted, on the crypto-current the security criterion is primary, providing system dispersion with its principle. This suggests, as an ideological generalization, that there is no basic ‘trade-off’ between liberty and security. Rather, it is through security (alone) that liberty establishes its reality. The term “crypto-anarchy” condenses this conclusion into a program.

§4.191 — Such invocations of the strategic investment in distribution and redundancy, however predictable, remain conceptually productive. They are especially valuable as a corrective to modes of discourse – typical among contemporary humanistic studies – which tend to haze the harsh selective or eliminative function of critique into a vapid metaphor. It is the military ancestry of the Internet that is tacitly referenced in the celebrated maxim of John Gilmore (1993): “The Net interprets censorship as damage and routes around it.” In order to save command-control, it was necessary to fundamentally subvert it.

[1] Barry Norman’s ‘The Tradition of Spontaneous Order’ (2002) lays out the intellectual history of the idea, within a primarily legal and economic context:

Whe abstracted beyond its socio-historical exemplification, and apprehended as a cosmic-philosophical principle, spontaneous order refers to immanent coordination within dispersed collectivities, making it approximately synonymous with ‘the science of multiplicities’ in general. It designates the systematicity of the system, which can be alternatively formulated as an irreducibility without transcendence, or an immanent non-locality. Whether philosophical, or colloquial, the association of spontaneity with notions of vitality is ultimately arbitrary. Spontaneity is not life, unless ‘life’ is conceived as the irreducible operation of the multiple as such.

[2] Public Choice Theory serves as a reliable proxy for minimalistic, game-theoretical axioms in their directly political application. In her book All You Can Eat: Greed, Lust and the New Capitalism (2001), Linda McQuaig cites a short fable by Amartya Sen, designed to satirize the Public Choice approach: “‘Can you direct me to the railway station?’ asks the stranger. ‘Certainly,’ says the local, pointing in the opposite direction, towards the post office, ‘and would you post this letter for me on your way?’ ‘Certainly,’ says the stranger, resolving to open it to see if it contains anything worth stealing.” Notably, in contenting itself with a satirization of disciplined moral skepticism, this line of criticism is brazenly unembarrassed about its dogmatic structure. Its rhetorical brilliance diverts from its substantive socio-historical meaning, which is the acknowledgment that solidarity is premised on the absence of a question. In other words, it demands, finally, the docile acceptance of an inarticulate presupposition. In this demand is to be found its irreparable weakness, shared by all cultural commitments – typically religious in nature – that are grievously wounded by the very fact of coming into question. A comparable defense of transcendent altruism founded upon the pre-delimitation of critique will be seen in David Graeber’s ‘everyday communism’. There, too, fatal injury coincides with the beginning of skeptical thought. Since, in this framework of collectivist moral norms, the investigation is the real crime, an intrinsic inclination to totalitarianism proves difficult to control.

There is one, and only one, coherent rejoinder available to the Left on this point. The table has to be reversed, in order for autonomous individuated agency to occupy the role of the explanandum, with its production and reproduction within the collectivity determined as the critical problem. Naturally, the production of individuation within complex adaptive systems would not satisfy this demand, since the presupposition of any such analysis is uncoordinated multiplicity. The Left is compelled to maintain that war is not ‘the father of all’. It is darkly amusing, then, that we continue to argue about it.

[3] In Smith’s most widely-cited words, taken from the Wealth of Nations, the repudiation of assumed altruism is explicit: “It is not from the benevolence of the butcher, the brewer, or the baker that we expect our dinner, but from their regard to their own self-interest. We address ourselves not to their humanity but to their self-love, and never talk to them of our own necessities, but of their advantages.” It is only through this theoretical self-restraint that the emergence of social cooperation is critically explained, rather than dogmatically asserted. It is important to note, however, that the recognition of the problem is not in itself a solution. The solution is not a philosophical reservation, but a machine. In Smith’s case, of course, it is the market mechanism, understood as a catallactic generator, or engine of spontaneous order, that represents the general (abstract) form of social solutions to novel coordination problems, in a way that remains mainstream within economics up to the present day. When grasped within this framework, it is at least tempting – if perhaps not strictly compelling – to suggest that any purported successor to the market, capable of usurping the role of the market, could only itself be a new iteration of the market. The most prevalent contemporary radical (left) alternative requires that coordination problems, rather than being solved, are instead terminally dispelled. Among those unfashionable vestiges of the Marxist tradition which emphasize the inheritance of capitalistic management practices by post-capitalistic social regimes (as exemplified by Lenin’s Taylorism), there is at least the promise of collective action solutions taking the form of legacy cultural resources, although without solid prospect of further innovation in a post-competitive milieu.

[4] The formal application of game theory to evolution dates back to the early 1970s, pioneered by John Maynard Smith and George R. Price, through the theorization of evolutionary strategies (see ‘The Logic of Animal Conflicts’, 1973,…15S). Since the notion of ‘strategy’ has an ineliminable telic inflection, this mode of evolutionary formalization can be philosophically-conceived as a diagonalization of biology. A video interview with Smith on the topic can be found here:;jsessionid=7E3CA894713888028A8D7066FB3F263E

The preparatory theoretical work of W. D. Hamilton is especially notable in this regard, since it is explicitly focused on the application of Darwinian intellectual tools to subtract the metaphysical pre-supposition of altruism. The rhetorical provocation of selfishness, as seen most prominently in the Hamilton-based neo-Darwinian synthesis of Richard Dawkins, testifies to this project. The ‘selfish gene’ poses altruism as a problem, which Hamilton had already productively addressed. This is how transcendental critique reaches social biology, whose compressed name – socio-biology – was, from its inception, a political scandal. Collectivist metaphysics was not slow to recognize the profound intellectual menace – at once ancient and scientifically-renovated – that it faced.

[5] The Santa Fe Institute gathers cross-disciplinary researchers for the study of far-from-equilibrium systems and their emergent behavior, characterized by path dependency, sub-optimality, and massive informational imperfections. Of special relevance to the discussion here has been the pioneering of ‘Complexity Economics’, associated in particular with the work of Brian Arthur, which provides a disciplined critique of equilibrium models in neoclassical economics. Markets approach equilibrium only in the way that all working machines approach an entropy maximum, without this limit being ever – or even approximately – actualized. See the institute’s website at

The concrete socio-historical application of such thinking found in Manuel De Landa’s (2000) A Thousand Years of Nonlinear History (see bibliography).  

[6] Between ‘descriptive liberalism’ and ‘analytical anarchism’ – despite the apparent terminological escalation – there is no distinction of serious theoretical significance. ‘Order out of chaos’ (or at ‘the edge of chaos’) is the consistent explanatory horizon.

[7] The most influential proximal ancestor of current network theory (by far) has been connectionism. This intellectual movement, characterized by an extreme interdisciplinarity, introduced the problems of electronic engineering and distributed information systems into the study of biological, psychological, and social phenomena. Its name was coined by neural net-theorist Donald Hebb, in the 1940s. Rising to prominence from the early 1980s, connectionism has generalized the study of neural networks (or parallel distributed processing) in suggestive coincidence with the rise of the Internet. Its research orientation has been guided by the proposition that complex adaptive behavior is better explained by the pattern of connections between simple units than by complex properties inherent within concentrated entities of any sort. At a larger scale, connectionism updates the diffuse multi-century current of empirical associationism, accommodating it to the prospects of technological actualization within electronic networks. It is thus marked by a comparative disregard for elaborate symbolic structures, and for highly-organized internal states more generally. The atomic elements of connectionist analysis are linkage modules, supporting emergent systemic behavior. Like cybernetics before it, connectionism outgrew its own disciplinary contour, and dissolved into the sophisticated common sense of the world. (Generalized dominion can be difficult to distinguish from disappearance.) Subsequent attempts to specify a definite ‘theory of networks’ have been programmatically vaguer, resorting typically to some notional gestures of obeisance in the direction of mathematical graph theory. Much of this work has been seduced by literary temptations, bringing its enduring theoretical productivity into serious question. Most damagingly, in respect to its own capacity for conceptual genesis, the primary analytical discipline of connectionism – programmatic dismantling of mysterious essences into distributions – has been increasingly neglected, and replaced by an arcane cult of whole ‘objects’ withdrawn from the menace of disintegration.

[8] Axioms are independent, formally-articulated assumptions. Since each axiom is a basic presupposition which cannot (therefore) either be derived from, or support, any other, a set of axioms is an irreducible multiplicity. Consequently, an axiomatic has a characteristic mode of variation, based upon composition from wholly independent parts, which can be added or subtracted with comparative freedom. Because axioms are bedrock elements, their selection demands an irreducible experimentalism. (As a matter of logical necessity, the systems they compose can never determine the characteristics of a missing axiom.) Within a social context, the pursuit of minimal axiomatic systems is ideologically charged, since it corresponds to a contraction of public purposes. The historical disintensification of capitalism has proceeded, as Deleuze & Guattari note, by way of an axiomatic proliferation. Addition of axioms is the way capital has been socially compromised. From a Francophone perspective, this tendency appears as a resilient teleological structure. When the same prediction is extended to Anglophone cultures, in which the recession of classical liberalism remains far more seriously contested, less confident conclusions are advisable.  

[9] Rigorous transcendental formulation of the model network – with all the conceptual ironies and traps this involves – finds its highwater-mark in Deleuze & Guattari’s essay ‘Rhizome’ (which introduces the second volume of Capitalism & Schizophrenia, A Thousand Plateaus). A ‘rhizome’ acquires facile (contrastive) definition through its distinction from the ‘arborescent’ schema of hierarchical sub-division. It proposes an absolute horizontality, approached through dimensional collapse, which opposes the rhizome to the tree, reticulation to hierarchical structure, and contagion to heredity. “In truth, it is not enough to say, ‘Long live the multiple’, difficult as it is to raise that cry. The multiple must be made, not by always adding a higher dimension, but rather in the simplest ways, by dint of sobriety, with the number of dimensions one already has available – always n – 1. (the only way the one belongs to the multiple: always subtracted). Subtract the unique from the multiplicity to be constituted; write at n – 1 dimensions. A system of this kind could be called a rhizome” (ATP 6). When analytically decomposed, the rhizome exhibits a number of consistent, distinctive features. “1 and 2. Principles of connection and heterogeneity: any point of a rhizome can be connected to anything other, and must be. … 3. Principle of multiplicity: it is only when the multiple is effectively treated as a substantive, ‘multiplicity’, that it ceases to have any relation to the One as subject or object, natural or spirtitual reality, image and world. Multiplicities are rhizomatic, and expose arborescent pseudomultiplicities for what they are. … 4. Principle of asignifying rupture: against the oversignifying breaks separating structures or cutting across a single structure. A rhizome may be broken, shattered at a given spot, but it will start up again on one of its old lines, or on new lines. You can never get rid of ants because they form an animal rhizome that can rebound time and again after most of it has been destroyed. … 5 and 6. Principle of cartography and decalcomania: a rhizome is not amenable to any structural or generative model. … [The map] is itself a part of the rhizome. …” (ATP 7-13). Manuel DeLanda proposes the term ‘meshworks’ for such systems of flat, heterogeneous, interconnectivity, which he opposes to (comparatively rigid and homogeneous) ‘hierarchies’.

[10] A network, in what is by now the overwhelmingly dominant sense of this term, is by definition robust. It is constructed in such a way as to tolerate massive disruption. Thus the perennial relevance of the military roots of the Internet. In its practical application, communications resilience has been inseparable from resistance to censorship. Like the basic Internet protocols, Tor (‘the onion router’) is decended from a military research program, initiated in the mid-1990s by the United States Naval Research Laboratory and subsequently pursued by DARPA. The regularity with which critical elements of state security apparatus are deflected into crypto-anarchist streams, and inversely, suggests a principle of strategic reversibility, in conformity (on one side) with the ‘poacher-turned-gamekeeper’ phenomenon. Given the prominence of treachery within all rigorously-constructed games, it should not be surprising to encounter this extreme fluidity of partisan identities, in which allies and enemies switch positions. The figure of ‘the hacker’ is notably liminal in this regard, representing – simultaneously – an ultramodern incarnation of the outlaw, and a skill-set indispensable to effective formations of contemporary power. At its highest levels, strategy is a matter of ‘turning’. This is the diplomacy intrinsic to strategy (and thus to war), once the latter is liberated from the strait-jacket of its political – or Clausewitzean – determination, and released into a conceptual space bounded only by a transcendental horizon. Friend and foe are defined within the great game, rather than outside, by its transcendent (political) frame. There can be no assumed parties at the transcendental level of the game, where not only every alignment, but any constitution of agency is itself a ‘move’. Agencies are pieces, consequent to strategic decisions. However tempting it might be to dismiss such considerations as arcane, they are already being condensed by commercial crypto-systems as practical problems. In regards to their legal form, modern business corporations have been synthetic agencies for well over a century. In the emerging epoch of Digital Autonomous Organizations (‘DAOs’), whose economic and military implications are yet scarcely glimpsed, this status advances from a matter of legal definition to one of software implementation. It consummates the historical triumph of code. At the end, as – invisibly – from the beginning, the transcendental subject is fully immanent to a plane of strategic production. 

Crypto-Current (039)


State of Play

§4 — Humans are neither tigers, nor bees. Regardless of ethnic variation, or ideological faction, they are neither solitary, nor collective (eusocial[1]), but social, and societies are essentially middling, or ambiguous. The concepts of the social and the individual, or the public and the private, are reciprocal, and mutually compromised. Social beings are necessarily (always, but only) partially coordinated, through transactional bonds. They have neither group mentality nor perfect autonomy. The way they get along together is an ineluctable and perennial problem, resolved through precarious, transient, meta-stable solutions. All promises of definitive fusion or fission, perfected solidarity or independence, are strictly utopian. The perpetual tension of dynamic social arrangements is an unsurpassable human reality.[2] It occupies the zone of coordination.

§4.01 — The ineradicable ambivalence of the social animal is captured by the theory of games. A tiger does not play games with a prey animal, anymore than bees play games with each other. A game is a transactional integration, at once too intimate for a non-social animal, and too fractured for a consolidated collective.

§4.02 — Games, in the game-theoretical sense of the term – the one relevant here – are always played in the wild. That is to say, they cannot be exited by cheating. If knocking over the table is a move that can be made (in reality) and doing so ends the game, it wasn’t a game of any seriousness to begin with. In any game that matters, cheating is a permitted move, as soon as it is possible at all. It might be said, more precisely, that any game which effectively prevents cheating is embedded within a greater game where such prevention is actualized, as an outcome. Any regulated game is carved out of the wild, and it is the outer game – that carves – which game theory attends to. In this lies its realism, distinguishing its objects from circumscribed, ludic amusements. Games merit social attention precisely because they contain cheating as integral options. Trust has to be internally processed, not extraneously presumed. There are no external referees.[3]

§4.03 — Due to its extreme elegance, and consequent generality of application, Prisoners’ Dilemma (PD) has come to achieve broad acceptance as the archetypal game. The scenario is elementary, by design. Two prisoners are held in noncommunicating cells. Each has the same, binary (or ‘Boolean’) strategic decision to make – to betray the other, or not. The entire game can therefore be represented by a 2×2 matrix. Finally, each space (or outcome) contains two numbers, representing the payoff to the players. In PD this aspect is perfectly symmetrical – the situation of each player exactly mirrors that of the other. Every payoff is a weighted negative utility – dramatized as a prospective period of jail time. All of the information on the outcome grid (or payoff table) is objective. It represent the dilemma facing each player as both, equally, would acknowledge it, without controversy, or perspectival inflection. In principle, it is accessible to both prisoners, and guides their choice of ‘move’.

§4.04 — PD has no well-coordinated solution, unless the game is multiplied – to become iterated,[4] and mnemonic. This is because there is no strictly rational alternative to defection (betrayal) in the absence of additional information, such as the kind that would be provided by the persistence of reputational positions through multiple cycles. In this respect, PD models coordination problems of the tragedy of the commons type, in which the optimization of collective interest is practically unobtainable.[5] ‘Free-rider’ problems are sub-components of the same dilemma, which indicates that it is generalizable to parasitic relations of all kinds.[6] Within all of these cases, rational individual decisions aggregate to a collective failure, expressed as systemic collapse in extreme cases, or – more typically – as a deadweight (negative sum) loss to the population as a whole.

§4.05 — It bears repeating – or reiterating – because it cannot be easily over-emphasized, that Prisoners’ Dilemma has extraordinarily general application to coordination problems. It would, indeed, be quite reasonable to characterize it as the model trust crisis. When concentrated into an atom, the pure element of the game is a double chance of treachery – subjective and objective – arising from the ineliminable hazard, on both sides, of betrayal. What Bitcoin acknowledges, from the beginning, is that to escape the prison-house of distrust is no easy thing, once mere moral exhortation in the direction of altruism is theoretically shelved. The recognition of this problem as a problem is socio-political realism itself. It is at this fork in the road that almost everything is decided.

§4.06 — PD is a close analog of a number of other game theoretical dilemmas, of which the best known is ‘Chicken’ – itself based upon an abstract model of bipolar geopolitical conflict in the context of nuclear deterrence. Chicken has several variants, distinguished primarily by dramatization. In one, competitors wrestle at the edge of a cliff, and double ‘defection’ pushes both over the edge. Another version of Chicken sets two drivers accelerating towards each other in automobiles. The contestant who swerves, loses. If neither swerves, a common calamity results (equivalent to the double defection – or collective pessimal – equilibrium in PD). An important difference between classic PD and Chicken, however, is that in the latter scenario(s) the contestants are not held to be strictly non-communicating. While the final decision of each antagonist remains a black box to the other, thus preserving the core of the game-theoretical dilemma, preliminary expressions of commitment are permissible.[7] Chicken thus permits strategies that involve signaling.

§4.07 — The DSP tells us that signs are cheap. Communication of commitment, therefore, is no trivial matter. Semantically and syntactically flawless statements of exceptional rhetorical quality still commonly – and even typically – mean nothing.[8] To repeat the essential, in the ways that matter most they are easy to say (and their repetition is cheaper still). Unless a cost is credibly attached to them, their flourishes make no additional contribution. The problem of credible commitment, as it arises within the theory of games, thus closely tracks that of the contract in crypto-economics. In both cases the strength (or value) of the signal is directly proportional to a conspicuous contraction of discretionary power, corresponding to an irreversible operation. Only when it is impossible – or at least infeasible – to back-down, recant, or renege, does a signal acquire game-theoretical significance. Burning bridges behind oneself signals something that no rhetorical flight is able to match. Even the importance of precedent – or reputation – in iterated PD is based on the status of the past as an irrevocable commitment. If what had been done could be taken back, like a fumbled move in a friendly game of chess, it would count for nothing. The irrevocable consumption of freedom provides the content for strategic signs.

§4.08 — Bitcoin is a game, in the strong or technical sense, because it does not control cheating through a transcendent rule (upheld by a “trusted third party”), but rather through an immanent principle (Nakamoto Consensus). Its immediate ancestry, within the game-theoretic lineage, descends from the formulation of The Byzantine Generals’ Problem, dating back to the mid-1970s.[9] As Lamport, Shostak, and Pease explain the problem (with line breaks preserved from the original):

We imagine that several divisions of the Byzantine army are camped outside an enemy city, each division commanded by its own general. The generals can communicate with one another only by messenger. After observing the enemy, they must decide upon a common plan of action. However, some of the generals may be traitors, trying to prevent the loyal generals from reaching agreement.

The generals must have an algorithm to guarantee that

A. All loyal generals decide upon the same plan of action.

The loyal generals will all do what the algorithm says they should, but the traitors may do anything they wish. The algorithm must guarantee condition A regardless of what the traitors do.

The loyal generals should not only reach agreement, but should agree upon a reasonable plan. We therefore also want to insure that

B. A small number of traitors cannot cause the loyal generals to adopt a bad plan.

§4.09 — ‘Byzantine failures’ arise when parties distributed within a communication network, containing unreliable nodes, are obstructed from reaching agreement, because they cannot confidently establish among themselves what has in fact been communicated, or from which agents messages have been received. A global perspective appears unobtainable, and local perspective is vulnerable to compromise. The extreme difficulty involved in Byzantine communications makes them a model coordination problem, of special relevance to Internet-connected agencies. Crucially, for our purposes here, and beyond, the problem follows upon an assertion of immanence (a critique), since it is defined primarily by the absence of a transcendent tribunal with global insight. None of the ‘generals’ are able to stand outside the system, call upon an authoritative criterion from beyond it, or even direct their communications around it. Their relation to each other is technically flat (or peer-to-peer). Any solution has to be drawn from out of the system itself – which is to say, from the self-organizational resources inherent to sheer multiplicity.

§4.091 — Nakamoto Consensus, in game-theoretic context, is the name for a solution to the Byzantine Generals’ Problem, based upon proof-of-work. By including proof-of-work within each message (hashed block), the generals are able to make the measure of agreement reached – i.e. computational power committed – into an intrinsic property of their communications. Agreement about the message is folded into the message. As blocks are chained, securely, in strict succession, the signal of consensus strengthens. In meeting a reiterated proof-of-work criterion, the blockchain accumulates immanent credibility. It replaces an extrinsic – and intractable – question about the reliability of communications with an intrinsic communication of reliability. Trust is made into the message.

[1] Within the terrestrial biosphere eusociality is most vividly represented by the Hymenoptera (ants, bees, wasps) and by termites. Unsurprisingly, therefore, the concept has been advanced primarily by entomologists. Suzanne Batra and E.O. Wilson have been particularly significant in advancing the concept, based on insect models in both cases. Nevertheless, truly communistic mammals do exist, instantiated by two species of mole-rat. As with eusocial insects, mole-rat societies are rigidly segmented between fertile and infertile castes (a biological precondition for equilibrium communistic organization). Eusocial species incarnate the solution to a coordination problem. The games involved (searches for evolutionarily stable strategies) have been resolved at the genetic level. That the pseudo-individuals within insect hives or colonies do not engage in competitive games with each other is precisely what eusociality means. Within (merely) social species, in contrast, genetics is under-determining, setting only general parameters for intra-social cooperative and competitive behavior. The execution of games is delegated to phenotypic performance, without access to any collective optimum state, or established strategic equilibrium. Such animals thus inherit the plasticity implied by ongoing (and uncompletable) games – which opens the sphere of culture, as a semi-autonomous domain of variation and emergent outcomes.  

[2] The same set of distinctions between the social, the a-social and the eusocial, is invoked by James A. Donald in his path-breaking study on the foundations of Natural Law,

The tripartite distinction echoes Aristotle’s classical statement, from the Politics: “Man is by nature a social animal; an individual who is unsocial naturally and not accidentally is either beneath our notice or more than human. Society is something that precedes the individual. Anyone who either cannot lead the common life or is so self-sufficient as not to need to, and therefore does not partake of society, is either a beast or a god. ”

[3] We can say, more precisely, that any game overseen by an external referee is embedded within a greater game, perhaps recursively, until reaching the transcendental plane which isn’t subject to adjudication by anything beyond itself.

[4] The ubiquity of the PD coordination model does not escape Venkatesh Rao, who observes: “… the well-known Iterated Prisoner’s Dilemma (IPD) model [is] sometimes called the e. coli of social science research.” In the words of Simon Dedeo: “As a tool for the mathematical study of human behavior, [PD] is the equivalent of Galileo’s inclined plane, or Gregor Mendel’s pea plants.”

[5] Garrett Hardin’s ‘The Tragedy of the Commons’ (1968) first coined the term that now seems so indispensable. Despite its compelling simplicity, there is little sign of subsequent intellectual convergence upon what this model of overexploitation through coordination failure implies. At the political level, socialists and libertarians – equally – find their analyses and prescriptions supported by it. An ideological meta-tragedy has thus confirmed its insight in the very process of failing to draw common conclusions from it. Hardin’s classic essay can be found at:

[6] Radical coordination failure in biological systems is epitomized by the parasite that kills its host. Despite the difficulty of evaluating deep historical evidence, under natural conditions in which extinction is the fate of approximately all species, it nevertheless seems reasonable to interpret this extreme pessimal equilibrium as the exceptional case. It is widely recognized that diseases tend to decline in malignancy over time, as self-destructively extravagant forms of parasitism are weeded from the biological record. Epistemological and ontological selection effects here converge, as the ‘phenomenon’ of coordination failure is edited from the realm of evidence. (That we will tend to see what works is Darwinism itself.) As with the tragedy of the commons, parasitical relations – enveloping every kind of predator-prey interaction – are vulnerable to overexploitation failures. The attendant arms races are important drivers of biological diversity, and phenotypical extravagance. The effects of competition for light among trees – roughly, trees themselves – are only the most vivid example of the way biological form is dominated by the outcome of a long history of default to non-coordination.

[7] In The Strategy of Conflict (1960) Thomas Schelling emphasizes the importance of ‘credible commitment’ to classically-structured games. His insight is satirized – with great insight – in Kubrick’s Dr. Strangelove, which builds its plot around the understanding that the limit signal of commitment is strategic automation (or automatic retaliation). The strategic irrationality of making this extreme commitment without signaling it is a central comic device of the movie.  

[8] An entire poetics could be constructed in this space. Based upon the lacuna of credible commitments in the pure linguistic realm, it would reverse the game-theoretical problem into a source of creativity, by conceiving it as a rhetorical generator. (That is an undertaking for another occasion.)

[9] The Byzantine Generals’ Problem – which is the difficulty of achieving ‘Byzantine coordination’ – was initially named ‘The Two Generals Paradox’ upon its formulation by Jim Gray (in his ‘Notes on Data Base Operating Systems’, 1978), and was then generalized – to larger multiple agent systems – by Leslie Lamport, Marshall Pease, Robert Shostak (in 1980).

As humorously reformulated by Satoshi Nakamoto, in a post on the Cryptography mail list that scrupulously preserves the critical abstract properties of the problem: “A number of Byzantine Generals each have a computer and want to attack the King’s wi-fi by brute forcing the password, which they’ve learned is a certain number of characters in length. Once they stimulate the network to generate a packet, they must crack the password within a limited time to break in and erase the logs, otherwise they will be discovered and get in trouble. They only have enough CPU power to crack it fast enough if a majority of them attack at the same time. […] They don’t particularly care when the attack will be, just that they all agree. It has been decided that anyone who feels like it will announce a time, and whatever time is heard first will be the official attack time. The problem is that the network is not instantaneous, and if two generals announce different attack times at close to the same time, some may hear one first and others hear the other first.” The same post explains how a proof of work solution can be achieved:

It is especially important to note that the BGP formalizes the problem of coordination (in general) as synchronization. As already remarked (in Part One), it articulates a problem whose insolubility, in the context of cosmo-physical theory, coincides with general relativity, spacetime, and the renunciation of absolute succession. A solution to the BGP, therefore, is intrinsically post-relativistic. (Given the restoration of succession to the order of signs that follows from the innovation of the blockchain, the application of the ‘post-’ prefix in this case has exceptional – and reflexive – conceptual legitimacy.)

See also The Problem of Firing-Squad Synchronization, whose relevance is implicit in its name:

Crypto-Current (038)

§3.8 — Setting out on the path to a cognitive integration of Bitcoin calls for both anticipation and critical retrospection, and in fact compels it. Bitcoin drives a migration long promised by transcendental philosophy, from naïve ontology to a practical acknowledgment of the essence of being as the criterion of reality (finally indistinguishable from absolute succession, or order in-itself).[1] What emerges is nothing less than an artificial universe, founded – groundlessly – upon a spontaneously-engineered consistency. Once it is granted, practically, that no assertion of truth can be effectively sustained against a predominance of cognitive capability, all prospect of Archimedean (epistemological) leverage is subtracted. Bitcoin at once systematizes and implements this insight within its cycle of auto-production, establishing the foundations of transcendental authority through a realization of semiotic singularity. Truth is that which survives a process of elimination biased against duplicity.

§3.81 — The elegance, or economy, of Bitcoin’s virtual universe is fully consistent with a certain ontological luxuriance, encompassing a population of agents (represented by accounts), territories (wallets), objects (coins and coin-fragments), events (transactions), a consensual history (the blockchain), and – providing an ultimate criterion of reality – matter (computing power). Such tropical frondescence is also ecological. It generates niches, as zones of specialization, competition, and proliferation. As with all cases of techonomic revolution, the result is a ‘Cambrian Explosion’ of unpredictable, cross-stimulated innovation. The very meaning of ‘species’ undergoes escalation. As a side-consequence of its unprecedented ontological severity, or selectivity, Bitcoin triggers a re-population of the world.

§3.82 — Given the common principle of viral hijacking and double-spending, any DSP solution makes an immediate contribution to the field of computer security. “Trusted third parties are security holes,” Nick Szabo writes.[2] Bitcoin as critique is immediately security innovation, because immanence is self-policing. Transcendent sources of protection are vulnerabilities. It follows that Bitcoin security threats are characteristically extrinsic, applying to the edges of its commercium, where violence and fraud can be targeted at ‘people’ (IRL-IDs) and their insecure human flesh-machines. Most crudely, an individual can be menaced with a weapon (in meatspace), and told to hand over his private key. Alternatively, residual intermediaries – entrusted with the safekeeping of bitcoins – can abscond with them. Such dangers are, however, exogeneous. Even when they are associated with Bitcoin in public perception, their origin lies elsewhere.

§3.83 — Bitcoin has yet to be hacked.[3] The principal security threat to Bitcoin is still conceived – as it was already at the origin – as a ‘51% attack’ in which a hostile party (or coalition) commands sufficient applied computing power to overwhelm the consensus, and subsequently re-configure the protocol to its convenience.This vulnerability is finally game-theoretical rather than narrowly technical, as are Bitcoin’s defenses against it.[4] Incentives are an integral factor. Stated with maximum crudity: Why would an attacker be motivated to destroy an asset that has already been captured? Subversion of Bitcoin requires that one first owns it, at least to the degree that its devaluation becomes a self-inflicted injury. These questions are addressed a little more fully in Chapter-4 (directly following).

§3.84 — Bitcoin is nothing less than a semiotic restoration – an Occidental analog of the Confucian rectification of signs – and actually something more, because it is irreducibly innovative (on the efficient model of critique). For the first time, the securitization of a sign, as an economic token, has been understood. Meaning becomes hard currency. The immense philosophical revolution is implicit: It can be demonstrably made impractical to lie. Thus, by a negative and ‘merely technical’ route, all prior discourse on truth has been bypassed. With Bitcoin, there is now a truth engine. The consequences are not easily delimited. Even if Bitcoin remains to be definitively comprehended as the long-anticipated end of philosophy, there has never previously been a more convincing model for it. We know, from around the back, what truth is now.

§3.9 — While this book contains numerous signs representing economic values, this does not mean that it is made – even partly – out of money. The expression ‘BTC 21,000,000’ – as it appears here and in comparable texts – evidently has no monetary value whatsoever. From this alone we can confidently presume that monetary signs have some crucially distinctive characteristic, which is only very inadequately captured by any general semiotic determination such as ‘representations of economic value’. A monetary sign is something more than a sign that means ‘money’. Money, nevertheless, is made out of monetary signs.

§3.91 — In order for signs to function as money, they have not only to represent value as a signification, or to indicate it (for instance as an account code), they also have bear it, as something else. Alongside the semiotic aspects of signification and indication – and even perhaps on occasions instead of them – monetary signs require the characteristic of commutation, collection, or allocation.[5] They involve real, rather than merely metaphorical, substitution or exchange, as a condition of possibility for expenditure. A language-user can spend time and energy emitting words, but the words themselves are not – in any rigorous sense – spent. Vocabulary is not consumed in the process of speaking or writing, because a word is not – unless merely figuratively – ‘passed’ from one party to another, but rather duplicated each time a message is communicated. Where a message is spread, or proliferated, money is transmitted – in accordance with the rules of double-entry book-keeping, and contrary to the dynamics of multiplication through double spending. When money is as such, it is added to one wallet or account only in being deducted from another. Whenever – in contrast – money operates in the manner of a linguistic sign, it is spent without cost, and rapidly reduced to worthlessness.

§3.92 — It would be convenient if the word ‘token’ were available to carry the sense of the allocative sign, and there is some indication that the word is being adopted in crypto-currency circles in this way, indifferent to potential interference (and confusion) from its previously established technical and philosophical usage.[6] In their ordinary deployment, tokens count as money. Yet precisely because they allocate more than they signify, their meaning has remained – overwhelmingly – lodged in obscurity. They circulate in immense numbers, saying little.

§3.93 — If a new semiotic settlement is to follow in the wake of the Bitcoin protocol, and its solution to the DSP, there is an alternative common term all-but destined – if not, in fact, simply destined – to be cemented into the foundations. The allocative sign is the coin. General acceptance, in this regards, requires only an increment of abstraction, accompanied by an automatic reversal. Once the crypto-currency ‘-coin’ suffix, rather than alluding to concrete specie, acquires the status of a defining model, the word ‘coin’ becomes the technically-precise bearer of a semiotic function. A coin, then, would be fully characterized as a unit of DSP-resolved currency, typically instantiated as a highly-virtualized, Internet-communicable ledger entry, reproduced on a blockchain.  

§3.94 — Beside the signifier and the index – or no less beneath them – is the coin.

[1] Melanie Swan, whose writings on Bitcoin are distinguished by their extraordinary visionary sweep, describes the cryptocurrency as “in some sense … a supercomputer for reality”. (This is a ‘sense’ that she proceeds vigorously to explore.) Unfortunately, her framework of understanding is impaired at its highest level by a propensity to abundance theorizing, under the sign of cornucopian illusion. This error is unfortunately common – and even typical – among those drawing upon transhumanist inspiration. As the genealogy of Bitcoin vividly demonstrates, the primary manifestation of digital abundance is not the supercession of the commodity, but spam. Bitcoin restores robust scarcity, precisely insofar as it filters-out spam money. …

[2] See:

[3] Bitcoin is an experiment in digital security. See:

For an examination of more exotic threats, in the emerging epoch of quantum computing, see Vitalik Buterin:…

[4] See the Bitcoin paper, section 6: “The incentive may help encourage nodes to stay honest. If a greedy attacker is able to assemble more CPU power than all the honest nodes, he would have to choose between using it to defraud people by stealing back his payments, or using it to generate new coins. He ought to find it more profitable to play by the rules, such rules that favour him with more new coins than everyone else combined, than to undermine the system and the validity of his own wealth.”

In this same vein, Morgen E. Peck asks: “Why trust Bitcoin, or more specifically, why trust the technology that makes Bitcoin possible? In short, because it assumes everybody’s a crook, yet it still gets them to follow the rules. … In old security models, you tried to lock out all of the greedy, dishonest people. Bitcoin, on the other hand, welcomes everyone, fully expecting them to act in their own self-interest, and then it uses their greed to secure the network.” This is of course simply liberalism, as it was once understood. Bernard Mandeville’s The Fable of the Bees had already securely apprehended the principle. Originality lies in the implementation.

[5] The task of completing the basic triad of semiotic dimensions is a voyage into terminological torment. For signs to fold-back so far into themselves is an invitation to madness. If ‘commutation’ is vulnerable to ruinous interference from its dominant mathematical usage, ‘collective’ buckles under its hyper-density of ideological associations. To collect is to accumulate. The most primitive money, Nick Szabo suggests, consists of collectibles. Yet a social collective, in its strong ideological sense, reduces property to its zero-degree (with the full suppression of exclusive use). To invoke ‘collectivism’ in the context of monetary semiotics, then, could quite reasonably appear as a gratuitous provocation, only partially excused by the entertainment potential it releases. Its abrasiveness would most likely prove culturally unsustainable.

[6] The most firmly-established technical determination of the word ‘token’ is that locked into the logical and philosophical ‘type / token’ distinction, which has been adopted into computer science and programming. It distinguishes a type or class of thing from a thing. Wikipedia illustrates the distinction through an unimprovable sentence from Charles Sanders Pierce: “There are only 26 letters in the English alphabet and yet there are more than 26 letters in this sentence.” The counting of letter-types and letter-tokens is different. Only in the latter case does the arithmetic incline to economics, opening to factors such as production capacity and cost, batch sizes, and supply limits. (Transition to the economic register occurs via the product prototype, and its special, initial, or unique costs.) This book refers ambiguously either to the output of an authorial and editorial process, or a unit from a print run. The digital complication of the distinction, or the meaning of an instance and its economics (encompassing the entire conceptual and practical problematic of intellectual property), is in this case especially evident. Typal property, of the kind found in IP, is reliably confounding. This coin, similarly, splits on the type / token fracture line, which divides it between its twin faces as an example of a class of coinage, and as a unit of currency. A ‘token’ in this sense has a definite relation to the idea of the non-duplicitous or allocative sign, since it isolates non-generic (or ‘numerical’) identity.

Crypto-Current (037)

§3.7 — ‘Singularity’ is a stressed sign, even in advance of its capture by theories of decentralized crypto-currency. It carries a complex of meanings that can easily appear inconsistent, and perhaps only arbitrarily concatenated (although this is not a conclusion drawn in the present work). The simplest – logico- grammatical – sense and usage of the term is fixed by contrast to plurality. ‘Singularity’ is the state of being singular (undoubled, or in any way further pluralized). This austere meaning has been overwhelmed by more exotic cosmo-physical, eschatological, and philosophical references – to the event horizon of gravitational collapse, to the ‘wall across the future’ drawn by emergent superintelligences, and to non-generic being beyond the metaphysics of unity.[1] The term is further complicated by its substantial overlap with individuation, which has itself accumulated technical semantic mass through its application to the study of complex systems. It is an essential characteristic of any complex system that it individuates (itself).

§3.71 Bitcoin Singularity is over-determined within this cloud of associations. It is not only – as already proposed – an autonomization event, or threshold of individuation, but also a de-pluralization (through resolution of the DSP), and even a crisis (or ‘critical-point’) in the history of terrestrial intelligence, with definite invocation of Technological Singularity – for which it arguably provides an infrastructural foundation. Singularity eludes comparison. It can be designated, but not definitely signified. It marks a limit of objectification, rather than an object. Kantian transcendental realism – whose place carrier is the non-objective thing-in-itself – prepares us for it.

§3.72 — Solving the DSP upon the digital plane requires that the relevant entities – units of value – can be copied without being multiplied. Unless carefully formulated, therefore, the problem can appear simply insoluble (as a straightforward contradiction). How can digital replication be assimilated to the conservation of singularity? As seen, repeatedly, such apparent contradiction (or pseudo-paradox) is the reliable indication of an incompletely resolved diagonal problem. The solution is the scarce sign, consolidated as a concept, but also – and no less fundamentally – actualized as a technical achievement. Bitcoin realizes a diagonal function, instantiated through digitally-replicable but economically precious signs.

§3.73 — The Bitcoin singularity simulation is – among many other things – a philosophical event of extraordinary significance: the technical initiation of absolute succession. From this point, history explicitly enters the phase of synthetic ontology, or the techno-commercial production of being. Reality is re-grounded in a catalyzed – and henceforth catallactic – construction, which functions as an ultimate criterion. In all questions directed towards the veracity of signs, the blockchain is – if as yet only virtually – the terminal tribunal.[2] Intrinsic to this innovation is the necessity, or strict principle, that no superior authority is possible. Within the entire cosmos of signs, encompassing all social and cultural exchanges, it is only through the blockchain – or some adequate analog – that the extinction of duplicity is ensured.

§3.74 — Such claims can only appear hyperbolic. They correspond, as previously noted, to the objective idealism of transcendental philosophy, insofar as they dismiss all prospect of external epistemological leverage as pre-critical. Nothing can be brought to bear upon Bitcoin from without that is not manifestly inferior to it in respect to the capability for truth validation. There cannot be an intellectually compelling reason for any anthropo-philosophical criticism of Bitcoin to be believed. To be discredited, in this ultimate or transcendental milieu, is only to be effectively selected against. Such an eventuality does not depend upon a philosophical decision (in the still prevailing sense of this term), but upon abandonment through hard-forking and effective loss of consensual support. The blockchain automatically facilitates the subtraction of every cosmos – or advancing world-line – compatible with duplicity. Block validation, then, is the basic mechanism of a selective ontology.

§3.741 — It has to be expected that no less than several decades will be required for the full epochal radicality of this transition to be appreciated, at an even approximately adequate scale. The current (Perez) ‘Great Surge of Development’ and its installation of blockchain-based distributed systems sets the pace of cultural assimilation. In accordance with rhythmic historical precedent, the ‘wild exaggerations’ of the germinal phase becomes the conventional wisdom of the mature techonomic order.

[1] ‘Singularity’ has been an over-invested term, even prior to its inflation by intellectual fashion. In its philosophical usage, it refers to non-generic being. This acceptation has twin lineages, within Anglophone and Continental traditions, but these converge upon a (single) conceptual core. Both draw – critically – upon the Leibnizean principle of the identity of indiscernibles, which proposes that no two things can be different if their complete (or maximally-elaborated) logical definitions do not differ in any respect. This is a principle that succumbs to the general crisis of logicism, as it unfolded within the early 20th century, most decisively in the work of Gödel and Turing. It draws upon the notion of a comprehensive definition, which falls prey to criticism based on the discovery of irresolvable incompleteness, or non-computable numbers. Cantor’s late 19th century demonstration of diagonal argument anticipates the crisis of logical comprehension, in its most abstract features. The singular begins where the project of definition encounters a rigorously-insurmountable limit. ‘This is not that’ does not even begin to tell us what ‘this’ is, or to provide its name. Determination-through-negation stalls at the threshold of singularity, where logical signifiers are supplanted by diagonal indicators, pointing into the rigorously incomprehensible. The numerical identity of the thing designates a logically-intractable excess, conserved (diagonally) even after infinitely exhaustive qualitative determination. Transcendental numbers provide the pattern. Individuation is ontologically basic (or transcendental), even if it is typically missed, or misidentified, as ego or object. Real selves and things, in themselves, are singularities. Reality disintegrates into them. No universe can encompass singularities. It is rather that any apparent universe is fragmented by them. (Black holes are not cosmic furnishings, but doors.) Singularities are transcendental by definition, since they elude all super-ordinate jurisdiction, or domain-subordination. Laws ‘break-down’ at their boundaries. They are thus elements of absolute multiplicity, or difference without genre. Historical or ‘Technological’ Singularity is – if only superficially – another thing entirely. Vernor Vinge describes it (perfectly) as a “wall across the future”. Historical structures can be extended up to, but not into it. This ‘Singularity’ sets the absolute limit of all projections. Like a black hole, it is epistemologically-impermeable. John Smart has suggested that it might even – sensu strictobe a black hole, achieved as an engineered techno-compression catastrophe. According to this forecast, communication time-lag minimization drives implosion. Only collapsed matter is fast enough for the future. Translated into the terminology of Bitcoin, optimization of the block discovery rate tends to singularity. Because impending terrestrial Singularity is a thing, and not a signification, it overspills every attempt at comprehensive definition. This stream of references is therefore far from exhaustive. In particular, there is an additional noteworthy employment of ‘singularity’ within the sphere of aesthetic production, designating the threshold at which the name of an artist acquires distinctive consistency, and thus efficient dehumanization. With a sufficiently abstract sense of the ‘artist’ this usage curves back into the main current. Singularity is the ultimate agent, or it is nothing.

[2] Once again, the obvious reference is found here:

Crypto-Current (note)

Attentive readers will notice a radical format change (due to WordPress updating). Footnotes, in particular, now work more like those in the book, although more strewn. Links have been returned to URLs, which is probably less convenient, but also more book-like.

No real point in mentioning this, unless to pre-emptively dismiss rumors of sudden-onset insanity.

Crypto-Current (036)

§3.6 — The duplicity of the sign has numerous variations, and double spending (narrowly conceived) is by no means the only one with direct relevance to Bitcoin. A digital monetary system is intrinsically open to fraudulence (manipulative duplicity) at every scale, since not only its currency units, but also its associated websites, exchanges, and institutions – up to the level of an entire implemented protocol or commercium – are vulnerable, as a matter of first-order principle, to cloning. In this (widened) sense, the DSP is the indicator of a fully scale-free vulnerability.

§3.61 — Bitcoin, as a whole, is replicable open source software. It has no secure uniqueness, beside that – by no means inconsiderable – of coming first. The fact that the distinctive identity of Bitcoin inheres solely in its originality – which is to say its historical privilege – is already an invitation to clone invasion, at multiple levels. Since the avenue of monetary counterfeiting is blocked by the Bitcoin DSP solution, digital duplicity is displaced, and in fact up-scaled. From the corruption of currency units, it is redirected into the corruption of currency institutions and systems. Fraudulent entities proliferate at the edge of the Bitcoin system, from fly-by-night scam sites to entire exchange businesses (whose structural corruption is as likely due to the unconscious consequences of defective design, as to malicious criminal intention).  

§3.62 — Among these dubious displacements of the DSP, the propagation of more-or-less Bitcoin-like currencies has a special place. The topic of altcoins is particularly engaging, and easily merits a dedicated work on its own account. As a deposit for creative techonomic endeavor, these variant cryptocurrencies are perhaps unsurpassed. Yet, when approached on the grimmest and most narrowly-critical track, they appear as deviant paths off the Bitcoin blockchain,[1] and – worse still – as a recrudescence of the DSP, amplified to the level of entire currency systems.

§3.63 — It is unnecessary to make too much of the fact that no less than three different altcoins have been brazenly named ‘Scamcoin’.[2] ‘Hammer of the altcoins’ Daniel Krawisz argues that they are all scams,[3] comparing them to cargo cults, for which there is an expectation of “similar results through blind imitation”. According to this argument, the proliferation of altcoins is a pathological phenomenon, to be denounced as an impediment to the emergence of Bitcoin’s natural monopoly (since, due to network effects, “one would always expect a single currency to overcome all its competitors”[4]). Because they sap network-effects, however feebly, altcoins are a parasitic drain, interfering with the ability of Bitcoin to rapidly reap the full consequences from its first-mover advantage. Krawisz writes: “…once Bitcoin exists, then there is no additional value, from a monetary standpoint, of creating knock-offs. … What makes Bitcoin great cannot easily be duplicated. … Altcoins can only be explained if we believe the purpose of cryptocurrencies is to make money rather than to become money.” 

§3.64 — Between Bitcoin and a close-clone altcoin, the difference that matters is invisible to even the most painstaking inspection of code. To avoid distraction, it is advisable to suspend all such comparison, and to assume – instead – perfect duplication. Bitcoin – as an event or real singularity – has no exclusive essence that can be separated from its history. It is merely an instantiation of its own code, even if the first one. Its currency potential is a matter of momentum, exhausted by its path dependency (or “history and community” as Krawisz puts it). Only the workings of nonlinear network effects, based upon its ‘first-mover’ or ‘incumbency’ advantage – rather than any determinable differences in kind – distance Bitcoin, in principle, from its proximate competitors.

§3.65 — Bitcoin does not defeat forgery by being difficult to forge, but rather – absolutely – the opposite. It abandons such terrain in advance, on the implicit assumption that all original identity is indefensible in the digital epoch. Synthetic being, alone, can secure itself. Once again, and not for the last time in this exploration, we are returned to the rift – the abyss. Bitcoin’s integrity is groundless. Every imaginable redoubt of essential uniqueness is denied to it in principle (or a priori). It can be based upon nothing other than the circuitry of auto-production, whose only ‘foundation’ lies within itself.

[1] There are three basic ways an altcoin can relate (‘cladistically’) to the Bitcoin blockchain. Competitor currencies, in particular, typically represent a separate lineage, initiated by cloning and minor modification of the Bitcoin protocol. Only slightly more speculatively – which is to say experimentally – they can be produced by ‘hard fork’ (speciation) events within the blockchain. Within the emerging digital ecology, hard forks can be expected to make an important contribution to basic political-economic conceptuality. Perhaps the most interesting possibility, with regards to evolutionary complexity, is provided by the option of attachment as side-chains. In this situation, a comparatively high degree of intricate, symbiotic co-evolution is built into the pattern of rising diversity from the beginning. See:

Bitcoin’s first hard fork occurred in August 2017, with the split of Bitcoin Cash. The break divided the crypto-currency between a major lineage prioritizing the security of a store of value, and a minor lineage prioritizing convenience as a means of payment. Monetary theory had become a process of experimental dissociation. Fitz Tepper at TechCruch commented helpfully: “Essentially, like everything else in crypto, no one knows what’s about to happen next.”

In discussing the relation between the archaic RNA world and its obscure predecessor – perhaps Cairns-Smith-type information-preserving clays – as a highly-abstract analogy to the potential transition to a post-DNA hegemonic information medium, Hans Moravec coined the term replicator usurpation. In this regard, the comparison of blockchains to genomes is of evident relevance. Both are characterized by the massive redundancy that comes from ubiquitous copying. Every cell or node contains a full version of the record. Additionally, speciation functions comparably in both cases. Variants stemming from any given speciation event (or hard fork) share a lineage. Diversity has a cladistic structure, or fragmentation record, registered in the conservation of common code.

Fred Ehrsam notes that: “Forking is a … critical evolutionary mechanism for blockchains. Just like mutations to DNA in biological organisms allow for evolution through natural selection, forking lets us run multiple experiments in parallel where the strongest versions survive.”


The topic of forking, amid other types of crypto-currency proliferation and diversification, will return in relation to the concept of inflation in a post-macroeconomic world. The emerging monetary schematics suggest spontaneous (decentralized) market regulation of the price of money via the propagation of difference rather than the amplification of the same. Speciation replaces printing as a deflation-control.

[2] The original ‘ScamCoin’ was released in January 2014. It was succeeded by two further altcoins bearing the same name.

[3] See:

[4] The argument for natural cryptocurrency monopoly, in its most abstract features, is a strict analog of the proposal advanced among certain influential voices engaged with the prospect of AI-Singularity, that such an event would be expected to install an effectively-unchallenged ‘Singleton’. In both cases, the argument identifies a point of criticality (or singularity) at which first-mover advantage is amplified explosively, by powerful positive feedback, leading rapidly – or at least with exponential cybernetic inevitability – to total domination. An articulate defense of this idea has been presented by Nick Bostrom:

Crypto-Current (035)

§3.44 — From the perspective of the miner, bitcoins are immanent remuneration for primary production, or resource extraction. They function as digital gold. As the simulation of a finite resource, it is natural that their production rate should exhibit declining marginal returns. Each increment of mining effort confronts an increasingly challenging environment, under conditions of steady depletion. For Bitcoin, as for gold, economic dynamics automatically counter-balance industrial exertion, as prices adjust in response to supply constraints. This process of continuously revised bitcoin price discovery cannot be determined within the protocol, but occurs at its edges, where economic agents trade into, and out of, bitcoins – synthesizing the Bitcoin commercium with its outside.

§3.45 — Within the protocol, adjustments are restricted to supply modifications, modeling the depletion of an abstract resource that is advanced as a general commodity (i.e. money). Bitcoin splits its schedule of decreasing returns in two, separating its measures of reward and difficulty. This double contraction – while clearly redundant from the viewpoint of austere abstract theory – enables a superior degree of flexible calibration, in response to a dynamic environment, volatilized above all by rapid improvements in computational engineering (and product delivery). By dividing bitcoin output compression between two interlocking processes, the protocol is able to stabilize the rate of block validation in terms of an ‘objective’ (external) time metric. The difference between these two modes of nominal reward restriction reflects a schism in time, between the intrinsic, intensive, absolute succession of the blockchain, and the extrinsic, geometricized order of pre-existing (globalized) chronological convention. Integrated reward is a complex chrono-synthesis, occurring at the boundary where Bitcoin’s artificial time – proceeding only by successive envelopment (of blocks into the chain) – meets the social-chronometric time of measurable periods. ‘Ten minutes’ means nothing on the blockchain (in itself), until connected by an interlock that docks it to a chronometer.

§3.46 — Are not all blocks time-stamped? it might be objected. To avoid confusion at this point, it is critical to once again recall the difference between the ordinal and the cardinal, succession and duration. Time-stamps are ordinal indicators, without any intrinsic cardinality, and with merely suggestive cardinal allusion. They implement an ordering convention. Metric regulation of periods is an entirely distinct function. ‘Chain’ means series (and nothing besides).

§3.47 — The bitcoin reward rate halves, stepwise, in four-year phases, on an asymptotic progression towards the limit of BTC 21,000,000 – the protocol’s horizon of zero-return. Taken in isolation, this exponential decline looks smoothly Zenonian (asymptotic), or infinitesimalizing, until arbitrarily terminated at a set point of negligible output. It is scheduled to pass through 34 ‘reward eras’ in the last of which – with block 6930000 – BTC issuance reaches zero. Due to the power of exponential process, 99% of all bitcoins are issued by Era-7 (during which 164,062.5 bitcoins are added to the supply).* The end of Bitcoin’s mining epoch is anticipated in the year 2140. After this point, at a date so distant that it belongs to the genre of science fiction, continuation of the system requires that mining-based block validation incentives are fully replaced by transaction fees. Evidently, the transition process cannot be expected to await its arrival at this remote terminus, which marks a point of completion, rather than inauguration.

§3.48 — The reward schedule is further tightened by increasing difficulty of the hashing problem. Rather than executing a pre-programmed deceleration, Bitcoin’s rising difficulty responds dynamically to technological acceleration, and balances against it, thus holding the block validation rate roughly constant. Even as the reward rate tumbles – when denominated in BTC – the block processing rate is approximately stabilized, at a rate of one block every ten minutes, regardless of the scope and intensity of mining activity.

§3.49 — ‘Difficulty’ modification is a synchronization. The Zenonian time of intensive compression that determines the BTC reward-rate is – taken on its own – wholly autonomous, or artificial. As already noted, its chronometric ‘ticks’ are block validation events, registered in serial block numbers (and their ‘epochs’). They have no intrinsic reference to the units of ordinary time. It is only with the stabilization of the block-processing rate that the time of Bitcoin is made historically convertible, or calendrically intelligible, through the latching of block numbers to confirmed or predicted dates. This is a supplementary, synthetic operation, which coincides with the protocol’s anthropomorphic adoption. The time of the blockchain is intrinsic, and absolute, but its history is a frontier, where it engages ‘us’. As the blockchain is installed, and thus dated, an artificial time in-itself – consisting only of absolute succession – is packaged as phenomenon.

§3.5 — It can easily be seen that bitcoin mining is an arms race, of the ‘Red Queen’ type.** Since the total bitcoin production rate has zero (supply) elasticity, local advances in production can only be achieved at the expense of competitors. In consequence, inefficient miners are driven out of the market (as their costs – especially electricity bills – exceed the value of their coin production). This brutal ecology has forced rapid technological escalation, as miners upgrade their operations with increasingly specialized mining ‘rigs’. In the course of this process, the standard CPUs initially envisaged as the basic engines of bitcoin mining have been marginalized by dedicated hashing hardware, from high-performance graphics processing units (GPUs) – originally designed for application to computer games – through field-programmable gate arrays (FPGAs), to application-specific integrated circuits (ASICs). Bitcoin has thus stimulated the emergence of a new information technology industrial sub-sector.

§3.51 — With the completion of this production cycle, Bitcoin Singularity is established in a double sense (we will soon add others). An unprecedented event has occurred, upon a threshold that can only be crossed once, and an innovation in autonomization attains actuality, establishing the law for itself. Bitcoin provides the first historical example of industrial government. It is ruled in the same way that it is produced, without oversight. At the limit, its miners are paid for the production of reality – effectively incentivized to manifest the univocity of being as absolute time.***

* For a more detailed description of the Bitcoin reward schedule, see.
Fort a compact, chronometric representation of mining difficulty, see.

** The Red Queen dilemma, as formulated by Lewis Carroll in Alice in Wonderland, is that “it takes all the running you can do, to keep in the same place.” Daniel Krawisz makes another comparison: “When a person upgrades their mining computer, they mine at a faster rate and therefore earn more bitcoins. However, when everyone upgrades, the mining does not become more efficient as a whole. There is only supposed to be one new block every ten minutes regardless of how hard the network is working. Instead, the network updates the difficulty to require more stringent conditions for future blocks. All miners may work harder, but none is better off. It is rather like a forest, in which every tree tries to grow as tall as possible so as to capture more light than its fellows, with the end result that most of the solar energy is used to grow long, dead trunks.”

*** The doctrine of the univocity of being is derived from Duns Scotus, and passes into modernity by way of its implicit contribution to Spinozistic ontology, as re-activated by Deleuze. It can be formulated in various ways. Most basically, the meaning of ‘being’ is insensitive to its application, and unaffected by differences of kind. Thus, the being of any being is no different from that of any other. God is not a flake of dandruff (and differs very significantly in kind from one, whether the distinction is entertained from a theist or atheist perspective), but the being of God has no difference whatsoever from that of a flake of dandruff – and even if God is held not to exist, the being denied him is the same as that of any existent thing. In other (more ‘Heideggerian’) words: ‘Being’ is not susceptible to ontic qualification. In its pure conception, therefore, what is said by ‘univocity of being’ is exactly equivalent to ontological difference.