“How do we overcome this paradoxical era of hyped-up individualization that results precisely in the algorithmic outsourcing of the self?”
– Geert Lovink, Networks Without a Cause
“…software studies need to be open to a plurality of approaches and techniques, striving to use those tools that provide us with useful empirical material for making sense of the sociality and spatiality of code.”
– Rob Kitchin and Martin Dodge: Code/Space Software and Everyday Life
Tiziana Terranova in her essay Red Stack Attack! Algorithms, Capital and the Automation of the Common (2014) for the Accelerationist reader tells us that what is at stake is nothing less than the relationship between ‘algorithms’ and ‘capital’: “the increasing centrality of algorithms to organizational practices arising out of the centrality of information and communication technologies stretching all the way from production to circulation, from industrial logistics to financial speculation, from urban planning and design to social communication” (381).
Thinking on the above I had to remind myself of what James C. Scott Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed in his once said:
The planned “scientific city,” laid out according to a small number of rational principles, was experienced as a social failure by most of its inhabitants. Paradoxically, the failure of the designed city was often averted, as was the case in Brasília, by practical improvisations and illegal acts that were entirely outside the plan. Just as the stripped-down logic behind the scientific forest was an inadequate recipe for a healthy, “successful” forest, so were the thin urban-planning schemata of Le Corbusier an inadequate recipe for a satisfactory human community.2
In both statements above we admit to what the authors of the two-volume History of The Probabilistic Revolution the power of probability: “Probability theory appeared to provide an answer to the problems drawing inferences from data subject to a variety of uncontrolled influences and the need to find rules for theory evaluation in these circumstances. (3).3 In the matter of theory only two scientific disciplines have truly bound themselves to the probalistical and statistical constructions: physics and evolutionary biology. Yet, it is within this world of probabilistically uncertain mathematics that both quantum theory and forms of economic theory (neo-Keynseanism) would forge their tools. With the marshalling of the complexity of mathematical probabilistic and statistical equations becoming increasingly difficult for the mathematicians themselves to master the need for an alternative came about. It was out of this need that the information processing or computer age was initiated. The notion of planning anything these days is beyond our human programming capabilities: ergo – we invented algorithms to do the job for us. But algorithms inhabit not only the virtual spaces of hardware and computers, they are the engines of creation that drive our social and political domains as well, from Wall Street to the great financial institutions of Europe and Asia we’re caught in the complex web of an accelerating war of competing algorithms.
Yet, it was actually a difficulty faced by gunners in WWI that would become the engine driving the future of this whole information age. Thornstein Veblen’s brother Oswald a gunner realized the need for a better and more accurate way of firing larger artillery, and needed the help of human computers or mathematicians to do the job. As George Dyson tells it: ”
Veblen organized the teams of human computers who were placed under his command, introducing mimeographed computing sheets that formalized the execution of step-by- step algorithms for processing the results of the firing range tests. It took the entire month of February to fire the first forty shots, yet by May his group was firing forty shots each day, and the growing force of human computers was keeping up.4
But it would be one of his recruits was Norbert Wiener, a twenty-four-year-old mathematical prodigy well trained after two years of postdoctoral study in Europe , but socially awkward and discouraged by the failures of his first teaching job (KL 637-639), who would eventually discover the answer needed to calculate artillery effectively. After the war Veblen would go on to become instrumental in bringing together many of the mathematicians that would ultimately provide the knowledge base from which our digital age was first conceived. As Dyson relates it quoting Freeman Dyson: “The School of Mathematics has a permanent establishment which is divided into three groups, one consisting of pure mathematics, one consisting of theoretical physicists, and one consisting of Professor von Neumann. (KL 987)” It was in this third kingdom of mathematics as formulated by von Neumann that the digital universe was conceived and “numbers would assume a life of their own” (991).
But as Terranova will relate the universe of algorithms would not be confined to the digital universe alone but would become a part of our everyday life, becoming increasingly coextensive with processes of production, consumption, and distribution displayed in logistics, finance, architecture, medicine, urban planning, infographics, advertising, dating, gaming, publishing, and all kinds of creative expression (music, graphics, dances etc.) (382).
Algorithms, Capital And Automation
In this section of her essay Terranova will play off the notion of automation, and specifically of two types of automation – the industrial-thermodynamic and the digital electro-computational models. The industrial type gave rise to a system ‘consisting of numerous mechanical and intellectual organs so that workers themselves are cast merely as its conscious linkages’ (i.e., we’ve seen this already from Marx’s ‘Fragment on Machines’ in the reader, 55). The digital form of automation on the other hand will involve the brain and nervous system of living labor as intellectual or cognitive labor, which will unfold within “networks consisting of electronic and automatic relays of a ceaseless information flow” (383). It’s within this digital form of automation and it spatial model that she will discuss the political for any new algorithmic modes.
After describing the typical nature of algorithms (i.e., what they do, the work they perform, how they are situated within certain material and immaterial assemblages, etc.) she remarks that as far as capital is concerned “algorithms are just fixed capital, means of production finalized to achieve an economic return” just like any other commodity (385). In this sense algorithms have replaced living labor, the worker herself as the site where the temporal aspects of labor time, disposable time, etc. play themselves out. Instead of the alienated presence of the human in the machine as mere appendage driving and guiding the machine through its everyday processes, the human has been stripped out of the process altogether as non-essential or disposable and the algorithm as an abstract machine is now situated in that site.
Yet, as Terranova will remind us after Marx we must not reduce the algorithm to “use value” only, but also see it within the context of “aesthetic, existential, social, and, ethical values” (386). She will ask if it wasn’t the reduction of software to its exchange value that drove many Hackers to opt out of the strict commercial world and invent an alternative type of economics (i.e., her example: Richard Stallman and the Free and Open Source movement). In fact, she asks, isn’t this at the heart of the hacker ethic and aesthetics, this need to escape the constraints of “use value” that capital has imposed upon the software industry?
She will also remind us that we must not reduce techniques in some absolutist fashion with either ‘dead labor’, ‘fixed capital’, or ‘instrumental rationality’ but should rather understand that the reduction of labor costs that enables capital investments in technology to free up ‘surplus’ labor not for the benefit of the worker herself as free time, but as that part of the cycle of production and exchange value which is continuously reabsorbed into profit and gain for the few (the collective capitalists) at the expense of the many (the multitudes). (387)
She describes the litany of effects that this neoliberal form of capitalism has brought to fruition in the closing time or our era: global poverty, psychic burnout, environmental degradation, resource depletion, war, etc. To remedy this she offers an agreement with Maurizio Lazzarato’s notion of a post-capitalist society based on the autonomous and enduring focus on subjectification that entails not only a better distribution of wealth, but also a the reclamation of ‘disposable time’ – that is, “time and energy freed from work to be deployed in developing and complicating the very notion of what is ‘necessary’ (387)”.
Against he exploitation of the existing and corrupt profit system of neoliberal technocapitalism with its cycle of crash and burn at the expense of the many, she that with the freeing up of ‘disposable time’ we could finally fulfill Marx’s dream of the free creation of new subjectivities that could begin to reshape the what is “necessary and what is needed” (388). This is not some return to a pristine natural world but is in fact the hard work of feeding populations, constructing shelters, education, healthcare, children and the elderly, etc. What we need she tells us is new ways of achieving these goals, ways that no longer exploit for profit and gain but bind us to a ‘commonfare’ – a notion from the work of Andrea Fumagali and Carlo Vercellone: “the socialization of investment and money and the question of the modes of management and organization which allow for an authentic democratic reappropriation of the institutions of Welfare… and the ecologic re-structuring of our systems of production” (388-389).
The Red Stack: Virtual Money, Social Networks, Bio-Hypermedia
She follows Benjamin H. Bratton in developing a new nomos of the earth that links technology, nature and the human in what is termed the ‘stack’ (389-391). As she tells it the stack supports and modulates a kind of ‘social cybernetics’ able to compose ‘both equilibrium and emergence’ (390). What she describes is the notion of the stack as providing a platform that is hooked into what Williams and Srnicek will term ‘The Network’: as a ‘megastructure’ the stack becomes a cartographic device that incorporates a normative standards based verticality, and a topographical layered organization of artificial and human components both every day and digital (see the essay for details).
Against the mapping provided by Bratton she proposes an alternative she terms the ‘Red Stack’ – a new nomos for a post-capitalist commons (390). To do this she tells us we must engage three aspects of the socio-technical systems of innovation: virtual money, social network, and bio-hypermedia. Citing authors as diverse as Christian Marazzi (money as a series of signs), Antonio Negri (money as an abstract machine), Maurizio Lazzaroto (money as both exchange and as investment in alternate futures), and Andrea Fumagalli – who will ask if the money being created in the digital realms (i.e., bitcoins, etc.) as experiments in alternative exchanges offer a way to “promote investment in post-capitalist projects and facilitate freedom from exploitation, autonomy of organizations, etc.? (392). She affirms the central role that algorithms will play as both creators of virtual money and its possible, and politically inclined agent (390). A place within any plan will need to incorporate these virtual monies as part of the subjectivation process in the creation of productive subjectivities that are open toward the “empowering of social cooperation” (390).
Social networks and social plug-ins are so prevalent and use a complex set of data structures and algorithms that support the interactions within these spaces that to circumvent the strictures of capitalist modes with post-capitalist modes of use will entail both the organization of resistance and revolt, but at the same time the need for creating new social modes of self-creation and self-information. These at the moment are aligned with notions of autonomy and singularity, but could instead be linked to form new collectives, new assemblages that within the red stack would hijack existing social networks and repurpose them to promote a distributed platform for learning and education, fostering and nurturing new competencies and skills, fostering planetary connections, and developing new ideas and values (395).
Coined by Giorgio Griziotti bio-hypermedia touches on that interface between bodies and those technological devices that have become our intimate connections to the world of relations. As devices are miniaturized and mobilized, as apps become the downloadable extensions to this world of relations, we begin to enter what are becoming less virtual and more actual ‘code spaces’ as software moves from the desktop to the everyday world of objects. More and more we are in the infosphere rather like fish in the ocean, swimming in information and communications that swirl around us like so many schools of fish. As she describes it these new “spatial ecosystems emerging at the crossing of the ‘natural’ and the artificial allow for the activation of a process of chaosmotic co-creation of urban life” (396). Rather than being subsumed within the networks of consumption and surveillance as in the neoliberal order, the new post-capitalist world will open up a new ‘imaginary’ and make room for alternative forms of hardware design and applications for these collective social devices (396).
In conclusion she offers the notion that algorithms will be the base of any ongoing construction project for the commons. Not only will algorithms be a central component within The Network but will open new potentialities for postneoliberal modes of governance and postcapitalist modes of production (397). It will entail nothing less than a takeover of the very infrastructures of the current corporatized internet and repurposing it toward an open egalitarian social system that is no longer based on monetization and privatization, but rather provide a way out of the neoliberal order of debt, austerity, and accumulation (397). She tells us this is not a pipe-dream but a “program for the constituent social algorithms of the common” (397).
In Part Two: Section Four I’ll open up toward the essays of Luciana Parisi who deals with the speculative reason in the Age of the Algorithm. Then we’ll move on to Reza Negarestani, Ray Brassier, Benedict Singleton and Patricia Reed; along, with a final gambit or rebuttal from Nick Land in his Teleoplexy: Notes on Acceleration within this same volume.
Previous post: Accelerationism: The New Prometheans – Cyberlude
1. #Accelerate# the accelerationist reader. Editors Robin Mackay & Armen Avanessian (Urbanomic, 2014)
2. Scott, James C. (1998-03-30). Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed (The Institution for Social and Policy St) (Kindle Locations 5777-5781). Yale University Press. Kindle Edition.
3. The Probabilistic Revolution. Two Volumes. Editors: Lorenz Kruger, Gerd Gigerenzer, and Mary S. Morgan (MIT Press, 1987)
4. Dyson, George (2012-03-06). Turing’s Cathedral: The Origins of the Digital Universe (Kindle Locations 633-636). Knopf Doubleday Publishing Group. Kindle Edition.