Protocols: Rage Against the Machine

Roy Ascott, Minitel, 1985, Les Immatériaux

Roy Ascott, Minitel, 1985, in Les Immatériaux, Centre Pompidou

The meltdown of the financial market in 2008 started to provoke a new way of looking at machine, under the title of “Market: Rage against the machine”, the journalists analysed how the market could be further ruined by machines in addition to its own disruption. Financial workers are not factory workers who dared in the old days to propose sabotage of machines. But the rage against machines are increasingly significant, the reporters wrote: “What frightens investors most is a sudden evaporation of liquidity, when everyone pulls back at once and there is no one to provide a firm price to an investor wanting to sell. In 1987, investors accused some market makers of not answering their phones so that they would not have to buy shares from panicking sellers. Today, human market makers have largely been replaced by ultra-fast computer systems trading with high frequency. But like the human traders of yesterday, the machines can and do back away if markets are disrupted.1

In this example, we do not only see protocols – that is in someway invisible behind the screen and the ecstasy of gambling in a global scale, but also standards – how trading is implemented globally by connecting different machines with different protocols and social norms – usages, distrust, rage, habits of not picking up phones when getting grumpy, body gesture of starring at the display of stock prices.The understanding of protocols in current media theories is more or less based on Alexandre Galloway’s book Protocol: control after decentralisation (2004) which excellently illustrates how network protocols embed different forms of control. The common take on Galloway always see protocol as agreement or diplomacy of communication, this understanding is unfortunately very limited. During the Hyperkult 2013 with the theme Standards, Norms and Protocols, it is also an occasion to revisit the question of protocols. To think about protocol as a mean of control is now evident, but in order to understand protocol, it is necessary to talk together with norms and standards.

If a textbook definition of protocol is required, then most fundamentally we can see the definition of protocol as an agreement:

Basically a protocol is an agreement between the communicating parties on how communication is to proceed. As an analogy, when woman is introduced to a man, she may choose to stick out her hand. He, in turn, may decide either to sake it or kiss it, depending, for example, on whether she is an American Laywer at a business meeting or a European princess at a formal ball.2

Another definition of protocol in An Educator’s Guide to School Network gives a better picture of the functionality of the protocol:

A protocol is a set of rules that governs the communications between computers on a network. These rules include guidelines that regulate the following characteristics of a network: access method, allowed physical topologies, types of cabling, and speed of data transfer.3

Understanding as such, protocol is an etiquette, whether it is diplomatic or not depends on purposes, situations, functionalities. A protocol of network security is different from a MIDI protocol that transfer signals between keyboards and synthesizers. Protocols are inventions that solves the problem of discontinuity or blockage of technical progress, for example packet switching is a protocol used to solve the problem of centralisation of traffics. In comparison with standard, we can all invent different types of protocols, but we are not able to easily invent standards, since what matters most for standardisation is not its content, but its capacity or potential to be universalised backed up by different institutions and interests. Simondon in Imagination et Invention (2008) has a short passage explaining the relation between protocols and norms. To be clear, Simondon didn’t use the word protocol but rather formalisation of mental images. Formalisation has to do with different levels of granularities that keep different agents in the same operation process compatible with each other. In order for these protocols to extend to a wider milieu, it has to be universalised. Universalisation is a process, it has to be distinguished from the universal. Universal is always a goal ahead, the absolute that one can never reach. Standardisation is often driven by different interests and institutional obligations. Standard organization such as W3C, ISO are bodies that promote and entangled with industrial interests, it is indeed intriguing when W3C an organisation that proposes open standard or even an open web, considered to integrate DRM in HTML5.

What is important for Simondon is the question of compatibility present in the social and the technological systems. The progress of technical objects are based on different levels of compatibilities, which he calls order of granularities. But roughly we can say there are two approach firstly from the technical level and the users’ level (Simondon calls it minor and major use of technology). Consider an ERP software (SAP, Oracle or some ERP systems for small and medium companies), the protocols developed by engineering companies are more or less the generalisation of operations of logistic companies. Protocols work well for machines, since negotiations are simple enough, for example the hand-shaking of TCP protocol. Protocols are also limitations, not only those regulate machines, but through their realisation, they create other kinds of limitation on human users, both direct users and those who are governed by machines that employ these protocols.

Contingencies as Norms

In Ned Rossiter’s keynote speech entitled Logistical Worlds: Designing Standards of Control, Gameplay and the Production of Knowledge that explore what he calls “supply-chain capitalism”, he talked about the nightmare of contingency, to name a few: “labour strikes, software glitches, inventory blowouts and traffic gridlock”. Contingency happened when these protocols failed to deal in certain cases, for example, when a product is not delivered and the clerk cannot update the inventory on time; or when a certain product which exist in the inventory cannot be located. These contingencies could be predicted before hand, but it couldn’t be explicitly expressed in the software due to quite a few reasons. Protocols should deal with contingencies, but cannot deal with all contingencies, and express explicitly these contingencies since a good operation should be able to avoid as much contingency as possible. Anticipation of contingencies can lead to two responses: firstly refusing these contingencies; secondly dealing with some relatively more “common” contingencies.

Contingencies are moment when a system comes into crisis, or incompatibilities that were not prefigured. Contingencies are not totally undesirable, paraphrasing Quentin Meillassoux we should also talk about the necessity of contingency, otherwise there is no invention.Technical inventions in turn create incompatibility in social, political and economical levels, as the French historian Bertrand Gills already showed that the technological system always ahead of the human system, then the latter attempts to adopt it by suppressing some of its possibilities. Social norms are also invented to compensate for such incompatibility, beside of the constant pursuit from the technical side. Simondon writes: “in each epoch, the normative inventions operate a discovery of compatibility for the modes of existence that didn’t have sense or intersection in the precedent normative structure”(158). Simondon followed by an example “when the ancients have discovered that slaves were human beings but not goods or tools that talk, they gave a normative structure to the master-slave relation, modulating it with the model of the simpler and more primordial relation, such as that between the father and his children (161)”.


In the logistic world, contingencies are technical norms. They appear all the time, contingency is only for protocols and algorithms implemented in machines but all these accidents are not contingent for those who are working on it. The simulation of truck routing looks beautiful, we are easily impressed by the speed and aesthetics of acceleration. Road traffics, weather condition, the mood of drivers and many other personal factors easily render these simulations as useless spectacles. Now social norms are invented to overcome this problem. Human agents, are used to compensate for the short-coming of protocols and algorithms. In order to add one parameter to the software to indicate that a product is in its pending/missing status, it is going to take months. Consequently, the clerks who are responsible for data entry will have to work another (unpaid) extra two to three hours to keep the stocks up to date. Port workers have to use the control room inside the cran as both toilet and kitchen, and reduce their lunch hour to 15 minutes. The beautiful human-machine interaction has to be questioned, and analysis should be extended from the compatibility of interaction, to the incompatibility between systems. The point here is not to propose rage against machines, but rather to understand how protocols are actually far more than communication agreement, and how protocols, standards and norms work together.


2 Computer networks, 2002. Andrew S. Tanenbaum P.27


Leave a Reply

Your email address will not be published. Required fields are marked *