The codec moment: raise the standard high

Outside the technical literature there has been little study of the protocols and standards that run through software and hence media ecologies, digital assemblages and information spaces. Even within software studies, the emphasis has been on software packages, whether business (Fuller 2003a), games (Galloway 2006, WardripFruin 2009), artworks (Fuller 2003b, Parikka 2010, Hertz & Parikka 2010) or anomalies such as spam or viruses (Parikka 2007, Parikka & Sampson 2009). In other words the codecs, standards and protocols on which those packages depend or which artists or hackers use, have been under-analysed.

Galloway’s discussion (Galloway 2004) of the relationship between TCP/IP, DNS and Deleuze’s ‘control society’ (Deleuze 1992) was the first work to focus specifically at this ‘level’. I will argue that Protocol is undermined by a foundationalist and textualist approach  as evidence in statements such as: “Protocol is a language that regulates flow, directs netspace, codes relationships , and connects life-forms.” (P74) but for now the point is to mark a turning point in Software Studies, but one that has not been picked up.

In 2006, Adrain Mackenzie sought to develop an “ontology of software” (p5) that included protocols and standards alongside languages and applications as part of the wider, post-Manovich establishment of software studies/theory. Arguably it is this wider project that undermines a specific protocol studies/theory. For Mackenzie, protocols are an example of code, which is the object of his study and ontology. The specific nature and workings of the protocol-object are not fully addressed.

Within film studies, there has been some discussion of codecs and standards. In 2008, Sean Cubitt approached YouTube through the H.263 codec which supports the Flash video (.flv) format. Cubitt locates the codec in terms of its alliances with corporate interests (Adobe) and telecoms and non-governmental bodies (ITU, ISO etc) as away of mapping its presence within global and neo-liberal relations and discourses of the public sphere. He goes as far as to say: “There is no internet without the standardisation of internet protocols” (p46). Cubitt picks up on Adrain Mackenzie’s short account of motion imaging codecs in Fuller’s Lexicon (Mackenzie 2008) where, taking a more constructivist line than Galloway, he argues that “codecs structure contemporary media economies and cultures in important ways… [they] catalyze new relations between people, things, spaces, and times in events and forms” (p48). In particular he draws a connection between video codecs’ ‘transform compression’ and ‘motion estimation’, the technique it uses to compress but also render motion, and a “relational ordering that articulates realities together that previously lay further apart” (p54).

Both writers have continued to raise the standard. Cubitt has drawn connections between colour space standards and emergent 3D scopic regime (Cubitt 2010) while Mackenzie discusses the experience of ‘wirelessness’ as articulated in and through the Wi-Fi standard (2010).

So why the dearth of work at this level (sic)? Mackenzie is clearly correct to say that one problem with mapping digital assemblages and ecologies through these objects is that “they are monstrously complicated” (Mackenzie 2008: 48). Another issue might be that to approach the assemblage through software and in turn through standards raises the spectre of technological determinism, which despite renewed interest in McLuhan remains a dangerous label. Software studies has shown the importance of the code-assemblage relation, clearly that needs to be approached through all forms of code – packages, commands, programming structures and languages as well as standards and protocols (see other papers in (Fuller 2008)). I would argue that approaching these monstrously complicated objects as object-actants allows us to trace their workings without recourse to foundationalism as well as rework the idea of technological determinism as a dynamic process (in the way Mackenzie uses radical empiricism to map the flow of wireless experience). This is not an academic exercise, recent discussions of, and battles over, the political position and effects of Wikileaks needs to take into account the HTML standard that allows the link, the MySQL database protocols that allow storage and retrieval and of course the TCP/IP and DNS protocols that allow the site(s) to be found or hidden. It is those standards that not  only enable Wikileaks to function but also enable Amazon and PayPal to ‘de-visualise’ it, the US Government to trace it and activists to mirror it. Those alliances are en/infolded and en/incoded.

Cubitt 2008, Codecs and Capability, in Lovink & Niederer (eds), Video Vortex Reader : Responses to YouTube, Institute of Network Cultures, Amsterdam,.

Cubitt, 2010, Making Space, Senses of Cinema (57).

Deleuze, G., 1992, Postscript on the Societies of Control, October, 59, pp. 3-7.

Fuller 2003a, It Looks Like You’re Writing a Letter: Microsoft Word, in Behind The Blip : Essays On The Culture Of Software, Autonomedia, Brooklyn, NY, USA, pp. 137-65.

Fuller 2003b, A Means of Mutation: Notes on I/O/D 4: The Web Stalker, in Behind The Blip : Essays On The Culture Of Software, Autonomedia, Brooklyn, NY, USA, pp. 51-68.

Fuller, M., 2008, Software studies : a lexicon, MIT Press, Cambridge, Mass..

Galloway, 2004, Protocol: How Control Exists after Decentralization, MIT Press, Cambridge, Mass.; London.

Galloway, A.R., 2006, Gaming: Essays On Algorithmic Culture, University of Minnesota Press, Minneapolis.

Hertz, G.  & Parikka, J.  2010, Zombie Media: Circuit Bending Media Archaeology Into An Art Method, Vilèm Flusser Theory Award 2010, .

Mackenzie, A., 2006, Cutting Code: Software And Sociality, Peter Lang, New York.

Mackenzie 2008, Codecs, in Fuller (ed), Software Studies : A Lexicon, MIT Press, Cambridge, Mass., pp. 48-54.

Mackenzie, A., 2010, Wirelessness: Radical Empiricism In Network Cultures, MIT Press, Cambridge, Mass..

Parikka, J., 2007, Digital Contagions : A Media Archaeology Of Computer Viruses, Peter Lang, New York.

Parikka 2010, Ethologies of Software Art: What Can a Digital Body of Code Do? in Zepke & O’Sullivan (eds), Deleuze and Contemporary Art, Edinburgh University Press, Edinburgh, pp. 116-32.

Parikka, J. & Sampson, T.D., 2009, The Spam Book : On Viruses, Porn, and Other Anomalies From the Dark Side of Digital Culture, Hampton Press, Cresskill, N.J..

Wardrip-Fruin, N., 2009, Expressive Processing : Digital Fictions, Computer Games, And Software Studies, MIT Press, Cambridge, Mass..