SUCKA my code, baby: Peer-to-Peer's production of sprawling unkempt cultural knowledge archives

Introduction

Within the internet's relatively short lifespan, specific software applications, technical protocols,and hardware advances have fed its evolution as a communicative, social environment. Each advance contains within itself the distinguishing features of informational capitalism (in the sense that it is a new productive and social paradigm, much as industrial capitalism was before it), and also reproduces and expands informational capitalism. Such expansion however is significantly differentiated along spatial and political lines; the topography of globalisation is irregular, bumpy not smooth. Within this, technology is neither ideologically neutral nor somehow external to the societies from which it emanates; rather it is deeply embedded within the social, forming a “techno-social” field. As the entire world becomes increasingly globalised, that is, comprised of sets of deeply integrated and highly networked interdependencies, especially those of an economic and legal nature (leaving aside the environmental for now), currents of social change also move globally (as they always have) and speedily, riding on the waves of the techno-social. Surf's up, and there's no going back!

Since the beginning of internet time so-called “killer applications” built upon commonly-agreed communications protocols have opened up creative and productive possibilities for users.1 Email in general was perhaps the first killer app, at least for a relatively homogeneous academic and scientific community who were already familiar with the use of (initially mainframe time-share) computing systems. The emergence of text-based threaded discussion lists on a myriad of topics as exemplified by Usenet played a significant role in the geo-spatial and cultural diversification of the internet. Some Silicon Valley corporations such as Xerox Parc invested in pure research and development projects focussed on how people might interact and create together online, developing new Object-Oriented programming languages which supported real-time role-playing within stripped-down Command Line Interface (CLI) environments such as LambdaMOO. Later, the Mosaic and Netscape web browsers built upon HTTP (Hypertext Transfer Protocol), a networking protocol enabling people to make and share distributed, collaborative, hypermedia via an easy to use graphical interface—the World Wide Web (WWW). It was not long before the advertising industry (led by Manhattan's DoubleClick), developers of secure forms of online electronic payment, and entrepreneurial companies with goods and services to sell, entered the picture, partially transforming the “information super highway” (which was always a reductive representation of net use) into a vast electronic mall.

Lest people drift away from an increasingly commercialised and regulated internet environment, software facilitating the production of web-based personal journals (blogs) heralded the rebirth of the 1990s home page. This time it was a younger and typographically cooler demographic leading the way in net narcissism, building a perfect market for the next techno-social wave, social networking, as exemplified by platforms such as MySpace and particularly FaceBook. Social networking sites took the genius of Tim Berners-Lee's hyperlink (invented to expand possibilities for meaningful communication) and transformed it into excess, an excess of ego-driven interconnections and personal data for capture by a global marketplace in which such data is a primary resource.

The phenomenon of peer-to-peer file-sharing over the internet, or P2P, first emerged in the mid-1990s, around the middle of this geological period. In this chapter I argue that an examination of some salient features of P2P, including historical, technological and social aspects, reveal it to be emblematic of how global networks function within a globalised info-capitalism, both as regimes of ordering and disordering. Both ordering and disordering tendencies within the techno-social fields with which P2P engages are productive of the new, both technological and social. The mass uptake of P2P despite legal and other barriers, and the social relations which file-sharing encourages, particularly the practice of sharing amongst strangers within loosely-constituted, unstable networks, might signify a new form of social reproduction directly born from the network form. Is the network an example of semi-organic parthenogenesis? Using documentary material in the public realm including media reports, academic research, file-sharing fora, and leaked intergovernmental treaty documents, I ground this theoretical proposition within a multi-perspectival historical framework. I focus primarily upon interconnected transglobal networks across three fields of practice: the field of file-sharing seeking complete network freedom (programmers and users); the field of corporate and government elites seeking network monitoring and regulation; and field of academic, legal, and media circulating and analysing developments in the first two fields.

The Internet's Field of Dreams

From the outset, the internet was designed as a decentralised and undisciplined mechanism for communications and information exchange, a “network of networks, or an internetwork” (Terranova 2004: 41, emphasis in original). Its evolution has relied on “access to common code” and ICT resources, and the ability to engage with others in “unrestricted networks” (Hardt & Negri 2009: x). North American Cold War concerns had propelled early research on a distributed computer network, yet ARPANET, the internet's predecessor, was developed within, and culturally influenced by, the sphere of academia, and hence energised by the “political dreams” of 1960s “American counter-culture” (Pasquinelli 2010: 288). These visions varied according to the dreamers, from Black Power activists to Flower Power antiwar hippies, but a thread of autonomy and rejection of capital's status quo intertwined these dreams.

Launched in 1969 ARPANET was a promiscuous network which enabled different types of computers and operating systems to communicate with each other via common open technical standards and protocols. Computer scientists/hackers addressed the challenge of materially and culturally different hardware systems by developing common codes via which they could communicate, a kind of Esperanto for machines which could handle the polyglottal chaos, making love not war. In January 1983 the contemporary internet came into being, heterogeneous small computers communicating using the TCP/IP protocol (Gillies and Cailliau 2000). The civilian part of this network, ARPA Internet, flourished, and had produced three million hosts (nodes) by the time it was decommissioned in early 1990. By 1996 the new environment now simply known as the internet supported some thirteen million hosts. Similarly, the development of the World Wide Web (WWW), the hypertextual, graphical platform which popularised and globalised internet usage developed by Tim Berners-Lee at Conseil Européen pur la Recherche Nucleaire (CERN) in the early 1990s, was motivated by the problem of how researchers could use the internet to easily share knowledge encapsulated in different formats (Gillies & Cailliau 2000). As both the internet and the WWW had been produced by inhabitants of the “academic gift economy” it was assumed that research outcomes would be shared for the social good (Barbrook 2005, no page numbers). This attitude has been embedded within the technological and social structures of the globally networked electronic domain since its inception.

Although the techno-social assemblages that constitute file-sharing have arisen not from comparable collective practices within the academic heartlands but rather from the singular obsessions of (mainly) young educated men, a similar techno-libertarian cultural imperative that “information just wants to be free” has shaped these projects (endnote 1 – source). To follow this imperative is to embrace disorder at a fundamental level, to harness chaos as a means of producing energies, desires, labour, and material across loosely-arranged, globally-emergent networks. It was not always this way, but the earliest projects, particularly as exemplified by Napster, suffered and eventually failed because of a reliance on centralised models of organisation and exchange. Order was thus Thanatos, and once it imploded due to the inherent vulnerabilities order generates, the unquenchable Eros could really let loose.

File-sharing in general, with its new forms of gifted labour and autonomous circuits of exchange, threatens the established order of information capitalism. Despite the gloss of the digital new, info-capitalism continues to be partially based on hierarchical models of production, circulation, control and power, as even a brief evolutionary sketch reveals.

From Pyramids To Networks and Back Again: Taylorism, Fordism, post-Fordism and beyond

Driven by industrial engineer Frederick Taylor, the Industrial Revolution's early 20th century iteration had seeded a “scientific model” of production to enhance business efficiency and profitability (Marazzi 2008: 50). Taylor's monograph, The Principles of Scientific Management, advised dividing all tasks into precise steps which could be monitored; his recommendations widely adopted created a paradigmatic shift in the organisation of labour. The consequences of Taylorism's division of labour and task fragmentation was to further deskill and alienate workers. Fordism built upon these foundations with its centralised factory model spatially concentrating workers clocking on to perform coordinated, repetitive tasks on semi-mechanised assembly lines, space, time, and actions all externally regimented. The mass commodities produced were subsequently bought by the workers whose accord-guaranteed relatively high wages until the 1970s had generated a hitherto unknown consumer class, with appetites fed by the burgeoning advertising field. Together Taylorism's micro-control of the “immediate productive processes” and Fordism's “regulation of the social cycle of reproduction” transformed capitalist production, complemented by the Keynesian welfare state's macroeconomic societal control mechanisms (Hardt & Negri 2000: 267).

When a cluster of geopolitical and social forces triggered a globally-wide but spatially-differentiated economic collapse in the 1970s, another socio-technological transformation occurred, similarly producing a major new paradigm of social organisation. The shift from industrialisation to post-industrialisation or “informatisation” of the services sector flowed through to secondary production (mirroring agriculture's earlier transition to semi-industrialisation), with “informationalised industrial processes” the apex of contemporary manufacturing (Hardt 1999: 90, 93). Japanese automobile maker Toyota forged the path of informatised industrial production by incorporating informatics advances into the heart of manufacturing processes. Toyotaism's Just-In-Time method pulled market information into the factory, enabling the rapid retooling of assembly lines; team work was integral to the production process, with factories divided into semi-autonomous groups of workers cooperating to produce whole products. Perhaps for the first time since industrialisation communication entered “directly into the productive process,” with the chain of production becoming a “linguistic chain” (Marazzi 2008: 49).

Here lies the advent of post-Fordism, the paradigmatic form of contemporary production, with its focus on communication and the increasing commodification and informatisation of all aspects of life, including the affective and the social. The network is post-Fordism's characteristic organisational form, replacing the Taylorist/Fordist pyramid of control. Labour is increasingly autonomous (that is, self-employed and/or self-organising), and “communicative-relational” (Marazzi 2008: 49).The governing social power over spatially and culturally disaggregated labour has partially dematerialised, becoming translucent. We are now conditioned to be our own oppressors. Likewise responsibility and accountability, perhaps most dramatically illustrated by the pass-the-parcel buck-passing by key protagonists in the Global Financial Crisis (2009 and arguably ongoing), does a good impersonation of The Invisible Man, minus the bandages and sunglasses. The post-Fordist paradigm replicates the earlier dominance of Taylorism, Fordism, and Toyotaism, and rather than superseding these models it incorporates their most useful features. For example, in the Taylorised call centre workers' communication adheres to rigid scripts and is monitored for 'quality and training purposes'.

Unlike previous historical tendencies of organisation, post-Fordism extends beyond traditional sites of production (school, field, factory, office) into the whole of life. Informational capitalism “fuses” work and worker, putting to work their “entire lives” (Marazzi 2008: 50). Hence Post-Fordist labour is sometimes depicted as “biopolitical labour,” drawing upon life (bios) to create “not only material goods,” but also “relationships,” “social life,” and “subjectivity itself” (Hardt & Negri 2004: 109, x). Denizens of advanced economies must be perpetually clocked on, logged in, GPS-able. The emplaced body can be of less value to capital than the free-ranging mind, as innovative ideas can be the greatest profit returners in this era. The Facebook story is a case in point (ref). For the mental as anything masses, work time increases to the point of being “work without end,” and wages continue to free fall in enforced worship of 'productivity' (Cohen, cited in Marazzi 2008: 41). Any resistance to the totalising logic of biopolitical production must manifest less in those spatialised and temporalised locations formally associated with work, and more across time and space in the interstices of daily existence. As commodification's creep monitors and encloses public space and private time, the need to reinvent ourselves and our forms of social organisation intensifies. Here then is a key to the enormous popularity of filesharing.

Informational capitalism has delivered increasingly cheap information communication technologies (ICTs) along with yawning lines of credit enabling even low-income earners to purchase super-size me plasmas and tiny pods and pads. Similarly various forms of subscription satellite, cable, and streaming services enable access to the latest television series and films. However, such content is typically deformed by both technical restrictions on its reproducibility and embedded advertising, and often limited by the geographical location of the subscriber. For all its global conglomeration, Big Content is hardly cosmopolitan, but rather as nationalistic as modernism's war babies. The business model is industrial, one-to-one plus one: one vendor, one buyer and one product. All seemingly orderly, if old-fashioned. Yet while a substantial market exists for the cultural material doled out from above by aggregates of producers, broadcasters, and ICT infrastructure corporations, so too exists a culturally-differentiated and geographically-dispersed multitude of people committed to new parasitic systems of exchange within the belly of the beast.

What this multitude craves is One Love! And freely-formed networks of desire.

[add paragraph of new paradigm - post post fordism .. networks of pyramical structured nodes... the return of hierarchy... but embedded in a field of networks...]

Ordering Attempts in the Infosphere

The one quality which especially distinguishes digital goods from any other kind of non-organic goods is that they are infinitely reproducible at a negligible cost, and with little or no degradation in technical quality (depending on the method of reproduction used). When the popularity of file-sharing had become so established that Big Content was forced to take action to defend its profitability, it took a tripartite approach. Firstly, it fought battles on legal fronts spanning various national and intra-national jurisdictions. Secondly, it implemented technological barriers to the material reproduction of digitised artefacts, ranging from differentiated forms of encoding DVDs according to region of distribution, to making content playable on only specified hardware formats. And thirdly, it developed commercial products such as subscription-based streaming television in belated response to a largely untapped market. All three modes of attempted ordering have been less than successful, as we shall next see. And in part this failure is attributable to Big Content's fundamental lack of understanding of the transglobal, deeply networked, socially heterogeneous culture they were attempting to constrain and discipline.

Despite a huge investment of time and resources by, the legal battles of the early noughties failed to curb the global expansion of file-sharing. Unsurprisingly, the primary staging ground for these contestations has been the United States, a 'naturally' litigious society, and the home of some very popular downloads including software packtages, blockbuster movies, television series, and music albums. These fights have been framed as being about theft and piracy, using a tradition of Intellectual Property Rights (IPR) and copyright law which extends back to the medieval period (ref). While in some cases various coalitions of content owners have been been awarded substantial damages, these legal wins have created unexpected consumer backlashes against the film and music industry in particular, especially when popular opinion deems judgements to be harsh and excessive. When Goliath attacks David and wins, the networks of swarming downloaders announce vengeance on websites such as torrentfreak.com, swearing to increase their file-sharing activities on principle.

In 2003 the Recording Industry Association of America (RIAA) led the way in the attempt to re-establish order in the electronic domain via legal means (Bridy 2011: 28-40). As the 1998 Digital Millenium Copyright Act (DCMA) had not foreseen the highly distributed nature of P2P file-sharing, it had not granted copyright owners the automatic right to force ISPs (who were increasingly providers of routing and transmission rather than web storage space) to disclose customer details. Consequently, the RIAA conducted a “coordinated legal campaign” described by Bridy (34-35) as a “class action in reverse, with the aggregation occurring on the defendants' side,” filing 30,000 so-called John Doe law suits over a five year period. One of the highest profile cases was Capitol Records, Inc. v. Thomas, the first file-sharing copyright infringement lawsuit be tried before a United States jury. Jammie Thomas-Rasset, a young Native American mother of four, was fined more than USD1.92 million for downloading 24 songs, a punishment many deemed to be disproportionate to the 'crime' (Yu 2010: 1389). Upon appeal this fine was heavily reduced to USD54,000, but a third trial in November 2010 to determine damages resulted in a jury awarded the record companies USD1.5 million (USD62,500 per song).

Eventually the public relations fallout from this and other cases caused the RIAA to abandon the mass litigation approach, and in December 2008 the RIAA publicly announced a new emphasis on fostering “greater cooperation” with ISPs (Yu 2010). This marks the shift to a new regime of info-ordering. Nevertheless, as of 2009 new alliances of rights owners (including film producers and commercial pornography makers) are filing new waves of John Doe litigation suits in the United States. This “profit center” mechanism is primarily aimed at gaining out-of-court payments from alleged offenders (Bridy 2011: 36-37) .

Big Content has equated the collective practice of file-sharing with the individual crime of property theft, an economic reductionism which ignores the phenomenon's socio-political and cultural dimensions. This blinkered view partially explains the inability of punitive legal measures to curtail the practice or shift attitudes about its morality. While it is undisputed that many people participate in file-sharing because it is a means of acquiring cultural materials at no direct monetary cost to themselves, even a rudimentary examination of commentary on both tracker sites forums, dedicated file-sharing news websites such as TorrentFreak, and the respected digital news portals Ars Technica, reveals that the reasons why people download are more varied and complex than simply wanting to 'steal' something. Many people report that file-sharing sites are the only places which offered the specific materials they seek. Others express revulsion at how the culture industries are driven only by profit, at the expense of many creative artists and consumers, both of whom were “ripped off” continually by the corporations. This position is frequently accompanied by a stated willingness to pay (music) artists directly for their works, bypassing the “greedy middleman.” Yet others are incensed by the “bullying” tactics of the MPAA (expand acronym) and RIAA with regards to specific lawsuits and position their own file-sharing as an act of solidarity with defendants. (It must be acknowledged that on the whole the level of dialogue within some fora is not particularly high, especially on TorrentFreak, leaving the impression that all file-sharers are angry spoilt young men. On Ars Technica, which is a digital news portal, the debate is qualitatively different).

Legal measures against individual file-sharers were not confined to the handful of high profile lawsuits which garnered so much mainstream and specialist media attention. The more common strategy has been for law firms representing major content rights holders to issue threatening Cease and Desist letters (sometimes with demands for payment) to individuals. By various means, legal and otherwise, the downloading and uploading activities of many can be tracked, and, with the assistance of Internet Service Providers (ISPs), customer contact details acquired. [more on this later]. The main response from file-sharers has played out on the social field: by ignoring the letters and advising others to do similarly, by changing ISPs in favour of businesses that protect customer privacy, and by pro-actively challenging the legality of the threats (ref that dodgy UK company - torrentfreak).

While legal measures targeted individuals, and were resisted by linguistic and legal countermeasures, concurrent technological measures aimed at curtailing the phenomenon were targeted at all ICT users and film and music lovers. While the mass of people affected by them might not have been overly happy with the constraints, it was left to individual hackers to rebuff these measures by developing technological solutions, knowledge of which was subsequently rapidly seeded through communication networks and material objects. Thus file-sharers shifted the technological battleground to the techno-social field, revealing a more nuanced understanding of the dynamics of contemporary digital culture than that held by the corporate interests .

DeCSS: can software code be an transglobal expression of free speech?

The contestation between content owners and hackers sparked by the introduction of the Content Scrambling System (CSS) exemplifies the tension between ordering and disordering forces in the electronic realm. American movie studios had insisted that the CSS copy protection mechanism be included within the software engineering requirements for the nascent DVD-Video standard in 1996. The studios' aim was to combat the unauthorised copying of films. This technological attempt to control social behaviour was thwarted by various autonomous hacker groups working on technological countermeasures. In 1999 a Norwegian hacker group fronted by 16 year old Jon Johansen released source code for a DeCSS utility, which enabled people to decrypt and digitise DVD movie discs played in the new DVD computer drives, saving the files to hard drives (Halavais 2003: 123). Because the DeCSS source code contained the algorithm for CSS, numerous other programmers subsequently used it to write similar programs for “ripping” DVDs. Although such ripping stripped away DVD's unique features such as the interactive interface allowing access to film 'chapters', and made cumbersome 4 gigabyte digital files (which however could be converted to 700 megabyte DivX format files), people could now easily exchange cultural media which had been only recently available in the DVD format (Patrizio 1999, no page number).

This digital circumvention project from the hacking underground generated a series of legal challenges in the United States centring on the relationship between software algorithms and free speech, testing the bounds of the 1998 Digital Millenium Copyright Act (DCMA) which prohibited people from reverse-engineering hardware and software for illicit purposes. The cases drew media attention to underlying social and political issues surfacing as a result of technological change. While the details of the dozens of DeCSS-related trials are beyond the scope of this chapter, it is worth noting that variations of the DeCSS code were not only distributed via numerous internet channels including list-servs, mirror sites, and electronic greeting cards, but also printed on material objects such as t-shirts and ties, translated into audio files and animations, and published by entities such as the geek website Slashdot and the magazine 2600: The Hacker Quarterly (Touretzky 2000, no page number, Halavais 2003: 124). In a landmark ruling (later appealed) Judge Lewis Kaplan ruled in favour of the representatives of Big Content, a judgement which included the provocative assertion that hyperlinks (the very foundation of the World Wide Web and the consequent popularisation of the internet) were a “form of 'trafficking' in illegal goods, and therefore illegal under the DMCA” (Halavais 2003: 124).

Big Content in the United States via its Content Scrambling System had sought to impose a technological order upon material objects, DVDs, to prevent the release of their core contents into the unbounded and generally uncontrolled realm of the internet. Various collective efforts from transglobal manifestations of a hacker underground resisted these constraints by finding unprotected back doors into the CSS code and writing algorithms which decrypted the software locks. Unable to respond technologically, a coalition of content owners took the fight to the legal system, targeting those who had circulated specific instances of DeCSS, including even the makers of the DeCSS t-shirts. The DeCSS example illustrates how the social and the technological are deeply interwoven, and how ordering attempts in one or more fields (the technological, the legal) can generate disordering responses that span those same and additional fields (the cultural, the political).

Moreover, the implementation of CSS occurred within an old industrial model of both technical innovation and social organisation: a small cadre of salaried programmers developed a technological fix to a social problem anticipated by their employers, and the resultant product was marketed without regard to consumers' expressed desire to be able to copy DVDs. This imposition of a form of technological enclosures disregarded the unquestionably legitimate activity of backing up forms of media (audio CDs or film DVDs) which are unstable and liable to be unusable if scratched or otherwise physically damaged. Duplicating one's own media also makes it available to be played via other digital devices, a point which although not pressing in the 1990s, has become absolutely critical in the contemporary electronic landscape of multiple digital devices.

In contrast, the development and release of the numerous iterations of DeCSS exemplifies a postindustrial, network form of innovation and organisation. Mirroring one of the precepts of Toyotaism, that is, attending to marketplace desires, which conveniently dovetailed with software culture's ultra-libertarian ideology, programmers identified a techno-social problem: the curtailing of cultural freedom via technology. Consequently, self-managed, loosely-organised, and sometimes spatially-dispersed groups of independent programmers concurrently worked to unscramble CSS. The first successful solution was disseminated via transglobal electronic networks and geospatially-emplaced material artefacts, ensuring its rapid adoption by a computer-savvy multitude, and its reversioning and redistribution by other hackers. Moreover, the escalating cycles of conflict set in motion by DeCSS drew in other networks of solidarity and support, from across both computing subcultures and the legal-political fields.

The disorder DeCSS created a foundation for new transdiciplinary knowledge production, from software code to legal arguments. CSS had spawned a bastard child, DeCSS, and in turn DeCSS contributed to the expansion of networks of productive resistance, and the loose communities which coalesced within these networks. Hackers and file-sharers had expanded the technological battleground to the techno-social field, revealing a more nuanced understanding of the dynamics of contemporary digital culture than that held by the corporate interests which was primarily concerned about limiting circulation of the new, believing that unauthorised copying would negatively effect their financial bottom line.

Over time empirical research on the buying habits of file-sharers has revealed that the dogma that 'piracy' diminishes corporate profits is not supported by the facts. UK research.... other research.....

Over a decade on, it is instructive to reflect on the passionate fight for an unlocked DVD format in the 1990s compared to the more recent explosion of proprietary hardware formats, and the public's acceptance of the barriers they impose on free circulation of content. The Apple Corporation has led the vanguard, with its iPhones, iPods, and iPads, and although it is not impossible to hack these devices so that they can more readily accept and exchange materials, the constraints have been embedded into the systems, requiring labour to disembed them. The Windows world is not much better, with special formatting and partitioning required to make portable hard drives able to receive files such as music and films from Apple computers.

What can we conclude from the mass embrace of proprietary formats? Has the sexiness of the design and marketing of the new miniaturised platforms seduced their owners into accepting the electronic chastity belts which accompany them? Does it make us more inclined to limit those important embodied, socialised forms of content exchange (such as taking your hard drive to a friend's place and swapping files after dinner) to situations where we all own the same genus of machines, in a form of self-imposed digital apartheid? Have we become less sensitised in general to the growing complexity of the machines which we interact with on a regular basis—cars, washing machines, sound systems—and the concomitant inability to take a peek under the hood? While the free software movement has significantly contributed the collective construction of code from the ground up, and more localised free wireless initiatives have done the same for pirate or community radio and television broadcasts, in general the field of consumer electronics fosters a passive acquiescence to the machine limitations. Leaving these speculations to one side for now, let us now return to the evolution of file-sharing software and the creation of new informal networks and loose communities.

From Napster's centralised one-on-one to BitTorrent's deep distributed peer-to-peer

Napster was a proto P2P service developed by university student Shawn Fanning and launched in June 1999. Like the later massively popular social networking environment Facebook, Napster was Boston born and bred, coming from and responding to the desires of a privileged class of digital natives, who had grown up post early post-Fordism and pre-dotcom bust, more West Wing than The Wire. Building upon earlier, clunkier methods of distributed filesharing (including text-based communications platforms Usenet and Internet Relay Chat, and the later client/server platform Hotline) Napster offered people a direct way to share music files which had been digitised in the then-new highly compressed MP3 audio file format. Napster used its own servers to maintain a central registry which displayed computers logged on to the system, and the shareable music files they contained. Users could then use Napster's web interface to directly connect with these computers to retrieve files. Napster only lasted two years until a series of legal challenges from a coalition of recording companies and individual bands (most infamously the heavy metal 'rebels' Metallica) eventually forced it to declare bankruptcy. By this time the platform had generated a groundswell of filesharing activity amongst an estimated 25 million users, many of them riding on the bandwidth of North American universities. Moreover, it had created a collective unquenchable thirst for discovering and sharing the artefacts of popular culture online.

Paradoxically, one of the main seedbeds of informational capitalism in terms of cognitive labourers and technological innovation, the tertiary education sector, was simultaneously nourishing a phenomenon which would challenge capitalism's sacred cows—such as return on investment from the production of scarcity and secrecy. Innovation was streaming through the networks from below, and although in Napster's case there were clear commercial goals driving the project's founding business team, the desires and appetites for cultural materials being fed by the software system signalled something which probably never had been imagined or codified in any business plan. The mixed (cassette) tape, that 'old' music compilation format of the 1970s and 1980s which had provided the soundtrack to countless friendships, love affairs, protests, and rites of passage, was reborn as individualised downloaded playlists subsequently burnt onto now-affordable Compact Disks and circulated in the spatialised world. More significant than this technological shift from one-off 'hand-crafted' analogue tapes to infinitely reproducible digital media however was the emergence of a new social paradigm, that of sharing amongst strangers.

The development of the BitTorrent communications protocol was fundamental to the evolution of this paradigm during the post-Napster era. BitTorrent, written by software programmer Bram Cohen, was released in July 2001 via an announcement on a Yahoo internet group. In computing, protocol refers to formats and rules for how machines (hardware/software assemblages) communicate and exchange data with one another. Whereas Napster had relied on both a centralised indexed file registry and one-to-one exchanges between pairs of peers who needed to consciously seek each other via the registry, the BitTorrent protocol enabled the creation of file-sharing “swarms.” A person would use various BitTorrent “clients” (software applications) to “announce” that they were “seeding” (distributing) a particular file from their computer. (Endnote? Each net-connected device has a unique net address or Internet Protocol address (IP address)). The protocol enabled BitTorrent software clients to break a file into small chunks, which others could download, chunk by chunk. As these downloaders incrementally acquired the file chunks, they would automatically “seed” them, thus taking the bandwidth pressure off the original “seeder” and making the process of file transference completely distributed over the internet. Thus the BitTorrent protocol harnessed the network power of the internet and its users, capitalising on the disorderly, unpredictable nature of technological and social networks to produce a system whose stability rested upon a foundation of complete chaos.

Everyone in a torrent swarm becomes a peer, the original uploader/seeder and the subsequent downloaders who are concurrently also seeders. The process is messy and unpredictable, embodying a convergence of technological and social issues. People (including the initial file seeder) drop in and out of swarms, bandwidth can become choked, incomplete or corrupted versions of files compromise the integrity of downloads, wildly divergent upload and download speeds amongst swarm members, and legal threats to Internet Service Providers (ISPs) and downloaders, are some of the typical challenges. However, the BitTorrent protocol was expressly designed to handle such variables, and because each file can be transmitted as non-sequential chunks by any and all logged-on nodes within a swarm, the inherently random, uncontrollable nature of the technological aspect of the process paradoxically assists an eventually ordered delivery of a requested file on someone's internet-connected device.

Compared to earlier platforms such as Napster, because BitTorrent could handle large files easily, it became possible to share more than relatively small music files. Hence the diverse selection of cultural artefacts in underground circulation came to include movies, computer games, television episodes and indeed whole series, and software applications. As a consequence of this technological expansion the demographic composition of file-sharers became more heterogeneous, and the young, white, male college-educated cadre were joined by increasing numbers of people who did not fit that narrow identikit. This shift mirrored how the internet itself was become more diverse in terms of its users' ages, ethnicities, genders, spatial localities, and socio-economic class (ref). In turn, the popularisation of file-sharing further diversified the make-up of the internet's “netizens.”

While BitTorrent software clients enabled the technical exchange of digital files, in the deeply decentralised post-Napster paradigm something else was needed in order for people to search and find the cultural materials they desired from a swarm of strangers. From this need arose a host of “tracker” websites which facilitated file-sharing by storing meta-data indices of downloadable material scattered across the internet. It is important to note that such tracker sites were not libraries hosting the digitised materials on their own servers, they were more like phone books, offering small 'tracker' files which pointed to the materials' locations on peer computers. Tracker sites performed an ordering function; they indicated where specific artefacts could be located from an untold number of digital materials available to be downloaded from seeds scattered across the physical world and electronic nets.

These tracker sites bifurcated into public trackers and private trackers. Public trackers can be accessed by anyone, and no download limits or “ratios” (the balance between files downloaded and seeded) were imposed on individuals' activities. Some well-regarded public trackers such as Demonoid are theoretically open to all, but only during periods when they open membership via their website or invites. Public trackers often reflect the disorganised, sprawling nature of the internet, with content being ordered in a handful of broad categories (television, anime, music, etcetera), and a wide range of file formats allowed. User comments are often the only semi-reliable guide as to the quality of the seeded material. Over time certain public trackers have attracted massive user bases only to disappear, usually in response to legal threats or action taken against them by Big Content (coalitions of content owners). Others have stood their ground and fought back, garnering moral and political if not financial support from their users, with The Pirate Bay being the emblematic example (to be discussed in more detail later on).

In contrast to the open nature of public trackers, private tracker sites function to exclude most of those who would want to join them. With membership closed to the hoi polloi new users can only be admitted in limited situations including on the recommendation of existing members, conditional invitations made available on other tracker sites (in which the invitee must prove their level of existing contributions), and via the 'illegal' selling or auctioning of memberships. Due to the imminent legal threats faced by tracker sites, some of the most exclusive private tracker sites have become highly secretive, forbidding members to mention the sites' names on online fora. One of the most revered private trackers was Oink's Pink Palace (OiNK), a site dedicated to the creation of a community of music lovers via the sharing of music. OiNK captured the enthusiasm of around 200,000 people and access to their music collections, and its meticulously organised repository of high-quality formatted music files was regarded as a gold standard in file-sharing. The site was active between 2004-2007 until its creator, software engineer Alan Ellis, was charged with conspiracy to defraud, thus becoming the first person in Britain to stand trial for file-sharing (Ellis was found not guilty in January 2010) (ref )

The Anti-Counterfeiting Trade Agreement's (ACTA) production of networked battlegrounds in the new global info-order

Despite some significant legal wins in battles against Napster, Kazaa (explain), torrent aggregator websites such as MiniNova, and individual file-sharers, during the first decade of new millennium the various alliances of Big Content were consistently losing the larger info-war of shifting peoples' attitudes towards, and practices of, file-sharing. Admonishment just wasn't working on those unruly hearts and minds, and instead disorder reigned wildly. People who had purchased films, music, books, software, and other cultural artefacts argued that they had a right to circulate digital copies of those materials over the internet, just as one might lend a book or a DVD to a friend. Moreover, people wanted to be able to share their enthusiasm for artefacts and their makers, by setting up discussion fora on the torrent sites they inhabited. So circulation encompassed not only content, but also ideas and information about that content. Networks for file-sharing extend way beyond a technological facility to encompass some timeless aspects of what it means to be human, which includes the ability to tell stories about stories, a meta-level story telling. As the activities of popular torrent sites like SuprNova and Mininova were forced to distribute only certain materials or monetise their exchanges, a thousand other tracker sites bloomed.

Unable to significantly impact behaviours of individuals populating these networks, in 2008 various manifestations of Big Content shifted gear by initiating a two-pronged attack targeting Internet Service Providers (ISPs) and sovereign governments (Bridy 2011: 44). If individual and collective behaviour of an increasingly heterogeneous class of info-serfs was out of control, then control needed to employ other means of expression. Firstly, transglobal corporations through their professional organisations insisted that ISPs around the world step up to the plate and monitor and discipline their customers' downloading behaviours. Secondly, they lobbied national governments to enforce this by introducing legislation favouring the interests of (often transnational or multinational) corporations over their citizens. The new “graduated response” or “three strikes” protocols gathered steam, with governments within Britain, Europe, Canada, and across Asia enthusiastically drawing up relatively similar laws to present to their parliaments (Bridy 2011: 44). Perhaps because of its deeply libertarian values applied to home turf issues, the United States industry groups have held back from pushing a “statutorily mandated” system, opting instead to pursue “voluntary agreements” with ISPs (Bridy 2010: 45). A selective snapshot of recent three strikes legislation across three countries, France, X and X, and the overarching Anti-Counterfeiting Trade Agreement (ACTA) which is the apex of the coordinated international lobbying campaign by corporate rights owners, demonstrates Big Content's extraordinary power to influence, or attempt to influence, democratically elected governments.

The lengthy negotiations by 38 countries surrounding the drafting of the multilateral Anti-Counterfeiting Trade Agreement (ACTA) have been enveloped in secrecy resistant even to Freedom of Information requests, according to special interest groups and scholars locked out of the process (Geist 2009, np). The ACTA treaty evolved outside of “conventional” policy-making venues such as the World Intellectual Property Organization and the World Trade Organization, but rather has been negotiated through a “private network of, by and for invited corporate insiders,” noted activist and author David Bollier (2009, np). Such circumstances foster “policy laundering,” that is, the use of “Trojan horse” international treaties to “justify the passage of controversial legislation within one’s own country, ” smuggling in “more expansive substantive rights” disguised as “better-coordinated enforcement” (Public Knowledge, quoted in Bollier; Bridy 2010: 7). As with other multilateral lateral agreements (MAI, TRIPS, GATT, NAFTA, and so forth) deliberate non-transparency masks the simultaneously all-encompassing and minutely-detailed agendas of a class of powerful elites, in this case those associated with the convergent media, entertainment, and telecommunications industries. Consequently, treaty signatories can exclude the wishes of ordinary citizens regarding the contested issues.

With the United States being a significant treaty driver in 2008 some expected that the newly-installed Obama Administration would relax ACTA's secrecy provisions,. Instead the US government determined that the draft text was a “matter of national security,” withholding even the negotiators' names from Freedom of Information requests, “while sharing the negotiating text secretly with hundreds of industry lobbyists” (Love 2010, np). As I argue throughout this paper, ordering attempts inevitably produce disordering functions, which generate their own socially prodictive momentum. Over time negotiations details and actual draft documents leaked out from the ACTA info-bunkers, which increased pressure on governments to open up the process. By March 2009 what little was known about ACTA came from a July 2008 “discussion memo” posted on Wikileaks 2

Although no longer available at Wikileaks (possibly due to data disruption caused by post-Cablegate mirroring of this site) I had previously downloaded this document. Entitled 'Japan – US Joint Proposal', its physicality is intriguing, being a very poorly photocopied artefact which bears the signs of a quick stealthy session by a mole at a copier. Some other leaked documents appear less harried. It is not clear who are the ACTA whistle-blowers, but presumably they are people engaged with the negotiating process at a high enough level to have decided that it is in the (global and national) public interest these documents are opened up to wider scrutiny. The leaked documents thus become raw materials for cultural activists, privacy campaigners, legal experts, and interested others to support arguments against ACTA which, due to the secrecy, have been primarily formulated on political, philosophical, and ethical grounds. The negotiating texts become concrete proof of the intentions of the anonymous negotiators, revealing which parties will benefit from the treaty's provisions and which will be disempowered. Moreover, subtle linguistic shifts in the various iterations reveal the subtext of the tensions amongst the negotiating parties themselves, disorder within the ordering project, as Bridy (2010: 7-11) demonstrates.

In March 2010 the European Parliament voted 633 to 13 to force the disclosure of the negotiating text, forcing a “one time release” of the ACTA negotiating text on 16 April 2010 (Love 2010, np). Subsequently only the United States has blocked additional releases of the draft treaty. In a Resolution aimed at finalising ACTA details, in November 2010 the European Parliament noted the “public criticism of the secrecy of the negotiations as a clear signal of the political unsustainability of the negotiation process adopted,” thereby recommending that the ACTA Committee “operate in an open, inclusive and transparent manner” (Rinaldi et al 2000, np). The European representatives had recognised the tactic of secrecy had completely failed to enable the construction of a new global info-order, and instead had opened the negotiating process to attacks from numerous quarters. Order had produced disorder, and this disorder was generating its own transglobal constituency who were sharing information, ideas, strategies, and resources.

The rise of the 'graduated response' approach to file-sharers

Through its various iterations, the draft ACTA treaty has centred on internationally-cooperative mechanisms to enforce Intellectual Property Rights (IPR) covering copyright, trademarks, and patents. While the treaty clearly targets commercial counterfeiting operations, so too is the phenomenon of file-sharing within its ambit. This has concerned both lobby groups focused on digital privacy issues and also ISPs, who have found themselves to be potentially burdened with the tripartite role of surveillance officer, informer, and punisher in a digital panopticon scenario. When the final draft was publicly released, some earlier provisions, including the mandating the implementation of harsh punitive regimes against individual file-sharers, had been omitted ('ACTA Consolidated Text, Informal Predecisional/Deliberative Draft, 2 October 2010'). Pressure from inside and outside the formal process that the treaty be “more protective of the parties' sovereign prerogatives in areas relating to substantive rights, liabilities, and exceptions” had caused this apparent retreat (Bridy 2010: 8). Nevertheless “graduated response” measures remained “tacitly endorsed” by both the preamble and provisions promoting “greater cooperation between rights owners and service providers” (Bridy 2010: 1).

Moreover, powerful copyright advocates including the International Federation for the Phonographic Industry (IFPI) and the International Intellectual Property Alliance (IIPA) have operated concurrently outside of the treaty framework, pressuring individual governments in an “especially aggressive” way to insist that ISPs adopt an “active role in policing copyrights online” (Bridy 2010: 2). And furthermore, some countries, including the United States and Ireland, are exploring “private ordering” options in addition to “public law mechanisms” to enforce online copyright (Bridy 2010: 2). The ACTA project's overall success is demonstrated by the general acceptance by governments, if not their citizens, that the State must play a major role to protect the interests of corporations in the information age. Clearly, the invisible hand of the market is not strong enough to control the activities of the wilful masses, but the strong arm tactics of the State have not produced quiet acquiescence as the French example demonstrates.

In May 2009 the French Parliament passed the Création et Internet law requiring Internet Service Providers (ISPs) to undertake a “graduated response” or “three strikes” protocols in relation to their account customers who were deemed to be engaged in alleged copyright infringements.3 Firstly, they would have to issue two warnings to such customers. Upon the third alleged infringement the ISP would be legally obligated to immediately terminate their customers' internet accounts (Bridy 2011: 52-56). A centralised blacklist would ensure that such terminated customers could not simply switch ISPs (Anderson 2009b, np).

Moreover, the law enabled ISPs to block popular file-sharing sites, and would penalise people for not securing their personal internet connections against illicit uses by others, especially via wireless networks. Although other countries including the United States, Italy, and Ireland had discussed similar legislation, France was the first country to write it into law (Anderson 2009a, np). The law, also known as HADOPI (High Authority for the Distribution of Works and the Protection of Rights on the Internet) after the administrative body which would implement it, had received significant public opposition. Furthermore, it contravened a European Parliament decision concluding that such punitive legislation would “violate the fundamental rights and freedoms of Internet users” (Ernesto 2009, np). One month after the law had been passed France’s highest legal authority, the Constitutional Council, ruled that the enforced loss of Internet access would be unconstitutional and blocked this provision, empowering HADOPI only to warn identified downloaders but not to punish them by cutting their internet accounts. But just as the technological battlefields on which contestations over file-sharing take place are continually shifting, so too legal terrains exist in a state of constant flux.

In September 2009 a second iteration of HADOPI was passed into law. Infringement activities are detected by commercial internet security companies engaged by rights owners, thus bringing increasing rings of ordering mechanisms into the overall process, and concomitantly, the potential for generating new kinds of disorder. The security companies employ network monitoring software to detect alleged infringements, and then transmit the following details to the rights owners: “the IP address from which the files in question were available, the ISP of the alleged infringer, and the date and time of the alleged infringement,” who in turn refer the each instance to HADOPI (ibid.). Some concerns about privacy had been taken on board by not allowing HADOPI to disclose an infringer's identity to rights owners, but only to the ISP. ISPs are required to issue warnings to customers within 24 hours of receiving a HADOPI notification. If three alleged infringements occur within one year, HADOPI contacts a prosecutor, who brings the matter before a judge. Without undertaking any judicial investigation the judge can ban a person from using the internet for up to one year (Anderson 2009c, np). However, people have the right to appeal such judgements (Bridy 2011: 54).

By October 2010 a French music industry body reported that their members, using a range of technological monitoring mechanisms to detect file-sharing activities, were sending around 25,000 music-related copyright infringements notifications to HADOPI per day (Pichevi 2010, np). However, based on a sample of 2,000 internet users, researchers from the University of Rennes in Brittany found that file-sharing had increased by 3 per cent since HADOPI, although people were changing how they obtained cultural materials, drifting from BitTorrent tracker sites towards both streaming content and direct download sites (Anderson 2010a, np). The study also discovered a clear relationship between the download and purchase of content, with fifty per cent of digital music and video online purchasers also admitting to illicit downloading. The researchers concluded that HADOPI's banning of downloaders from the 'internet might “eliminate 27 percent of all Internet buyers of music and video,” thus paradoxically impacting the profits Big Content had used the law to protect (ibid.).

-----------------

next para: spanish wikileaks revelations....pressure from US to change laws on ISPs

  • 1. Internet applications provide a means and an inspiration for people sharing common interests and passions to aggregate ideas and resources in rich communicative environments. Many such applications emanate from the grassroots of computing and hacking cultures, that is, individuals and autonomous communities solving problems and building solutions to needs they themselves have identified. However, this is not to underplay the role that educational and scientific institutions, and to a lesser extent corporations, have had in developing applications and providing (often free) server space. For example, in the 1980s and 1990s countless special interest groups gathered together via the USENET network (ref). Text-based Multi-User Domains (MUDs) and Multi-User Domains Object-Oriented (MOOs) offered the first popular incarnation of internet-located role-playing games and collective experiments in playful improvisation with avatars. And early portals such as the WELL (Whole Earth 'Lectronic Link) operated as attractors of like-minded people in search of intellectual companionship and community.
  • 2. Michael Geist, law professor at the University of Ottawa (2010, np), has played a leading role in analysing such leaked content particularly as it applies to Canada, maintaining an online repository of key documents, as has James Love through the auspices of Knowledge Ecology International (see 'The Anti-Counterfeiting Trade Agreement (ACTA)' 2011). While a number of legal activists from organisations such as Electronic Frontiers Foundation (EFF) have weighed in on the ACTA debate, law professor Annemarie Bridy's (2010) comparative analysis of leaked documents stands out for both its thoroughness and non-partisan approach.
  • 3. The French version of this protocol first had been seeded in a 2004 report by France's High Council of Literary and Artistic Property, was then ignored when I 2006 France amended laws to comply with an EU directive requiring “harmonization” of copyright law, and subsequently resurfaced in a 2007 French Ministry of Culture report recommending the establishment of an administrative body to oversee a “system of warnings and sanctions” (Bridy 2011: 52). Hence, Création et Internet had been preceded by a five year period in which the basic idea of ISP involvement in direct punishment of file-sharers had been socialised via policy papers and commissioned research.

Comments

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

summer of love

yes,
cd invite ppl to respond to the theme of one love..
in the context of digiculture mebbe, or culture/capitalism/anarcho-lovism,..etc..something to anchor the love anyways
mebbe shortish pieces, 2-3K
cd be a sweet book, paper and/or print...like your Dive book

yessa

lets make a one love edition

jah rastafari
a

Plan B

Hi Josephine

That is a great tip-off - thank you. Will follow through today.

I am discovering that writing this piece is making a patchwork quilt without any boundary in sight! especially since I have rediscovered the Social Science Research Network last night, which anyone can join, and it contains a cornucopia of working papers and published academic papers, free to download.

A pretty interesting legal scholar in the p2p field is Annemarie Bridy, who in the paper below gives the clearest description of the technical aspects of how p2p works that i have read

Bridy, Annemarie, Is Online Copyright Enforcement Scalable? (January 13, 2011). Vanderbilt Journal of Entertainment & Technology Law, Forthcoming. Available at SSRN: http://ssrn.com/abstract=1739970

Cheers

Francesca

ipv6

hi Francesca,

This may interest you:

"All of this means that peer-to-peer protocols such as VoIP solutions and BitTorrent work worse over IPv6 than over IPv4. This situation probably won't be resolved any time soon, as people with "security" in their job title refuse to consider passing through unsolicited, incoming packets in any way, shape, or form."

http://arstechnica.com/business/news/2010/09/there-is-no-plan-b-why-the-...

For the whole story, click back to the first page of this article. From what I understand from a friend of mine who lectures on ipv6, Vesna Manojlovic of RIPE, ipv6 will in the end make peer to peer near impossible.

warmly,

J
*

oops :b

I will change it right away, thanks for detecting

mebbe we cd have a series of One Love texts one day

yeah, it's a great song!

Jah love

title

ahem, far from claiming ownership of a fantastic title of one of the greatest reggae songs of all times i only want to point out humbly that on these pages here it has already been used once just in case you were not aware http://www.thenextlayer.org/node/573

bestest
a.

technopolitical

Hi armin .

thanks for encouragement.

hope the bug leaves your body post haste! chamomile tea can be soothing. also fennel tea.

I don't think i am a member of techno group -- will keep fiddling on the draft and post the next iteration there over the next couple of days. its my first bit of writing since sending the Phd off, so its a relief to work on something that has an end in sight. Am putting in a few chunks from the thesis, they might eventually get weeded out again, we'll see

bye for now, rest up
francesca

very technopolitical

Francesca,

great draft and very technopolitical. are you member of that group? maybe should post it also there so that Brian can see. I am a bit ill with some strange stomach flu bug, so no further comments but i am looking forward to see this article develop
best
armin