The Rise of the Network Commons

Antenna installation at Haus des Lehrers, 2003, image courtesy Freifunk

The Rise of the Network Commons, a book project under development, by Armin Medosch, supported by EU project Confine.

The Rise of the Network Commons is the working title of a new book which I am currently writing. It returns to the topos of the wireless commons on which I worked during the early 2000s. In this new version, combining original research from my German book Freie Netze (2004) and new research conducted in the context of the EU funded project Confine, the exciting world of wireless community network projects such as Guifi.net and Freifunk, Berlin, gets interspersed with philosophical reflections on the relationship between technology, art, politics and history.

Language: 

Topic: 

The Rise of the Network Commons, Chapter 1 (draft)

The Rise of the Network Commons is the working title of a new book which I am currently writing. It returns to the topos of the wireless commons on which I worked during the early 2000s. In this new version, combining original research from my German book Freie Netze (2004) and new research conducted in the context of the EU funded project Confine, the exciting world of wireless community network projects such as Guifi.net and Freifunk, Berlin, gets interspersed with philosophical reflections on the relationship between technology, art, politics and history. This is the first draft of the first chapter. In the final version, texts may significantly change. Critique and comments are welcome. You can send your opinion either to me in email or ask me for an account to post comments here armin (a) easynet dot co dot uk.

The World of Guifi.net and the Dispositif of Network Freedom

On my recent visit to Barcelona in the context of the Confine project, Guifi.net founder Ramon Roca took me to Gurb, the village he comes from. There, in 2003 Guifi.net was started when Ramon realized that he would never get good bandwidth at a fair price in this remote area in sight of the foothills of the Pyrenees. Ramon, who is an IT professional but keeps his working life and activities with Guifi.net separated, found that he could get broadband by using WiFi to connect to a public building in the outskirts of a nearby small town, Vic. Since then, Guifi.net has grown to become the largest Wifi community network in Europe, with currently more than 25.000 nodes. It is not entirely correct anymore to call it a wireless community network since a growing number of nodes is created by fiber-optic cable. Since Ramon and his collaborators have found out how relatively easy it is to work with fiber he is on a new mission, to get fiber to the curb to as many houses as possible.

Visiting Gurb and talking to Ramon for nearly a full day has revitalized my fascination for wireless (and wired) community networks. I have written a book on wireless community networks in 2003, in German, under the title Freie Netze (Free Networks). The choice of title back then had deliberately emphasised the analogy between Free Networks and Free Software. The title had been inspired by two very different influences. On one hand there had been Volker Grassmuck's early book Freie Software (http://freie-software.bpb.de/Grassmuck.pdf). Volker's magisterial work provided deep insight into the history and politics of Free Software and stood out for me as an example how a book on wirelesss community networks should be written. The other inspiration had been provided by a sweeping lecture in Vienna in June 2003 by Eben Moglen, lawyer of the Free Software foundation and legal brain behind the licensing model of Free Software, the General Public Licence (GPL). Moglen's thunderous and captivating speech had presented the combination of Free Software, Free Hardware and Free Networks like a kind of holy trinity of the everything-free-and-open movement. Moglen's conclusion was that while Free Software was already an accomplished fact, and free hardware was the hardest bit, free networks were a viable possibility, yet there was still a long way to go to attain critical mass.

My book had come maybe a few years too early. When it appeared, some of the most important wireless community networks of today, such as Freifunk, Berlin, Funkfeuer, Austria, or Guif.net, were either inexistent or existed still in embryonic form only. The model of wireless community networks on which my book had been based had been created by Consume.net in the UK. Consume.net was the outcome of an improvised workshop in December 1999 in Clink Street, near London's creative net art hub Backspace. I will describe the history of Consume in more detail below, but one key aspect of that initiative was that it was launched by non-techies. James Stevens, founder of Backspace, and Julian Priest, artist-designer-entrepreneur, provided the impetus for DIY wireless networking by sketching plans for a “model 1” of WLAN based community networking on a napkin during a tempestuous train journey in late summer 1999. Their “Model 1” - a name chosen for its association with Henry Ford's first mass produced car, the Ford Model 1 or Thin Lizzy – was a techno-social network utopia.

The relatively young discipline of Science Studies teaches us that the technical and the social cannot or should not be considered as categorically separated. Technologies are “socially produced” is one of the key phrases in the discourse of science studies. They are not existing outside the human world but are the product of specific societies which exist under specific conditions and circumstances. Technologies are hybrids between nature and society, as science studies author Bruno Latour puts it. Moreover, a specific school of science studies, the Social Construction of Technological Systems (SCTS) has studied the co-evolution of large technological systems and social structures. SCTS pioneer Thomas P. Hughes, who studied the building of the first nationwide electrical grid, has found that there are strong co-dependencies between technological and social systems. While there is undeniably a strong influence on the shaping of technologies exerted by business interests, Hughes' work emphasizes co-dependencies between technologies and the people who build and maintain them, the technologists or techies – a term I will use from now on because it allows to refer to both academic computer scientists and researchers and autodidactic hackers, whereby I hope my use of the term is not seen as derisive in any way.

Engineers and skilled workers involved in large technological projects bring certain predispositions to projects; as projects evolve, the communities of techies develop certain habits and ways of working. The technological and social system build a unity which determines the ways how those technologies evolve in the future. What we can learn from science studies is that neither is science objective (in the strict sense of the word), nor is technology neutral. To believe the opposite would either constitute scientific objectivism - a rather outdated form of scientific positivism – and technological determinism, which is the belief that technology alone is the main factor shaping social developments.

James Stevens and Julian Priest, founders of Consume, are neither scientific positivists nor technological determinists. They conceived Model 1 as a techno-social system from the very start. There ideas combined aspects of social and technological self-organisation. In tech-speak, the network they aimed at instigating was supposed to become a Wide Area Network (WAN). But while such large infrastructural projects are usually either built by the state or by large corporations, James and Julian thought that this could be achieved by bottom-up forms of organic growth.

Individual node owners would set up wireless network nodes on rooftops, balconies and window sills. Each node would be owned and maintained by its owner, who would also define the rules of engagement with other nodes. The network would grow as a result of the combination of social and urban topologies. The properties of the technology - well strictly speaking there is no such thing as property of technology as I just explained but lets reduce complexity for a moment - impose certain restrictions. WLAN as the underlying technology of WiFi is called in more technical circles, operates in a part of the electromagnetic spectrum that does not pass through obstacles such as walls. Therefore, from one node to the next there needs to exist uninterrupted line-of-sight. Node-owners need a way of identifying each other in order to create a link. According to the properties of internetworking protocols each of those links is a two-way connection, which means that data can travel as easily in one direction as in the other. Furthermore, node owners would agree to allow data to pass through their nodes. There would not only be point-to-point connections from one node to the other, but larger networks, where data can be sent and received via several nodes. Such a wide area community network would also have gateways to the Internet in order to allow exchange of information between the local wireless community network and the wider networked world.

Those desired characteristics of Model 1 were not actually invented by Julian and James. Those properties already existed, deep inside the technologies we use to connect, but working for most parts unnoticed by those who use them. The key term has already been introduced above, without further explanation, it is the “protocols” that govern the flow of information in networked communication structures. Protocols are conventions worked out between techies to decide how the flow of data in communication networks should best be organised. The basic protocols on which the net is running, such as the Internet Protocol (IP) and the Transport Control Protocol (TCP) have been defined decades ago by engineers and computer scientists working on the precursors of the net, Arpanet and NSF-net. Some people would go as far as saying that the Internet is neither the actual physical structure of cables and satellites used to connect, nor the content that travels via such structures but it is embodied in the suite of protocols, commonly referred to as TCP/IP (those two are usually mentioned but there are many more). The protocols are the essence of the net because they give it its key characteristics. I am not sure of this is not a very refined form of technological determinism, but I would like to leave this question open for a moment.

The reason for this hesitation is that the protocols are not identical with the technology that uses them. The protocols are conventions that can be described in textual form. The way how this is done is through so called Request for Comments (RFCs). Since the dawn of the net RFCs have been defined in a way that runs counter to common understandings how technologies are created. RFCs are approved by techies who congregate under the umbrella of the Internet Engineering Task Force (IETF). The arcane decision making mechanisms of the IETF have since the very start been governed by maxims such as “rough consensus and running code”. People who develop new Internet technologies present them to their peers who then react by making noises such as humming or whistling. Criteria for approval are not theoretical consistency but weather they actually do something or not. The robustness and the freedom of the net is guaranteed, despite the lack of central coordination, by the self-organised decision making power of those techies who meet at the IETF. While a lot of those people may have jobs with large corporations, when they meet at IETF conferences the still decide as technicians who adhere to their own codes of human responsibility.

It is amazing, because despite the commercialisation of the net this has not fundamentally changed. Corporations and governments may seek to wrest more and more control over the net, and while they are actually quite successful in doing so in some areas, the social protocols of decision making enshrined in the mores of the techno-social communities have so far been able to withstand all such assaults. On the layer of the protocols the net was and is still “free”.

Thus, when James and Julian wrote out the formula of growth for Model 1, they referred to a freedom to connect that is inherent to the way in which the Internet was originally conceived and the way it still functions now, on the layer of the protocols. The knowledge and awareness of that fact had become buried by new layers built on top of older layers in the course of technological improvement but also the commercialisation of the net in the 1990s. Consume.net was started at the cusp of what was then called the New Economy, a stock exchange boom fueled by the rise of information and communication technologies in general and PCs and the Internet in particular. The 1990s had been a very exciting decade which saw the rise from obscurity of the net from a communication technology used by scientists and a small number of civil society organisations, artists and freaks in the late 1980s, early 1990s, to a new mass medium driving and being driven a gigantic economic machinery. In the process, a lot of the properties that had been dear to the early inhabitants of the net, the digital natives, had become either sidelined or overshadowed by commercially driven interest and the secret workings of the deep state.

Model 1 was thus both a new techno-social invention but also a recurse to the original Internet Arcadia. Against the tide of rising commercialisation and the inequalities and distortions that came with it, wireless community networks were supposed to bring back a golden age of networked communication, of equality and freedom. Technical and social properties were conflated into a model of self-organisation. The possibility for that was provided by a small and often overlooked feature of the technology. 802.11b was the technical name of the wireless network protocol as used at about 1999. It allowed two different operating modes, one where each wireless network node knew its neighbors and could receive and send data based on fixed routing tables, and another one, the ad-hoc mode, where nodes would spontaneously connect with each other. The ad-hoc mode was supported by routing protocols that would be best suited for the wireless medium. In a fixed network with cables, it is of advantage to work with fixed routing protocols. When data arrives, the network node decided where to send it, based on its knowledge of the topology of the network. But in wireless networks that topology constantly changes. Nodes can break down due to atmospheric or environmental influences. The quality of connection can change dramatically because of disturbances in the electromagnetic medium. Or a truck parks in front of your house and the line-of-sight is suddenly gone.

For this reason, Consume.net started to get interested in a technology called mesh networking. In the year 2000 mesh network protocols were still very much in their infants. There was a working group called Mobile Ad-hoc Networking (Manet), supported by the US military. In London, a small company was building something called Meshcube. It was a working technology but it was not really open source and only the developer knew how to run it. When Consume.net started to work with mesh network technology, this seemed to be a utopian technology. While neither James nor Julian were techies, they had the support of some very skilled hackers, but neither of them was capable of significantly developing mesh network protocols. Mesh networking was a dream, something that was already on the horizon but not yet there.

This was a pattern established in 2000 and still very much in place in the year 2014: when the problems of mesh networking would be solved, wireless community networks would flourish and become unstoppable. Social qualities, such as self-organization without centralized forms of control, were mapped onto technological properties, such as the ability of machines to automatically recognize each other and connect to build a larger cloud of networked nodes. The idea of network freedom – the ability to connect without having to apply to a central point of governance, and without having to go through a company such as a telecommunications operator (telco) - was supposed to further communication freedom, and thus the rights and ability of people to express themselves and communicate freely without top-down hierarchical control. The convergence of those ideas I call the dispositif of mesh networks and network freedom.

I am appropriating the term dispositif from Michel Foucault who used it to “refer to the various institutional, physical, and administrative mechanisms and knowledge structures which enhance and maintain the exercise of power within the social body” (Wikipedia: http://en.wikipedia.org/wiki/Dispositif).1

Our mesh network dispositif does not (yet) add up to all society, but it is something that is widely shared among techies building wireless community networks. It is a discoursive behaviour, but also a set of believes and a set of material assemblages. Now that, assemblage, is another term that I appropriate freely from a a French philosopher, Gilles Deleuze. While the dispositif does not exist outside time, it is somehow hovering above the concrete historical moment. In this way, the dispositif of mesh networks has influenced wireless community networks since the year 2000. The assemblage, while also consisting of material and non-material components, is concretely manifest in the historical moment. The mesh network dispositif promises to bring about an era of unrestricted and seemless communication, free from technological and social constraints. This dispositif historically legitimates itself by the way the Internet was originally conceived. At the same time it contains the promise of a future when the net will be again what it once had been.

When I came to Barcelona in July 2014, I was thrilled to see that as part of the EU funded research project Confine a project was under way to develop Quick Mesh Project (QMP). QMP is a so called free firmware, a Linux based operating system for network devices. Many people now have at home wireless routers. When you buy Internet access from a provider, you often also get a box that allows to wirelessly connect to the net. QMP would replace the operating system of such a device with a much improved version, one that speaks the language of mesh network protocols. To give a simple example, if in a street of apartment blocks everybody who owns a wireless router replaces the firmware with QMP and the puts the router on the window sill, all those machines would automatically connect and build a network without using any cables or other hardware from commercial providers. It would make it easy and simple to connect without having to go deep into system settings. This has now changed from being a faraway utopian goal to something that is literally around the corner.

It may or may not succeed. One problem with that is that it resembles what Saskia Sassen described as an engineer's utopia. Techies, weather they are academically trained computer scientists, telecommunications engineers or self-taught hackers, tend to believe in the unlimited potential of technology. They see the potential of a technology. There is nothing that speaks against that, on the contrary. It needs such people who are capable of dreaming a different future based on creative bending and twisting of technologies. The problem, however is, that far-sighted techies tend towards a linear extrapolation of technologies into the future without considering other factors, such as politics, the economy, the fundamental differences between people in class based societies and so on and so forth. In this way, the highly productive mesh network dispositif gets turned into the dreamworld of the Internet cornucopia. The technology gets imbued with characteristics that are actually outside it and depend on factors beyond the influence of creative technologists. It becomes a messianic technology in the way the great philosopher of culture and technology Walter Benjamin theorized it in the 1930s.

(to be continued ...)

  • 1. The same Wikipedia page further defines the dispositif as “the interaction of discursive behavior (i. e. speech and thoughts based upon a shared knowledge pool), non-discursive behavior (i. e. acts based upon knowledge), and manifestations of knowledge by means of acts or behaviors [...]. Dispositifs can thus be imagined as a kind of Gesamtkunstwerk, the complexly interwoven and integrated dispositifs add up in their entirety to a dispositif of all society." (quoted from Siegfried Jäger: Theoretische und methodische Aspekte einer Kritischen Diskurs- und Dispositivanalyse http://www.diss-duisburg.de/Internetbibliothek/Artikel/Aspekte_einer_Kritischen_Diskursanalyse.htm)

Network Commons: dawn of an idea (Chapter 1, part 2 - Draft)

Mad James

Good ideas often pop up at the same time at various points on the Earth, they just seem to be in the air. And so it came that around the year 2000 at different points on the globe wireless free community networks were started: Consume.net in London, New York Wireless, Seattle Wireless and Personal Telco, in Portland Oregon, were among the first wireless community networks based on Wireless LAN, or WLAN. Nobody really can say which one came first. I have been lucky to experience the development of Consume and free2air.org in London from a close encounter. Therefore, in this chapter I will tell the story of those networks.

But before I go into the details of this story, it is worth remembering a bit how things were back then. Today, when the debate shifts to a topic such as so called digital natives many young people seem incapable of comprehending that there are middle-aged people like me who have spent a large part of their adult life on-line. I had my first computer in 1985, whereby I should say we, because it was a shared computer between my then girl friend and myself. In 1989 it was followed by two new computers. She got an Amiga 2000, and I got a pre-Windows PC. So I spent a good time learning key commands for the DOS version of Word, while my partner could do wonderful graphical stuff on her Amiga. We could even digitize video, change every single image and turn it into a loop that could be played out and recorded to tape. While I was envious of the slick graphical interface of the Amiga, my PC soon learned a new skill, communication with other computers. That was when the whole on-line fun started.

Actually, we had to overcome a few obstacles first. In Europe, computer modems at the time – around the late 1980s to the early 1999s - had to be licensed by the national Post, Telecommunications and Telegraph company (PTT). This made the stuff prohibitively expensive for many. But we found a workaround. We traveled to West-Berlin and there, in a store called A-Z Electronics, we could buy a 2400 baud modem on the cheap. This modem could be legally sold because it had one cable missing – a loophole in German law according to which it was legal to sell unlicensed equipment if it was not in a state to be used. After we smuggled it back to Vienna, we soldered the cable and connected the modem to the telephone end-connection-point. Franz Xaver, a friend and artist-engineer, had to help to solve issues with the arcane Austrian telephone system. Another friend brought a pack of diskettes and we installed Telix, a program for communicating with a bulletin board systems (BBS).

The BBS world was like a testing ground for virtual communities where certain types of behavior could form. This could be elements of a netiquette, but also an understanding of what it means to be on-line in the first place. Stories about early on-line communities by author such as Sandy Stone1 and Howard Rheingold2 describe how these communities, some of which go back to the early 1970s, foster social (or anti-social;-) behavior.

First artistic experiments with “Art and Telecommunication”3 began in the late 1970s. The Canadian artist Robert Adrian X, who by then was living in Vienna, started an artist's conference board called Artex on a proprietary network in 1980. Fellow artist Roy Ascott described in vivid terms how it felt to be on-line and engage in real-time synchronous communication.

"Over the past three years I have been interacting through my terminal with artists in Australia, Europe and North America, once or twice a week through I.P. Sharp's ARTBOX. I have not come down from that high yet and frankly I don't expect to. Logging in to the network, sharing the exchange of ideas, propositions, visions and sheer gossip is exhilarating. In fact it becomes totally compelling and addictive." (Roy Acot 1984, quoted in Grundmann 1984, p. 28)'

Similar feelings have been shared by almost everyone since who first experienced an always-on network connection. But let's return to the BBS world, which could be quite wild at times. Artists-hackers such as Toek from radio art and performance group DFM circumvented the fact that those systems did not really have graphical interfaces by creating a log-on page with flashing and blinking ASCII animations. Communications in those systems were uncensored – apart from the curiosity of the maintainer of the system – and sometimes one could encounter, without looking for it, cracked software or literature such as the “Hackerfibel” by the Chaos Computer Club, or the Anarchist Hackers Cookbook, or The Temporary Autonomous Zone by Hakim Bey; one could also find software for war-dialling and similar things bordering on what was legally permissive. Thus was created the myth of the Internet as a kind of large DIY bomb-building workshop.

This is a pretty persistent myth by the way, but has maybe more to do with the criminalization of hacking by the US secret services who seemed to be intent on demonizing an activity that many of those involved understood primarily as curiosity, research, interest, gaining new knowledge. When the Internet was opened up for public usage, it seemed to get populated very quickly by all kinds of creative spirits. In 1995, when I had, through work, my first always on “broadband” Internet, the web seemed to consist primarily of artists, anarchists, trade unionists, multinational and non governmental organisations, campaigners for the environment, workers' rights and indigenous groups, as well as the occasional commercial web page of a forward looking company and the standard setting physics department homepage which has been immortalized by artist Olga Lialina with this work "Some Universe": http://art.teleportacia.org/exhibition/stellastar/. Olga Lialina has also collected “Under Construction” signs such as this one, another charming aspect of the early web:

While the on-line world was colorful and intellectually stimulating, Internet access was not that cheap at all at the time. We looked with envy at the US, where local calls were almost free. In Europe you had not only to pay the costs of a provider, but also the costs of the call for every minute you spent on-line. As the 1990s progressed, the modems got faster and maybe telephone provider rates a bit cheaper, but the situation remained fundamentally the same, except in those rare instances, where people came up with inventive solutions.

Cheap Broadband for the Masses: Vienna Backbone Service

In Vienna, Austria, the media artist Oskar Obereder started an Internet service provider almost by accident. With some art school friends, Obereder had launched “A Thousand Master Works”, a project where artists produced multiples which were sold via a poster. Soon, the poster proofed an inefficient method of keeping the offerings up to date. Obereder created a data base and together with some other artists, hackers and the editors of music magazine Skug brought the server on-line, as a web based ordering system. The same technology also supported Skug's data base of independent music. This machine had to be online 24/7, so Obereder and Skug had to get a leased line. In order to share the cost, they distributed internet access throughout the loft-spaces in a former furniture factory where lots of other artists and creative people worked. Everyone who connected to this cable-bound ad-hoc net got the buzz of an always-on Internet and Obereder inadvertently became a provider.

Working together with a small ISP, AT.net, Obereder and colleagues found out about a technology that was coming from California, brand new, and allowed normal copper telephone lines to be used for broadband Internet connections. This was called DSL, and when they first contacted the manufacturer they told them to get lost, because they only sold to telecom providers. Finally, the Austrians got hold of a few modems and started laying the groundwork for what would become Vienna Backbone Service (VBS). This network was offered by three small ISPs as a collaborative effort, but it was also “provided” by many of its first customers who were hosting network exchange services in their cupboards.

Because of the “creative milieu”4 in which Obereder existed, he knew many artists and techies or combinations of those, who had high bandwidth needs and some technical skills. As he by now had founded a company, called Silverserver – later shortended to Sil – they had found out that there was a special type of telephone line that you could rent quite cheaply from the incumbent, and over which you could run DSL. Moreover, the cost was dependent on the distance from the next exchange. Silverserver started finding friends, who were also customers, who lived next to an exchange. In this way, they found a foothold in many Viennese districts, from where they could spread out organically, offering always-on broadband at a tenth of the price, initially, of the incumbent.

In 1998 the workshop and conference Art Servers Unlimited brought together about 40 artists, hackers and activists of all kinds at London's Backspace and the ICA. Obereder was presenting the model of VBS and James Stevens caught an earful of it. What he mainly got out of it was that you could grow a rather large network in a decentralized way, by a collaborative method that involved people taking over responsibility for a node.

Consume – the culture of free networks

 Free Networking as social mechanism: Consume workshop with Manu Luksch, Ilze Black and Alexei Blinov, probably 2003

James Stevens and Julian Priest found another inspiration for their “Model 1” (see chapter 1) through the way in which in a particular neighborhood and social environment WiFi was used to share a leased line Internet connection. At the turn of the millennium, James Stevens and Julian Priest had "worked for a decade almost in multimedia, making CD ROMs and websites, running around... then we decided to give it a try, and concentrate on doing more altruistic work.”5

Both had their offices in a special corner of Southwark, the London borough just south of the Thames, in Clink Street, in a small warehouse, directly on the riverside, called Winchester Wharf. Today, oh irony, the ground floor is occupied by a Starbucks. Adjacent to it there were other warehouses, converted into offices and studios for various creative outfits, from record labels to web and multimedia companies. In Winchester Wharf, the web company Obsolete and the Internet cafe Backspace enjoyed a few years of happy coexistence. Obsolete had become successful quickly by making web-pages for Ninja Tune and other record labels located in the same building. After record companies followed some blue chip companies such as Levi who were intent on having a cool, young image. But James Stevens had already opted out at that time.

So he founded Backspace (http://bak.spc.org/), a place at the ground floor, with one window almost on water-level when the tide was high. Fittingly, the homepage of Backspace showed (and still does show) a graphical animation of the river Thames with the web-sites hosted by Backspace floating like half sub-merged buoys in the river. Descriptions of Backspace it as an Internet cafe or gallery just show the ineptitude of common language to describe what it was. It was a hub where people with all kinds of ideas – whether they were related to the Internet or not – came together to talk, organize, share. Backspace was a crucible of London's net art and digerati, where events such as the legendary "Anti with E" (http://www.irational.org/cybercafe/backspace/) conferences and lectures took place. Backspace also became quickly known for its regular live-streaming sessions, at first mainly radio, later also videos, with Captain Gio D'Angelo often in command6.

That was only made possible, because Backspace shared a leased line with Obsolete, who were just upstairs. The other small outfits in the area, on the same street but not in the same building, also wanted a share in the bandwidth Bonanza. At first some sort of Grey area solution was considered, like finding a way of connecting buildings via Ethernet, but that turned out to be impossible, unless one broke the law or dug up the street, legally of course, as a provider company. At some point, someone must have stumbled over WiFi or Airport, as the version promoted by Apple was called. A lot of people in Clink Street were designers and thus Apple users who at the time were the first major consumer computer company who supported WiFi through “Airport” as they called it (See History of WiFi, by Vic Hayes et al 7).

The creative cluster of artists, designers, musicians and entrepreneurs experienced the benefits of broadband and also the laws of network distribution. As Stevens and Priest noted, the maximum bandwidth available is only relevant at peak times, when everybody was on-line, checking into the system, or after work, when people played games or watched videos. Otherwise, the 512K connection, which today would not be considered broadband anymore, was giving everyone enough space to live, listen to music, build web-pages or even play on-line games. But the bandwidth paradise on the Southern shore of the Thames was not to last.

Whinchester Wharf was sold as part of a general regeneration drive of the Southern river bank, at a time when Tate Modern was opened and the whole area underwent a wider transformation. “Between us, we both had an axe to grind when Backspace was closed, we sat together and talked about it and thought it was a good moment to put into practice some of the ideas that we have hatched and some of the things that we have experimented with,” remembered James in 2003. In late 1999, they organized a first workshop to start building Consume, in the offices of I/O/D, one of a web of companies and art groups in the area. For James Stevens, it was ffrom the beginning a "social thing". "The idea that came out was much more straight forward than it looks now, but it was interlinking locations where people work and live using this wireless stuff. We did it already across the street, so that sort of scale where we had a grasp.” (James Stevens, interview with the author, 2003).

Consume workshop at the studio of AmbientTV.NET, London circa 2003

I received this invitation and remember that I was electrified (although it turned out that I could not participate in that first meeting). I knew that James Stevens was on to something. As he later put it in his own words, "it was on the cusp of a wave of awareness that was sweeping around, also economically we were in a funny state, in a kind of decline of the swell after all of that gluttony of that Dotcom shit." Within the space of a few years the Net had been completely transformed from a colourful space dominated by various leftist and creative types to a place apparently ruled and defined by multinational corporations.

The early WWW had generated a lot of enthusiasm about free speech and possibilities of political self-organisation. It was seen as an electronic Agora, a place where democracy could be reinvented through participatory processes, electronically mediated. Yet in the eyes of the media, all attention was devoted to Internet startups such as Netscape and Amazon who made billions with their IPOs. Ideas about freedom of speech and creative expression, held dear at places such as Backspace, were completely omitted in public discourse. But in late 1999 the stock market boom had started to flounder and in April 2000 the Nasdaq collapsed. Suddenly, the pendulum swung back and ideas about freedom of speech and political self-organisation came back. The call for the first Consume workshop was met with “a phenomenal response” according to Stevens. The question they asked themselves was: the technology had shown to work in a relatively confined area. Could it be made to work over a mile or two? Could different areas be connected into a Wide Area Community Network? Stevens:

. There was a momentum there, in that way, because it grasped peoples attention and got them to come out, literally, just physically to turn up, gather at a meeting, and really, the second meeting that we had, we built nodes. It was really just like as direct as that: physically turn up and do it; those who could handled the Unix side of it, which is not everybody, obviously.” (Stevens 2003)

This workshop must have been in the first half of 2000. What they were out to do, “was to provide ownership of network segments to self-provide those services and in addition to that do all sorts of node-to-node kind of benefits” explained Stevens. But the routing was soon found out to be a core issue to be solved. The nodes deployed in such a network had to mesh and this had to be automatic.

This was a grave problem in 2000, since the Internet by then had been thoroughly commodified, and junks of it handed over to companies, who could defined it as their "country" or Autonomous System, controlling the entry points of the network. This is called Border Gateway Routing Protocol and on such a technical level there is nothing to be said against it. However, it introduces a more leafy, hierarchical structure, which boils down to the problem of exclusive IP numbers in networks. Due to the cascaded nature of networks, with many layers, users in internal networks are often linked via a protocol called NAT. That means, that this router controls the connection to the world, while this node is visible to the world also through this gateway alone. In other words, there is no public visible route to this machine. If a lot of people who share their network connectivity via wireless have such a provider, the routing in the network becomes a problem. There are workarounds for that problem, but this is just one aspect of a protracted sequence of issues regarding wireless and routing. 8

At that point, in the year 2000, mesh networking technology was really in its infants. Through the launch of Consume, a lot of gifted people started to get interested in mesh networking and similar ideas. It is fair to say that community networks took mesh technology out of the military closet and turned it into a working technology (a story which continues today with great intensity and to whch i will return later in this book).

The way how Consume grew, initially, owed much more to the special "genius" of James Stevens than to any technology. "Genius", a term usually resevred for artists or sometimes also scientists, in this context refers to a social skill. James Stevens has a special way of “growing” projects, of initiating them, bringing them into existence but then letting them go their own way. Rather than becoming the leader and figurehead, he tries to initiate a self-perpetuating idea. Maybe this has also something to do with his past in the underground music and squatter scene in the 1980s. Politically, those social scenes were, if not explicitly anarchist, connected with a deep sitting social and artistic liberalism that I found to be much more entrenched in England than in any other country of which I know.

For Stevens and Priest it was a long term goal to “find an opportunity, within the legislation of radio spectrum, to use these domestic computer devices to interlink in a way that it was deemed possible to bypass the local loop” argued Stevens. For him, what became a priority was advocacy, “promotion of systems that create a mesh over the topography. […] You just have to propagate the idea or possibility or potential across the landscape.” And that's what happened in the years 2000 to 2002. While Julian Priest had to take a step back for a litle while for private reasons and because of moving to Denmark, James Stevens and a small but fast growing group of volunteers was building Consume, a self-propgating net. A Consume mailinglist and a website were launched. But the main mechanism for propagating the idea were workshops. There were a number of workshops in spring and summer of 2002, one at the studio of Manu Luksch and Ilze Black, another one at Limehouse Town Hall, which I remember vividly.

The workshops offered something for everyone. First and foremost, they gave people in a particular area the opportunity to meet and discuss the possinbilities of creating a local wireless community network. This involved the social side of getting to know other people in the area. This may not sound like much, but in London talking to neighbours is seen as something quite radical. The only apparently banal thing of "talking to neighbours" went together with exploring the city-scape for suitable locations for antennas and repeaters.

Those who were inclined to do so were building antennas, an activity that showed to be quite attractive for a diverse range of people. It is also something that turns the rather abstract idea of the network into something that can be literally grasped. Antenna building also involves learning about basic physics and the electromagnetic spectrum, which is something very useful in a world pervaded by electronic devices.

Other workshop participants turned to the software side of things. At the time, old computers were used as wireless routers. They were taken apart, reassembled, equipped with network cards, turned into Linux machines and then configured by usage of some bespoke experimental routing software. The issues that posed themselves with regard to routing and networking were publicly and hotly debated which, in my case, triggered a steep learning curve. This was a time when I started to gain knowledge of IP numbers, address spaces, NATing and port forwarding, and, last not east, routing protocols.

Screenshot of Consume Node Data Base of UK in text mode

As Stevens and a core group of supporters traveled up and down the country, workshoping, talking, advocating, Consume quickly developed a national dimension. Networks and nodes popped up all over the country. The vibrancy of Consume was based on the support it found by a wide range of people across the UK. Stevens advocated a model of de-centralised person to person communication, realized via self-managed nodes. De-centralisation was at the core of the idea, politcially as well as technologically. The network was not centrally owned and managed but came together as a resut of the activities of many independent and self-motivated actors. James Stevens at the time argued:

“Creating any sort of infrastructural layer on the landscape, in an environment or the community, that's something that has always been left to the councils or commercial entities, but this is something that can be pulled out from the ground at any level almost really. A school can just decide to put up an access point: utilize, redistribute, in order to legitimately pass the network that it has got from its council network and say its available throughout the school without any wires.” (James Stevens 2003)

Stevens wanted to demonstrate that large, infrastructural projects could be realised in a bottom-up manner, through processes of self-organisation and through the mobilization of social capital (rather than financial capital). This was only possible because Consume attracted some very gifted people, such as the Russian artist-engineer Alexei Blinov, founder of Raylab, later Hivenetworks; hacker-programmer-techies such as Jasper, who programmed the Consume Node data base, and BSD core developer Bruce Simpson; and network admin wizards such as Ten Yen and Ian Morrison. Other people who participated, such as Saul Albert and Simon Worthington, co-founder of Mute Magazine, could be described as non-commercial social entrepreneurs; their strength was also advocy, creating ideas of their own and pulling in people and resources; the same can be said of artists and curators such as Manu Luksch, Ilze Black and myself who, for a while, also belonged to the core of the London free network scene (I will dedicate a special chapter to art and wireless community networks later in this book). Another core participant was Adam Burns, who can claim to have had the same idea, more or less, by himself, and had set up the first wireless free network node in Europe, free2air.org.

You are Free 2 Air Your Opinion

Adam Burns and Manu Luksch explore skies over East London. Photo: David

While Consume had been an early project, as a really existing free network in London it had been preceded by Free2air.org. Free2air.org was the virtual flag flown by Adam Burns, of Australian descent. In his daytime job he managed firewalls of financial groups, in his spare time he had set up an omni-directional antenna on a building on Hackney Road, just above the Bus stop and the Halal Chicken shop. From there, everybody could pick up a signal who was in range.

"To my knowledge I am not aware of any other facility in Europe offering totally open network access like this. I do not want to know the name, the address, the credit card number, the colour of the eyes or hair of anyone who connects through to this network, that's unimportant to me. And I don't feel that this is a necessary requirement." (Adam Burns, Interview with Armin Medosch)

At the time of the interview, which must have happened in autumn 2002, Adam Burns claimed that free2air had been active since 18 months. Thus, from late 1999 or early 2000, Free2air.org, hosted by a machine called Ground Zero, offered free wireless Internet access to everyone passing through. Adam Burns had been involved with small ISPs in Australia in the early 1990s, when providing Internet access was done more out of ethical conviction than business sense. This background has inspired his keen sense of networking as a social project.

“Free2air is a contentious name, but one that I have chosen to use. Basically it has a dual meaning: once you establish such a network the cost of information travel is free. It's not a totally free service to establish, you need to buy hardware, you need computer expertise and so on. But the whole idea of ongoing costs are minimal. Secondly, what I liked about it is the plans for a distributed open public access network. It gets rid of the idea of a central ISP, in other words, globally around the world, when we are talking about the Internet or censorship or pedophiles hanging out, or bomb makers, there is a lot of concentration on what really goes on in networks. When you have got a lot of people passing information directly to each other its very hard to track down what information has and has not passed and how it got air. So there is a double meaning to free2air, it also means you are free to air your expressions without concern or problems in getting that message through.” (Adam Burns, 2002)

East End Net

an omni directional antenna by Consume, a spire and the towers of the financial center, East London 2002

Adam Burns became a central person in the London wireless scene around Consume and what came to be known as East End Net. The idea was launched to connect Limehouse Town Hall with the area around the office of Mute magazine at lower Brick Lane, and somehow to connect also Bethnal Green and central Hackney. That bit was also the place where I lived at the time. While the large version of East End Net never materialised, we had our local version of it, with a connection from free2air.org to the “compound”, a large workspace building for small industries at the bottom of Broadway Market in E8.

With AmbientTV.net's help, the connection was spread by wireless and wirebound through the building. For several years a community of changing size, from between 20 to 40 or 60 people, inhabited a chunk of the net. Due to the social composition of this area, a number of art projects using the free WiFi took place. I will turn to those projects at a later stage.

East End Net: The Original Map

While East End Net was never built in the way it was supposed to, the discussions and the focus that it generated was highly productive in a number of areas. Several lines of flight are taking off from this point, which all will have to be followed separately - so I'll just hint at some of those ideas in overview form. The hand drawn original map of East End Net was the starting point for a lot of ideas about mapping of wireless networks, but also ideas about communal map making as such. It was the time when the Open Street Map project began, as it was recognised that also something as complex as a map could be built in a decentralized way by unpaid volunteers.

Consume's NodeDB, as already mentioned, was a quite early and successful attempt at building a website that facilitates registering a free network node through a wiki-like functionality. The idea was that the database would not only contain technical information about nodes, but also additional information about services offered. In this way, the NOdeDB would become the focus of community development and of micro-ecologies of small business, art, culture, activism.

The communal building of a wireless mesh network over a large part of a metropolitan area also raised issues about ownership and responsibility. While, as we shall see, in Germany the discussion from the very start was dominated by anxiety about legal repercussions of sharing an Internet connection, in London the discussion was about the notion of the commons. It was through Julian Priest that I became introduced to the work of Elinor Ostrom who successfully contested the hegemony of the thesis of the Tragedy of the Commons - work for which she later received the Nobel price in economics. We started to discuss the implications of what it meant to treat the network as a commons and sought to find ways of affirming this status of the network commons.

For me, personally, two fundamental insights emerged from my involvement at the time. Through participating in workshops and talking to techies, I started to understand a bit more what happened behind the surface of the screen when one clicks on a web-page or sends an email. As I gained insight into how networks function technically, I experienced this as a form of empowerment. In my view, everyone should understand at least a little bit about how networks work. Why? Because networking is not just about moving around bits and bytes, it is about communication, freedom of speech, about democratric participation, the freedom to learn things. One big problem that we have in societies such as ours is that the division of labour imposed on us creates categorical separations between things that should be seen and understood as belonging together. Building and maintaining telecommunication networks is seen as a technical task but affects fundamental human rights and social issues. Thus, everybody should have at lest some idea about how it works, as they otherwise cannot meaningfully participate in Network Society.

Thus, as a grand thesis I would like to introduce here, I propose that the involvement of ordinary people in building a network commons has a profound emancipatory effect. In particular, as the process allows people to learn more about the structure and the functioning of the Internet, they gain a better understanding of what they can potentially achieve in societies and, no less im.portant, how to protect themselves from the harmful effects of information abuse by corporations and government. As people learn how networks work they can become teachers of the free network spirit. They will understand that they can become part of the network (and not only be useres of a service provided by a corporation or the state), and can bring to it their own specialisations and ideas. Through that, the idea of the network also gets enriched.

Thus, the second part of the thesis is that free networks contribute to the democratisation of technology. Conventionally, technology is considered to be developed behind the closed walls of research labs. There, gods in white (or jeans and black pollunder) develop the technologies of the futures, which the thankful people then consume as a commodity. The way in which wireless community networks function, that is, the development of cutting edge technology, is opened up to wider mechanisms of participation. This secon part of the thesis is almost confirmed already through the existence of a project such as the EU project Confine. Through the involvement of community networkers in shaping future technologies, those technologies become less elitist, less controlled by narrow commercial or security interest. The original peer-to-peer spirit of the net gets enhanced and made fit for the future in a network commons that is there to protect our democratic freedoms and rights.

Next, Chapter 2: Consume the Net - the internationalisation of an idea (watch this space ...)

  • 1. Stone, Allucquère Rosanne. The War of Desire and Technology at the Close of the Mechanical Age. MIT Press, 1996.
  • 2. Rheingold, Howard. The Virtual Community: Homesteading on the Electronic Frontier. MIT Press, 1993.
  • 3. Grundmann, Heidi. Art + Telecommunication. Vancouver, [B.C.]: Western Front Publication, 1984.
  • 4. I have written more extensively about this in Medosch, Armin. “Kreative Milieus.” In Vergessene Zukunft: Radikale Netzkulturen in Europa, 1., Aufl., 19–26. Bielefeld: Transcript, 2012.
  • 5. James Stevens, interview with the author, June 2003, private notes
  • 6. see article by josephine Berry http://www.medialounge.net/lounge/workspace/crashhtml/cc/23.htm
  • 7. Lemstra, Wolter, Vic Hayes, and John Groenewegen. The Innovation Journey of Wi-Fi: The Road to Global Success. Cambridge University Press, 2010.
  • 8. Corinna "Elektra" Aichele, a free networker from Berlin, has summed up those problems and possible solutions much better than I ever could in her book "Mesh - Drahtlose Ad-hoc Netze", Open Source Verlag, 2007

Free Networks Banner Image

Free network banner

Consume the Net: The Internationalisation of an Idea (chapter 2, part 1, draft)

Wireless Community Networkers

This chapter starts out with a summary of the achievements of Consume.net, London and then traces the development of this idea, how it was spread, picked up, transformed by communities in Germany, Denmark and Austria. The internationalisation of the free network project also saw significant innovations and contributions, developing a richer and more sustainable version of the network commons through groups such as Freifunk.

In London, Consume had developed a model for wireless community networks. According to this idea, a wireless community network could be built by linking individual nodes which would together create a mesh network. Each node would be owned and maintained locally, in a decentralized manner, by either a person, family, group or small organisation. They would configure their nodes in such a way that they would link up with other nodes and carry data indiscriminately from where it came and where it went. Some of those nodes would also have an Internet connection and share it with everybody else on the wireless network. Technically, this would be achieved by using ad-hoc mesh network routing protocols, but those were not yet a very mature technology. Socially, the growth of the network would be organised through workshops, supported by tools such as mailinglists, wikis and a node database, a website where node owners could enter their node together with some additional information, which was then shown on a map. Within the space of two years, this proposition had become a remarkable success.

Consume nodes and networks popped up all over the UK. Consume had made it into mainstream media such as the newspaper The Guardian http://www.theguardian.com/technology/2002/jun/20/news.onlinesupplement1 . The project also successfully tied into the discourse on furthering access to broadband in Britain. The New Labour government of Tony Blair was, rhetorically at least, promising to roll out broadband to all as quickly as possible. This was encountering problems, especially on the countryside. The incumbent, British Telecom, claimed that in smaller villages it needed evidence that there was enough demand before it made the local telecom exchange ADSL ready. ADSL is a technology that allows using standard copper telephone wire to achieve higher transmission rates. The Access to Broadband Campaign ABC occasionally joined forces with Consume. The government could not dismiss this as anarchist hackers from the big city. These are “good” business people from rural areas who needed Internet to run their businesses and BT was not helping them. Consume initiator James Stevens and supporters traveled up and down the country, doing workshops, advocating, talking to the media and local initiatives.

BerLon

In 2002 the opportunity arose to bring Consume to Berlin. Although living in London, I had been working as co-editor in chief for the online magazine Telepolis for many years, so I knew the German scene quite well. After quitting Telepolis in spring 2002, I traveled to Berlin to renew my contacts. The curator of the conference Urban Drift, Francesca Ferguson, asked me to organize a panel on DIY wireless and the city. This gave me the opportunity to bring James Stevens and Simon Worthington to Berlin, as well as nomadic net artist Shu Lea Cheang.

The idea emerged, to combine our appearance at Urban Drift with a workshop that should bring together wireless free network enthusiasts from London and Berlin. Taking inspiration from Robert Adrian X early art and telecommunication projects, we called this workshop BerLon, uniting the names Berlin and London. Robert Adrian X had connected Wien (Vienna, Austria) and Vancouver, Canada through four projects between 1979 and 1983, calling the project WienCouver http://kunstradio.at/HISTORY/TCOM/WC/wc-index.html.

Our organisational partner in Berlin was Bootlab, a shared workspace in Berlin Mitte, where a lot of people had a desk who were interested in unconventional ideas using new technologies. Some Bootlabers were running small commercial businesses but most of them constituted the critical backbone of Berlin's network culture scene. Bootlab was a greenhouse for new ideas, a little bit like Backspace had been in the late 1990s in London. Our hosts at Bootlab were Diana McCarthy, who did the bulk of organisational work, and Pit Schultz, who had, together with Dutch network philosopher Geert Lovink, invented the notion of net-critique and initiated the influential mailinglist nettime.

A little bit of additional money for travel support from Heinrich Böll Foundation, the research and culture foundation of the German Green Party, enabled us to fly over some more networkers from London, such as electronics wizard Alexei Blinov and free2air.org pioneer Adam Burns. And as is often the case with such projects, it developed a dynamics of its own. Julian Priest came from Denmark, where he lived at the time, and brought along Thomas Krag and Sebastian Büttrich from Wire.less.dk. Last not least, there were people from Berlin who had already experimented with wireless networking technology, among them Jürgen Neumann, Corinna “Elektra” Aichele and Sven Wagner, aka cven (c-base Sven).

The rest is history, so to speak. I would be hard pressed to recall in detail what happened. Luckily, the Austrian radio journalist Thomas Thaler was there. His report for Matrix, the network culture magazine of Austrian public radio ORF Ö1 gives the impression that it was a bit chaotic, really. There was no agenda, no time-table, no speakers list. Sometimes somebody grabbed the microphone and said a few words. As Thaler wrote, “London was clearly in the leading role” in what will have to be accounted for under “informal exchange.” Most things happened in working groups.

One group was discussing the networking situation in Berlin. There had already been initiatives to create community networks in Berlin, one called Prenzelnet, another one Wlanfhain (Wlan Friedrichshain). As an after effect of re-unification of Germany, there were areas in the eastern side of Berlin that had OPLAN, a fiber-optical network, which made it impossible to use ADSL. What also needs to be accounted for is the special housing structure of Berlin.

As an after effect of Berlin having been an enclave of Western “freedom” first, then having a wild East right in its center of occupied houses and culture centers in Berlin Mitte and neighboring areas, a relatively large number of people live in collective housing projects. These are not small individual houses but large apartment blocks, collectively owned. Freifunk initiator Jürgen Neumann lives in such a housing project which was affected by the OPLAN problem, so that 35 people shared an expensive ISDN connection. After learning about WLAN, he built a wireless bridge to an ISP for his housing association and spread it around the block. Other people who were already experimenting with wireless networks before BerLon were c-base and Elektra.

Another working group dealt with the question of how to define the wireless networking equivalent to the licensing model of Free Software, the General Public License (GPL). From Berlin, Florian Cramer, an expert on Free Software topics, joined this discussion. This issue about a licensing model for Free Networks caused us quite some headache at BerLon, and we did not really find a solution there, but managed to circle in on the subject enough to finish the Pico Peering Agreement at the next meeting in Copenhagen.

At BerLon, Krag and Büttrich also reported about their engagements in Africa. There, well meaning initiatives trying to work with Free and Open Source technology often meet socially difficult and geographically rugged environments.

I can't claim to know in detail what happened in the other working group, the one on networking in Berlin, but the result is there for everyone to see. This was the moment of the inception of Freifunk, the German version of wireless community networking. Freifunk (which, in a word-by-word translation means simply “free radio transmission”) is today one of the most active wireless community networking initiatives in the world. Ironically, while today Consume is defunct, Freifunk became a fantastic success story. With German “Vorsprung durch Technik” Freifunk volunteers managed to contribute significantly to the praxis of wireless community networks. In particular, the Freifunk Distribution and the adoption and improvement of mesh networking technology contributed significantly to inter-networking technology. Freifunk's existence, vibrant and fast growing in the year 2014, is testimony also to the social viability of the Consume idea.

However, I am not claiming that Freifunk simply carried out what Consume had conceived. This would be a much too passive transmission model. Freifunk, just like Guifi, contributed significant innovations of its own. I am also not claiming that Freifunk jumped out of the BerLon meeting like the genie out of the bottle. A number of significant steps were necessary. However, it is also undeniably the case that BerLon provided the contact zone between Berlin and London. This set into motion a process which would eventually lead to a large and successful community network movement.

Jürgen Neumann and a few other people from Berlin decided to hold a weekly meeting, WaveLöten (wave soldering), every Wednesday at c-base starting at 23.10.2002, which was very soon after BerLon. WaveLöten was an important ignition for Freifunk in Berlin. As Neumann said, the lucky situation was that there was a group of people who understood the technical and social complexity of this and each started to contribute to the shared project of the network commons – Bruno Randolf, Elektra, cven (c-base Sven), Sven Ola Tücke and others on the technical side, Monic Meisel, Jürgen, Ingo, and Iris on the organisational and communicative side.

What are the reasons that Freifunk could thrive in Berlin and Germany and while Consume lost its dynamic in the UK? The answer is not simple, so I am just pointing at this question here. Which will pop up throughout this book. What makes a wireless community network sustainable? Why do some communities thrive and grow while others fall asleep?

Copenhagen Interpolation and the Pico Peering Agreement

BerLon was followed, on March 1st and 2nd 2003, by the Copenhagen Interpolation. On this occasion the Pico Peering Agreement was brought to a satisfactory level. I am happy, because I contributed to writing it, and as this story has developed since, it has found some implementation. The Denmark meeting was also quite small. There were people from Locustworld, the Wire.Danes, Malcolm Matson and Jürgen Neumann, Ingo Rau and Iris Rabener from Berlin. They decided in Copenhagen to hold the first Freifunk Summer Convention in Berlin in September 2003.

At BerLon we had discussed the social dimensions of free networking. What were the “social protocols” of free networking? The answer was to be given by the Pico Peering Agreement, a kind of rights bill for wireless community networking.

It had all begun with discussions how to improve the NodeDB. James Stevens expressed his desire a node owner could also choose a freely configurable license – to create a bespoke legal agreement on the fly for his network on the basis of a kind of licensing kit. The node owner should be able to choose from a set of templates to make it known to the public what their node offered at which conditions. This work should be done with the help of lawyers so that node owners could protect themselves. This seemed a good idea but was way to complicated for what our group was able to fathom at the time. We needed something much simpler, something that expressed the Free Network idea in a nutshell.

The success of Free Software is often attributed to the “legal hack,” the GPL. This is a software license which explicitly allows it to run, copy, use and modify software, as long as the modified version is again put under the GPL. This “viral model” is understood to have underpinned the success of Free Software. Today, I am not so sure anymore if this is really the main reason why Free Software succeeded.

Maybe there were many other reasons, such as that there was a need for it, that people supported it with voluntary labour, or that the development model behind Free Software, the co-operative method, simply resulted in better software than the closed model of proprietary software with its top-down hierarchical command system. Anyway, we thought that Free Networks needed an equivalent to the GPL in order to grow. But how to define such an equivalent?

With software, there is one definitive advantage: once the first copy exists, the costs of making additional copies and disseminating them through the net is very low. Free Networks are an entirely different affair: they need hardware which costs money, this hardware is not just used indoors but also outdoors and is exposed to weather and other environmental influences. Free Networks can not really be free as in gratis. They need constant maintenance and they incur not inconsiderable cost.

The crib to get there was the sailing boat analogy. If there are too many sailing boats at a marina, so that not all of them can berth at the pier, boats are berthed next to each other. If you want to get to a boat that is further away from the pier, you necessarily have to step over other boats. It has become customary that it is allowed to walk over other boats in front of the mast. You don't pass at the back, where the more private areas of the boat are – with the entrance to the cabins and the steering wheel – but in front of the mast. In networking terms that would be the public, non-guarded area of a local network, also known as the demilitarized zone (DMZ).

We agreed that it was conditional for participation in a free network that every node owner should accept to pass on data destined for other nodes without filtering or discriminating. We can claim that we defined what today is called network neutrality as centerpiece of the Copenhagen Interpolation of the Pico Peering Agreement: http://www.picopeer.net/PPA-en.html.

While it is important, and I am happy to have contributed to it, I see things slightly different today. I think the real key to Free Networks is the understanding of the network as commons. The freedom in a network cannot be guaranteed by any license but only by the shared understanding of the network commons. The license, however, is an important additional device.

Fly Freifunk Fly! (Chapter 2, part 2, draft)

The Copenhagen Interpolation had induced confidence into the very small number of participants, including a delegation of three from Berlin. In Berlin, the Domain Freifunk.net was registered in January 2003. The name was coined by Monic Meisel and Ingo Rau over a glass of red wine. Their initial impulse, according to Monic Meisel, was to create a website to spread the idea and make the diverse communities that already existed visible to each other. They wanted a domain name that should be easily understood, a catchy phrase that transported the idea.

Early Map of Berlin Backbone, courtesy Freifunk

Freifunk is a good name. It carries the idea of freedom and the German word “funk” has more emotional pull than “radio.” Funk is funky. The German word “Funke” means spark. The reason is that early radios actually created sparks to make electromagnetic waves. “Funken” thus means both, to create sparks and make a wireless transmission. Meisel, who at the time worked for a German web agency, also created the famous Freifunk Logo and the visual identity of the website.

 Freifunk Logo by Monic Meisel, image courtesy Freifunk

It seems that Freifunk took off because of a combinaton of reasons. It quickly found support among activists all over Germany, not just Berlin. It had people, who had a good understanding of technology and made the right decisions. And Freifunk did very good PR from the start. Jürgen Neumann quickly emerged as a spokesperson for the fledgling movement. However, he could always rely on other people around him to communicate the idea through a range of different means. Freifunk from the start was more like a network of people than Consume has ever been. When James Stevens decided to stop promoting Consume, it ceased to exist as a nationwide UK network of networks.

In spring and summer 2003 the Freifunk germ was sprouting in Berlin. I was writing my German book and started to put draft chapters into the Freifunk Wiki. Freifunk initially grew quickly in Berlin, in particular in areas that had the OPAL problem and thus could not get broadband via ADSL. AS the Wayback Machine shows

In June 2003 the Open Culture conference, curated by Felix Stalder in Vienna, brought together a number of wireless community network enthusiasts. There, Eben Moglen, the lawyer who had helped write the GPL, gave a rousing speech. His notes consisted of a small piece of paper on which he had written:

free software – free networks – free hardware.

Eben Moglen at OpenCultures conference 2003; Image courtesy t0 / WorldInformation.org

The holy trinity of freedom of speech and participatory democracy in the early 21st century. His speech was based on the Dotcommunist Manifesto which he had published earlier that year. Moglen skillfully paraphrased the communist manifesto by Marx and Engels, writing “A Spectre is haunting multinational capitalism--the spectre of free information. All the powers of “globalism” have entered into an unholy alliance to exorcize this spectre: Microsoft and Disney, the World Trade Organization, the United States Congress and the European Commission.“ Moglen argued that advocates of freedom in the new digital society were inevitably denounced as anarchists and communists, while actually they should be considered role models for a new social model, based on ubiquitous networks and cheap computing power. His political manifesto posited the digital creative workers against those who merely accumulate and hoard the products of their creative labour.

While sharply polemical and as such maybe sometimes a bit black and white in its argumentation, Moglen's Dotcommunist Manifesto is correct insofar as it outlays a social conflict which characterizes our time and is still unresolved. The new collaborative culture of the Net would in principle enable a utopian social project, where people can come together to communicate and create cultural artifacts and new knowledge freely. This world of producers he juxtaposes with another world which is still steeped in the thinking of the past, which clings on to the notion of the production of commodities and which seeks to turn into commodities things that simply aren't. This is the world of governments, of corporations and lobbyists who make laws in their own interest which curtail the freedom and creative potential of the net.

There is no reason why a network should be treated as a commodity. The notion of access to the Internet is, as the free network community argues, a false one. The Internet is not a thing to which one gets sold access by a corporation. As a network of networks, everybody who connects to it can become part of it. Every receiver of information can also become a producer and sender of information. This is realized on the technical infrastructural layer of the net, but it has not yet transpired to mainstream society.

Freifunk Summer Convention 2003

In September 2003 the first Freifunk Summer Convention FC03 happened in Berlin at c-base. This self-organised memorable event, from September 12 to 14, brought together a range of people and skills which gave some key impulses to the movement to build the network commons. Among the people who had joined by their own volition were activists from Djurslands. This is a district in the north east of Denmark, a rural area with economic and demographic problems. Djurslands.net demonstrated for the first time that you could have a durable large scale outdoor net with a large number of nodes. The guys from Djurslands.net brought a fresh craftsman approach to free networking, with solidly welded cantennas (antenna made from empty food can). At the Freifunk convention it was decided to have the next community network meeting in Djursland in 2004, which turned out to become a major international meeting of community networkers in Europe.

 Map of Djurslands.net

According to conflicting reports at FC03 Bruno Randolf showed the mesh-cube, a technology he developed for a company in Hamburg. However, according to a recent entry on the timeline it was only after FC03 that the development of the Meshcube began in serious. At the time, Julian Priest wrote in the Informal Wiki:

"Bruno Randolf ran mesh routing workshop. After a good discussion covering the main mesh protocols and solutions, aodv, mobile mesh, scrouter, and meshap, around 10 - 15 linux laptops were pressed into service as mesh nodes using the mobile mesh toolset. Tomas Krag crammed a couple of wireless cards into his laptop, (which kind of fitted), and ran the border router and others stretched the network around the buildings. Many discussions about how to assign ip addresses in the mesh followed, maybe ipv6 mobile ip and zero conf can be ways forward here. Bruno demoed the jaw dropping 4G mesh cube.. 4 cm cube sporting up to 4 radios, smc type antenna connectors, a 400 Mhz mips 32Mb flash 64M ram, with power over ethernet and usb, currently running debian. A space to watch for sure."

The Meshcube made use of industrial small chips optimized for running an embedded Linux distribution. It worked with the AODV routing protocol (CHECK). Another early protocol was OLSR developed by Andreas Tønnesen as a master thesis project at the university graduate center in Oslo. However, it seems at FC03 Mobile Mesh which was discussed and tested. See this entry on Mesh, probably by Elektra, on meshing on the early Freifunk Wiki.

Thus it is confirmed that on a mild day in September 2003 in Berlin, a couple of dozen of geeks could be seen walking around the streets with laptops making, to the ordinary passers-by, incomprehensible remarks about pings and packets. This was the beginning of a long and fruitful engagement of free network communities with mesh routing protocols. (see also This report from 2003.

Shortly after FC03, the Förderverein Freie Netzwerke was founded, a not-for-profit organisation whose aim was the furthering of wireless community networks. The convention had also mobilized a television crew, who made this short film (in German) Real Video-Stream Polylux TV http://brandenburg.rbb-online.de/_/polylux/aktuell/themen_jsp.html

It shows a number of free network advocates including this author at a slightly more youthful age.

As the video makes evident, Freifunk from the start advertised itself as a social project which is about communication and community. Freifunk created an efficient set of tools to be picked up as a kind of community franchise model, as Jürgen Neumann calls it. There is the Freifunk Website with a strong visual identity, and the domain name, which also works as an ESSID of the actual networks. Everybody can pick up a Freifunk sub-domain and start a project in a different locality. Freifunk initially grew out of Berlin's creative new media scene, so that from the very start interesting videos and other new media content was produced.

Another decision that should proof beneficial was that early on Freifunk started to build a Berlin Backbone, long-distance connections between high-rise buildings with reliable radio links. Freifunk was really good at choosing buildings – and getting access to them – with suitable roofs where weather-proof installations could be made. This idea with the Berlin Backbone was a good one from the start, it gave the community something to experiment with. In an interview with radio journalist Thomas Thaler, Sven Wagner advocated the Berlin Backbone as a network linking Berlin's big alternative culture centers such as "Tacheles, CCCB, bootlab, Lehrter Kulturfabrik and c-base and a few other projects". Wayback Machine Link: Thomas Thaler, ORF Matrix Nov 19.2003, Transcript

I belief that for those early long distance connections mesh-cubes were used. Those links however, did not mesh, as they were set up on fixed routes. But from those points then bandwidth was redistributed. Thus, from early on a Berlin Backbone grew, such as shown in this image which appears to be from July 2003. (Meanwhile, Berlin Backbone receives financial support from the rehional government – more about that in a future installment of this story).

Berlin Backbone Summer 2003

In London, if you look at an early map of East End Net, the dots are there but they are not connected. Between Cremer street and Free2air.org and Limehouse there was never a connection. This has partly to do with the urban topology of London, partly with the social structure. Everyone is much more commercially minded, even the church.

In May 2002 there was a Consume workshop in Limehouse Town Hall, where networkers discovered the spire of the adjacent church as an ideal antenna mounting point for a long distance connection. The Vicar, however, had already sold access to the spire of his church to a mobile telephone company.

It seems significant that today's Berlin Backbone uses quite a few churches. Another aspect of the social side is that in Berlin it is easier to find people who have time to engage in voluntary labour. The combination of lower costs of living and the remainder of a welfare state make it easier for socially motivated techies to devote unpaid labour time to such projects. In London, that capitalist behemoth, everybody is under permanent pressure to make money, unless one is very privileged or young enough to live in insecure squats. Such comparisons, however, should not make us conduct false comparisons. At around 2003-04, Consume was still very innovative and dynamical, while Freifunk was also developing rapidly.

If we follow this list of links from the Wayback Machine https://web.archive.org/web/20030723203256/http://freifunk.net/wiki/FrontPage then we can see that in autumn 2003 there were already quite many initiatives. The timeline which has recently began as a cooperative work, shows similar results. http://pad.freifunk.net/p/ff-timeline

In spring 2003 also the early beginnings of Funkfeuer in Austria were made. Funkfeuer, which means radio beacon, was initially built by the artist Franz Xaver for Silverserver. When the provider decided that this was commercially not viable, the network was taken over by a group of volunteers, among whom was Aaron Kaplan. He had already, together with Austrian digital civil rights initiative Quintessenz, made an open WLAN hotspot in Viennas Museum District (Museumsquartier). Funkfeuer has since successfully branched out to Graz and a number of rural locations.

In above mentioned interview, Elektra also made a strong statement in support of meshing technology, expressing confidence that the Free Software community would solve this. The confidence should proof to have been justified. In autumn 2003 Elektra spoke about joining together a Linux distribution such as Knoppix with everything a wireless community node should be capable of, especially meshing. The protocol under deliberation was still mobile mesh, but this would change soon.

[Next ... The Social Technologies of the Network Commons]

The Social Technologies of the Network Commons (Freifunk 2, draft)

The social technologies of the wireless community network are technologies specifically developed to support social goals, such as community networking. Typically, new technologes are developed by large firms or the state. The achievements of wireless community networks demonstrate that there is an alternative, community based innovations. This chapter presents the genealogy of some of the key technologies needed for wireless community networking and discusses their social content.

A happy accident in June 2003 opened the gates to wireless cornucopia. Developers on the Linux Kernel Mailinglist started making comments about a product of Linksys company. In March 2003, Cisco Systems had bought Linksys for US$500 million. The firmware of the wireless access point WRT54G included both the Linux kernel and other code released under the General Public License (GPL). The company had published the firmware as binary, but not the source code, which was against the statutes of the GPL. As the developer who revealed the GPL violations quite happily added, this meant that a whole family of chips by one wireless provider, Broadcom 802.11b/g, supported Linux.

“Complaints appeared on discussion boards such as LKML and Slashdot claiming that Linksys was violating the GPL by not providing source code for certain code used in its WRT54G wireless access point.” (see Linux Insider: http://www.linuxinsider.com/story/43996.html)

The Free Software Foundation stepped in, leading the campaign for enforcement of the GPL. Cisco was brought to comply by publishing the source code. This enabled a revolution, albeit one that most people have never heard of before: the revolution of firmware flashing.

Based on the sourcecode for the Linksys product, a new initiative called OpenWRT came into existence, and began making an increasingly stable Linux distribution specifically for small wireless devices.

The WRT54G became the most popular router for community networks, and this is where OpenWRT has its name from. Meanwhile, OpenWRT works on many embedded devices such as WLAN routers but also wireless hard-drives and basically everything that networks. (Here is a long list of devices on which it is possibly running: http://wiki.openwrt.org/toh/possible)

Firmware replacement had been the battle-cry of East London artist-engineers such as Alexei Blinov and Adam Burns in 2003, 2004. It was possible, in principle, even before OpenWRT, but was very hard. At the time, early ideas for Hivenetworks were being flouted around. This project, which went through many instantiations, initially was quite utopian.

It was a boldfaced claim that you could make multi-hop networks using mesh network protocols that use every available snippet of network connectivity, be it Bluetooth, Wireless, LAN or you have it; furthermore, those devices should also be self-announcing, if they carry any services. Taking inspiration from the zeroconf protocol, the meshing devices would tell if they offer services such as streams, voice chat, podcasts, skype, jabber, and others.

With OpenWRT, this utopia came that bit closer, but as this http://www.wireless-forum.ch/forum/viewtopic.php?t=16140 interview with Sven Ola-Tücke also attests to, it needed great skill and tenacity, even with OpenWRT. You needed to create a directory on a Linux machine for compiling the firmware. To compile one Linux kernel on a machine for another machine, is very difficult, especially if the other machine is so different. Cross-chain compilation it was called by my London friends Alexei and Adam. Once you had a customized Linux image for your device, you needed to transfer it to the device and “flash” it, replace its existing firmware with the new one.

If anything went wrong, you had not flashed it but “bricked” it. The devices electronics were as dead as a brick. As Alexei Blinov pointed out, thats quite unfair against bricks because they can actually be still useful. A dead WRT54G is really only electronic toxic trash. Once you have successfully replaced the firmware, you still need to configure it, all via ssh and shell. Last not least, you had to make the OLSR work with the firewall. If then everything worked well, you could indeed mesh using cheap near ubiquitous devices in autumn 2004. The liberation of hardware to build the network commons through OpenWRT was undoubtedly a great step, hence now software experimentation could begin.

Another important step was the release of OLSR 0.48. In this announcement by Elektra, the advances of this protocol are explained http://de.indymedia.org/2004/12/101054.shtml Open Link State Routing protocol was developed initially as a master thesis by Andreas Tønnesen at UniK - University Graduate Center. In the context of Wizards of OS he was invited to give a presentation at c-base. The conference series Wizards of Operating Systems was initiated by Volker Grassmuck and ran from 1999 to 2006. In 2004 we made a panel on free networks and a workshop. The latter was organized by the community, that is largely by Elektra and Sven Wagner. At the main conference Dewayne Hendricks talked about a 2 Gigabit network for California, constantly refering to the “holodeck”.

The workshop was dedicated to mesh networking, and was to have lasting repercussions. The Freifunk community started using OLSR, which gave decisive impulses for its further development. As Elektra writes, as soon as communities started using mesh networs, the technology started to florish. The community networks have the decisive advantage to have real test conditions. This underpins also project Confine which has developed a community testbed. Many problems of routing stem from the wireless medium, which is unpredictable.

At WOS3 a number of meshcubes from 4G systems were used to create the network at the conference venue. It was running OLSR and the quality of service was horrible, according to an online posting. But within the space of a few months the metrics used was significantly improved. OLSR started to use routes according to the actual quality of transmission, the so called ETX metrics (Expected Transmission Count).

This was the start of a long story of community development of mesh networking. Who wants to get more into the technology should read „Mesh - Drahtlose Ad-hoc Netze“ by Corinna „Elektra“ Aichele. According to her, the release of OLSR 0.48 in December 2004 was a major step in mesh routing. We could say that mesh routing is the paradigmatic technology, the one which is most expressive of this dispositiv, because it brings together technological and social goals and advantages. It expresses the ideal that everyone could connect to everyone in a decentralized way.

I will return to the topic of mesh routing but stick to the timeline. Some of the „technologies“ which communities have developed are rather more like techniques, social technologies in a more direct sense, ways of doing things. At around 2004-5 Cornelius Keller put http://olsrexperiment.de OLSR-Experiment online. This was a website which according to a logic following Berlin's postcodes, handed out IP addresses for people who wanted their wireless router to be part of Freifunk. The visual logic of this method you can follow here:

As another snapshot shows, in a series of core meetings, the Berlin community decided how to use the IP addresses for a Class B Net.

https://web.archive.org/web/20050219235855/http://www.freifunk.net/wiki/CoreMeeting20041020 Logic of address space usage.

Last not least, in autumn 2004 Sven-Ola Tücke had started developing Freifunk Firmware. Initially it was developed for the Berlin Backbone, but then became the foundation for the whole Freifunk community. The Freifunk Firmware (FFW) brought together OpenWRT and all software components you needed to run a Freifunk node, in an easy to use, web based installation process.

This is how it works more or less still today. You can buy a device from a recommended list of compatibe devices. Then you go to the Freifunk webpage and request an IP address. Then you get the automatically compiled correct installation package for your device. You install the firmware via the web-interface, enter the IP address and here you go.

In 2005 OpenWRT released the so called White Russian version, which was the first stable release. The Freifunk Firmware still relies on a follow-up version of that.

There is no doubt that the Freifunk firmware contributed massively to the take off of Freifunk in Germany and Funkfeuer in Austria, who made a customized version, based on the same foundations. As Jürgen Neumann said in an interview, when he saw Sven-Ola's Freifunk Firmware for the first time, he knew that Freifunk was becoming a reality.

The coming together of this assemblage of artifacts - the liberation of hardware to build the network commons, mesh routing and the freifunk firmware – stimulated a rapid growth of free networks. Other regional flavours of firmwares were made for Funkfeuer, Guifi, Wireless London. In the mid-2000s, Freifunk communities mushroomed, many also in former GDR territory. The availability of small and cheap hardware for Open Source experimentalism was a great breakthrough. Potentially, you could network everything with everything, make alternative ad-hoc infrastructures.

Those breakthroughs in peer based innovation fell into a climate that was thick with promises. The early 2000s were full of techno-political promises. The tone had been set by Wizards of OS in 1999, which was the first significant attempt (to my knowledge) to think through Free and Open Source Software also as a political project.

Political means in this case, it has a social significance beyond that usually granted to software. The success of Linux and the things that could be done with it arrived in the art scene – but also the intellectuals of social sciences, etc. – to ask questions about what that freedom or openness was and if and how you could harness that for other things than software.

Looking back at this period, a number of things happened in close proximity: Creative Commons was making a breakthrough with having one million of its licenses used in 2003. Wikipedia was started in 2001 and was gaining critical mass. Many Linux distributions appeared for creative tasks, such as Dyne.org, Puredyne, Knoppix. The protocol Netsukuku was developed, a really explicitly political technology, a peer based routing protocol. It has changed a lot since its first release in 2005, and now seems to take on a very interesting direction. http://en.wikipedia.org/wiki/Netsukuku

In 2003 together with FACT, commissioned by Michael Connor, we made a brochure and DVD, with Kingdom of Piracy http://www.fact.co.uk/projects/kingdom-of-piracy.aspx. It was like a toolbox for free culture, in software, intellectually, in art.

The many-headed hydra of community mesh

The eraly 2000s were an era of rapid growth of the digital commons. The network commons in practice enabled a range of other creative practices. At the time, do-it-yourself map making and the larger framework of locative media was an exciting new topic, debated at the Cartographic Congress in London 2003 and two workshops organised by RIXC in Latvia. WLAN technology opened the possibility of so called war-driving, of driving or cycling or walking through an area with a laptop or similar device, scanning for networks. Depending on the ethical stance taken by those doing the scanning, no intrusion happens, but the information gathered can be used for location based tasks.

One thing that can be done is simply measuring the signal strength of WiFi networks. You can also use the IP addresses for WiFi triangulation - establishing your own location in relation to geolocative database information. This enabled art driven, non-commercial projects to experiment with locative media art, at a time when commercial applications were stil lagging behind. The network commons and locative media art seemed like a perfect marriage, but it was a short spring. While OpenStreetmap may have made rapid gains in 2003, Google Street Maps was also released at around the time. Mapping became part of this sprawling data monopoly. Many of the ideas that were hatched around the early to mid 2000s by independents and small groups ended up getting recuperated by the info-oligarchy.

Around that time the moniker Web 2.0 was introduced. A lot of the things that were the product of community based innovations, and were created in a decentral way were reunited in a new centrality, the soon to rise monster of „social media“. As Yevgeni Mozorov explains, the capacity to coin those terms gives a lot of interpretative power to Silicon Valley (see article: http://www.thebaffler.com/salvos/the-meme-hustler).

In my view you could say that the early 2000s were a time when a peer-to-peer view of the world was formulated on many layers. Michel Bauwens and the P2P foundation's Wiki http://p2pfoundation.net/ may also give testimony to that. But these were still ideas by a relatively small elite. The majority of the world was busy with „the war on terror.“ As we now know, it was used as a pretext to build a gigantic surveillance machinery.

Web 2.0 turned out to be the slogan of commodification of community based innovation, while under the surface massive data harvesting is going on. As an economic model, it was not to last.

The banking crash of 2008 reveals a systemic crisis. The informational paradigm does not see effective demand in the rich countries, there is a downward pressure on the cost of labour, and all those conditions have not significantly improved since the crash. The former West is in crisis and community based innovation could be a way out of it.

Wireless community networks should not be considered in isolation but as part of this larger movement for the commons as the last great liberal utopia. Already before, but now intensified, people start getting involved in food coops, urban gardening, time sharing economies of all kinds. Peer based commons production makes sense when capitalist ways of organization break down. A project to bring wireless community networks to deprived areas in Detroit, the Digital Stewards project, has received mainstream media attention recently.

The former motor city is still in a process of de-urbanization. Inner city communities have appallingly low broadband connection rates, a digital divide reminiscent of a developing country in the heart of industrial America http://detroitdigitalstewards.tumblr.com/
A project to bring WiFi to a community on the outskirts of Valparaiso, Chile, has started and gets documented on these pages here. http://thenextlayer.org/blog/478

(to be continued)

File Attachment: 

AttachmentSize
Image icon plz-zentrum.png42.51 KB

Free Networks: We Are Only Just Beginning

Gio and Alexi in the Wireless Spring

We are only just beginning, is the message I have picked up from the two biggest communities in Europe, from Guifi.net and from Freifunk. Since the publication of the first chapter and this one, Guifi.net has grown from app. 25.000 to 26.500 nodes. Similarly, the political implications of the free network movement have become more easily visible today. As Jürgen Neumann and Monic Meisel report, Freifunk has „unfortunately“ benefited massively from the Snowden allegations. Since it has become known how massive the surveillance machinery is, self-managed networks suddenly make much more sense again. While many things have been coming together to make Freifunk possible, one thing was less in their favor, the German legal climate. Freifunk finds itself at the center of a prolonged battle about "Störerhaftung"

In Germany, from the very start I was asked by potential node owners, if they wouldn't get themselves into trouble by offering an unprotected WLAN access. In Germany, there was and is a big worry about so called Störerhaftung - liability for violations of laws by users of an open WiFi access point. In Germany, cases of legal precedence have been created, where owners of WLAN routers were held responsible for violations of law by people who had logged on to their access point and used it, for instance, for filesharing of copyrighted music and films. Such cases, however, have been relatively rare. The real problem is that in Germany, there exist specialized law firms who have made it their business model to send threatening letters to people whom they accuse of a violation. While often there is hardly any evidence, and the intention of those law firms isn't to take those cases actually to court, they offer people an easy way out by paying a certain sum for an out of court settlement. People unaware of their legal rights and maybe scared of entering a prolonged legal battle with an opponent they consider superior, give in an pay. This is a real nuisance and has created a situation of insecurity for participants in Freifunk. However, this is not the only threat to network freedom in Germany. There are genuine cases, where court cases have been brought by content providers and copyright owners who think their copyright has been broken.

Freifunk has been battling those problems for a long time and has come up with some creative solutions. One is to build locally a community association which can then obtain the same status as a provider. While this solves the “Störerhaftung” problem it may create new ones such as costly legal obligations for data warehousing. So Freifunk in collaboration with the OpenWRT-Team released the first 100 "Freifunk-Freedom-Fighter-Boxen" – wireless routers flashed with an OpenWRT release configured to route all data via a virtual private network (VPN) through an ISP in Sweden, so that German law does not apply.

Freifunk Freedom Box on Mainstream TV

This action, launched in 2012 under the title "Freifunk statt Angst" (Free radio instead of fear) http://freifunkstattangst.de/2012/06/14/aktion-gegen-storerhaftung-anony...
created plenty of publicity. In an initial action, Freifunk gave away 100 Freifunk Freedom Fighter Boxes. When users logged on to such a router, they got a Splash page which informed them about the political background: http://anon.freifunk.net/
According to Jürgen Neumann, one of the founders of Freifunk, the action had always meant to be a temporary publicity stunt. However, the battle against Störerhaftung in Germany turned out to be quite a protracted one. And for many people it offered a relatively safe option.

At the same time, the Freifunk Freedom Fighter box was just one step as a part of a larger counter-offensive on many layers. At the time of writing, in November 2014, there were several court cases going on concurrently. As Monic Meisel reported on November 27 2014, one group of lawsuits had been stopped because the claimant had withdrawn all allegations. They obviously accepted that the accused was a member of Freifunk and that because of its quasi-provider like status there was no liability. See Monic's update: http://freifunkstattangst.de/2014/11/23/update-zu-den-feststellungsklagen/ At the same time, the public climate also changed. In this article, Prof. Thomas Hören http://www.sueddeutsche.de/digital/forum-wer-haftet-1.2473293 reports that a number of court decisions had gone against "Störerhaftung." Thomas Hören is a leading Internet law expert in Germany and specialized in issues where technology and restrictions on it infringe on people's civil liberties. According to Hören, the liability of node owners is not a foregone conclusion.

Störerhaftung has meanwhile become recognized as an impediment to the development of a creative information society in Germany. The number of open WLAN hotspots in proportion to the number of citizens in Germany is very low. As I can confirm as a frequent traveller, in many countries around the world, finding an open WLAN hotspot is quite easy, but not so in rich and bandwidth saturated Germany. German media, such as Heute, the news program of public national television, have started to recognize that STörerhaftung is one of the main reason why there are so few open WLAN hotspots in German cities, towns and villages. http://www.heute.de/freie-wlan-netze-funkstille-in-deutschen-staedten-re... The German coalition government has announced in its coalition agreement to enable open WLAN hotspots in German cities. A number of cities such as Hamburg now also want to realize this. But until Störerhaftung is revoked, there will be legal insecurity. A new draft law has been created which, it is claimed will provide legal security for node owners. But the devil is in the detail and this law is so badly drafted that it actually could achieve the opposite, argues Prof Hören http://www.sueddeutsche.de/digital/forum-wer-haftet-1.2473293-2 .

Freifunk has started a campaign in 2014, which has intensified in 2015, asking its members to write to their MPs to not vote in favour of this law and demand significant changes. http://freifunkstattangst.de/2015/03/13/neuregelung-der-stoererhaftung-w... Up until March 2015, more than 200 MPs received letters informing them about inefficiencies of this draft law. In my view, this are good steps but need to be intensified.

Wireless community networks are growing up, they are getting a political voice. In such cases the technology itself is not the centre of attention but serves as a catalyst to ferment wider political action. The issues and hurdles posed to wireless community networking turn activists into educators of the public. Together with other activists working against surveillance and for open data, their activities raise issues that make the network more transparent for citizens. We can thus see different layers of being political.

On the most basic layer material access is political, because there are situations and places where the availability of free or cheap broadband is an issue. We are reaching a point where not having Internet seriously disadvantages you – you cannot fully participate in society. Freifunk Hamburg, for instance, in this podcast https://hamburg.freifunk.net/2014/11/freisprech-12.html tells about a refugee camp on the premises of St.Pauli church, where Freifunk Hamburg established a free hot-spot, without mightily banging its drum about it. Freifunk is now working with refugees in several cities, providing them with free internet, without advertising this too loudly, because of the many discriminations refugees face (not just in Germany).

While access to the network is one layer on which freedom can be formulated, another layer is the actual shape of the network. In the early 2000s, when free network movements started, broadband via ADSL and cable often came with restrictions, such as no fixed IP numbers and an automatic reset of the network connection once every 24 hours, as well as imbalance between upload and download speed. The actual shape of the network connection, its technical properties, also define how "free" a network is. This affects deep layers of network technology, where access points / providers have powers to filter and monitor traffic. Wireless community networks thus have an important function in educating the public but also politicians about the social necessity of net neutrality.

Finally, also on the layer of applications and content, networks can be more or less free. Here, one of the major sources of insecurity arises through issues such as copyright, or more generally speaking, intellectual property. In countries with a repressive regime, freedom of speech and other issues are also it stake. In my 2004 book Freie Netze I tried to systematize those ideas by creating a layered model of network freedom. If demand arises, maybe I will translate and update this model.

In the meantime, however, since I wrote that earlier book, a major economic crisis has happened. Free network activists often appeared to rather robot-like repeat Richard Stallman's dictum that the free in free software is not about free beer, but about free speech. Well, maybe this opinion turns out, if applied to networks, to be narrow and dogmatic. Free or at least cheap telecommunications is an important issue of our times. Especially after the outbreak of an economic crisis, even in the richest countries, there is a digital divide, as some groups or strata of society tend to have no Internet or also no PCs. Such issues are often connected with intractable social problems, where issues of class, economics, gender, ethnicity, all come together.

This goes so far that a recent study in wealthy Austria concluded that 600.000 adult Austrians (out of a population of 8.5 million) are affected by functional illiteracy. Digital literacy is thus a major issue which has repercussions for many other areas which affect the basic life chances and citizen rights of people. This poses questions for self-serving views of some free network activists who think their networks are free because they use "free software." When people have problems with reading and writing, the potentially "liberating" technology actually just creates further obstacles, as more basic problems need to be addressed first.

On the other hand, it is exactly the potential of the Internet to create an open knowledge society that makes it still so attractive, and which could also benefit disadvantaged people. In the following chapters, I will thus try to address those questions. On the one hand, the paradigm change from industrial to information society has remained incomplete. It has become stuck halfway, where older layers and mindsets prevail and prevent the full emancipatory potential of the Net to be realized (see Next Chapter The Incomplete Paradigm Shift). On the other hand, I am certainly not the first and only person to have recognized that there are complex relationships between free networks, free software and society. Those problems pose themselves in especially sharp focus when free networks are created in poor countries and rural areas.

The Incomplete Paradigm Shift

June 18th 1999

This chapter takes a bird-eyes' view of history, locating the developments of wireless community networks within a historical transition from industrial to information society. Following the thesis that this paradigm shift has become stuck, creating serious obstacles for realizing the emancipatory potentials of information society, the conclusion can only be that those obstacles need to be overcome in order to realize “Society in Ad-hoc mode” as a positive, really existing utopia.

The historical context of the problems and issues regarding wireless community networks is what I call an incomplete paradigm shift. The term “paradigm” is used here in a specific and well defined sense. While the paradigm has been introduced into the scientific language by Thomas Kuhn's seminal book The Structure of Scientific Revolutions1, it has been given new meaning by the Innovation School in economics who, building on Kuhn's work, coined the term “techno-economic paradigm”2. Christopher Freeman and colleagues at the Science Policy Research Unit (SPRU), a semi-independent research institute connected with the University of Sussex, developed a theory of innovation in industrial societies. They claimed that technological progress since the beginning of Industrial Revolution did not occur in a linear way, but in bursts and bouts, followed by periods of only incremental change. Influenced by the Austrian economist Joseph Schumpeter and by the Russian econometrist Nikolai Kondratiev, they argued that technological innovation was linked to the business cycle. It had long been known that economic activity in capitalist economies followed patterns of expansion and contraction. There is a short term cycle of 3 years, and a medium term cycle of 10 years, which Marx had already observed and commented on, but it has been Kondratiev, studying long term price developments of staple foods such as grains who found out that there were so called “long cycles” of 50 years, which could be separated in two parts, an upswing of 25 years, followed by a downswing of roughly equal length. Those time periods are not mechanical but research since Kondratiev has confirmed the existence of swings in economic activity of approximately between 40 and 60 years.3

The period of a downswing, especially in its later stages, is usually experienced as a severe economic crisis. Schumpeter's contribution has been to show that such a crisis can only be resolved by clusters of innovations. In order to resolve a crisis of a paradigm in decline, a new paradigm has to come into place. This new paradigm will typically consist of new “leading technologies” but also new business models and new ways of working. It is never just the technology alone which allows a paradigm shift to happen, but without technological change it would also not be possible. However, in order for this technological change to happen, mindsets of people also need to change, new laws need to be made, a wholly new business environment needs to be created. That explains why it takes such a long time, 25 years, a whole generation, for a new paradigm to come into place.

The Venezuelan economist Carlota Perez, who also worked with Freeman and SPRU, has developed a stylized model of paradigm change, which gives this whole development some further plausibility. According to Perez, the new paradigm develops inside the womb of the old one.4 Perez has divided the 50 years of the long-cycle into 4 quarters, divided by an interstice. The first quarter is when innovation gets started, usually by forward looking people, inventors, entrepreneurs, risk taking financiers, but also, I would add, artists, activists, and independent technological innovators. Once they have been able to show the feasibility of an innovation, others jump on the bandwagon and an investment frenzy starts. This will lead to an over-investment and a first crisis – an interim period of uncertainty. Once those insecurities are overcome, the paradigm reaches maturity stage. In this stage, all innovations made before are becoming fully relevant on a societal scale. This is the roll-out phase of the paradigm, when knowledge about new business processes and new patterns of behavior gets widely shared. Once this is achieved, however, the benefits of the new technologies, new business models, new ways of working, start to decrease. Since everybody now knows hos to do it, the competitive advantage is gone, and the paradigm enters its fourth and last stage, saturation.

The key point, however, that Perez makes, is that during maturation and especially during saturation phase aspects of a new paradigm are already developed, albeit not yet widely recognized. While the benefits of the existing paradigm can still be exploited, something new is already breeding under the surface. At this stage, however, it is hard to say what the new paradigm will really be made of.

It needs to be pointed out the theories about techno-economic paradigms have serious weaknesses. They imply a quite mechanistic way of historical development, and as such suggest a depoliticized view of history. History is always defined through human struggles which have many aspects, be they of a political, cultural or religious kind. Work under the title “Technopolitics”, initially undertaken by Brian Holmes and Armin Medosch, meanwhile more widely shared by a Technopolitics working group in Vienna, has widened the scope and perspective to not just look at techno-economic but also techno-political paradigms. History is not only defined by economics and technology, but also by politics, which implies raising the fundamental question how we want to live, as individuals and as social groups or classes.

The economic innovation school appears to almost willfully exclude such a political perspective, implicitly suggesting that capitalism itself will continue forever, in one way or the other. But as recent years have shown, the struggle is exactly also about that aspect of the argument, with new social movements in the European and global South suggesting a different type of economic model, often based on commons-types of economic activity where cooperation plays a larger role than competition.
It is now not so difficult to apply those theoretic concepts to recent history. According to Freeman, Perez and other scholars, the fourth long-cycle had been defined by the industrial mass production of consumer goods, in particular cars, by communications technologies such as radio and television (which relied on a centralized “broadcast” structure), and on cheap energy based on oil and other fossil fuels and nuclear energy. This paradigm reached maturity during the first decades after the Second World War, when it allowed for a long boom of economic expansion, lead by the United States. It was successfully copied by nations who had been defeated in that war but now had become the biggest American allies, Germany and Japan, as well as other states in Europe. This industrial paradigm is often referred to as Fordism, after Henry Ford, who invented the core production technology supporting it, the assembly line.

However, it is very important to point out that this paradigm also had an economic and political aspect. Because of the economic, political, or simply human catastrophes of the first half of the 20th century, and because of the existence of the Soviet Union proposing itself to be an alternative socio-political model, capital was willing to compromise and find a way of co-existence with labour. This manifested itself in concessions to organised labour, such as the right to form trade unions and the agreement to collective bargaining. These institutional arrangements guaranteed rising wages and rising living standards in the USA, Western Europe and Japan for 25 years. In the 1970s, however, for a combination of reasons this model entered a crisis, and the new technopolitical paradigm, information society began, at first under the surface of what was then called “Post-Fordism.”

The 1970s were a period of crisis and transition, when the new paradigm had been kickstarted by the mass production of microprocessors by Intel in 1970-1 making computing small and cheap. This was at first only recognized by an avant-garde of techies, intellectuals, financiers, people who met, for instance, in the Homebrew Computer Club, or worked in research labs such as Xerox Park, where the first GUI was created. Yet by the end of that decade, the first Personal Computers (PCs) were brought to market, and the electronic and digital world started to capture popular imagination through video games and films such as Tron. Now, rather than continuing with such a chronological narrative, I would like to point out that by the early 1990s, information society was established, and with the opening of the Internet, an investment frenzy started, at the time known as the New Economy. It first hit the headlines of newspapers globally, when the browser company Netscape received 2 Billion Dollars for its initial public offering (IPO) at the stock market.

When we now take a look at the old paradigm, Fordism, and the new paradigm, information society or “informationalism,” we can see that in many aspects it has turned to completely the opposite of what had been in place before. Had the old paradigm depended on hierarchical chains of command from top down to the bottom, the new paradigm fostered much flatter hierarchies and cooperation. This found its most pronounced expression in the leading sector, ICT, where “commons based peer production” became the new norm. This term, coined by Yochai Benkler, suggests a new cooperative type of production, pioneered in free and open source software.5 People decide themselves on which projects they want to work and freely associate themselves with software projects. These projects are then often not organised in a completely egalitarian way, sometimes there are so called benevolent dictatorships. But the core issues is that it is free cooperation, and that the results of that cooperation are entering a digital commons, a pool of resources which can in principle be used by all.

I could continue now with a much longer list of transitions from the old to the new paradigm, but would rather restrict it to a few core examples. Another important point is the type of media used. Fordism relied on a centralized model of broadcast media, with electronic media such as radio and television sending out their messages to people. “Feedback” was provided mainly through viewer statistics but also focus groups used in product marketing. The informational paradigm is characterized by “pull”-type media, where people either communicate with each other directly, through the Internet, or use “on demand” platforms to watch what they want, when and where they want it. This would seem in principle to foster a much more egalitarian media culture, a “read and write” media culture, as Lawrence Lessig, advocate of the Creative Commons licenses for free content, has called it.6.

The problem at which I wanted to get through this rather lengthy parenthesis is that all those great ideas and innovations have somehow become stuck halfway. It is true that in principle free cooperation has become much more important than hierarchical top-down structures. However, hierarchies have not gone away, and command structures have become established on another level. It is true that the combination of cheap computing power, laptops and the Net has enabled a much more egalitarian media culture. At the same time, however, new centralized media powers have arisen which did not even exist 20 years ago, companies such as Google and Facebook who have acquired a centrality compared to which Henry Ford's business empire pales.
One aspect of this paradigm change which has not been mentioned yet, must be added quickly, which is “financialisation” and “neoliberalism.” Financialisation describes a process where ever more areas of the economy were reshaped according to principles stemming from high finance and finding their most potent expression in computerized, networked stock markets. This means that even companies who on the surface still mass-produce consumer goods, now act according to a new set of principles. While in the old paradigm, the Fordist multinational corporation had been hierarchically organized, subsuming under one company all kinds of activities – development, production, marketing, catering, cleaning – in the financialised economies of now corporations have been broken up and shed all those parts which do not promise a maximum of profit. Production typically happens abroad, in so called low-wage countries, while things such as cleaning or catering or transport and logistics are outsourced to companies exposed to breakneck competition.

This system has arisen in tandem with the neoliberal economy. We can say that while financialisation is the “mode of production” of information society, neoliberalism is its political ideology. It suggests as the best way forward a scaling down of the state and its functions, while everything should be ruled by market mechanisms. This, it needs to be said, is an ideology. The reality is different. In neoliberalism, the markets are not free and the states have an important function, but this is rarely ever said. Neoliberalism is now the ruling ideology, and as such it does not have to care about reality. It has won the argument, at least as far as business circles and politicians are concerned, and as a consequence, many rights and achievements of the labour movement have been rolled back. This has led to a much more uneven economic development, with a rising gap between rich and poor. Even the OECD, which itself is a kind of neoliberal think tank of the most developed countries has recently conceded that never before has income inequality been as pronounced as now.

As a result, the paradigm shift has remained incomplete. As Karl Marx and Friedrich Engels had already observed in the mid 19th century, capitalism is innovative technologically. This would create the condition which would theoretically enable a new type of society. The informational paradigm has this potential to enable a knowledge society, a cultural society, where sharing, learning and the creative realisation of the self become core aims. These beneficial aspects and potentials of the liberal technological utopia are constantly undermined by capitalism's need to maintain current social relations. It has to catch the surplus amount of freedom in order to maintain the political status quo. Thus, you have “Störerhaftung”, data retention laws, surveillance, Big Data, the rule of the financial markets, the command of capital.

The bigger sweep of history shows that there is a structural analogy between the distributed or decentralized structure of the Net and the ideals of the revolutionary sixties. The global revolts of '68 were against the docility induced by one-directional, one-dimensional societies of mass production, where TV organized the consciousness of the worker-consumers. The drive for decentralization has come from many corners, but has its origins in the social movements of the 1960s. The foundational technologies of the Internet were developed in the late 1960s, by staff and students at public universities, who made the result of their research public, thus creating the foundations of the digital commons. Inspired administrators such as J.C.R. Licklider were driven by visions of networked digital public libraries.7

Information society has inherited those ethical values which have also become embodied in the structure of digital technologies in general and the Internet in particular. From this point of view, the Net as it exists today is a mesh network, and it is free and neutral, at least on the level of protocols, as I have written in the first chapter. Information society as such, however, has only been established in core nations in the 1980s and 1990s, and not everyone loves its decentralizing, horizontal, participatory groove. So there are those continuing tensions and contradictions going on, between those forces who still defend their privileges and sources of incomes, but also patterns of thought of the old paradigm, and those who propagate bottom up social self-organisation, and a free culture of sharing and cooperation. Advocates of free culture need to be careful, however, not to become victims of their own ideology.

As this article on Rhizome has pointed out, there is a connection between mesh networking and decentralization in general, but this opens the danger of a re-centralization. In an article in 2004 I speculated about similar issues, regarding a “Society in Ad-hoc Mode.” We have to be careful not to be carried away too much by those technological and political analogies. A mesh network can also be used by the army, well, the first mobile ad-hoc networks were developed by the US army. The social version of the ad-hoc mode may have liberating potentials, but we do not need to forget that neoliberalism is the political economy of informationalism, and that means that ever more areas are exposed to financialisation. The plutocracy of global finance uses prefers “ad-hoc” structures such as the G7/8/20 conferences, however they like it, rather than more democratically legitimate structures such as the UNO. Globally important decisions are made by ad-hoc committees rather than more democratically legitimized multinational structures. The allocation of the means of social production – and that's what finance is – is regulated by stock markets which are increasingly networked and automated, rather than by considerations about the well-being of people, animals, plants and the sustainability of natural resources. Decentralization can become a dangerous ideology when detached from actual social content. On a political layer, it is then either a form of libertarianism or anarchism.

The negative effects of financially driven globalization have been countered by new global protest movements that emerged as a specific new political culture of the net in the 1990s. As the old class politics were replaced by a newly constituted “working class” which has become rechristened as the “multitudes,” new forms of networked protest were pioneered in the 1990s. With support of the Association for Porgressive Communications https://www.apc.org/ during the Chiapas uprising in 1994, messages from Subcommandante Marco were smuggled out of the Lacandona jungle via the Net and triggered a global campaign of solidarity which stopped the Mexican army from carrying genocide of the descendants of the Maya people.

The increasing financialization during the era of the New Economy peaked in protests against the financial centers and free trade such as June 18th and the Battle of Seattle in 1999. The multitudes got together on the streets, organized in a decentralized way, via the net. It is no coincidence that June 18th and Seattle were foundational moments for Indymedia. However, those were early high-points of a new form of networked protest that has received various names, from the Arab Spring to the Indignados to Occupy – movements for the right to democratic self-organisation supported by a variety of DIY network technologies.

All major protests against G7/8 meetings after Seattle had independent media centers, IMCs, which in some cases were attacked by the police. Ad-hoc networks for mobile devices carried by crowds could make uncensored communications possible, even when mobile phone networks are shut off. The ad-hoc mode, the power of self-organisation has become part of a wider epistemological shift in information society. Starting in the 1980s but intensifying in the 90s, there was a flood of terms such as emergence, complexity, self-organisation, which were spilling over from techno-science into common language. These are all terms which come from a second order cybernetics, the cybernetics of cybernetics and form an epistemological framework for network society. In some cases they have become mixed with other terms from the social sciences and philosophy, such as “spaces of flow” and “lines of flight”. In some cases this is just old-fashioned philosophical idealism in a new dress-up. In the worst case, this can become part of an ideology, where neoliberalism, libertarianism (or anarchism) high-tech and finance meet to create new ideologies of power and domination, for which the best example is still Kevin Kelly's book Out of Control.8

Against this backdrop, I have advocated, already years ago, a political understanding of the term self-organisation. One of the few coherent concepts for self-organisation was developed by the philosopher, psychoanalyst and political activist Cornelius Castoriadis.9

Castoriadis' ideas centre on autonomy (self-determination) as opposed to heteronomy (outside control). In his view, self-organisation is not simply a better model for organisation or management, serving instead as a principle for “the permanent and explicit self-institution of society; that is to say, a state in which the collectivity knows that its institutions are its own creation and has become capable of regarding them as such, of taking them up again and transforming them.”10 Castoriadis went back to the direct democracy of the Greek city state in order to find out how democracy should reinvent itself today. This vision could also be achieved by using self-organising technologies such as mesh networks. What is dangerous, however, is any belief that automatically links the technological with the social level of self-organisation.

The protest movements of the late 1999s and the concepts and ideas of free software have inspired new ideas regarding the possibility of self-organisation. In the 1990s this has led to a lively discourse, first, about the digital commons, then about the notion of the commons in general. The rise of information society enabled an avant-garde of software developers to create the digital commons. As I have described in much more detail in my article “Shockwaves in the New World Order of Information and Commmunication”11 the success of the digital commons has then been transposed into other areas. People such as Michel Bauwens of the Peer-2-Peer Foundation are propagating the idea of the commons as a new social model that could be applied in all areas. After the financial crash of 2008, the commons movement internationally has taken up steam. Electoral victories by protest movements in Greece and Spain signal, that a political change has started which could lead not just to a new techno-economic but also to a different political paradigm in which the commons and social justice play a greater role.

While I do not insinuate that every member of the free network movements shares leftist political ideas, I propose to consider such a larger socio-econommic environment. The self-organising mesh network could thrive much better in a self-organising society. Currently we live in an ongoing era of insecurity. The new paradigm is not yet in sight, its shape remains to be determined. I think that, without this being a foregone conclusion, commons of all types, technological, social, political, could play a much greater role in the next 25 years, while at the same time we need to be cautious regarding the ideology of information society which has made a language of self-organisation, emergence and complexity its own, while actually building new hierarchies and new forms of domination and repression.

  • 1. Kuhn, Thomas S. The Structure of Scientific Revolutions. Chicago: University of Chicago Press, 1962.
  • 2. Freeman, Christopher, and Luc Soete. The Economics of Industrial Innovation. 3rd revised. Cambridge Mass.: MIT Press, 1997.
  • 3. Goldstein, Joshua S. Long Cycles: Prosperity and War in the Modern Age. New Haven: Yale University Press, 1988.
  • 4. Perez, Carlota. Technological Revolutions and Financial Capital: The Dynamics of Bubbles and Golden Ages. Cheltenham, UK; Northampton, MA, USA: Edward Elgar Publishing, 2002.
  • 5. Benkler, Yochai. The Wealth of Networks : How Social Production Transforms Markets and Freedom. New Haven [Conn.]: Yale University Press, 2006.
  • 6. Lessig, Lawrence. Free Culture: How Big Media Uses Technology and the Law to Lock Down Culture and Control Creativity. London: Penguin Books, 2004
  • 7. Waldrop, Morris Mitchell. The Dream Machine : J. C. R. Licklider and the Revolution That Made Computing Personal. New York: Penguin Books, 2002.
  • 8. Kelly, Kevin. Out of Control: The New Biology of Machines, Social Systems, & the Economic World. New York: Basic Books, 1995.
  • 9. Cornelius Castoriadis took part in the attempted Communist coup in Greece in 1944. This experience turned him into an opponent of Stalinism and he went to France, where he joined the Trotskyites, soon leaving again due to their authoritarian tendencies. He then founded the group “Socialisme ou barbarie” and the publication of the same name. In his work with this group, he developed his ideas of self-organization, using the example of wildcat strikes, among others. He was one of the first radical socialists in France at the time to publicly criticise Stalinism, as well as publishing critiques of Marx’s historical determinism.
  • 10. Castoriadis, Cornelius. The Castoriadis Reader. Oxford; Cambridge, Mass.: Blackwell Publishers, 1997. 30
  • 11. in: Blackwell Companion to Digital Art. Paul, Christiane, ed., Hoboken, NJ: Wiley-Blackwell, forthcoming

Free Networks Between Countryside and City, between North and South

Mariposa Hill, Valparaiso

The previous chapter has delved into some of the bigger implications of free networks in relation to the overall historic development. It has described the overall development as an incomplete paradigm shift, characterized by an ongoing structural crisis of information society. This chapter starts with the question, what makes a network sustainable? On the surface of things it looks like the conditions for growth are better in rural areas, where there are no good alternatives provided by the telecommunications industry. Examples in Spain, Germany, as well as Greece show that there can be successful models that bring together community initiatives with municipalities. This appears to have worked less well in the USA where after a good start in the early 2000s hardly any wireless community networks exist. It seems that the relationship between rich and poor in the US is almost like the relationship between the overdeveloped world and the poor nations of the South. This chapter finishes with a more sustained look into selected projects from the global South.

As the introduction to this chapter has stated, it often looks like the main difference, regarding the demand for wireless community networks, is whether broadband is available at reasonable cost or not. In East Berlin and East Germany, Freifunk found a lot of support initially because of the presence of the OPAL network, a fiber optical backbone which prevented the implementation of ADSL. The availability of broadband often comes down to the difference between the city and the countryside. Therefore, Guifi.net in Spain had its origin in rural Catalonia.

On the countryside in Catalonia, the problem was and is that it is hard to get broadband Internet at an affordable price. As the very first chapter has pointed out, Guifi.net originated in Gurb, a small village near the larger town Vic. Twelve years ago, there was no broadband in Gurb, not even for a very high price. Now it is attainable through telecommunications carriers, but the price is very high. The reason, according to Guifi.net founder Ramon Roca, is the collusion between business and politics. The incumbent telecommunications provider has no incentives to change her business practcies. Since the liberalization of telecommunications laws in Spain, no significant competitor to the incumbent has arisen. Guifi started on the countryside out of a real need. According to my interview with Ramon Roca, initially they could use a public library as access point.

Ramon Roca: One of the things that helped us a lot in the beginning, maybe we did not have the Internet at home, but looking at places where there was some public institutions like libraries was a way of sharing the Internet.
In our case it was a public library, it was paid through our taxes - So we were already paying those Internet accesses; and they were happy for sharing that. It was free in this case in terms of gratis, so it was paid already in taxpayer's money.

Ramon suggests there are a number of ways in which community networks and public institutions can cooperate. So this is like an open model of collaboration between community network and the communes, the local small political entities. This is not a one-size-fits-all scheme of “municipal wireless.” There have been several schemes around the world, where cities made full-mouthed announcements about bringing free WiFi to public places, but then soon had to row back for various reasons. What Guifi.net promotes is something else, a suggestion of cooperation between communities and public entities. For the politicians, who want to get re-elected, it is good to support Guifi because they can say they brought their citizens cheap broadband, and for Guifi support by the municipality makes life much easier. Similar models have also been founded in Germany, where Freifunk communities have entered successful collaborations with local communities or firms; and in Greece, where in the Sarantaporo http://www.sarantaporo.gr/ area wireless comunity networks are getting built

The enthusiasm around Sarantaporo is a reminder how exciting all this has been, also here, 10 years ago. There seems to be a better chance that community WiFi prospers, when it taps into other needs of a region or community. The fruit and vegetable farmers in Sarantaporo hope to have a more direct access to markets, giving them fairer prices.

In cities such as Barcelona it is different. There, you have various providers offering different types of broadband, from ADSL to cable to fiber optic at relatively affordable prices. The widely shared assumption is that in such a setting there need to be other motivations to participate in a wireless community network; for the municipality the motivation - it seems - to support such a network is small as the politicians can say that the market provides for all needs. But this is not really true.

I would advise caution with regard to such assumptions. In the still ongoing economic crisis, the price for broadband is not negligible, especially if you are on low income. In cities there is also a digital divide but it is part of a larger divide, of social stratifications in class based societies where the class structure is often veiled behind a language that implies there is just one large middle class. This class structure often coincides with shifting urban geographies. I think community networks would be wise to adopt strategies which make it a strong point that collective and not-for-profit network provision is also cheaper and fairer; even in the city it gives an economic advantage; the second point is that also in cities it can be beneficial to have cooperation between community networks and political entities.

In Berlin, a benevolent development has been the financial support of the regional government for Berlin Backbone, built by Freifunk. This gives Freifunk resources to work with, such as money for hardware, access to public buildings but also added legitimacy in the public eye. Getting access to tall public buildings such as the mayor's hall in the Berlin district of Neukölln and using it as a hub for the wireless backbone, has had a very positive impact on the perception of Freifunk by the public and in the media, according to Jürgen Neumann. Freifunk had used some tall buildings and also churches in Berlin for many years for its wireless backbone. Only after getting access to mayor's halls in Kreuzberg and Neukölln Freifunk suddenly became celebrated as Robin Hoods of network society, especially since they used those buildings not only as supernodes for their backbone but also to distribute open public wireless access. Citizens in some of the more edgy inner city areas of Berlin, can now access the net on their smartphones and tablets while waiting for concluding some public errands.

It appears that both Guifi and Freifunk have successfully built models for growth of community networks across large metropolitan areas – because this is what is the case, their networks cover not just cities such as Berlin and Barcelona, but whole regions such as Catalonia and Eastern and Northern Germany. The crucial point is to tap into real needs, which are always slightly specific and local, and find a layer where it is possible to bring those needs and resources together. Yet, resources in this context means the mobilization of people to come together and cooperate.

This appears to have worked less well in the USA. In the most powerful nation of the world, the USA, the regulatory climate and the general business environment is so strongly pro-business that community networks have a hard time to get going at all. The USA really pose a conundrum. In 2003-4-5 there were community networks such as NYC Wireless, Seattle Wireless, Personal Telco, Portland, Oregon. Those initiatives were quite vocal and participated in regional and international debates. Nowadays you have to look for them like a needle in a haystack. As this article shows, http://technical.ly/2015/04/06/12-communities-experimenting-mesh-networks/ there are still some wireless community networks in the USA. This is really great, but Pittsburgh's mesh network with its 11 nodes looks a bit meagre compared with Guifi's almost 28.000 at this point in time, April 9 2015, or Freifunk's nearly 13.000.

The story of one such project, Wireless Philadelphia, is being told in this report by New America http://www.newamerica.net/files/nafmigration/NAF_PhilWireless_report.pdf (this link still worked in Match 2015; it seems some major re-shuffling is going on at New America and its associated websites). The city of Philadelphia created a quango, Wireless Philadelphia, with the aim of creating a city-wide wireless network. However, this quango made the mistake of handing over the commission to build the network to a private company, rather than consider alternatives (such as an initiative by community activists). This created a dependency and weakened Wireless Philadelphia's ability to carry out its declared goals of closing the digital divide. Now, a newly configured Wireless Philadelphia tries to find other ways of furthering network access. The regulatory climate is difficult, to put it mildly. Companies use the courts to prevent cities from supporting non-commercial networks for poorer citizens, as this is considered "unfair competition."

Free Networks in the Global South

It is a question that stares you into the face when you study wireless community networks. Hardly any seem to exist in the USA today, despite the work of organisations such as the Open Technology Institute which does its best to promote and study wireless community networks. OTI, formerly part of the New America Foundation, which does not seem to exist anymore (or now simply called New America), is behind projects such as the Digital Stewards scheme in Detroit, where people are sent into poor areas to raise digital literacy. After the riots in Ferguson and Baltimore in recent months, the world has been reminded that in the USA race and class divisions go through society which are reminiscent of the divisions between the rich overdeveloped world and the poor global South. A scheme such as Digital Stewards reminds of approaches in so called "development projects" with ICT in what used to be the Third World many years ago.

The basic scheme behind such projects was that the good knight from the North came with his horse and shining armour to bring the Internet to the suffering people of the South. For "horse and shining armour" think of course jeep, laptop and solar panels. From very early on it was considered a good idea to use wireless networks in poor, rural areas. In my book Freie Netze (2004) I had a chapter about a number of such projects (pp. 152-157). Lee Felsenstein is not just a pioneer of computer science but also a pioneer of community networks, having built the Community Memory project in the Berkeley area between 1972 - 74, probably the first computerized community network in the world. In the 2000s, Felsenstein was involved with the JHAI foundation which undertook ICT projects in Laos and Cambodia. They developed a special purpose computer with low energy consumption and resistant to the extreme climactic conditions to link villages and assist them in important issues such as crop selection and bringing their harvest to markets. Also in the early 2000s, Harvey Brainscomb built a wireless network for the government of Bhutan. Meanwhile, I would assume, many more projects of this kind exist.

While I would not doubt for one mili-second the good intentions of everyone involved, the probem with those schemes is their one-sidedness, and the specific ideas regarding "development" and "aid" they are often connected with. Thomas Krag and Sebastian Buttrich of Wire.less.dk have been involved with Geekcorps and went to Ghana to build wireless networks. They went there well prepared, bringing technology such as the solar energy supported "Autonokit", a set of hardware and software components that should allow building a wireless community network based on free and open source software in Africa on the countryside. What they had to find out is that our notions of free and open source do not necessarily function in Africa in the same way. In areas where poverty is endemic and education and knowledge are bottlenecks which are an impediment to development, some of the people they had to work with, such as local business people and ISPs were not found of the idea of sharing knowledge. They were afraid that if they trained people so that they could build wireless community networks, they would walk away and found their own companies.

Elektra, right hand side, at workshop at Espacio g

Even where the social separations are by far not as pronounced as in sub-Saharan Africa, obstacles arise from the nature of the social environment. In 2010, Ignacio Nieto reports, an extremely interesting project was launched in Santiago, Chile. After a meeting of free network activists from Latin America in Uruguay, this group, together with long-term Freifunk activist Elektra, came to Santiago to realize a project that would use a wireless community network to connect to a pirate television station. Through the wireless network, an internet portal was created, through which everyone would be able to post video which then would be re-broadcast on the television station. The project very nearly succeeded, but after Elektra, who had provided a lot of the technical expertise, left technological development stopped. It also seems that there were issues around the appropriation of funds.

Antenna installation on Mariposa Hill

In 2014 Elektra was again invited to Chile, this time by Espacio G, an alternative gallery hacker space in Valparaiso. There, poorer areas in the outskirts had been devastated by fire. While people live on the hills, all public services are in the valley. So the idea was to connect the two hillsides through a mesh network. Again, a prototype was built with Elektra's help. In a recent interview she called those types of projects "helicopter drop" projects. As a well meaning person she participated, but already aware that this was possibly not very sustainable. And again, soon after Elektra's departure the project fell apart for a number of reasons. One reason, according to Ignacio's report http://www.thenextlayer.org/node/1325 was that the people in the poor neighbourhoods of Valparaiso were not motivated enough. They probably did not feel they had a real stake in the project.

It does not necessarily have to be that way. Carlos Rey-Moreno works at the University of Western Cape in South Africa. His project, which also received support through the Open Call of the Confine EU project, created a wireless mesh network and Voice-Over-IP (VOIP) project in a tribal area in the Western Cape province, where the Mankosi people live. http://villagetelco.org/deployments/mankosi-south-africa/ Here, the aim has been from the beginning to involve the community as much as possible to create a sustainable model for a village telco. As Carlos Rey-Moreno told me in an interview, it is important to consider the specific circumstances that came together.

"Mankosi is composed of twelve villages, around which 6000 people live in 500 households. The average income per household, consisting of around 10 people, is about 53 Euros. In this community, there is coverage from mobile telephone operators but they tailor their services for more wealthier urban users. South Africa is the second most unequal country in the world when it comes to income distribution. With regard to mobile communications, matters are made worth by middlemen who go to town and bring the airtime, so that there is a markup for airtime, local people are charged even more than everyone else. We are talking of about 30% of household income going into phone communication, with all the hazards that implies for other areas, such as health, education."

The project used "mesh potatoes" from the Village Telco project as hardware. Twelve houses were chosen as nodes, with solar panels and antennas. The local people were involved in all aspects of the project, such as choosing the houses and installing everything. It was not always a smooth process, reports Carlos. The locals are used to relying on the tribal elders for all decisions. The result is that not always everything is very transparent. For example, some of the owners of the houses where nodes were installed did not tell everyone else that this new infrastructure was a shared property. After 7 months absence, Carlos returned and started a process of public meetings.

"Now we have regular meetings with about 10 people meeting monthly, people from every village, so that it has become much easier to reach decisions. This is now beginning to take root, that working together is a better way, they start to apply that to other areas as well. Some sort of transformative effect appears to be taking place, apart from the network as such."

Initially, the plan had been to use the network mainly for voice calls between the 12 villages. But then the villagers raised the demand to also make break-out calls into the telephone net. As a result, a cooperative was formed, which has attained the status of a local network carrier. As a small provider, they could negotiate better conditions with a commercial VOIP company. Carlos stresses that now the project has become self-sustainable. Income is raised by using the solar panels for charging mobile phones for a small fee. Break-out calls can be made at a quarter of the normal costs. The maintenance and the operation of the network is now in the hands of the people of Mankosi. It is true that Western Cape University provided initial capital and that Carlos' role had been important to overcome initial hurdles regarding technological and social issues. He thinks that it had been important that of the 20 months of the project's duration, he had spent around 10 months in Mankosi. At the same time he thinks it had been important that he took care not to obstruct himself on their decision making processes and allow them to find their own feet. But now he thinks this has created a model that could be replicated with much less work in other, similar areas.

The Mixed Political Economy of Guifi.net

This chapter throws a closer look at the different models used economically by Freifunk and Guifi. In particular, it investigates in which ways the fact that Guifi has a mixed political economy contributed to its growth. This sub-chapter also investigates the terms in which network freedom is defined and with which other ideas and measures it is connected.

Guifi and Freifunk have chosen different models. In Germany, it seems there is a high ethical stance adopted by volunteers who are building and maintaining networks. The initial Model 1 as proposed by Consume (see Chapter 1) was that each node should be built and maintained by its owner. But this turned out to be slightly utopian. Building proper, reliable nodes goes still far beyond the capacity of the average user. So in Germany, the networks are built by volunteers, who donate free labour to build and maintain networks.

In Spain that happens too, but in addition to voluntary work there is also the option of having people come to build one's node in a paid capacity. Only slowly I start to understand what a complex „being“ Guifi.net is. I am not implying that volunteers in Spain are less idealistic than their German peers. But I would like to high-light that Guifi has created a unique mixed economy where capitalist elements can co-exist with the commons, and vice versa.

Guifi is managing an expanded web of contacts between node builders and node owners via the Guifi website. This website is, by the way, far mor than simply a website, but more like a central hub that facilitates the growth of Guifi. A quite elaborate social media system has been built, which allows users to rate network builders, who are often small IT companies, consisting of one or two people. In times of a severe economic crisis in Spain, this enables afficionados of free software and free networks to earn a bit of money. Maybe this gives Guifi a chance to maintain its fast growth rates: https://guifi.net/guifi/menu/stats/nodes

In Germany, it seems, paying people to build networks is anathema to most. The shared ethical stance - which has been voluntarily adopted and not been imposed by anyone - demands that people build free networks through free labour. Both models have their pros and cons. The German model works as long as enough techies are available to donate their time. Even the Berlin Backbone is built without paid labour. The funding from the Berlin Brandenburg Media Agency is only used for hardware and other materials.

The Spanish model seems to work pretty well too. But it can also have centrifugal consequences. Some of those service providers will always want to privatize the network segment they have created. They will try to take their customers with them and build a service provider company. However, to prevent this, is the job of the Guifi foundation.

Both, Guifi and Freifunk have become very strong in advocacy. Neither of which sees itself as a centralized company or a network provider. Neither the Guifi Foundation nor the Freifunk Förderverein (not-for-profit umbrella organisation) are running those networks. Their task is to advocate the building of free networks in two directions: on one hand, towards the official world of institutions, city, town and village administrations, and internally, regarding the community of active and potential network builders. But there is also a difference, regarding the type of advocay.

In my view - which I do not claim to be objective and someone can come and correct me - Guifi advocates the right of access to the Internet as a fundamental freedom and right for all people more strongly than anybody else, while Freifunk argues slightly differently, advocating the politcal implications of a free network, free from government surveillance and commercial interests, which may distort network freedom.

Guifi is consciously creating a network commons, and uses also the term commons in its language.

Ramon Roca: the network is managed as in commons, whatever you have wireless, and cable bound, whichever protocols, and all the economics – which are a lot - have to respect that the network is in commons, it is not in control of a single person, single company, single point of interest. That does not mean that there can be no business, a lot of business can happen around that but based on the service.

He insists that whoever makes a business there by building the network, planning, and maintaining it „has to respect that the return comes from his services. It is not coming from claiming ownership of the network and then asking a higher price to whoever wants to use it, they have to respect the Internet as commons.“

RR: We are not building a private Internet, we are part of the Internet. Internet is the result of the networks, so we are simply a part of that. We have a portion of the Internet that works as in commons, and other portions maybe do not work as a commons. And the only thing we have to do is to interconnect. So what happens inside the commons is we do not charge anything for interconnecting.

The mixed economy allows people to build a business model for instance by providing some kind of support after disasters. They guarantee they will bring a node back within 4 hours for a certain fee. But you are not charging for interconnecting between networks, which is called peering, the concept of peering. Guifi are systematically peering, for free, explains Ramon

You don't take economic advantage from each other, don't be intrusive in terms of looking what they are doing, in terms of privacy also which is taken care by the licenses, and doing the business but not in controlling the network, keeping it as in commons.

In order to do so, it would be good to be able to rely on the state as a benevolent partner. Commons theorists maintain that there should be alliances betwen democratic governments to create commons enabling legislation. Yet regulation can be easily circumvented.

Ramon Roca: „But even talking in market terms, everybody knows that the market does not work if there is no competition. There are many ways of avoiding to have competition happen. So that's why in every country you have a regulatory agency to ensure that happens or not. When there is an incumbent with too much different between the others, there is no free competition and they will do whatever to protect their position, such as to create bureaucratic problems. They will use any opportunity to create bureaucratic problems ... this is a very long story.
And this differs between countries, but often the regulator can be captured by the lobbies interest because they are very powerful.

The interests of the incumbent are often more highly on the mind of politicians than the interest of the majority of people. Nevertheless, so far Guifi have been able to fight off any charges, whether they come from government or business.

In order to keep the balance between commercial business and those merely participating in the network as a commons, All users of Guifi.net have to subscribe to „The Compact for a Free, Open & Neutral Network“ (FONN Compact). This contract has carefully looked at other examples such as the Pico Peering Agreement and has enshrined network freedom in a small number of principles to which everyone has to subscribe, thus allowing a network to grow which has different property structures, but works as a commons regardless, through its committment to interconnect. The understanding of the three principles deserves some closer explanation. I quote:

“1. It is open because it is universally open to the participation of everybody without any kind of exclusion nor discrimination, and because it is always described how it works and its components, enabling everyone to improve it.
2. It is free because everybody can use it for whatever purpose and enjoy it as foreseen in the freedoms of the “General principles” section, independently of his network participation degree.
3. it is neutral because the network is independent of the contents, it does not influence them and they can freely circulate; the users can access and produce contents independently to their financial capacity or their social condition. The new contents produced by guifi.net are orientated to stimulate new ones, or for the network administration itself, or simply in exercise of the freedom of adding new contents, but not to replace or to to block other ones
It is also neutral with regard to the technology, the network can be built with whatever technology chosen by the participants with the only limitations resulting of the technology itself.

However, any rule needs enforcement in order to function. Guifi.net has chosen to look at Elinor Ostrom's research on how commons can function and avoid their „tragedy.“ From the design principles for a commons, they have chosen

4) Effective monitoring by monitors who are part of or accountable to the appropriators

This one of the reasons why Guifi is applying network monitoring methods. This is another thing which initially perplexed me. In the early days of Consume, metering would have been seen as a first step towards charging and thus in opposition to the spirit of network freedom. Ramon Roca disagrees:

„We aim for a net neutrality, not only the commons. That's far from being a religion, to be neutral we should maintain agnostic in all aspects, technology, between volunteer activity or professional activity, allow all uses from free as in beer or commercial services on it..., be safe from governments,...not to say about politics or religions. Important not to become fundamentalist.

It seems that Guifi does not only have no problems with metering traffic, but on the contrary, sees it as a prerequisite for building an effective network commons.

For sustainability of the commons and manage the network you need capacity planning and economics involved in investments or operating expenses (regardless of there is money in between or not). For sure requires metering for managing the network itself, diagnose where is required more capacity, etc ...

As the mixed political economy of Guifi.net includes governments and businesses, the metering also serves the aim of checking if people pay their dues. The condition for businesses and administrations participating in the commons infrastructure is that they have to compensate, by paying back something.

The Guifi.net website is the embodiment of this social and technological construct which is Guifi, a network commons based on a mixed political economy. The metering capacity also is a necessity for network planning. And recently, Guifi cant be called a wireless community network anymore, since it has started to deploy more and more fiber:

Ramon Roca: “we started in 2009; we were realizing that fiber was getting much cheaper, and also much more reliable and capable; but it's a difficult journey, with lots of bureaucracy, it's a complex project, but still a planning issue.”

Once the new way of working with fiber has been mastered, explains Ramon, it is quite easy to roll out and can be much more cost effective than anything else. With fiber, Guifi can offer one Gigabit per second symmetric bandwidth. Fiber, Ramon Roca is convinced, is the future.

All those properties together, Guifi's mixed economy, its strength in advocacy, the existence of effective mechanisms for conflict resolution through supervision and its agnosticism, if one can say so, against anything fundamentalist, make this a very open, very adaptable model which is, in my view, the secret of Guifi's success. And Guifi's mixed political economy have allowed it to grow at a rate that has made itself visible in government statistics.

Ramon Roca: “in 2004 the region, Osona with 150.000 inhabitants, was ranked 31st in Catalonia in terms of bandwidth, and now we have 10 percent above the average, because it was the first region to reach the European average. So now we can provide that we are the only county on Catalonia that meets the European average, and this year we went above the average. If you look to the statistics you see we make the difference, it's the 10 percent. By having alternatives it makes sure meeting the digital agenda, like 2020, of the European Union. We are maybe still minoritarian, we are still 10 percent, but that 10 percent will make a difference.”

[to be continued]

The Obsessive Utopia of Mesh Networks

Paul Baran Network Topology

„The sleeping beauty of mesh has been kissed into life by the community,“ explains Elektra in her book. The community has made it possible to have decentral wireless networks which connect small local cells, automatically connected by intelligent software (Aichele 2007 p. 15)
In this chapter, a closer look at developments around mesh networks is taken, based on a study trip to Barcelona, supplemented by further research. This chapter also asks the difficult question, how the mystifications of technology might be overcome. Are better mesh routing protocols really the answer to all problems?

In one of the previous chapters I stated that there is a significant difference between town and countryside. In many rural areas, it is virtually impossible go get affordable broadband Internet. This problem has actually furthered the growth of wireless community networks on the countryside. A widely shared view is that it is much more difficult to mobilize people for wireless community networks in urban areas where a variety of possibilities for network access exist and where the urban topology makes networking difficult. This, however, while broadly true, may not always be the case. In some areas in Barcelona, wireless community networks are growing, and they are developing and using the latest mesh network technologies.

Routing is generally a very interesting area. The Dijkstra's algorithm is one of the earliest path finding algorithms, written by computer scientist Edsger W. Dijkstra in 1956 and published in 1959.
The Dijkstra algorithm is something as basic for the current political and cultural system as cars – or traffic lights - were for the previous one, but nobody knows it, except for experts, computer scientists, techies. It would not surprise me if it was included in the Evil Media book, since this is something that has become part of the technologcial unconscious. It has an agency of its own, as a repressed. This is definitely the case with the information infrastructure.

The process of forewarding 'packets' from one node to the next on the net is called routing. The politics thereby deployed concern fundamental freedoms and rights. Until now, the neutrality of these protocols has been maintained, because they are jointly developed by the IETF and IEEE. The commonality of the net depends on neutrality on some layer. And even in the turbo-capitalist world we live in this is still safeguarded. Mesh routing protocols are improvements of „normal“ routing protocols.

Pau Escrich is one among a team of researchers working on the Confine project, and he is also a Guifi activist.

Pau Escrich: “I realized that in my district, a Barcelona neighborhood which is called Sants, there was not any node of the Guifi.net project. So, following the approach - think globally, act locally - I started contacting people from the neighborhood. We built a nice group of folks interested on building a free network, and after having some meetings we started deploying nodes. Now, four years after this, we have around 50 nodes in this area.”

Pau and colleagues started using new technology based on mesh routing protocols. Most of Guifi.net does not use mesh protocols, but standard routing technology such as Border Gateway Protocol (BGP). In such a network, a group of routers under a single administrative policy – an Autonomous System (AS) – is managed using BGP for interior and exterior routing. If you compare an AS with a country, the router controls entrance and exit to that country. The benefit is that for nodes inside this “country” it is not necessary to know the route to each and any other node on the net, it only needs to know the nearest gateway router.

The resulting network topology is one that could be described as decentralized, according to the classification of Paul Baran's seminal study from the early 1960s (see image above). A decentralized topology is a mixture between a hierarchical, star-shaped network and a completely distributed or mesh network, without any nodes taking on a notion of a center.

In Catalonia, Guifi.net has a decentralized topology with SuperNodes which are connected with each other, and to which are connected many Nodes, which are only connected to the SuperNode, but not to each other. This works reasonably well but does not fulfill the criteria of the wireless community network dispositif which demands a more egalitarian topology.

Pau Escrich: “The SuperNode network creates what we call the Backbone and this Backbone network is decentralized, but the level below (the Nodes layer) is very centralized and it represents more than 80% of the network devices. So this is an actual control point; the groups, individuals or companies controlling these SuperNodes are the actual managers of the network. This is what we are trying to skip by developing and using QMP.”

QMP stands for Quick Mesh Project, a Linux distribution based on OpenWRT and specifically made for mesh networks. QMP is based on a predecessor project which was developed in the context of another community network initiative. In 2007 a small group in Gracia, a pleasant neighborhood in Barcelona, which extends from just behind Sagrada Familia into a more leafy and hilly area, started a small mesh network called GSF. Roger Baig, a key figure in Guifi.net, was involved in this. Roger Baig, according to his self-description, had been involved in free software since the 1990s, and “installed a server in each village around my area.” “Initially,” he said, “I was not so skilled in networks, still learning, so OpenWRT was fresh air for me.”

The group looked for funding and managed to win a contest organized by a foundation named PuntCat (dot-cat are Top Level Domain managers). They received 15.000 Euros to start the project, reports Pau Escrich.

The development of QMP then started seriously after 2010, when a small group of convinced mesh networkers dedicated themselves to building a new distribution from scratch. Part of this group was the German Axel Neumann who at the time also lived in Gracia. After 3 years they launched the first stable release, and QMP is now used at many places around the world. http://qmp.cat/
Axel Neumann is key developer of BMX6 http://bmx6.net/projects/bmx6, one among a number of the latest incarnations of B.a.t.m.a.n., a mobile ad hoc mesh network protocol.

Axel Neumann is writing software for the Confine project. He is helping to run the testbed, Community Lab. He is also main developer of bmx6, one of a number of Batman forks. Axel was fascinated with complex problems early on, problems such as how to make a map of a landscape that constantly changes; or how to have routing tables in a network where constantly nodes appear and disappear? Axel was getting interested in Batman through Freifunk.

Pau Escrich: “B.a.t.m.a.n. was born in Berlin as an alternative to OLSR. Its approach is different for a node running the routing protocol; instead of knowing all the network topology (as OLSR does), in B.a.t.m.a.n. every node only knows its new best step to reach any other node in the network. So if all the network participants are doing the same, the user data will be routed from one side to the other following always the best path. This approach is called distance vector.”

B.a.t.m.a.n. is actually an acronym and stands for “better approach to mobile ad-hoc networking.” The initial idea came from Corinna “Elektra” Aichele who also started developing it, and was soon joined by Axel Neumann in this effort. To cut a long story short, after B.a.t.m.a.n. emerged as an alternative to OLSR – the latter the first mesh protocol which became more widely used by the community – a rivalry developed which inspired the “Battle of Mesh Networks.” This is a kind of contest, where community networkers meet to test and compare different protocols. The next wireless battle mesh will happen in Maribor, Slovenia, in August 2015 http://battlemesh.org/ . Meanwhile, a number of different flavours of B.a.t.m.a.n. exist besides BMX6.

Batman is a distance vector protocol. OLSR is a link state routing protocol where every node has a map of the network and can make decisions about where to send packets first. Distance vector, Axel explains, is more like I send somebody on a hike without giving him a map, but telling him to look out for the signs. Distance vector is more simple, in a certain way, but has other consequences. The signs have to be put in place and they have to be kept up to date. This is done by flooding the net with messages from the target node. Axel is now working on bmx6, trying to improve the way how this flooding of messages is done. „Speaking in the absract“ Axel explains, „it is like compressing data.“

Pau Escrich: “We choose BMX6 because it fits our requirements: scalability, good performance, capable to run on a low-resources machine and IPv6 support. In addition Axel Neumann, its main developer, is a good friend of the Guifi.net community and he joined to the QMP team, so we are really having a routing protocol which is very adapted to our needs.”

Programmers such as Axel and Pau are deeply fascinated about the capacities of mesh protocols like OLSR or B.a.t.m.a.n. in terms of self-organization. In the network topology of Guifi.net as described above, a SuperNode may control 50 or 100 nodes. While the backbone is decentralized, the leafs are very centralized. For community network activists, the network topology is not just a technical issue, it also expresses a political desire.

Pau Escrich: “When I was kid I was an enthusiastic about Che Guevara, Gandhi and these people in history who changed the world and fought for the freedom of ordinary people. I also liked computers a lot. So I found the free software movement as a perfect scenario to follow my ideas.”

The mesh networking community is striving to build a completely egalitarian, uncensored, free and open network. Axel Neumann believes that the future belongs to multipolicy routing. Each node decides autonomously but still everything works together. In Community networks, it would be too much asking constantly for meetings to make policy decisions. Batman advanced, the other main Batman fork, uses Layer 2 of the Internet. The user feels like hanging on one switch, but Axel says he cannot be sure how far this can scale. Axel would propose to have a cloud using Batman Advanced and bmx for long distance connections. Locally, the user will be able to move from cell to cell as with a mobile phone.

Distributions with several mesh protocols are already in actual use in Spain, Germany, Austria, Argentinia and Nicaragua, in Chile possibly too. A major effort for a new distribution is called libre mesh. It is an attempt at globalising the Freifunk firmware undertaken by several community networks across continents, including Freifunk, Ninux and Guifi, together with people in Argentinia. http://libre-mesh.org/attachments/download/20

The latest Freifunk distribution, kathleen, also has Batman and OLSR installed. It offers a lot of improvements in the direction of autoconfiguration and ease of use, and better management of IP spaces and DNS services. https://github.com/freifunk-berlin/firmware/blob/0.1.0/README.md
With regard to the politics of Freifunk, I was able to make interesting observations when a massive flamewar broke out, between someone who appearantly wanted to use the Freifunk label for his own cause and everybody else.

The discussion about what makes a network „free“ or „open“ was raging on WLAN-News, one of the main Lists for exchange on Freifunk issues. The story, which had rumbled already through local Freifunk lists, was that one wannabe entrepreneur wanted to do something „like“ Freifunk, and with its endorsement and under the subdomain, but with policies inflicting on some rules. It seems he wanted to make his own network and use a tall building, paying someone to have a router there on his balcony.

The ensuing discussion was like a look into the collective psyche of the Free radio community. The community objected to a whole range of issues, but one of them was that the other network would not mesh. Admittedly the „entrepreneur“ did not make his own case easier with a very angry tone, accusing Freifunk of acting like a closed shop. The whole idea smacked of opportunism. But what I found interesting was the emotional intensity with which mesh was argued as a political project. Only the mesh network is really a free network.

On one hand, I do believe that things such as Libre-Mesh can make a difference, since it creates the possibilities of a global, independent infrastructure, the network commons, reclaimed by its users.

In political terms, this could either be described as libertarian, or anarchist or grassroots, bottom-up, self-organized democracy. The desire of mesh network developers is to give the Net a technical structure which makes it difficult to impose any top-down control structure.

In my ears, this sounds a bit like the initial idea behind the Internet in the first place. However, as the history of the Net has shown, such a decentralized structure on the technological level does not make the Net immune from other forms of centralization and control. Capitalism knows many ways of bending and taming the liberatory potential of new technologies. Google is the best example, it can exert control without directly owning the whole of the net, it does not need to shape the traffic flows of the Net at control points such as routers or hubs.

In my conversations with community network activists, I tried to explain that a decentralized network can also serve top-down organizations and vice versa, that a centralized network could also serve the struggle of a movement for freedom. This winter, I visited the Museum of Revolution in Havana where you can see the radio transmitters built by technicians for Fidel Castro and Che Guevara. There was nothing decentralized about this technology, but it served the purposes of the revolutionary struggle perfectly well. Whenever I try to make such an argument, it seems I am running against walls.

“If you have a centralized network you have a weak network; distributed things are the basis for the freedom of technologies,” Pau insisted. The “freedom of technologies” are constituted by the three freedoms of Eben Moglen introduced earlier in this text: free software, free networks, free hardware. Basically, everyone from Guifi.net whom I interviewed repeated those three freedoms to me like a mantra.

Pau Escrich and Roger Baig are part of a group of community networks and researchers, who work at Universitat Politècnica de Catalunya (UPC) within the framework of the EU funded Confine project http://confine-project.eu/. This project brings together community networkers, but also academic computer scientists and telecommunications researchers, to build Community Lab, a testbed for many new WiFi applications.

One of them is Llorenç Cerdà-Alabern who is an Associate Professor at Universitat Politècnica de Catalunya, Barcelona, Spain. He also lives in the district of Sants and wanted to contribute something practical to this project. So he put an antenna on his roof which has now become a hub in the Mesh in Sants built using QMP. Llorenç thinks that this cooperation between networking enthusiasts and academic researchers is beneficial because the community is much more oriented towards practical results whereas researchers can look further into the future.

Llorenç is using his position in the network topology to conduct some experimental measurements and to write papers about it, for instance on “Topology Patterns of a Community Network: Guifi.net.”1 He has also written a topology generator, a tool that visualizes the network between Sans and UPC. The resulting page is definitely worth studying: http://dsg.ac.upc.edu/qmpsu/index.php

The community network provides the unique opportunity to have live field tests running, studying relatively large scale wireless networks under real conditions. Ivan Vilata-i-Balaguer is also working within the Confine project. His responsibility is to provide services for the implementation of the testbed, Community Lab. For those more technically interested, there are some slides here:
http://wiki.confine-project.eu/_media/pub:community-lab-intro_fosdem-2014.pdf

“We have the community device,” explans Ivan, “we chose to run the experiments on a separate device.” The community lab testbed is actual hardware, a device which is put next to nodes at community networks, and on this device are running some experiments. So they had to ask a lot of questions, explains Ivan, questions such as “how do we manage all this hardware; we are talking about nodes on the community network which can be used to run experiments.”

Most research is usually done in controlled environments, by research groups, but there are no users. In a community network with real users, explains:

Ivan Vilata-i-Balaguer: “the experiments must not overwhelm the community network with traffic, must not crash it, when experiments crash, and it also should not affect node ownership, so we cannot expect total control from testbed operator; we had a lot of open topics to think about and we had to find an architecture that meets all these requirements.”

The research devices can be used for different experiments, remote controlled from servers hosted at an organization called Pangea. According to Ivan Vilata-i-Balaguer, in July 2014 there were 120 to 130 research nodes, of which were about 90 in perfect working condition, “but we are expanding the testbed and developing the software.”

The maintenance of the research devices is an issue, because most of the nodes belong to individual node owners, so the update of the software and keeping of the nodes in working-order is precarious. As a result, now most of the nodes which are connected to a research device “are operated by people who work for CONFINE, so the communication overhead is not so big” explains Ivan. Confine merges community network and academia in the same project. It enables for example advanced monitoring capacities which can be used by Guifi.net to have a transparent network and enforce the rules and principles of the network commons. It also enables more experimental topology generators such as the one by Llorenç Cerdà-Alabern; but this is just a fraction of what is going on.

The project is also providing data sets from those measurements for other researchers. http://wiki.confine-project.eu/experiments:datasets This page gives an overview of all the partners and activities, such as Athens Wireless. Ninux in Italy, Funkfeuer and many more: http://community-lab.net/

As part of the project, a whole range of social projects have been added through an open call. http://wiki.confine-project.eu/experiments:opencall2

One of those projects is CONFLATE, which uses “the new Research Devices of Ninux.org to deploy a simple (but practical) OpenFlow based DASH Live Video Streaming service for real users of Ninux.org” informs the website. This and other projects will be topics of future articles.

However, on the danger of being seen as a party popper, I feel the need to also share an observation which stems from rather long term engagement in the field. It seems that many participants in this movement display facets of what Joseph Weizenbaum called the obsessive-compulsive programmer.
In his 1976 book Computer Power and Human Reason2 Weizenbaum, as a big critic of computer science from within, wrote this famous passage about the

“bright, young men of disheveled appearance, often with sunken glowing eyes, can be seen sitting at computer consoles, their arms tensed and waiting to fire their fingers, already poised to strike, at the buttons and keys on which their attention seems to be as riveted as a gambler’s on the rolling dice.”

The real issue here is not about appearances and also not about finger pointing at computer enthusiasts or techies, as I prefer to call them. We should not make techies gullible for what are actually the contradictions of this society. In the knowledge economy, almost everyone is quite compulsive about their work and in the 21st century many people have become “hackers” according to McKenzie Wark's definition.3 In Fun and Software, a recent book edited by Olga Goriunova, one of the pioneers of Software Studies, the authors treat this subject in a more even handed way.4 Wendy Hui Kyong Chun and Andrew Lison argue that there is a dialectic at work between fun and exploitation.5 Techies such as Linus Torvald write software “just for fun” but their political naivety makes them also subject to exploitation. For some, the fascination with technology, which itself is not the real problem, can turn into a compulsive obsessive disorder.

The real problem seems to be not the obsession, which is actually driving innovation, but the one-sided belief that there is a technological fix for each and any social ill. The bigger question rumbling throughout this draft book is if community networks can alter the course of technology and if a different relation between technology and society can emerge which could help to make society more democratic. The simple answer to the first part of the question is almost certainly a bold Yes!, of course. Community Lab certainly helps to generate a lot of data to improve mesh protocols and develop new methods and services. Yet the second part of the question is much more complex and demands further explanations.

In capitalist societies a heightened division of labour develops which drives people into increasing specialization. As a result, information and communication technologies (ICT) are for most people a black-box. They use it, but have no idea how it works. This allows to create what Critical Theory used to call the “mystifications of technology.” Societies get ever more fragmented, whereby small elites command a lot of power by using money and technology. To ordinary people then it looks like they are controlled by technologies, mistaking social relations for relations between things. As technology becomes “mystified” in this way, the solution to the problem appears to be to create even better technology.

While techies, as individuals and citizens, my actually disagree with the political status quo and desire a free and egalitarian society, the course of technology as such – driven by their own free labour, produced out of their obsession with creative computing – exacerbates those divisions between powerful elites and ordinary user-citizens. Techies passionately belief that Free and Open Source Software will help to counter such developments. But while those technologies are transparent to experts, for ordinary people they are as opaque as a brick wall. Social mechanisms intrinsic to the techie community actually make matters worse.

The idealistic techie communities who produce FLOSS tend to have a missionary zeal about them and are very tightly knit 'communities of practice' who have created their own rules and codes, literally and metaphorically. This world, as highly complex it has become, of practices and ethics, has few connections with the rest of society - it works well within the community, where everybody carries the same rules and values, but is completely impregnable for non-members.

Just to give an example, it is completely beyond me why the testing of different mesh protocols has to be called “battle mesh.” People who are otherwise really nice and sensible are using, without further questioning it, a militarized language. This is a put-off for many people who may otherwise be interested in joining those communities. The problem goes even further. Mobile ad-hoc networks have initially been developed by the US military. The new and improved mesh network protocols are almost certainly used by the military again. Mesh protocols can be used for creating swarms of semi- or fully automated weapons in a battle field. These are issues that most people involved would want to avoid. When directly asked, they give evasive answers.

The “mystifications of technology” could be reversed by a two-way process which I tentatively call the socialization of technology. If more people learn about how ICT works, it will become much less easy to use and abuse those technologies by the powers that be. The socialization of technologies would also imply that there are closer links between the people who develop technology and those who use it. This was the idea behind participatory design which was pioneered in Sweden in the 1970s. Community networks in principle carry great hopes for initiating and furthering such processes of participatory design and socialization.

In reality, however, when I tried to find empirical evidence for those claims, I mostly gathered evidence to the opposite. I wanted to find out what drove people to work on those issues and how they developed the criteria for their code. And the answer was in 90% that the criteria were implicit, that they were following a shared tacit consensus according to which the coders developed the code. The question of a “user” of a software lies at the bottom of concerns, as the developers – or like minded people - are the users themselves.

This self-referentiality of community network activists extends to the three “laws” of Guifi.net. When asked about the freedom in free networks, everybody was quick to come up with the answer that this freedom was based on the open, free, and neutral character of the net. The reality is that the initial utopia of self-provision of networks is not really attainable. Most networks are built and maintained by professionals and the users, by participating in such networks, learn little or next to nothing about the technology, it remains a black-box and thus mystified.

Yet for exactly that reason, mesh is so important in upholding the decentralized utopia. If every node can mesh, you don't need expert knowledge at each node. The dream of mesh, however, is a Promethean fantasy inherent to all such technology, it is a form of the automatic utopia. The idea is that community networks will proliferate freely once mesh software is perfectly working and available.

In the meantime however, the actual problems and possible impediments come from the social sphere, where lobby groups and continued neoliberalism lead to a difficult environment for community networks. At the time of writing, Freifunk in Germany finds itself in a renewed battle against “Störerhaftung”, the law according to which a node owner is responsible for anything that a user might do. This law seems to have been created particularly to support the interests of the copyright industry. Now, the German coalition government is drafting a new law which, if passed, would make wireless community networks next to impossible. The problem is a political one, not one of the efficiency of battling mesh networks.

To support the Freifunk campaign against the new draft law, follow this link:
http://freifunkstattangst.de/2015/03/10/wir-brauchen-eure-hilfe-helft-mit-die-stoererhaftung-fuer-wl...

[To be continued]

  • 1. Vega, D., L. Cerda-Alabern, L. Navarro, and R. Meseguer. “Topology Patterns of a Community Network: Guifi.net.” In 2012 IEEE 8th International Conference on Wireless and Mobile Computing, Networking and Communications (WiMob), 612–19, 2012. doi:10.1109/WiMOB.2012.6379139.
  • 2. Weizenbaum, Joseph. Computer Power and Human Reason: From Judgement to Calculation. W.H. Freeman & Company, 1976.
  • 3. Wark, McKenzie. A Hacker Manifesto. Harvard University Press, 2009.
  • 4. Goriunova, Olga. Fun and Software: Exploring Pleasure, Paradox and Pain in Computing. Bloomsbury Publishing USA, 2014.
  • 5. Wendy Hui Kyong Chun and Andrew Lison “Fun Is a Battlefield: Software between Enjoyment and Obsession.” In Fun and Software: Exploring Pleasure, Paradox and Pain in Computing, 175–96. London / New York / Paris: Bloomsbury Publishing USA, 2014.

Towards the Network Commons (Conclusions)

This draft chapter summarizes my findings. Based on a recent trip to Germany, where vibrant new communities have triggered discussions about what makes the essence of Freifunk, I am suggesting that the future of wireless community networks lies in the notion of the Network Commons.

In the course of this book project, I have studied wireless (and wired) community networks trying to establish the current status of this movement. Two main research questions have guided my inquiry. First, I wanted to look if and how wireless community networks connect with larger questions such as communications freedom; and second, I wanted to find out if those networks can play a role regarding the democratization of technology. This second question has two aspects. One regards the development of technology itself. Wireless community networks are not just consumers and users of technology, they are also actively developing it. My question was, if technologies, developed by a community, are fundamentally different from technologies developed by companies and what would make such a qualitative difference. The second aspect regarding the democratization of technology concerns the role and function the respective technology plays within a community of users. In informational capitalism, technology in general and ICT in particular are key social agents. They are not just neutral tools but connected with wider social issues. Intricate knowledge of technologies, however, is restricted to narrow strata of society. The gap between high-priests of ICT and users, for whom it is a black box, goes across society and political divides. My assumption is that a lack of knowledge also furthers other inequalities, economic and political ones. If wireless community networks thus further knowledge about network technologies, because the development and application is embedded in a wider community, then it could be said that they further the „socialization of technologies.“

When starting this study, I soon became aware that any proper method applied to the research question would require vast quantities of empirical research which, due to the limitations of this project, I would be unable to conduct. It would require, for instance, to gather comprehensive empirical evidence about who participates in those networks, what their backgrounds are and which ways of participation exist. That would mean to engage in field work doing hundreds, not dozens of interviews. My work has been supported by an EU grant in the context of the CONFINE project. I was employed by Verein Funkfeuer, Vienna, on a part-time contract of less than one working day per week for 14 months. I thus soon decided that I could only do qualitative research. My main methodology used was participatory observation, conducted through interviews, research visits, websites, and mailinglists. I conducted about 20 interviews of different lengths and intensity.

By and large I think that my research questions have been validated. Those were interesting questions to ask and they merit further attention. However, the nature of my research questions does not allow for a yes or no type answer, any answer would necessarily be a complex assessment of a complex matter. My main case studies were Freifunk, Germany, and Guifi, Spain. In both projects, people are at work who share a certain ethics and their goals coincide with my research questions. They are building wireless community networks with the aim of furthering communications freedom, free speech, access to knowledge and information. This answers the first main research question, whereby important qualifications are to be made. The second question regarding the democratization of technologies yields more mixed results. The intention of the communities involved, in principle, is to further the democratization of technology, but there are divisions how this is best achieved. As I have analyzed in the previous chapter among some members of the developers and activists community, mesh network routing protocols are idealized as a technological fix to all problems of wireless community networks. There is a widely shared belief that once there exists firmware that is really easy to install which also uses mesh routing protocols, then nothing can stop wireless community networks. This type of firmware now more or less exists. Quick Mesh Project and also the latest releases of the Freifunk Firmware meet those requirements to a large degree. However, this still leaves open the question how easy it is to install and configure such software. And even if that part becomes solved, there are other issues regarding installation of antennas, energy supply and so on and so forth – technical hurdles are bound to continue to exist. Therefore, the main question regards the nature of participation in these projects, in particular the relationship between the core of activists, those people who participate in the network and society at large. I have been trying to find out, to which degree developers consider demands arising from the community and to what kind of extent a knowledge transfer happens between techies and users. As I was lacking the means to answer those questions through a broad scientific study, which would require a different project with a significantly higher level of funding, I can only address those questions as an observer, participant and interpreter.

The Dispositif of the Self-Organizing Network

On May 14th 2015 I was invited to give a talk at the OpenTech Summit in Berlin. This was followed, on 15th and 16th, by the Wireless Community Weekend at c-base, also in Berlin. At the OpenTech Summit I presented the summary of my findings. What I said, was about the following:

In my view, ideas about wireless community networks in Europe were first raised by initiatives such as Consume and Free2air.org in London, around the year 2000. While there existed other initiatives as well, nobody else made such a concerted effort, not just technologically but also ideologically, intellectually, in furthering those ideas. As I have described it in the first chapter, Consume produced a dispositif of the network commons, an idea, but also material support structures and a set of methods which enabled the building of a network commons. I am aware that I am slightly misappropriating this term by Michel Foucault. Foucault's notion of the dispositif is largely concerned with how power distributes in society. The dispositif of the network commons is concerned with the distribution of a type of network that is free from hierarchical power relations as far as possible. Consume's “model 1” was the idea of a network where each node is owned and maintained by its users. There is no centralized entity, neither technologically speaking (no supernodes which can become control points), nor organizationally – there should not be a company or other type of organization which runs the network. The network would be created through a process of social self-organisation. An important aspect of self-organisation would be provided by organizing workshops in regular intervals and having local meeting points which allow people to come together and share ideas, knowledge, skills, technologies.

The dispositif of the network commons traveled over the English channel and found support in Germany. In Berlin, a group of people came together and started, first, a regular meeting at c-base, called Wavelöten, and soon Freifunk (free radio), an initiative to build network commons, first in Berlin, later all over Germany.

Independently of Freifunk a similar initiative had started in Austria, called Funkfeuer (radio beacon). Funkfeuer had the advantage of being able to start on the basis of an existing installation. In the late 1990s, early 2000s, the provider Sil had been one among the most innovative Internet companies in Europe and Worldwide. As I have written in my article “Kreative Milieus” (2012), Sil was the result of a creative milieu, of the coming together of artists, hackers, designers and an entrepreneurial spirit. The company was one among the first in Europe to offer fixed leased line broadband Internet via ADSL for competitive prices through a partnership with two other small providers called Vienna Backbone Service (VBS). In the late 1990s VBS/Sil was looking into ways of consolidating its success by moving into the wireless medium. The artist-engineer Franz Xaver, while working for Sil, created Funkfeuer, a wireless network on the rooftops of Vienna, built to professional standards. But that proofed too costly for Sil, which was, after all, a commercial company. Sil abandoned the effort and for a while the antenna and router installations lay silent. Then a new initiative formed around the young computer technician Aaron Kaplan to revitalize Funkfeuer. He had actually read an early draft of my first, German book on Freie Netze which gave him the idea. Initially, Funkfeuer also operated a free WiFi hotspot in Vienna's Museum Quarter, in cooperation with the NGO Quintessenz. The hotspot served the purpose of showing that an open public Wifi access point could be operated without submitting to regulations regarding mandatory data retention. To cut a long story short, Freifunk and Funkfeuer became a resounding success.

Both networks initially grew rapidly. Freifunk in Berlin was propelled by the lack of availability of affordable broadband in certain areas. In former East Berlin, after German reunification, German Telekom installed a fiber optic network called OPAL. The same story was replicated in towns and regions across the former GDR, in cities such as Leipzig. Because of the OPAL fiber network, those areas could not receive cheap ADSL broadband access. Freifunk has had a strong argument. By joining Freifunk, people could gain fast Internet access almost free of cost. In Leipzig, Freifunk soon had 900 nodes, in Berlin at one point more than 1000. At the same time the German and Austrian free network communities were fervent developers of mesh network routing protocols. At first, OLSR was adopted, then BATMAN was developed out of the heart of the community. Freifunk and Funkfeuer also developed organisational ideas of their own which went beyond what Consume had dreamed up. It can thus be generalized that they did not just adopt the dispositif of the network commons, but contributed to it significantly. One key difference was that Consume was very British in a certain sense, that it had a strong libertarian or anarchist ideology at heart, which at some point becomes impractical when it comes to organizational issues. Those ultra-liberal instincts amount almost to a fear of doing anything that may be seen as prescriptive or normative. The ideology of Consume was that the network had to grow by itself. But the reality was that the moment key protagonists of Consume withdrew from publicly advocating it, it stopped developing at all and then fell apart. Since around 2006-7 Consume stopped being a publicly recognizable entity. Freifunk and Funkfeuer, on the other hand, soon founded a “Verein” each. A Verein is a registered non-commercial association which allows doing things collectively without running a business. Freifunk Germany from the very start was adamant that “Förderverein Freie Netze” was no umbrella organisation under which all other Freifunk initiatives had to be subsumed. And most importantly, the “Verein” was not to be mixed up with the function of a provider. Its role was merely to give the movement a kind of backing by doing fund raising and giving it a voice also publicly, when talking to politicians and regulators.

As I have already told in previous chapters, while Freifunk was growing rapidly throughout the 2000s, German law – or rather “legal practices” always had maintained a threat to the movement through so called “Störerhaftung.” This means that if a private person offers an open WiFi hotspot this person can be made responsible for infringements committed by users. There had been precedents in German law where people were made responsible for illegal filesharing over their open WiFi. This, however, was not even the main source of the problem for Freifunk. The real trouble is that there are law firms in Germany which make it their business model to send threatening letters to everyone suspected to have broken the law. They send out mass letters to people supposedly running open WiFi routers threatening to sue them unless they settle out of court by paying a certain sum. It is very rare that such a case actually comes before a court because this is not really the intention of those law firms. Their intention is to scare people so that they will give in and pay them money. Such practices discomforted Freifunk activists. Therefore, the Förderverein Freie Netze created a workaround, the Freifunk Freedom Fighter Box, a Wifi router configured in such a way that it creates a VPN (a secured virtual private network) which routes Internet traffic via a provider in Sweden. In that moment that no data packet hits German ground, German legislation does not apply. This created a lot of publicity but also adversity. Elements of the German press accused Freifunk of a lack of patriotism by going through a Swedish provider. Thus, Freifunk diversified the method. It also found German providers who allowed tunneling to them. In that moment when access is provided by a bona fide provider, “Störerhaftung” does not apply because providers, like telecoms, are not liable for violations of other laws by their users.

Providers, however, have other obligations. It has been only relatively recently, in April 2014, that the EU data retention directive has been brought down by a decision of the European Court of Justice. Member states are keen on reinstating a similar law which forces providers to store communications data for later usage. This would be in total opposition to the values held by the free networks community, I would assume. Anyway, the issue I want to get at is that there is no ideal solution. Following the example of Förderverein Freie Netze, many local Freifunk initiatives also formed a Verein and attained the status of an Internet service provider which made them exempt from liability for the actions of their users.

The New Ideological Divide

In my presentation at the OpenTech Summit I argued that the belief into mesh networking technology as the “golden bullet”, the magical solution to all problems, was mistaken. While techies believed that technology would provide the fix for all problems, the real problems were of a social and political kind and not easily solvable through technology alone. I presented this with a punchline. I said that while hackers in the past had told newbies always RTFM, which stands for “Read the Fucking Manual” I am now telling hackers my own version of RTFM, which in my case stands for “Read The Fucking Marx” (whereby Marx does not just literally mean Marx but all Marx inspired social theory and critique). I honestly expected to get booed when I would say that, but actually I was cheered on. Later I was to find out that my speech had touch on open points of lively ideological discussions going on inside the Freifunk community.

As I found out through discussions around the barbecue at the Wireless Community Weekend (WCW) on the following day, Freifunk had gone through a specific curve in its development. After growing rapidly almost throughout the 2000s, German IT infrastructure providers had upgraded their infrastructure, which meant to remove the OPAL obstacle to broadband via ADSL. As the obstacle for getting broadband was removed, the incentive for joining Freifunk was lessened. At around 2009 it started getting quiet around Freifunk. But then the Snowden affair kickstarted a new wave of free wireless networking. As people realized, through the revelations of whistleblower Edward Snowden, how widespread snooping on their communications habits was, both by the state but also by private companies, it was realized that the “free” in Freifunk had other connotations as well rather than just cheap Internet. A range of new initiatives started, especially in former Western Germany, where Freifunk had not been that strong during its early years.

For instance, it was only in 2011 that Freifunk Rheinland was founded (Freifunk in the Rhine valley). In 2013 it celebrated its 100th node, in May 2015 it had more than 1000 nodes https://freifunk-rheinland.net/ . Freifunk Rheinland understands itself as a loose connection of local networks in currently 42 towns. It has servers at major Internet exchanges and is also member of RIPE (regional Internet registry for Europe). In the vicinity of it, there is also Freifunk NRW (North Rhine Westphalia). Actually, it calls itself https://vfn-nrw.de/uber-den-verein/ “Verbund freier Netzwerke” which insinuates that it is an actual umbrella organisation which represents all the smaller networks belonging to it. This is a hierarchical organisation which is not in the spirit of the original idea, is claimed in this Forum post https://forum.freifunk.net/t/freifunknrw-weiterhin-irrefuhrend/3448 which created 254 responses so far.

However, to take things further, having a web based forum is something that is anathema to most Freifunk people of an older generation. Some new initiatives are presenting themselves to the public in a way which is not in the decentralized spirit of the original model at all. There are initiatives which present themselves and act as a kind of alternative Internet service provider. This goes deep into the way how technology is used. There is a new version of the Freifunk Firmware, called Gluon, which allows software updates from remote. Freifunk Munich recently proudly informed its members about the successful upgrade of the firmware of 300 routers from remote. Remote software upgrades of this kind fly in the face of the network commons dispositif. Some other people have even opened a Freifunk shop http://freifunk-shop.net/ where you can buy hardware which has the Freifunk firmware pre-installed. Freifunk Firmware, by the way, now exists in two main versions and more than 40 flavors.

The bottom line is that there is a new generation of activists who do not share the same set of values at all. It seems that they see the building of a Freifunk type of network as some sort of sport, proudly announcing when they break another quantitative landmark (1000+ nodes!). There is quite a variety of those new initiatives and one should avoid generalizing too much. Some initiatives are actually very close to the ideas of the original dispositif of the network commons. Others have barely hidden commercial aspirations. And others again, whilst operating under a non-commercial “Verein” and subscribing to the basic set of values, have condescending views of the people who participate in their networks, whom they see as end-users. Some of those differences may be based on a generation gap. Whilst it is always dangerous to classify a whole generation under this or that label, just because they have been born at around the same time, what seems obvious is that younger people have grown up within neoliberal information society. They are net-savvy and naturally use the techniques of Net 2.0, but they have also been shaped by consumer society and a certain competitive attitude prevalent in the neoliberal age. This sort of edginess makes itself felt in forum postings which are outright hostile to Förderverein Freie Netze.

The Verein created in Berlin in 2003 which has done so much for the network commons gets denounced as an obstacle to growth. Its set of values – which can be summed up in the slogan Decentralize! – is even considered “dogmatic” or “fundamentalist.” They call Freifunk Berlin the “legacy” organization, as if this was an obsolete version of Freifunk, insinuating that the new model is better adapted to the contemporary landscape and has more efficient ways of growing networks.

Jürgen Neumann and Monic Meisel of Förderverein Freie Netze try to counter those tendencies in a measured way. In those cases where the Freifunk logo and name are clearly abused for commercial ventures, they are working with lawyers to fight against that. The ideological differences they try to work through by raising a discussion about the original values.

Memorandum of Understanding

At the Wireless Community Weekend (WCW) 2015 a “Memorandum of Understanding” http://blog.freifunk.net/2015/memorandum-understanding was released which summarized the original ideas of Freifunk and called for an open debate. At the same time a national “Advisory Council” was formed which should serve as supervising instance in disputes about domain names and related issues. If that will help is not clear. In the week after WCW the general mood seems positive, as the new initiatives were received positively by the community. The Memorandum of Understanding and the Advisory Council are part of a larger change in strategy. Förderverein Freie Netze does not necessarily advocate the foundation of a Verein for each local initiative. They now say any group of people can make a Freifunk initiative, recognizing that the structure of those “Vereine,” so well known in Germany, tends to attract the wrong kind of people who are experts in that type of community organizing. A negative example is provided, unfortunately, by Funkfeuer Vienna. This has become a tightly run organization, very inward looking, and barely communicating with the outside world. It seems significant also that Funkfeuer is stagnating, the number of nodes has roughly been the same for years, and the website has hardly any new content. While Freifunk and Guifi.net communicate with the world through thousands of channels, Funkfeuer oozes the spirit of self-contained nirvana of nerd. The prevalent attitude is similar to those of HAM radio amateurs, a tightly knit group of males who like to experiment with latest technology. Benefits to society may arise in times of natural disasters, but apart from that it is not evident if or in which ways this once so vibrant initiative participates in wider social issues.

Freifunk, on the other hand, as Elektra remarked with a laugh, has arrived in the mainstream of German society, with all pros and cons. In the region of Franken, in the south, Freifunk has been adopted by the local branch of the CSU, the Christian Conservative party. Local and regional newspapers are full of articles about Freifunk, not always positive. It seems that Germany is a more politicized society, where issues such as surveillance, privacy, network freedom and communication freedom are of concern for a growing number of people.

All those things were debated hotly at the WCW. The spectrum of opinions stretches from those who think that routers with pre-installed firmware, serviced and maintained from remote by experts, are the way forward. Others think that this is a consumerist ideology which has nothing in common with the original idea. They insist that the transfer of knowledge between core activists and users is an important facet, that there should be no Freifunk shops, no pre-installed software. If that implies that growth is much more slowly or even stalls, than that's the price They are convinced that upholding the original idea is much more important than quantitative success, expressed in numbers of routers flashed. But is there such a thing as the original idea? Hasn't the idea of what a free network is already changed? Has not the very success of Freifunk and Guifi shown that the idea of Consume was a tick too Utopian, that it needed a less fundamentalist, slightly less de-centralized approach? On the other hand, a centralized structure such as Funkfeuer, based on a Verein and a close community of males with an affinity for technology, can create a network which functions as a commons for its participants but appears as a closed network to the outside world. While being a closed community, Funkfeuer can still make important contributions to the development of experimental network technology.

Conclusions

After considering all evidence, it seems a proven fact that community networks make unique and important contributions to communications freedom and the democratization of technology. Major European networks such as Guifi and Freifunk are growing at exponential rates which creates all kinds of stresses and problems, but this is in turn just a sign of their vibrancy. Community networks have also demonstrated that they can make invaluable contributions in poor and thinly populated areas. Those success stories, however, are precarious, always threatened by the general tendencies of neoliberal information society. The course of development of information society, which has been characterised by an incomplete paradigm shift, is itself not yet a foregone conclusion. The combination of computer networks and computational devices has potentials for emancipation and empowerment, but also repression, exploitation and disenfranchisement. Within that scenario, the relation between society and technology is a key issue. Wireless community networks have the potential of closing the digital divide and furthering knowledge about ICT in society, creating more sensibility about how people can make better use of those technologies. But this social aspect is not universally shared by all activists. Even in the world of free networks, where free and open source software is used, elitist attitudes sometimes prevail, which only reinforce other social divisions of wealth, class, education, gender. Therefore, the dispositif needs to be fundamentally revised. It needs to be spelled out what a free network is.

In my view, and this is my real conclusion out my engagement over more than 10 years, it is more productive to ditch the notions of Free networks and wireless community networks and speak of the Network Commons. The Network Commons is not a solution that already exists, but something which yet needs to be defined. The attempts at defining the free in free networks in the past centered on the so called Pico Peering Agreement. But this was very minimalistic, overly determined by English anarchic-libertarianism. What is now needed is a definition of the Network Commons in a strong and normative sense, something that can be read as a kind of constitution and even be turned into a legally binding and accepted license, such as the Creative Commons license package. The idea of the network commons stood at the beginning of this book and also at its end. It is not a ready-made solution but an open horizon to be explored.