Load a web page in 2023 and it bristles with surveillance: cookies track our comings and goings, while ‘fingerprints’ uniquely identify a specific browser on a specific computer on a specific network. All of our interactions get recorded, and can be used to build a profile that grows into a simulacrum of tastes, behaviors and reactions. Minutely and continually measured – more than at any other moment in history – what does all this surveillance get us? Very little. But the price of surveillance, once enormous, has dwindled to the point where it has become cost effective to put us under surveillance, to better understand what to show us, and when to sell it to us.
It feels almost ironic that the planetary-scale computing apparatus we call ‘the Web’ has been reduced to such an inconsequential task. Throughout the world, vast warehouses, filled with racks of equipment, burning through gigawatts of electricity – all for no more than to offer up a slightly cheaper hotel room.
The Web we have today – a behemoth using surveillance as the handmaiden of commerce – may look like the only possible way things could have happened. We built this fantastic machine – but were we destined to see it reduced to a sidewalk spruiker? Or did history offer us multiple paths to this future, with vastly different outcomes? Did choices made a generation ago shape the world we inhabit today?
In the beginning, the Internet – which predates the Web by more than twenty years – got its funding from the US military, through its Defence Advanced Research Projects Agency (DARPA). The only users of that network were a few universities and US defense contractors. Across a multi-year project to build up its core technologies, the Internet slowly added more universities, research facilities, and defence firms.
None of that research work had any commercial focus – quite the opposite: the network had an ‘Acceptable Use Policy’ that forbade any commercial traffic. You could not use the Internet to buy or sell anything. That didn’t matter to any of the users of the Internet – who mostly used it to communicate with one another using a new tool – ‘electronic mail’ – or participate in discussion groups on USENET, which I explored in my last column.
By the early 1980s the Internet had grown enough to necessitate a ‘split’ between the explicitly defense-oriented bits of the Internet – this became MILnet, then promptly disappeared behind security firewalls – and the rest of the Internet, which focused on research and educational uses. This bigger half, under the aegis of the US National Science Foundation, became NSFnet. NSFnet retained the ‘acceptable use policy’ of the early Internet – no commercial activity was permitted. Explicitly commercial organisations couldn’t even connect to NSFnet until a rule change in 1989 allowed them to use the network – but only for non-commercial purposes.
Those commercial organisations quickly saw the commercial potential of the Internet, and began a sustained lobbying effort to get changes to NSFnet’s Acceptable Use Policy. As a government-funded entity, NSFnet couldn’t turn a deaf ear to the demands of well-funded and highly influential companies – nor could the US Congress. In a series of rule changes and legislative initiatives from 1991 to 1995, NSFnet gradually removed the commercial restrictions in its acceptable use policy, opening the Internet to commercial firms for commercial purposes, and opening the network itself to for-profit providers of Internet access. That transition to commercial providers proved so successful that we almost never think about the Internet anymore, except in those rare moments when the network goes out at home, in the office, or on our mobiles.
You could not use the Internet to buy or sell anything.
Fortuitously, this transition to commerce happened at precisely the moment in time when the World Wide Web – a service that sits on top of the Internet – began to take off. The Web had non-commercial beginnings at CERN, the gigantic atom-smasher that nestles at the foot of the Alps just outside Geneva. Originally conceived as a way to allow the global physics community to share results and research across a vast and largely incompatible array of high-performance computers, the Web soon proved itself even more vital as a universal information resource. More than just the raw data and reports and papers, it encompassed all of the documentation and notes on the technologies, tools and software that physicists used to do their research.
The Web is both a technology – one that, at essence, allows any two computers anywhere to exchange data – and an idea that a single ocean of information can be created from the many independent pools of data on the multitude of computers scattered around the world. A single pool can be useful, but that ocean had been the dream of thinkers all the way back to Liebnitz, who envisaged a “calculus ratiocinator” – a universal thinking machine with an infinite supply of knowledge to work with – in the middle of the 18th century.
It’s that dream of an ocean of universal knowledge – recovering the legendary Library of Alexandria, destroyed multiple times between the first century BCE and third century CE – that brought four hundred researchers to CERN at the end of May in 1994, answering the call of Tim Berners-Lee, the ‘father’ of the Web. I sat among them, in CERN’s main theatre, watching as Tim made a brief opening statement, then introduced the opening keynote, by Dr David Chaum.
Very little documentation of this First International Conference on the World Wide Web exists. The Web, our medium for managing the memory of such events, barely existed. The smartphone was more than a decade in the future, and digital cameras remained expensive rarities. In all the years since, I’ve only found a handful of photos from the event. Nothing at all remains from the many talks delivered over the three days of the conference – except for one. The talk by Chaum, now on YouTube.
Very little documentation of this First International Conference on the World Wide Web exists.
I certainly had no idea what Berners-Lee had planned for this opening keynote, and that video won’t be winning any awards; I doubt whoever sat behind the camera lens had any idea that they would be the first to film a commercial transaction happening over the Web. In the core 90 seconds from Chaum’s talk, you hear him talking about the need to pay for content on the Web, while he demonstrates a fully electronic, Web-based system for payments. In that room at CERN, Chaum gave us the first public demonstration of ‘digital cash’.
I’d already been doing Web research for six months and at no point had it occurred to me that the Web could be a commercial environment. I’m fairly certain that no one in the room – other than Berners-Lee and Chaum – had seriously entertained the possibility that the Web enabled commerce. But by the end of Chaum’s presentation, every person in that room saw the Web as not just the Cathedral of Knowledge – a second chance at the Alexandrian Library – but as the doorway to a planet-spanning bazaar.
The Web might have eventually found a commercial trajectory; Berners-Lee’s curation – putting Chaum’s digital cash demo at the front of the conference – moved the Web’s goalposts from library to storefront. In so doing, it also set in motion the commercial imperatives for Web-based advertising, followed by advertising analytics, then tracking, then profiling – and now, the full suite of surveillance capitalism. Would that have happened as quickly – or at all – if Tim had used the top spot to highlight a project that emphasised community, contribution, and knowledge-sharing? We’ll never know. But, judging from Tim Berners-Lee’s efforts since that day – in particular, the Solid project, which works to roll back many of the surveillant characteristics of the modern Web – he may harbour second thoughts.
NSFnet bowed to political and commercial pressure to become the modern, commercial Internet. Berners-Lee grasped the nettle directly, charting a course for the Web which placed commerce its heart. Decisions made 30 years ago carried a price – one we’re still paying today.