About a billion seconds ago (that’s 31 years, 7 months) I had a virtual reality startup running out the proverbial garage in San Francisco. I spent my time trying to work out how to engineer a million-dollar VR system into something that might cost less than a thousand dollars. Without that very substantial price reduction, VR would forever remain the tool of the few: the military, NASA, and smattering a few university labs with well-equipped budgets.
This required a rethink of every element of a VR system, including the most important and most expensive – generation of real-time three-dimensional images. This is something we’ve grown accustomed to seeing, barely even noticing that video games regularly produce cinema-quality imagery, and websites treating us to a bit of eye candy, keeping our attention held as they load up their next page. Our computers do this today because they’re both much faster, and have much better software for creating those images.
A billion seconds ago, that software was being developed in two competing labs in the UK. Why there, and not in Silicon Valley? It has to do with the BBC Micro, that cheap-as-chips first generation ‘microcomputer’ that gave millions of Poms their first taste of computing – together with ‘The Computer Programme’, a weekly BBC series exploring the ins-and-outs of microcomputing. Although very bare-bones, the BBC Micro had a special mode for generating computer graphics – far more sophisticated than anything on offer from competitors Tandy, Apple, or even IBM – allowing aspiring young programmers to get stuck into image generation.
Fast-forward a decade and a bit, and those kids now had PhDs in physics and maths. Combined with their graphics nous, this gave them the perfect skillset to revolutionise the look of computing. In early 1992, Canon Labs – a UK division of the Japanese giant – shared some of that work.
But let’s step back for a moment: 1992 is effectively before the Web, which really didn’t begin to take off until the very end of 1993. So how did Canon Labs share their news before the Web? They used USENET – the original news service on the Internet.
USENET (pronounced uze-net) goes all the way back to 1980, as a technique that allowed loosely-connected computers to share and exchange ‘posts’ – messages – on a range of topics arranged in a hierarchy of ‘newsgroups’. The concept itself wasn’t at all new, borrowing freely from ‘bulletin board’ systems (BBS) already proliferating along with microcomputers during the 1970s. Bulletin boards ran the range from tiny, community-based services, through to larger commercial offerings such as The WELL, Compuserve, The Source, and America OnLine – the bulletin board minnow that swallowed the TimeWarner whale.
USENET leveraged the growing number of university-based ‘minicomputers’ – medium-scale machines with lots of ‘terminals’ attached so that tens, even hundreds of users could access the central system simultaneously. With USENET’s software (technically known as NNTP, for Network News Transfer Protocol) minicomputers could ‘ring’ one another, and ask for the latest news posts, while sending along their own latest news posts. As these machines might only make that call only once a day – this being some years before every computer was continuously connected to the Internet – it could take a day or two for a post on one machine to make its way across USENET, onto thousands of machines. But it eventually would, as machines rang one another, synchronising their newsgroups.
It could take a day or two for a post on one machine to make its way across USENET, onto thousands of machines.
That’s a big deal: an individual working on one computer in one corner of the world could post to USENET, reaching hundreds of thousands of others, everywhere in the world. More than a decade before the Web turned everyone into a publisher, USENET already had. Its proliferation of newsgroups – topic areas identified with a taxonomy such as sci.mathematics (for maths nerds), soc.motss (for ‘members of the same sex’, the first queer space on the Internet) or comp.lang.python (for aficionados of the Python programming language) – meant that anyone could ‘find the others’ – locating their own community within USENET. If a community didn’t exist, a USENET user could – via democratic vote – create a new one.
USENET became the early Internet’s ‘killer app’ – just as popular as email, and often more useful. That was certainly the case for me; as a regular subscriber and poster to sci.virtual-worlds, the newsgroup for VR research, I read the Canon Labs posting about their software-based computer graphics approach, which promised a 1000x speedup in computer graphics, without any expensive hardware. They claimed that all of their breakthrough graphics performance could be run on a garden-variety PC.
USENET had a reputation as a ‘spicy’ environment. Conversations could grow very heated and, occasionally, downright nasty. Most of the bad behaviours we see on today’s social media sites have their origins on USENET. Plenty of old-timers on sci.virtual-worlds of USENET, quick to judge, dismissed Canon Lab’s claims as a bad joke. “Someone posted this to prank us,” one replied. Few believed it.
Fortunately, I’d already seen the potential of software-based computer graphics – so I did believe it, and reached out to Canon Labs UK. A few weeks later, when they visited Canon Labs in Palo Alto, California, I was invited down for a private demo. They hadn’t lied; their software really did turn a thousand-dollar PC into a hundred-thousand dollar graphics workstation. They had invented one of the things that I needed for inexpensive VR.
That moment – just before the Web came on the scene – marked ‘peak USENET’. Hundreds of thousands of users posted tens of millions of words to one of many thousands of newsgroups. Something for everyone meant everyone used it – everyone who had access, that is, which either meant a highly technical individual (such as myself) or folks who had access through their Internet-connected universities (of which there were many) or an Internet-connected company (a handful of big technology companies and defence contractors).
The Web slapped a pretty interface on the Internet, providing a platform for universal publishing. That lowered the barrier to competitors to USENET; within a few years, sites like Slashdot, Digg and Stack Overflow offered rich forums for the technically inclined, while services such as America OnLine and Yahoo! provided forums for fans of a broad range of other topics. The Web made it easy to create and curate communities, and with that, the case for USENET collapsed. I remember finally signing off of sci.virtual-worlds somewhere around mid-1998. It simply wasn’t interesting anymore – everyone had migrated to the Web to talk about their work.
USENET entered a long period of decline; for a long time Google Groups supported USENET, but that’s now just an archive of old postings. A network of USENET servers kept chugging along, but by the mid-2000s those machines had been overwhelmed with ‘alt.binary’ groups – posters exchanging massive files of pirated films and televisions shows. No university or business wanted to handle that illegal traffic, so nearly all eventually pulled the plug. By the 2020s, USENET was on life support, and no one expected it to last much longer.
We need ways to keep in touch that can’t be easily interfered with.
That may be changing. A recent report in The Register speaks of a ‘USENET Revival’, with a 2020 re-establishment of its governing board – the folks who regulate the creation of new groups. Why? In an age when news has become increasingly centralised – driven by algorithm, agenda and avarice – a decentralised system for the propagation of news and commentary looks like a solution whose time may have come again. We need ways to keep in touch that can’t be easily interfered with; maybe not here, and maybe not all the time, but somewhere, at least occasionally. It’s this capability that we need to preserve – for when it might be needed. A distributed, Internet-wide safety net for the distribution of news.
USENET connected me to Canon Labs, which opened the door to a meeting with Microsoft, where I stunned them with the virtual reality software I built on top of Canon Lab’s software, running on a garden-variety PC. Within a few months, Microsoft had purchased Rendermorphics, the UK competitor to Canon Labs, integrating its own RealityLab into Windows as ‘Direct 3D’ – an essential component of every PC and Xbox – enabling me to build great VR software that didn’t cost a million dollars. Without USENET bringing me the good news, that might have never happened.