In June, the father of the World Wide Web, Sir Tim Berners-Lee, took his original source code for the very first Web server, “wrapped” it in a secure bit of digital packaging, brought it to the auction house Sotheby’s, and put it up for sale in a lot named, appropriately, “This Changed Everything”. So it did – and not just once, way back in the early 1990s, but once again, a billion seconds later.
The code had always been free. I used it myself a few years after he’d written it when developing the first 3D interfaces to the World Wide Web. His boss, Robert Cailliau, an unsung hero of the Web, convinced the bigwigs at CERN that Tim’s code would be far better, and go far further, if no one could claim ownership, or demand a licence fee. The Web has always been “open source”, free to all to use and abuse.
So how can Sir Tim, 30 years later, take that same code and put it up for auction? How can he auction something that’s free – and also pretty much everywhere, in some version, on nearly every connected device created in the last few decades? He can’t claw it back off those devices and suddenly demand payment for something that has always been free. So, what was really for sale here? And how does he take code and make it “saleable” by auction?
The answer to both questions lies in the roots of another technology, something known as the “blockchain”. Too often spoken of in mysterious and almost reverent terms, blockchains are nothing more than data structures – think of them as envelopes securely sealed shut with a bit of mathematical wizardry to assure the contents are authentic, have not been tampered with, and can be audited. Nothing about that seems weird – for a few thousand years we’ve had property deeds that do just this. Yet, because a blockchain performs all of this entirely digitally, without any need to touch upon the physical world, it feels a bit magical to a civilisation that equates property ownership with a paper trail. A blockchain is that paper trail, fully digitised, weightless and invisible.
Berners-Lee tucked his source code for the Web within one of these wrappers, placing it onto the “blockchain” – an immutable, globally accessible record of its authenticity (yes, this is the real code), originality (every line of it is the real deal, no additions or deletions), and contents (here, look for yourself). Then, in a cultural act that has nothing at all to do with the digital realm, Sir Tim effectively “blessed” this code as the code. Because he wrapped it up in this way, he could proclaim it as the real deal.
Why go through all of these calisthenics? It turns out that other bits of data that had been wrapped – transforming them into so-called “non-fungible tokens”, or NFTs – had sold for astounding sums, culminating with the nearly A$100 million sale price for Everydays by digital artist Beeple, a concatenation of 5000 digital artworks that had already been distributed freely by the artist. Something could be free, yet still take its place among the highest prices ever paid for art.
In light of that precedent, it seemed obvious that Berners-Lee would make a motza (for charity) from his own source code. Raising over $7 million before the hammer fell, Tim showed that the Web – free to all – could still be bought and sold.
That’s the lesson most people took away from this: Tim got to turn his hard work into a pile of cash for some worthy causes. But there’s another piece to this, one that’s far more important – and, in light of current events, far more relevant. You see, this isn’t a story about money, or even the history of the Web. This is a story about code itself.
For much of the history of computing, code has simply been plain text, typed onto a punched card, or paper tape, and later, directly into the computer via a keyboard and monitor. A program known as a compiler translates this code into instructions for the computer. That compiler detects errors in the code; most often these will be typographic errors, along with a few mistakes in the structure of the code itself. But no compiler has ever been able to check for logical errors – where the code instructs the computer to do the wrong thing. This means that code has always been able to crash a computer by making it do something effectively impossible. More recently, code has been written that can subtly corrupt a computer’s operations, making it prey to hackers, who can take it over, steal its data, then scramble its storage.
For this reason, programmers value the ability to inspect the code to a program before they run it. In a perfect world, they’d compile the source code for every program they run, and do that only after they’ve had a chance to read through it, giving them some assurance that they’d avoid being hacked by a rogue program. A few operating systems – most notably Arch Linux – insist on making their uses compile all of the system’s programs from source code, offering a certain assurance of security.
Most computers – including all of the billions of smartphones out there – have tens of thousands of programs on them, with hundreds running simultaneously, most written by commercial entities which consider their source code highly protected intellectual property, and do not offer it up for inspection to anyone except for employees of the firm. Most code is a “black box” – you run it, and you have to trust that it does what it claims. That means code can sometimes be a wolf in sheep’s clothing.
This danger provided the rationale for Apple to mandate that all apps for its smartphones and tablets have to be digitally “signed” by the company. Without that signature, an app can’t be installed onto an iPhone or iPad – and an app only gets that signature after the developer submits its code to Apple for inspection. Apple uses tools to analyse the code for any potential mischief, signs the code, then puts the app into its App Store. That’s some protection – but the recent antitrust lawsuit between Apple and games developer Epic revealed that Apple has not stopped dangerous apps ending up on users’ devices. Instead, Apple has been caught out using its right-of-signature as a way to block competitive apps from its devices, promising safety while simultaneously reinforcing its monopoly.
Over the last year, several global-scale hacking events have been uncovered – first, “Solar Winds” invaded the systems of the Fortune 100, and more than a few US government agencies; now the “Kaseya” hack has invaded thousands of firms worldwide. Both are known as “supply chain” hacks, exploiting the fact that end-users of software unthinkingly download, install and run programs provided by a vendor – because they believe it’s guaranteed to be safe. Hack the software provider, and all of its customers get hacked automatically. Imagine a hacker breaking into Apple’s App Store, then replacing all the apps with corrupted equivalents. When you’ve hacked the provider, you can sign the code so that it looks authentic.
Fortunately, there’s an answer to this most devious of hacks. It’s what Sir Tim Berners-Lee did with the most important code that’s ever been written. It changed everything, and will again.
Instead of a single provider like Apple, tasked with signing code, we’ll be driven by necessity to a more comprehensive solution: a global “registry” of programs, each signed and added to the blockchain. We’re heading into a world where all code, running on all computers, has been wrapped, authenticated and audited, its safety guaranteed by the mathematics of the blockchain, and vouched for by a global community of computers, each contributing to and supporting a master index of known safe software. There’s safety in numbers – perhaps enough that the sheep can fight off the wolves.