The web's greatest minds explain how we can fix the internet

Tim Berners-Lee, Jimmy Wales, Wendy Hall and more on how we can reset the net

When Sir Tim Berners-Lee invented the world wide web in 1989, he designed it to be open. Anyone could use it to make a website, link to others and connect with people around the world. It promised a decentralised utopia of information sharing. But the web today is a very different place than it was 28 years ago. Internet giants such as Facebook and Google now dominate the landscape, bolstered by the huge amounts of data they collect in return for providing free services. (As the saying goes: if the service is free, you are the product.) These platforms control much of what we see and do on the web, having transformed the way online industries from journalism and advertising to political campaigning works. And they contribute to the worrying trend of "fake news", helping to spread misinformation with algorithms that prioritise showing users information which is likely to foster an emotional response over any concern for accuracy.

"For 20 years it was a reasonable assumption that if you kept [the web] open, great things would happen: Wikipedia, blogs, good things," says Berners-Lee, speaking at the London headquarters of the Open Data Institute, which he co-founded with Nigel Shadbolt in 2012. "Then people look at last year and think, whoa - to a certain extent they don't immediately assume that if it's on the web it's good." Now, he says, it's time for a mid-course correction.

As well as his role as director of the World Wide Web Consortium, Berners-Lee leads Solid, an MIT project that proposes one solution: decoupling applications from the data they produce.

Most companies offer an all-or-nothing approach to data sharing: to access their services, we must agree to hand it over. As well as limiting control over how data is used, the model keeps web monopolies by exacerbating the effect of vendor lock-in: Facebook owns the data you've produced while using it.

An app built using Solid architecture would ask users where they want to store their data. You might decide to designate your private cloud storage for social media, and a work server for professional projects. You'd retain ownership over data, and any app following Solid standards would need to ask for access.

This would also give you access to all of the data you create. We don't have much control over how our data is used, yet we are also limited in what we can do with it ourselves. Berners-Lee gives the example of fitness-activity data: rather than it being locked up with a company, we should be able to decide whether or not to share this information and with whom. "If you can't read it, it should be because I've decided that you shouldn't read it - not because our machines won't talk to each other," he says. An app on the Solid platform could pull in your own data, plus any that others have shared with you. "[It's] much more powerful for you as a user, because you can integrate all the data that you have got access to," he adds.

Decoupling data from apps could also offer advantages to developers. They wouldn't need to spend as much on building a backend to store data, so scaling up wouldn't be such an issue - potentially levelling the playing field against the big platforms.

Take AI algorithms: what if an individual could run an AI product across their personal data - plus any other data they have access to - without having to share it? An insurance firm could send you a programme to run on your health data in order to offer a better quote, for example - but the calculations would happen on your machine and the company wouldn't get to take the data away with them. "This isn't somebody else running an AI on you; that's you running an AI on the whole planet - everything you can see, everything to do with you," Berners-Lee says.

Developers also wouldn't have to worry about hackers breaking into centralised databases to steal data. This might be particularly attractive in light of regulation changes, such as the EU's General Data Protection Regulation (GDPR), which comes into force in May 2018 and places new obligations on organisations that collect personal data. In this sense, data, Berners Lee says, may be less like the "new oil" and more like the new nuclear waste. "Companies will realise that every piece sitting in its system is potentially stealable," he says.

Berners-Lee is eager to see which community will be first to adopt Solid principles. Perhaps it will be coders or people who share a common privacy concern, such as journalists or lawyers. "I can imagine in a few years, a law company could say, 'We have a Solid server; we don't care which apps you use, they just better store everything on this server because it's illegal for us to store it off of it,'" he says.

Although changing the dynamics of data sharing could weaken the chokehold of today's web giants, there's no technological cure for everything. We can try to prevent the purveyors of fake news from gaming the system, but we can't stop them creating it or people believing it.

That is a more fundamental problem, which Berners-Lee never expected. We need to go back to basics, he says: "It's about re-establishing facts, which means re-establishing data and science as the basis for democracy."

Jaron Lanier: Save the internet - but change the business model

Something has gone very wrong: it's the business model. And specifically, it's what is called advertising. We call it advertising, but that name in itself is misleading. It is really statistical behaviour-modification of the population in a stealthy way. Unlike [traditional] advertising, which works via persuasion, this business model depends on manipulating people's attention and their perceptions of choice. Every single penny Facebook makes is from doing that and 90 per cent of what Google makes is from doing that. (Only a small minority of the money that Apple, Microsoft and Amazon makes is from doing that, so this should not be taken as a complete indictment of big tech.)

That business plan is precisely the nexus of evil in our time. And it must be ended. It is not a survivable business plan.

The behaviourist BF Skinner designed an experimental box for conditioning animals in laboratory experiments. A person in a Skinner box has an illusion of control, but is actually controlled by the box or whoever is behind the box. In this case, they're algorithmically designed. Because they are not physically contained in the Skinner Box, you have to keep people attentive to the device. The only way to do that is to create a continuous urgency, and that can only be achieved through conflict and danger. So intrinsically, the business plan breaks apart the world, including any efforts to prevent things from stopping it.

If you look at specific occurrences of evil, they usually happen around som body who has figured out how to behaviour-modify other people - be it a cult leader, a dictator or creepy religious figure. That's essentially what evil is.Everybody likes Tim Berners-Lee. But I think what happened is the web's one-way links (as opposed to the two-way links proposed by some early network designers) merged with a misguided ideology that everything must be free. By default, that left manipulation-for-pay as the only business plan left standing.

If you offer an alternative, people will rise to it. Of the big social networks, LinkedIn is the one with the least bullying, fake news and ugliness. The reason is obvious: people can use LinkedIn to further their careers, so they have a motivation other than attention. People decide not to be assholes. The key thing to do is to make sure that alternative is available.

There are other available business plans. The things that Google does, and the things that Facebook does, are actually valued by people. This idea that there's only one business plan is wrong. If you outlaw that business plan, everything else may eventually see to itself.

As told to Oliver Franklin-Wallis. Dawn of the New Everything (Bodley Head)by Jaron Lanier is out now

Eric Jardine: The dark web is the original ideal internet

"Some would say that the dark web is what the internet was supposed to be. The dark web returns to that initial vision [of the internet] by anonymising traffic, encrypting crowd traffic and therefore making it private. It allows you to get around geographically imposed restrictions on content. That is used for benign purposes, like political activists in repressive regimes using it to try to dodge state repression. It also comes with a dark side, where criminals realise this is an efficient way to do bad things. The dark web, being at an early stage of growth, may not have formed into what it will eventually be; the incentive structure may be different. But design choices are about values. So asking 'how would you redesign [the internet]?', is asking: what kind of values do you want baked in?"

Eric Jardine is a research fellow at the Centre for International Governance Innovation

Brewster Kahle: It's time to make the web a level playing field

The web is amazing. It's simple. But we have technologies now that have improved on it. Encryption, for instance. We have JavaScript, so the browsers don't just display pages, they can run distributed code safely. That JavaScript layer can perform a more sophisticated routing and content gathering function. It's time to take the steps that would have been difficult in the 90s, when Tim was getting his system deployed.

It's still early days for many of these technologies. But there are promising first starts: decentralised networks range from IPFS to ZeroNet. Bitcoin and Ethereum are also interesting developments. Those of us in the nonprofit area - the Internet Archive, W3C, ICANN, Creative Commons - and government, must ensure it's a level playing field. I would like to see the decentralised web have time to incubate. It's time to pull people together again. Let's see if we can rally around a vision.

As told to Oliver Franklin-Wallis. Brewster Kahle is an internet entrepreneur and activist

Jimmy Wales: App stores are the next frontier for net neutrality

Net neutrality is important. It’s something we need to keep an eye on, but if we’re worried about companies being able to block rival services, there are other things to focus on. We should think about the upstream model, that, for example, Apple has complete control over what goes on iOS devices.

Imagine if, at the height of Microsoft’s power, with the release of Windows NT, it announced that, “due to problems with spam and viruses, in the newer versions of Windows, we’re not going to allow any software on the system unless it’s sold through a Microsoft-approved store.” That would have been unacceptable. Yet somehow it’s accepted on our iPhones. To what extent is it acceptable that we have devices where we can’t choose what software to run?

It’s partly because there’s a benefit to it. But in Wikipedia’s early days, suppose I had needed to pay a licence fee to someone. That would be problematic. Now suppose I want to start a service that competes with something that Apple wants to do and they won’t let me put my app on their store. I suspect we’re going to see some legal challenges around that type of market power.

As told to David Baker. Jimmy Wales is the co-founder of Wikipedia and WikiTribune

Professor Dame Wendy Hall: Before you change the system, think of the downsides

"Could there be a different internet for the rest of the world? Well, [TCP/IP co-inventor] Bob Kahn's DOI [Digital Object Identifier] alternative architecture aims to give more control over access to links and this could allow countries to create their own internet within digital borders. This would mean the system breaks down. In fact, it's already happening: China imposes web censorship and blocks sites critical of it. Iran and Russia are keen to impose censorship too. DOI would make this easier. We have to think about what we might lose. The network would be compromised. It wouldn't be an internet, it would have walls. All the things we do to access information independently of where we live would break down."

Professor Dame Wendy Hall is regius professor in computer science at the University of Southampton

This article was originally published by WIRED UK