By Jason Dorrier
Aug 25, 2019
We’re a long way from the halcyon days of the early internet, when the promise of a decentralized digital communications network meant anyone could talk to anyone without appealing to the gatekeepers. Knowledge would be liberated from elite newsrooms and stodgy old encyclopedia sets. “Information wants to be free” was the slogan of the day.
We’ve since learned “free” information often has a price.
Fast-forward to 2019, when science fiction author Neal Stephenson’s book, Fall; or Dodge in Hell, traces a near future in which the internet is nicknamed the “Miasma.” The digital haves and have-nots are separated by who can afford human “editors” to curate the web, filter out the rubbish, and guard their employers’ identities and information online. There are those privileged enough to float above the noise, selectively dipping in for reliable information, and those drowning in, and being driven mad by, the mass of disinformation and propaganda lurking below.
Science fiction reflects the cultural moment as much as it predicts the future, and Stephenson’s vision extrapolates and intensifies today’s challenges just plausibly enough to be unsettling.
Where’s all this headed? Can we still turn things around and recapture that early dream?
In a talk titled “What If the Internet Was Safe?” at Singularity University’s Global Summit in San Francisco this week, Doc Searls, editor-in-chief of Linux Journal, and Richard Whitt, president of the GLIA Foundation and founder of the GLIAnet Project, took a stab at an answer.
According to Searls, we need to adjust how we look at the online world. Asking how to make the internet safe is like asking how to make gravity safe, he said. “The internet is elemental. It’s a genie that’s not going back in the bottle.” We can’t make the internet safe any more than we can gravity, so instead, it’s our job to civilize it.
Life in the Wild
Whitt, who is a former Googler, suggested the current situation, in which a lack of trust and a lack of accountability reign, was brought about by four, mostly familiar, root causes.
Network effects. Everybody wanted to be on Facebook because their friends were there. The more people there were on the inside, the more FOMO on the outside. Everyone piled in, until the company had captured a significant fraction of the world’s population in their products. Google search improved the more people used it, and the more it improved, the more users it attracted. These effects bestowed quick growth and dominance on today’s internet giants.
Web inputs. Users yield behavioral data (knowingly or not) that draws a picture of who they are, and what they like and dislike. Users also freely create content that draws in other users or can be mined for data. That is, users supply the inputs and outputs that drive the business.
The attention economy. A healthy platform is a platform stocked with highly engaged users. Companies are constantly questing after the next click and trying to figure out how to ensure users stick around. Strong engagement is driven by emotion—joy but also anger and fear—which has led to some of the “pernicious societal issues which we’re still grappling with.”
Platform dynamics. Digital platforms are still a fairly new type of business—and a potent one at that. In the last two decades or so, economists have shown that platform businesses are uniquely powerful in a way the old, more familiar models can’t match.
But simply attracting a billion users onto a platform doesn’t a business make. Google figured out they could pair ads with search back in 2002, Facebook followed suit, as well as many others. It’s the self-reinforcing interplay of these trends and practices and the incentives in the ad-based business model that have resulted in some of today’s less-than-desirable outcomes.
That doesn’t mean we’re stuck with the system we’ve created. “This is all something [that’s] very new in the unfolding of this technology, and I would submit there’s ample room for us to say, you know what, we want to shift course a little bit and do something different,” Whitt said.
Take the Power Back
If the power is with the platforms today, the solution may be to invert that dynamic. Users are objects from which data is extracted and analyzed in the current model. A more people-friendly model would see us become clients and customers and regain some agency.
As Whitt wrote in a Fast Company article, “…what if users had the same power as platforms? What if users had a whole layer advocating for them—an arsenal of sophisticated tools to swat away invasive ads, safeguard their personal data, and negotiate fiercely with platforms?”
Whitt’s GLIAnet Project aims to build an ecosystem of such tools.
One component that might enable this new agency would be the development of personal, local AIs that act on our behalf, as opposed to platform AIs, like Alexa or Google Assistant, that collect our data, store the information on their servers, and share it with vendors. These personal AIs would be like “virtual envoys” that interact and negotiate with platform AIs.
Further, our online profiles—which websites we’ve visited or what we’ve purchased—would no longer be free for the taking but stored locally. These profiles and the data in them could then be far more selectively shared (in part or whole) in exchange for services provided online.
Some tools already exist, such as ad-blockers and VPNs, but they aren’t always easy to set up and use and they’re standalone—the GLIAnet Project hopes to gather everything together and provide a more user-friendly, all-encompassing ecosystem that’s more or less plug-and-play.
With his project, Customer Commons, Searls is similarly working to shift the power dynamic.
Every app and online service you sign up for requires you agree to their terms of service. Just think of the hundreds of contracts you’ve automatically checked the “agree” box on over the years. These agreements are long, legal, and variable. No one reads the fine print. Most don’t have the expertise to understand all the legal implications even if they did.
What if, instead of having terms presented to us, we presented our terms to companies? “You can show me ads, but don’t personalize them, and don’t track me. These are my terms, if you agree to them, I’ll happily use your service.” Then you have one set of terms that you understand because you set them, and you’d never have to check another “agree” box again.
Back to the Land
This isn’t about making the internet safe or stumbling on the perfect regulations. It’s about taking the internet at face value and inventing the tools to live there more comfortably.
The natural world isn’t safe. There’s not much we can do to change that. Instead, we’ve adapted to nature with personal technologies like clothes and shelter that provide protection from the elements. Now we live in nearly every clime, and instead of being constantly under threat by the natural world, we enjoy and find inspiration out in the wild.
Why not apply this approach to the internet too?