September 10, 2025 schedule 5 min read

Surveillance Capitalism: The Business Model We Never Agreed To

Originally designed for curiosity, exploration, and sharing online, surveillance capitalism is not the internet's original promise.

O
Orchid Labs
Surveillance Capitalism: The Business Model We Never Agreed To

The internet was supposed to set us free. Hyperbole, perhaps, but we envisioned a decentralized network where information flowed freely, where curious minds could explore the world’s collective knowledge, and where communication transcended arbitrary geographical and political boundaries.

Unfortunately, because we neglected to endow the internet with a native form of payment and value transfer, we’re now left with a sophisticated surveillance apparatus disguised as convenience.

Thanks for reading! Subscribe for free to receive new posts and support my work.

This transformation didn’t happen overnight, and it certainly wasn’t part of any explicit agreement. The business model that now dominates the internet – platforms using freely-contributed data to sell ads – is not the ideal manifestation of the web’s original promise.

The Bait and Switch

The early internet operated on a simple principle: you paid for access, and in return, you could explore without being watched. Internet Service Providers charged monthly fees for connectivity and websites existed to share information, sell products, or facilitate communication. The relationship between users and services was transparent: you knew what you were paying for and what you received in return.

Then came the second iteration of the web, with its promise of free services funded by advertising. The trade seemed reasonable at first: see some banner ads in exchange for email, social networking, or search. Over time, though, these platforms recognized the value of the data they could collect. More than selling ad space, they were selling access to the world’s attention and, increasingly, its collective purchasing power and behavior.

As we wrote in earlier blog posts, “Web 2.0 business models incentivized companies to watch and track us and sell our data.” The platforms needed to know everything about us to predict what we might do next, because prediction products were far more valuable than simple ad inventory. Our data became the raw material for an entirely new kind of capitalism.

The shift was gradual, meaning that most users never noticed they’d “become the product.” Free email scanned messages for ad keywords, social networks analyzed relationships to show relevant content, and search engines catalogued queries to build psychological profiles, as a couple of examples. Each service introduced these privacy oversights to “improve user experience,” but they weren’t as open about the value they derived from our information.

The Information Economy

The tracking unfortunately goes beyond what most users expect. Thanks to the information that’s now freely available about individuals using digital services, platforms know when someone is considering a major purchase, going through a breakup, or struggling with depression.

Platforms can predict behavior now with remarkable accuracy and even manipulate emotions through algorithmic content curation, including influencing decisions in ways that are difficult to notice. They know which version of an advertisement will make users more likely to buy, which news stories will make them emotional enough to share, and which social interactions will keep them scrolling longest. This is now commonly referred to as “ragebait,” and society is ever more aware of the “engagement” economy trying to maximize the time we spend using various applications.

Unfortunately the personal costs are becoming clear. Studies link heavy social media use to depression, anxiety, and social isolation. Political discourse has become more polarized as algorithms amplify divisive content that generates engagement. Children struggle with body image issues exacerbated by platforms that profit from insecurity. Democracy itself faces threats from micro-targeted disinformation campaigns that exploit psychological vulnerabilities at scale.

The Illusion of Consent

Perhaps most troubling, this transformation occurred without meaningful consent. When people sign up for these services, they technically agree to all this data collection. Privacy policies have become incomprehensible, and most people never read these documents. Those who do try often can’t even understand them. Facebook’s privacy policy, for example, is longer than the U.S. Constitution.

Technically users can “opt out,” but these choices are largely illusory. Try using the modern internet without Google, Facebook, Amazon, or Apple, for example, and you’ll discover how thoroughly these companies have embedded themselves in digital infrastructure. Opting out often means opting out of digital life entirely.

In short, the burden of privacy protection has been shifted entirely to users, who must navigate increasingly complex systems designed to frustrate their privacy preferences and preserve service provider revenue streams.

The Network Effects Trap

The biggest reason people continue using these platforms is what economists call a monopoly based on network effects. Not only are these platforms good at providing value based on the data they gather, they become more valuable as more people join, making it harder for competitors to gain traction.

Network effects create what are called “switching costs:” the difficulty of moving to alternatives. Switching requires sacrificing convenience, a price few users are actually willing to pay.

The platforms understand this dynamic and exploit it. They acquire potential competitors before they can achieve meaningful scale, copy features from upstart rivals, and use their resources to weaken alternatives through predatory pricing or exclusive partnerships. The result is a technical oligopoly where a handful of companies control the digital infrastructure that modern life requires, and we’re observing this dynamic unfold with the next great platform shift in technology, too: artificial intelligence.

A Different Model

But the internet’s original promise need not be lost forever. Users are more privacy-conscious as they become aware of the effects of surveillance capitalism, seeking alternatives and demanding more of their governments.

The tools for change already exist. Privacy-preserving technologies can protect user data while enabling useful services, cryptoeconomic systems can align incentives between users and providers, and decentralized technologies coupled with open-source innovation can serve as an effective counterbalance to the concentration of power and influence in the hands of enormous internet companies.

We’re contributing our piece of the puzzle in pursuit of a better internet, and we hope you’ll join us for the ride.

Thanks for reading! Subscribe for free to receive new posts and support my work.