Remember when the internet was fun? When you could scroll through your feed without feeling like you'd just survived a psychological assault? When search results actually showed you what you were looking for instead of seventeen sponsored links and a recipe blog's life story before the actual recipe?
Yeah, me neither. Because that internet is dead, and we all watched it happen in slow motion while arguing about whether the dress was blue or gold.
Here's the uncomfortable truth that Big Tech hopes you never fully grasp: the internet isn't getting worse by accident. It's getting worse by design. Every frustrating experience, every rage-bait headline, every algorithmic rabbit hole that leaves you feeling vaguely anxious and definitely not smarter—that's not a bug. That's the business model.
The Enshittification Lifecycle
Cory Doctorow coined the term "enshittification" to describe the predictable decay of online platforms, and honestly, it's the most accurate word in the English language right now. Here's how it works, and I promise you've lived through this cycle multiple times:
Phase 1: Be Amazing to Users The platform launches. It's clean, fast, and actually useful. There are no ads, or if there are, they're unobtrusive. The algorithm shows you things you actually want to see. You tell all your friends about it. You become emotionally invested. This is the honeymoon phase, and like all honeymoons, it's designed to end.
Phase 2: Be Amazing to Business Customers Once you're hooked, the platform pivots. Suddenly it's all about advertisers and content creators. The algorithm starts prioritizing "engaging" content (read: content that makes you angry or anxious) because that keeps you scrolling. Your feed fills up with sponsored posts. The organic reach of the accounts you actually follow mysteriously plummets.
Phase 3: Extract Maximum Value Now that both users and businesses are locked in, the platform starts squeezing. Ads become more intrusive. Features get paywalled. The user experience degrades just enough to be annoying but not quite enough to make you leave—because where would you go? All your friends are here. All your photos are here. Your digital life is hostage.
Phase 4: Repeat Until Collapse Eventually, the platform becomes so degraded that users start leaving for the next shiny thing. The cycle begins again.
Sound familiar? It should. You've watched Facebook, Instagram, Twitter (sorry, "X"), TikTok, and basically every other platform follow this exact trajectory.
The Outrage Economy
Here's where it gets really fun: the algorithms that power these platforms have discovered something that media companies have known for decades—anger is the most engaging emotion.
A study from MIT found that false news stories are 70% more likely to be retweeted than true ones. Not because people are stupid (well, not entirely), but because false stories tend to be more emotionally provocative. They trigger outrage, fear, or tribal loyalty—all emotions that make you want to share, comment, and argue.
The platforms know this. They've run the experiments. They have the data. And they've made a calculated decision: your mental health is less important than their engagement metrics.
Every time you see a headline designed to make you furious, every time you get sucked into a comment war with a stranger, every time you close the app feeling worse than when you opened it—that's not an accident. That's the algorithm working exactly as intended.
The Search Engine That Forgot How to Search
Google used to be magic. You'd type a question, and it would give you an answer. Simple. Revolutionary. Life-changing.
Now? Good luck finding anything useful in the first page of results. You'll scroll past:
- Four ads that look suspiciously like organic results
- A "People Also Ask" box that answers questions you didn't ask
- A featured snippet that's probably wrong
- Three results from the same website with slightly different URLs
- A Reddit thread from 2019 that's somehow more helpful than everything else
Google's search quality has declined so noticeably that people are now adding "reddit" to their searches just to find actual human opinions instead of SEO-optimized garbage. When your users are actively working around your product, you've failed. But Google hasn't failed—they've just optimized for a different metric than "being useful."
They've optimized for ad revenue. And it turns out that showing you exactly what you're looking for doesn't maximize ad impressions. Keeping you searching, scrolling, and clicking does.
The Attention Merchants
Let's talk about what's actually being sold here, because it's not software and it's not services. It's you. Your attention. Your time. Your emotional state.
The average person spends over 6 hours a day on the internet. That's not because the internet is providing 6 hours of value per day. It's because it's been engineered to be addictive. Variable reward schedules (the same mechanism that makes slot machines addictive), infinite scroll, autoplay, notification badges that trigger anxiety—these aren't features. They're manipulation techniques.
Former tech executives have been remarkably candid about this. Sean Parker, Facebook's founding president, admitted the platform was designed to exploit "a vulnerability in human psychology." Tristan Harris, former Google design ethicist, has spent years warning that these platforms are "downgrading humans" and "hacking our minds."
They're not wrong. And the fact that the people who built these systems are now warning us about them should tell you everything you need to know.
The Privacy Illusion
While we're on the subject of uncomfortable truths: every "free" service you use is monetizing your data in ways you probably don't fully understand and definitely didn't meaningfully consent to.
Yes, you clicked "I agree" on the terms of service. No, you didn't read them. Neither did anyone else. They're designed to be unreadable—dense legal documents that would take hours to parse, updated frequently, and written to maximize the company's rights while minimizing yours.
The data being collected goes far beyond what you'd expect. Your location history. Your browsing patterns. Your purchase history. Your social connections. Your typing patterns. Your voice recordings. The photos you took but didn't post. The messages you typed but deleted.
All of this is being analyzed, packaged, and sold to advertisers who use it to manipulate your behavior in ways you're not consciously aware of. And the really insidious part? It works. Targeted advertising is effective precisely because it exploits psychological vulnerabilities you don't know you have.
The Return of the Small Internet
Here's the hopeful part of this otherwise depressing analysis: people are starting to notice. And they're starting to leave.
Not en masse—the network effects are too strong for that. But around the edges, something interesting is happening. Private Discord servers are thriving. Newsletters are making a comeback. RSS feeds (remember those?) are experiencing a renaissance among people who want to control their own information diet.
Smaller, private communities are emerging as an alternative to the algorithmic hellscape of mainstream social media. These spaces prioritize quality over engagement, conversation over content, and human connection over viral reach.
It's not a revolution. It's more like a quiet exodus. People are building their own corners of the internet, away from the attention merchants and the outrage algorithms. They're rediscovering what the internet was supposed to be: a tool for human connection, not a machine for extracting attention and selling it to the highest bidder.
What You Can Do
I'm not going to tell you to delete all your social media accounts and go live in the woods. That's not realistic for most people, and honestly, the internet still has genuine value when used intentionally.
But here are some things worth considering:
Recognize the manipulation. Once you understand how these platforms work, you can start to notice when you're being manipulated. That angry headline? Designed to make you click. That notification? Designed to pull you back in. That infinite scroll? Designed to keep you from ever stopping.
Curate ruthlessly. Unfollow accounts that make you feel bad. Mute topics that trigger you. Use browser extensions that hide recommended content. Take control of your information diet instead of letting algorithms decide what you see.
Seek out smaller spaces. Find communities built around genuine shared interests rather than algorithmic engagement. These spaces tend to be healthier, more substantive, and more rewarding.
Pay for things. This sounds counterintuitive in the age of free everything, but if you're not paying for a product, you are the product. Services you pay for have an incentive to serve you. Services that are free have an incentive to exploit you.
The Bottom Line
The internet is getting worse on purpose. Not because the people building it are evil (though some of them might be), but because the incentive structures reward degradation. Platforms that prioritize user wellbeing over engagement metrics will lose to platforms that don't. It's a race to the bottom, and we're all along for the ride.
But awareness is the first step toward resistance. Once you understand the game being played, you can choose not to play it—or at least to play it on your own terms.
The internet we have isn't the internet we deserve. But maybe, if enough of us start demanding better, we can build something that actually serves human flourishing instead of undermining it.
Or we can keep scrolling. The algorithm doesn't care either way.
Free Playbook
2025 Digital Transformation
Get our comprehensive guide with proven strategies, frameworks, and real-world case studies. Join 5,000+ industry leaders.
No spam. Unsubscribe anytime. We respect your privacy.
