Surveillance Capitalism
Every moment you're online, someone is listening. You know how it goes — you mention a product or a place to a friend, and then suddenly an ad for it shows up on your phone like it read your mind. It feels like your mic must be on all the time.
But here’s the thing: it probably isn’t. At least, not always. The truth is even weirder. They don’t need your microphone. Your phone is already bleeding data — your location, your searches, who you hang out with, how long you linger on a post — and the algorithms are so dialed in they can predict what you’re thinking before you even say it out loud.
Gentleponies, this is surveillance capitalism manifest.
So what is surveillance capitalism, exactly? It's not just ads. It’s not just “data collection.” It’s a full-blown economic system built on turning your life into prediction fuel. Your movements, your moods, your messages, your late-night doomscroll spiral — all of it gets scooped up, packaged, and sold to whoever wants to influence what you do next.
It started with companies like Google realizing they could make more money not just by showing ads, but by predicting behavior. The more data they collected — clicks, searches, GPS pings — the more they could sell advertisers a glimpse into your future. And that quickly became the business model for the whole internet.
Now it’s everywhere. Social media, shopping apps, dating apps, your smart fridge. Every “free” service comes with a hidden cost: you're the raw material. Your attention, your habits, your impulses — that’s what’s being mined, refined, and monetized. And the better the system gets at predicting you, the easier it is to shape you.
But ads are only thew beginning.
Once you can predict behavior, the next logical step is to influence it. And if you can influence behavior at scale, now you’re shaping elections, social movements, entire cultures, uprisings, genocides.
Facebook (sorry, Meta) didn’t just enable targeted political propaganda with the Cambridge Analytica scandal — it actively helped fuel real-world violence. In Myanmar, the platform became a vector for anti-Rohingya hate speech, disinformation, and incitement to violence, helping pave the way for a brutal military-led genocide. Facebook’s own internal reports admitted it, long after the bodies piled up.
Google isn’t off the hook either. It quietly dictates what counts as knowledge, who shows up in search results, and whose voices get buried. These are the logical outcomes of a system that treats attention as a commodity and ethics as an afterthought.
The future of the planet itself is at stake here.
So what do we do? How do we cut out these new robber barons — the nouveau lords who are building Collapse Bunkers and digital HOAs and new empires?
First: understand that opting out isn’t as simple as logging off. The system is baked in. Even if you delete your social media accounts, data brokers still build “shadow profiles” of you from leaked lists, facial recognition, and the digital exhaust of your friends, your coworkers, your phone just being on. You don't need facebook on your phone for them to get all your data, the tracking is in ads you're loading in browser and other methods.
That’s why resistance starts with refusal. Refuse the default. Use tools that don’t spy. Switch to open-source software. Use browsers that block trackers, email providers that don’t mine your inbox, messaging apps that encrypt by default. We need to go on the defensive.
But individual action alone isn’t enough. This is structural. It needs sabotage, regulation, collective pushback. We need laws with teeth, not PR fluff. We need networks that serve people, not profit. We need to treat data extraction like pollution — toxic, corrosive, and in urgent need of cleanup.
They built a panopticon, but we can tunnel underneath it. With encrypted meshes, we can build networks of human-level trust where the corpos can’t see. We may not be able to beat them at their own game, but we can build a new one.
COMING SOON - How I'm going to try to implement that.