Undermined, Again, This Time by AI

Humanity is standing on the edge of something that looks like progress but feels like control. It’s no wonder many feel their agency could be undermined by AI. We are on the verge of an absolute nightmare — not because AI is evil or because machines are about to wake up, but because the incentives behind them are wrong. If we don’t fight for truly open, decentralized, open-source AI, we are royally screwed. In fact, people worry that their lives may be undermined by AI on a much larger scale.

I didn’t write Undermined because I hate technology. I wrote it because I learned what happens when power concentrates behind polished interfaces and promises. I learned what happens when systems you’re told to trust quietly turn against you — not with drama, not with sirens, but with terms of service and “nothing we can do.” That’s how it starts. Silicon Valley is not a charity. These companies are not building intelligence for humanity; they are building leverage. Control the models, control the data, control the distribution, control the narrative. That isn’t innovation — it’s infrastructure capture. And infrastructure capture is how you undermine a society without firing a shot. Many fear that such societies will be undermined by AI in ways we can’t yet predict.

We’ve seen this pattern before. Banks promised stability. Exchanges promised security. Platforms promised connection. Then came lock-in, censorship, fees, and the quiet reminder to trust them while they decided what you could access and what counted as truth. AI will not just recommend videos or write emails. It will shape perception, influence elections, determine who gets hired, who gets approved, who gets flagged as “high risk.” Now imagine that power sitting in the hands of a handful of corporations whose legal obligation is shareholder profit — not your sovereignty, not your freedom, not your long-term survival. Profit.

When AI remains closed, centralized, and opaque, we aren’t building innovation; we’re building digital feudalism. You won’t argue with the king. You’ll argue with the algorithm — and you won’t know who wrote it, what data shaped it, or what bias lives inside it. That is how societies get undermined: not overnight and not loudly, but slowly, invisibly, and permanently. Ultimately, we must ask ourselves how much of our freedom is being undermined by AI, intentionally or not.

Open, decentralized AI is not a technical preference. It is a defensive posture. Open models mean transparency. Decentralization means resilience. Community governance means no single boardroom controls the direction of human thought. It may slow things down and it will absolutely reduce profit margins, but it also reduces the chance that one concentrated power structure quietly reshapes reality.

Undermined was a story about misplaced trust. About believing systems were neutral when they were incentive-driven. About learning that centralized control always favors the controller. AI is the next system. The next decade will determine whether it becomes a tool of empowerment or the most sophisticated control mechanism ever built. Once intelligence centralizes, power follows — and once power follows, it rarely gives itself back.

If we don’t demand open systems now, we won’t get a second chance later. Because once the algorithm decides what’s true, who qualifies, who is credible, and who is safe, the debate is already over.

And this time, it won’t just be individuals who are undermined by AI.

It will be all of us.