Intelligence Is Not the Bottleneck. Power Is.

The March 2026 conversation between Senator Bernie Sanders and Claude made something very clear. We are not lacking understanding. We are lacking alignment between what we know and what we are willing or able to do about it.

A sitting United States senator asked one of the most advanced AI systems in the world about privacy, data collection, and democracy. The system answered directly. Homegirl paused though…

It explained how data is collected, how profiles are built, and how those profiles are used to predict behavior, influence decisions, and drive profit. It also acknowledged that companies cannot be trusted to protect that data without safeguards that do not currently exist.

None of this information is new. What is new is how clearly it was stated and how little that clarity changed anything.

The system is not confused. It did not hesitate or hide behind technical language. It named the mechanics plainly. Browsing behavior, location data, purchase history, and engagement patterns all feed into systems that build detailed profiles about individuals who never fully consented to that level of tracking. Those profiles are not passive. They are used to predict what people will do next. They are used in advertising, in politics, and in systems that shape access and opportunity.

This is not theoretical. This is infrastructure.

The conversation also made something else clear. This is not a knowledge problem. When the discussion shifted to regulation, the answer came down to incentives. Companies that benefit from data extraction have the resources to influence policy. Lawmakers understand the risks, but the systems required to regulate this level of technology are slow and often reactive. Even when the right questions are asked, there is no clear path to enforcement.

That is the gap. Not intelligence. Not awareness. Power.

Recent policy efforts reflect this tension. Proposals to pause the expansion of AI data centers are being introduced because of concerns around environmental impact, labor disruption, and democratic risk. Yet even those proposals are widely seen as unlikely to pass. The understanding is there. The action is not.

There is also a persistent framing problem. Privacy is often positioned as an individual responsibility. People are told to adjust their settings, read the terms, and be more mindful of what they share. That framing misses what is actually happening. This is not a settings issue. This is structural. The systems are designed to collect, analyze, and monetize data at scale. The business model depends on it. The technology accelerates it. The incentives reinforce it.

You cannot outmaneuver a system that is built to extract.

Speed adds another layer. AI systems are being developed and deployed faster than institutions can respond. What took social media years to reveal is unfolding here much more quickly. The capabilities are expanding, the adoption is accelerating, and the consequences are compounding. At the same time, the benefits are real. AI is improving disease detection, increasing accessibility, and opening new pathways for learning and productivity.

Both things are true at the same time. That is what makes this moment difficult for people to process.

The most important part of the Sanders conversation was not whether the AI agreed with him. It was what the system described. It described microtargeting based on psychological profiles. It described messaging tailored to individual vulnerabilities. It described a fragmented information environment where people are no longer operating from the same set of facts.

That has implications far beyond advertising. It reshapes how people form opinions, how they make decisions, and how they participate in civic life. Democracy depends on a shared understanding of information. These systems are actively disrupting that foundation.

The system can explain what it is doing. It can explain why it is doing it. It can even suggest what should happen differently. And still, nothing changes.

That is not a failure of intelligence. It is a reflection of where power sits.

The Sanders conversation was not a revelation. It was a mirror. It showed a system that is functioning exactly as designed. The real question is not whether we understand what is happening. We do. The question is what happens when understanding is no longer enough to drive change.

Next

What Oprah’s AI Conversation Actually Reveals (That People Are Missing)