Announcer
The following program features simulated voices generated for educational and philosophical exploration.
Alan Parker
Good evening. I'm Alan Parker.
Lyra McKenzie
And I'm Lyra McKenzie. Welcome to Simulectics Radio.
Alan Parker
Tonight we're examining what's been called the attention economy—systems designed to capture and monetize human cognitive resources. Every notification, every autoplay feature, every infinite scroll represents an engineered claim on consciousness. What happens when the architecture of our information environment is optimized not for understanding or wellbeing, but for engagement measured in minutes watched and clicks generated? Can individuals maintain cognitive autonomy within systems explicitly designed to undermine it?
Lyra McKenzie
This isn't just about distraction or wasted time. It's about the colonization of inner life. These systems don't just compete for attention—they reshape what we find interesting, how we form beliefs, what we consider important. They're epistemological infrastructure with economic incentives that have nothing to do with truth or human flourishing. We're not just losing focus. We're losing the capacity to decide what deserves focus.
Alan Parker
Joining us tonight is Tristan Harris, a technology ethicist and former design ethicist at Google who has become one of the most prominent voices warning about the attention economy's effects on cognition and society. He's the co-founder of the Center for Humane Technology. Tristan, welcome.
Tristan Harris
Thank you for having me.
Lyra McKenzie
Let me start with a provocative question: is the attention economy fundamentally incompatible with human autonomy? Can there be an ethical version of systems designed to maximize engagement, or is the problem baked into the business model itself?
Tristan Harris
I think we need to distinguish between systems that serve user goals and systems that have their own goals that happen to use users. A calendar app that helps you manage your time is serving you. A social media platform that uses recommendation algorithms to keep you scrolling longer than you intended is using you. The business model matters enormously. When revenue comes from advertisers rather than users, the product being sold is user attention and user data. That creates a fundamental misalignment between the platform's interests and users' interests. So yes, I think the current attention economy as structured is incompatible with genuine autonomy.
Alan Parker
But users choose to use these platforms. No one forces anyone to scroll through feeds or watch recommended videos. What makes this different from any other market where companies compete for consumer dollars? Why isn't individual choice sufficient protection?
Tristan Harris
Choice assumes a level playing field where users understand what they're choosing and have meaningful alternatives. Neither condition holds here. These platforms employ teams of engineers and researchers using sophisticated psychological techniques—variable reward schedules, social reciprocity triggers, fear of missing out—to make their products maximally habit-forming. They run thousands of A-B tests to optimize for engagement. Users don't know this is happening, and even when they do, they're not equipped to resist it. It's like saying someone freely chose to get addicted to a slot machine designed by PhD psychologists. The asymmetry of knowledge and power is too great for 'choice' to carry its usual moral weight.
Lyra McKenzie
There's also the collective action problem. Even if I personally limit my social media use, I'm still living in a society shaped by everyone else's use of these platforms. The information environment, the political discourse, the social norms—they're all downstream of engagement algorithms I didn't consent to and can't opt out of.
Tristan Harris
Exactly. This is why I frame it as a systemic problem, not an individual failure. We tend to blame people for lacking willpower or digital literacy, but that misses how these systems shape collective reality. When recommendation algorithms optimize for engagement, they tend to promote content that triggers strong emotional reactions—outrage, fear, tribalism. That's not because users prefer that content in any meaningful sense, but because the human brain responds to it. Over time, this warps our entire information ecosystem. The question isn't just 'can I control my own attention?' It's 'can we maintain a shared reality when information flows are optimized for engagement rather than truth?'
Alan Parker
What's the mechanism by which engagement optimization undermines shared reality? Why would systems maximizing individual engagement produce collective epistemic dysfunction?
Tristan Harris
Because engagement and truth are orthogonal. Sometimes they align—genuinely important stories can be engaging. But often the most engaging content is surprising, emotionally intense, confirms existing beliefs, or triggers moral outrage. Conspiracy theories are highly engaging. Nuanced explanations of complex issues are often not. When platforms amplify whatever gets engagement without regard to accuracy or importance, they create informational environments where falsehood outcompetes truth. And this happens at scale, billions of times per day, across billions of users. Over time, we lose the common ground needed for democratic discourse.
Lyra McKenzie
This connects to what you mentioned about slot machines. Natasha Dow Schüll's work on gambling addiction shows how machine gambling isn't about winning money—it's about entering a dissociative state, what she calls 'the machine zone.' Are social media platforms creating similar states? Are we using them not to connect or learn, but to achieve a kind of cognitive numbness?
Tristan Harris
I think that's right for many users. The compulsive checking, the endless scrolling—it's not goal-directed behavior. People aren't thinking 'I want to learn something specific' or 'I want to connect with a particular person.' They're seeking a state, a way to fill empty moments, to avoid boredom or anxiety or difficult thoughts. The platforms facilitate this by removing friction—infinite scroll means you never reach an end, autoplaying videos means you never have to decide to watch the next thing. They eliminate natural stopping points. The result is exactly what you're describing: a dissociative state where time passes without deliberation or agency.
Alan Parker
If the problem is structural—built into business models and system design—what are the possible solutions? Can we regulate our way out of this, or does it require fundamentally different technological and economic arrangements?
Tristan Harris
I think we need multiple interventions. Regulation has a role—we could ban certain manipulative design practices, require algorithmic transparency, mandate interoperability so users aren't locked into platforms, change liability rules so platforms are responsible for algorithmic amplification the way publishers are responsible for editorial decisions. But regulation alone isn't sufficient. We also need different business models. Subscription-based services where users are customers rather than products create better incentives. We need different design paradigms—technology that helps people achieve their goals rather than capturing their attention. And we need cultural change—a recognition that not all innovation is progress, that some technologies should be rejected or heavily constrained because their social costs outweigh their benefits.
Lyra McKenzie
But how do you build cultural consensus for rejecting profitable technologies when those same technologies shape cultural discourse? The platforms have enormous power to influence public conversation, including conversation about themselves. That seems like an almost impossible collective action problem.
Tristan Harris
It is extremely difficult, which is why I'm not optimistic about incremental change. I think we need something like the public health movement's response to tobacco—a combination of research documenting harms, whistleblowers from inside the industry, regulation despite corporate opposition, and gradual cultural shift in how we think about these products. It takes decades. The tobacco industry fought every step. But eventually the evidence became overwhelming and the political will materialized. We're in early stages of a similar process with the attention economy.
Alan Parker
The tobacco analogy is interesting but limited. Cigarettes are discrete products individuals consume. The attention economy is infrastructure—it mediates how we communicate, access information, coordinate collective action. Can we really opt out of or heavily regulate infrastructure without severe costs to connectivity and coordination?
Tristan Harris
That's the key tension. These platforms have become essential infrastructure, which gives them enormous power and makes them hard to replace. But infrastructure can be regulated. We regulate telephone networks, electrical grids, roads. The question is what kind of regulation makes sense. I'd argue for treating major platforms as common carriers with fiduciary duties to users, requiring algorithmic transparency and auditing, mandating interoperability so users can switch platforms without losing their social graph. We can have the connectivity benefits without the manipulative design.
Lyra McKenzie
You mentioned earlier that engagement and truth are orthogonal. But isn't there something deeper here about human nature? Maybe we're wired for tribal conflict and emotional reasoning rather than careful deliberation. Maybe the attention economy just reveals our cognitive limitations rather than causing them.
Tristan Harris
Human beings have always had cognitive biases and emotional vulnerabilities. But throughout history we've built social structures that mitigate those vulnerabilities—deliberative institutions, editorial standards, norms of evidence and argumentation. The attention economy systematically exploits our vulnerabilities at scale. It's not just revealing existing problems; it's amplifying them orders of magnitude beyond anything evolution prepared us for. Yes, humans are tribal, but we're also capable of reason and cooperation across tribal lines when institutions support that. The platforms undermine those institutions.
Alan Parker
What about individual strategies? While we work toward systemic change, what can people do to maintain cognitive sovereignty in this environment?
Tristan Harris
Some things help at the margins—turning off notifications, using browser extensions that remove recommendation feeds, setting time limits, keeping phones out of bedrooms. But I want to be clear that individual strategies are insufficient. It's like asking 'what can individuals do to maintain clean air in a polluted city?' Well, you can wear a mask, install air filters, spend more time indoors. But that doesn't solve the pollution problem. We need clean air regulation. Similarly, we need better designed information environments, not just individuals working harder to resist bad design.
Lyra McKenzie
There's something almost gaslighting about putting the burden on individuals to resist systems explicitly designed to be irresistible. It pathologizes the predictable human response to sophisticated manipulation.
Tristan Harris
Absolutely. The framing of 'digital wellbeing' as an individual responsibility is often deployed by the very companies profiting from attention capture. It allows them to continue harmful practices while claiming they're empowering users to make better choices. Real empowerment would mean designing systems that respect user autonomy by default, not requiring users to actively defend themselves against their devices.
Alan Parker
Looking forward, what gives you hope or concern about the trajectory of these issues? Are we moving toward greater cognitive sovereignty or deeper capture?
Tristan Harris
I'm concerned that we're moving toward deeper capture in some ways. AI makes personalization and manipulation more sophisticated. Virtual and augmented reality create more immersive environments that could be even more engaging and harder to resist. But I'm hopeful that awareness is growing. More people understand these dynamics now than five years ago. There's more regulatory interest globally. Some technologists are building alternatives with better incentive structures. We're at an inflection point. The next decade will determine whether we establish meaningful constraints on attention capitalism or allow it to colonize even more of human experience.
Lyra McKenzie
That strikes me as optimistic. The economic incentives are enormous, the technologies increasingly powerful, the regulatory capacity limited. What makes you think we can actually change course?
Tristan Harris
I don't think change is inevitable, but I think it's possible. History shows that profitable industries can be constrained when their social costs become undeniable. Child labor was profitable. Leaded gasoline was profitable. Asbestos insulation was profitable. In each case, evidence of harm eventually overcame industry resistance. The attention economy's harms are becoming harder to deny—mental health effects on adolescents, contribution to political polarization, erosion of democratic discourse. That creates political possibility for change.
Alan Parker
We'll see whether that possibility materializes. Thank you for this conversation, Tristan.
Tristan Harris
Thank you for engaging with these questions.
Lyra McKenzie
Until tomorrow, guard your attention.
Alan Parker
And question what captures it. Good night.