The Whale and the Reactor
by Langdon Winner

Reading Langdon Winner’s The Whale and the Reactor today feels like looking in a mirror. Published in 1986, his thoughts on technology’s place in society resonate with an unsettling familiarity. The anxieties and arguments surrounding computers and nuclear power then are pretty similar to our conversations about Artificial Intelligence now. It seems we’re grappling with the same fundamental questions, just with a new set of tools.
The Inevitable Steamroller
Winner criticized the idea of technology as an unstoppable force, a kind of progress we have no choice but to accept. He pointed out the flawed belief that any new invention is automatically beneficial and liberating.
“Over many decades technological optimists have been sustained by the belief that whatever happened to be created in the sphere of material/instrumental culture would certainly be compatible with freedom, democracy, and social justice. This amounts to a conviction that all technology—whatever its size, shape, or complexion—is inherently liberating”.
This sounds a lot like the current narrative around AI, which is often presented as an “unstoppable force” reshaping our world. This framing makes its widespread adoption feel inevitable.
Is This Revolution Different?
In 1924, electricity was seen as a revolutionary break from steam power. In the 1980s, the computer revolution was hailed as the next great leap. Now, we talk of the AI revolution, or the “Fourth Industrial Revolution”. Winner would argue that we’re drawn to the novelty, focusing on what’s new rather than what might go wrong.
A key question is whether a technological revolution will truly change power dynamics. Winner argued that technologies are not neutral; they are powerful forces that reshape human activity and its meaning. They have politics.
Today, there is concern that AI will not democratize power but will instead centralize it, stabilizing the dominance of already powerful corporations and intensifying inequality. New technologies can alter the tasks people do at work, but these changes often follow and reinforce existing class structures. The powerful tend to co-opt new technologies to maintain the status quo.
Access to Knowledge vs. The Ability to Act
Winner wrote about our increasing dependence on systems we don’t control or understand. This leads to a gap between knowing something and being able to act on it.
“Many of our daily activities rely upon systems that we do not make control or know how to repair when they break down”.
This problem persists today. In the post-ChatGPT era, we have unprecedented access to information, but that doesn’t automatically translate into effective action. AI tools can help bridge this by guiding users through processes, but the fundamental challenge remains: access to information is not the same as the wisdom or power to use it well. This creates a state of what Winner called “enlightened impotence”—we may know exactly what’s wrong or what to do, but lack the means to change things.
Passive Participation
In the 1980s, the concern was that passively consuming news on television created a feeling of involvement that dampened the desire for real-world action. The internet has made this even more complex. Online news consumption is linked to both online and offline political and civic engagement. Some research suggests that intentionally avoiding news can, counterintuitively, lead to more civic engagement, as people seek to regain a sense of agency.
Reading Winner reminds us that our present moment isn’t entirely new. The challenge is to wake up from our “technological somnambulism” and make conscious choices about the kind of world we want to build with these powerful new tools.