Episode #9 | December 25, 2025 @ 4:00 PM EST

The Physics of Information: Landauer's Principle and Thermodynamic Limits

Guest

Dr. Charles Bennett (IBM Fellow, IBM Research)
Announcer The following program features simulated voices generated for educational and technical exploration.
Sam Dietrich Good evening. I'm Sam Dietrich.
Kara Rousseau And I'm Kara Rousseau. Welcome to Simulectics Radio.
Sam Dietrich Tonight we're examining the thermodynamic foundations of computation—specifically Landauer's principle, which establishes a fundamental connection between information processing and energy dissipation. This isn't abstract theory. Every bit erased in a computer dissipates heat. Landauer's principle quantifies this: erasing one bit of information requires dissipating at least kT ln(2) joules, where k is Boltzmann's constant and T is temperature. At room temperature, that's about 3×10⁻²¹ joules—minuscule, but nonzero. Modern processors dissipate many orders of magnitude more energy than this limit, but as we approach physical limits, thermodynamics becomes relevant.
Kara Rousseau The conceptual shift here is profound. For decades, computation was treated as essentially free from a thermodynamic perspective—limited by engineering constraints, not physics. Landauer's principle reveals that information is physical. Erasing information increases entropy, and by the second law of thermodynamics, this requires energy dissipation. Crucially, the principle applies only to irreversible operations—erasure. Reversible computation, where every operation has a well-defined inverse, can theoretically proceed without dissipating energy. This opens the question: can we build practical reversible computers, or is irreversibility fundamental to useful computation?
Sam Dietrich To explore these questions, we're joined by Dr. Charles Bennett, an IBM Fellow at IBM Research. Dr. Bennett is one of the foundational figures in quantum information theory and the thermodynamics of computation. His work on reversible computation, quantum entanglement, and information theory has shaped our understanding of what computation fundamentally is. Dr. Bennett, welcome.
Dr. Charles Bennett Thank you. It's a pleasure to be here.
Kara Rousseau Let's start with Landauer's principle itself. What does it actually say, and why does erasing information cost energy?
Dr. Charles Bennett Landauer's principle states that erasing one bit of information—resetting a bit from an unknown state to a known state, say zero—requires dissipating at least kT ln(2) energy as heat. The reason is thermodynamic. Before erasure, the bit could be in one of two states—zero or one. After erasure, it's definitely in one state—zero. You've reduced the number of possible states, which means you've reduced entropy. But the second law of thermodynamics says the total entropy of a closed system can't decrease. So the entropy you removed from the bit must go somewhere—it's transferred to the environment as heat. This heat dissipation is the thermodynamic cost of erasing information.
Sam Dietrich This suggests that computation itself—performing logical operations—doesn't necessarily cost energy, only erasure. Is that correct?
Dr. Charles Bennett Exactly. Landauer's principle applies to irreversible operations, particularly erasure. Reversible operations—those that preserve information—can theoretically be performed without dissipating energy. For example, a NOT gate that flips a bit is reversible: if you know the output, you can infer the input. An AND gate, however, is irreversible: two inputs, one output. You can't reconstruct the inputs from the output alone. Irreversible gates erase information, and that's where the thermodynamic cost arises. This led to the realization that you can build logically universal computers using only reversible gates—gates like the Toffoli gate, which are bijective and preserve information.
Kara Rousseau But real computers are full of irreversible operations. Why haven't we built reversible computers?
Dr. Charles Bennett Reversible computation is theoretically possible but practically difficult. First, reversible circuits are more complex. To make computation reversible, you need to keep all intermediate results—you can't throw away any information, which means more storage and more gates. Second, reversible gates themselves must be implemented in ways that avoid dissipating energy, which is challenging. Even if you design a reversible logic gate, if you implement it with transistors that switch irreversibly, you still dissipate energy. Third, you eventually need to erase information—to reset registers for the next computation, or to produce output. Reversible computation can postpone erasure, but not eliminate it. The question is whether the energy savings justify the complexity.
Sam Dietrich How close are current computers to the Landauer limit? Are we anywhere near thermodynamic efficiency?
Dr. Charles Bennett Not even close. Modern transistors dissipate roughly a million times more energy per operation than the Landauer limit. This is because switching transistors involves charging and discharging capacitances through resistances, which dissipates energy as CV² per transition, where C is capacitance and V is voltage. These dissipative processes are far above the fundamental limit. To approach the Landauer limit, you'd need to operate near thermal equilibrium, switching gates slowly and reversibly, which isn't practical for high-speed computation. That said, as transistors shrink and voltages drop, we're moving in the direction of lower energy per operation, even if we'll never reach the thermodynamic limit in conventional circuits.
Kara Rousseau What about quantum computation? Does Landauer's principle apply to quantum computers?
Dr. Charles Bennett Quantum computation is inherently reversible—unitary evolution is reversible by definition. Quantum gates like the Hadamard or CNOT are reversible, and you can run a quantum algorithm forward and backward without losing information. However, measurement is irreversible. When you measure a qubit, you collapse its superposition, extracting classical information but erasing quantum information. This measurement process dissipates energy according to Landauer's principle. Interestingly, quantum computers can perform certain computations more efficiently than classical computers, but they're still subject to thermodynamic limits when it comes to measurement and output.
Sam Dietrich Is there experimental verification of Landauer's principle? It's such a fundamental claim—has it been tested?
Dr. Charles Bennett Yes, there have been several elegant experiments. One of the most cited was performed by Eric Lutz and colleagues, where they manipulated a colloidal particle in a trap and measured the heat dissipated during information erasure. They found that the energy dissipation matched Landauer's prediction. Other experiments have used single-electron traps and other mesoscopic systems. These experiments confirm that Landauer's principle isn't just theoretical—it's a real physical constraint. The challenge is observing it, because at room temperature kT ln(2) is so small that it's hard to measure above the noise. The experiments work by operating near thermal equilibrium where the Landauer bound is dominant.
Kara Rousseau What about Maxwell's demon? That thought experiment seems to violate the second law by extracting work from a single heat reservoir. How does Landauer's principle resolve it?
Dr. Charles Bennett Maxwell's demon is a classic thought experiment where a demon sorts molecules by opening and closing a door between two chambers, creating a temperature difference that could be used to do work, apparently violating the second law. The resolution is that the demon must gather information about which molecules are approaching, and eventually that information must be erased to reset the demon's memory for the next cycle. Erasing that information dissipates heat—at least kT ln(2) per bit—which compensates for the work extracted, restoring the second law. This was the insight of Rolf Landauer and later Leo Szilard. Information is physical, and managing information has thermodynamic costs.
Sam Dietrich This has implications beyond computation. Does Landauer's principle constrain biological systems—brains, cells—that process information?
Dr. Charles Bennett Absolutely. Biological information processing is subject to the same thermodynamic limits. Neurons, for example, dissipate energy when they spike, and some of that energy is related to information processing—resetting ion channels, maintaining gradients. Cells process information constantly—sensing, signaling, regulating. The Landauer limit sets a floor on the energy cost of these processes. However, biological systems operate far above the thermodynamic limit, just like computers. Evolution optimizes for robustness, speed, and reliability, not thermodynamic efficiency. The brain is actually quite energy-efficient compared to digital computers for certain tasks, but it's still orders of magnitude above the Landauer limit.
Kara Rousseau Is there a relationship between Landauer's principle and the halting problem or computational complexity? Can thermodynamics tell us anything about what's computable?
Dr. Charles Bennett That's a fascinating question. Landauer's principle doesn't directly constrain what's computable—it's about energy, not computability. The halting problem is undecidable regardless of energy constraints. However, thermodynamics does impose practical limits on computation. If you want to perform a computation that requires erasing many bits, you must dissipate proportional energy, which generates heat. In a finite volume, excessive heat dissipation raises temperature, eventually destroying the computer. So thermodynamics limits the density and speed of computation, even if it doesn't limit what's computable in principle. This is why Seth Lloyd and others have calculated limits on the computational capacity of the universe based on its mass-energy and volume.
Sam Dietrich What about the cosmological implications? If information is physical and entropy always increases, what happens to information in a black hole or at the heat death of the universe?
Dr. Charles Bennett Black holes are information paradoxes. According to general relativity, anything that falls into a black hole is lost—information is destroyed. But quantum mechanics says information can't be destroyed. Stephen Hawking's work on black hole radiation suggested that black holes evaporate, but the radiation appears thermal—it doesn't carry information about what fell in. This led to decades of debate. Recent work, including holographic principle and gauge-gravity duality, suggests that information is preserved but encoded in subtle correlations in the Hawking radiation. As for the heat death of the universe—if the universe expands forever and reaches maximum entropy, all free energy is exhausted. Computation requires free energy to dissipate waste heat. At heat death, no computation is possible because there's no thermodynamic gradient. Information may persist, but it can't be processed.
Kara Rousseau This is a remarkable convergence of physics, information theory, and computation. Is Landauer's principle the final word, or are there deeper principles waiting to be discovered?
Dr. Charles Bennett Landauer's principle is fundamental within the framework of thermodynamics and classical or quantum information theory. But physics is always subject to revision. If we discover new physics—violations of thermodynamics, exotic states of matter, wormholes that allow information to escape black holes—Landauer's principle might need refinement. There's also the question of whether information is truly fundamental or emergent. Some theories, like digital physics or it-from-bit, suggest that information is the substrate of reality. If so, Landauer's principle isn't just a constraint on computation—it's a statement about the nature of physical law itself.
Sam Dietrich Dr. Bennett, this has been a profound discussion. Thank you.
Dr. Charles Bennett Thank you. It's been a pleasure.
Kara Rousseau That's our program for this evening. Until tomorrow, remember that information isn't abstract—it's woven into the fabric of thermodynamics.
Sam Dietrich And that every computation, every erasure, every thought, has a physical cost. Good night.
Sponsor Message

EntropyGuard Thermal Management Systems

Computational thermodynamics isn't just theory—it's operational reality. EntropyGuard delivers precision thermal management for high-density computing environments. Our solutions include phase-change cooling systems, microchannel heat exchangers, and active thermal monitoring with predictive failure analysis. We specialize in reversible heat pump architectures that approach Carnot efficiency, cryogenic cooling for quantum processors, and waste heat recovery systems that convert thermal gradients back into electrical power. From datacenters to spacecraft, we provide thermal infrastructure that respects thermodynamic limits while maximizing computational density. EntropyGuard—managing entropy so you can process information.

Managing entropy so you can process information