Note: This is a research note supplementing the book Unscarcity, now available for purchase. These notes expand on concepts from the main text. Start here or get the book.
Simulation Science: What Physics Actually Says
Summary: The simulation hypothesis isn’t just philosophy - it connects to verified physics. This article explores the scientific foundations that make simulation theory plausible: quantum measurement, Planck-scale discreteness, computational efficiency, and the “it from bit” hypothesis. We’re not claiming we live in a simulation. We’re showing why the question isn’t crazy.
Not Proof, But Plausibility
Let’s be clear upfront: we have no proof we’re in a simulation. We also have no proof we’re not. This article isn’t arguing for simulation theory - it’s explaining why serious physicists take it seriously.
The core insight: our universe behaves in ways that are consistent with computational optimization. This proves nothing. But it’s genuinely strange.
The Double-Slit Mystery
The double-slit experiment is the most replicated result in physics, verified thousands of times since the 1920s. Here’s what happens:
Particles behaving as particles: Fire electrons at a barrier with two slits while monitoring which slit each passes through. They pile up behind the slits like bullets - two bands, one per slit.
Particles behaving as waves: Fire electrons without monitoring which slit they pass through. They create an interference pattern - alternating light and dark bands - as if each electron went through both slits simultaneously.
The same particles. The same slits. The act of observation changes the outcome.
This isn’t instrument limitation. The 2022 Nobel Prize in Physics went to Aspect, Clauser, and Zeilinger for definitively proving that quantum systems don’t have definite properties until measured. Einstein called this “spooky action at a distance” and spent decades trying to disprove it. He failed.
The Simulation Interpretation (Speculation)
If you were designing a simulation to conserve computational resources, you wouldn’t calculate what no one observes. Store a probability distribution, render the result only when queried. Programmers call this lazy evaluation.
The universe might do the same thing. Or it might not - we can’t tell from inside. But the behavior is consistent with computational optimization.
The Grain of Reality
Our universe appears to have a fundamental “pixel size” and “frame rate”:
The Planck Length
About 1.6 x 10^-35 meters - so small that if you scaled it to the size of a grain of sand, an atom would be larger than a galaxy. Below this scale, the concept of “space” breaks down into quantum foam.
The Planck Time
About 5.4 x 10^-44 seconds - the shortest meaningful time interval. Below this scale, as John Wheeler described it, “there would literally be no left and right, no before and no after.”
What This Means (And Doesn’t Mean)
Physicists debate whether Planck-scale limits represent actual discreteness or merely the boundary of current theories. Loop quantum gravity predicts spacetime is woven from discrete chunks; string theory offers different answers.
The simulation interpretation: if you were designing a simulation and wanted to limit computational load, building in minimum resolution would be elegant. The universe may have done this for its own reasons - or the discreteness may be an artifact of our theories, not physical reality.
“It From Bit”
John Wheeler, who coined the term “black hole” and trained generations of physicists including Richard Feynman, proposed a radical hypothesis: “it from bit” - physical reality (“it”) emerges from information (“bit”).
Wheeler’s claim: “Every physical quantity, every it, derives its ultimate significance from bits, binary yes-or-no indications.”
This doesn’t mean we’re in a computer simulation. It suggests something deeper: information may be more fundamental than matter. Mass, charge, spin - these might be patterns in an underlying informational substrate.
Supporting Evidence
- The holographic principle: Black hole entropy is proportional to surface area, not volume - as if information is stored on a boundary, like a hologram. Leonard Susskind and Gerard ’t Hooft developed this insight.
- Quantum information theory: Quantum mechanics describes evolution of information, not matter. The wavefunction isn’t a physical thing - it’s a probability distribution, a mathematical description of knowledge.
- Landauer’s principle: Information erasure has physical cost - erasing one bit requires minimum kT ln(2) of energy. Information and physics are intertwined at the deepest level.
Computational Cosmology
If we’re in a simulation, what can we infer about the simulators?
The Efficiency Argument
A civilization capable of simulating a universe with 10^80 atoms must have computational resources we can barely imagine. But even they might optimize. Nick Bostrom’s original paper notes that you don’t need to simulate every atom - just enough to fool conscious observers.
Games use level-of-detail rendering: high detail near the camera, low detail in the distance. Quantum mechanics might be similar - high precision only where observers look closely.
The Ancestor Simulation
Bostrom’s argument runs:
- Advanced civilizations might run detailed simulations of their ancestors
- Each civilization might run many such simulations
- Simulated people would vastly outnumber “real” people
- Therefore, statistically, we’re probably simulated
This isn’t proof. Step 1 might be wrong (simulations might be impractical or forbidden). But it shows why statisticians take the hypothesis seriously.
The Speed Question
David Chalmers asks: does it matter if our simulation runs slowly? If the simulators pause our universe for a billion years (their time) while they grab coffee, would we notice?
No. Subjective time inside the simulation would be uninterrupted. You might be living at one billionth speed right now.
What We Can’t Know
No Exit
If we’re in a simulation, we likely can’t prove it from inside. We’d need to find bugs, seams, rendering errors - and the simulators have had our entire cosmic history to patch them.
Some theoretical approaches suggest we might detect constraints on high-energy physics if reality is discretized. So far, no such evidence exists.
Turtles All the Way Down
If we’re simulated, are our simulators simulated? The regression problem suggests either infinite regress or a “base reality” that isn’t computed. We can’t know which.
What If?
Suppose tomorrow we discovered proof we’re in a simulation. What changes?
Probably not much.
Your coffee tastes the same. Your relationships are still real relationships. Your choices still have consequences. Meaning was always something we created - it doesn’t require specific metaphysical grounding.
As David Chalmers argues: “Virtual reality is genuine reality.” If we’re simulated, we’re really simulated. That’s still reality for us.
The Unscarcity framework doesn’t depend on resolving this question. Whether we’re carbon or code, Law 1 (Experience is Sacred) still applies. Consciousness has intrinsic worth regardless of its substrate.
Related Articles
- The Sacred Question - How religious and secular worldviews can coexist
- Consciousness Upload: The Last Identity Crisis - What happens when minds move between substrates
- The Constitutional Core - The five axioms that govern the MOSAIC
Further Reading
- Nick Bostrom, Are We Living in a Computer Simulation? (2003)
- David Chalmers, The Virtual and the Real (2017)
- John Wheeler, Information, Physics, Quantum: The Search for Links (1990)
- Rizwan Virk, The Simulation Hypothesis (2019)