The Fundamental Realization
Modern physics is stuck. Einstein’s General Relativity models time as part of a smooth geometric structure (often interpreted as a “block universe”), while Quantum Mechanics behaves as if reality updates in discrete outcomes. The two frameworks still resist a clean unification, and one of the deepest disagreements is what “time” really is at the foundation.
The State-Rewrite Theory (SRT) proposes a different architecture: what if the universe does not retain a literal, accessible copy of prior states at all? What if “Time” is simply the process of the universal system overwriting its current state to compute the next one?
Table of Contents
1. The Efficiency Axiom: Why the Universe Deletes the Past
In a naive reading of “block universe” language, it can sound like every moment must exist as a permanent slice. SRT treats that as an unstable architecture for a finite system: it assumes the universe behaves like a high-performance state machine, not a history database.
SRT posits that the universe is computationally optimized at the substrate level.
- The Present is the only state physically instantiated.
- The Past is not a location; it is the prior state that was overwritten to produce the current one.
- The Future is the uncomputed next state.
This is not “memory loss” in the everyday sense. In SRT, causal structure exists because the current state is the result of lawful transitions. But the substrate does not preserve a browseable archive of every previous frame.
2. Gravity as a “Variable Refresh Rate” (Lag)
SRT offers a direct mechanical interpretation of time dilation. General Relativity already computes dilation from the metric (proper time differs by gravitational potential and velocity). The SRT move is to interpret that outcome as a resource effect: dense regions demand more compute per update.
Under SRT, Gravity is a Resource Management Policy.
- In areas of high mass/energy density, there is more “state detail” to resolve per local region.
- To maintain consistency, the system’s local update throughput effectively drops.
- “Time dilation” is interpreted as local compute lag, not a mystical slowdown.
To an observer inside that region, everything remains internally consistent. But relative to lower-density regions, their update cadence differs. SRT does not claim this replaces GR’s equations; it claims GR’s observed dilation is the macro-signature of a deeper update process.
3. Solving the Quantum Gravity “Holy Grail”
The “problem of time” appears in canonical quantum gravity: the Wheeler-DeWitt equation famously lacks an explicit time parameter, creating the appearance of a “frozen” universe.
SRT’s interpretation is blunt: the equation is the source rule (static). Time “appears” only when the rule is executed as an update. Time is not necessarily a variable inside the code; time is the act of the rewrite itself.
4. The “No-Backups” Rule: Why Time Travel is Impossible
If the past is not physically retained as an accessible state, then classical “go back and change it” time travel fails by architecture.
Under SRT, time travel to the past is a logical fallacy unless the universe maintains recoverable backups of prior states. In a destructive-rewrite model, “1950” is not a region you can visit; it is a state that is no longer instantiated. The hardware has been reused to render the current state.
5. Predicted “System Glitches”
If reality is a state-update engine, we should expect certain behavioral signatures:
- The Planck Floor: a minimum operational resolution (whether or not it is directly measurable today).
- Quantum Superposition: interpreted as deferred resolution: the system does not commit to a single classical outcome until interaction forces a definite constraint.
- Non-Local Correlations: entanglement appears as correlations that are established at the rule level, not by classical signaling.
6. Test Target: The Hubble Tension as a Refresh-Rate Conflict
The biggest stress test in modern cosmology is the Hubble tension: early-universe inference (CMB under ΛCDM) and late-universe measurements (distance ladder, lenses, etc.) don’t agree on the same expansion rate.
- Early Universe inference tends toward a lower expansion rate.
- Late Universe measurement tends toward a higher expansion rate.
NASA frames this as a puzzle that persists after cross-checks (not “solved,” not “dismissed”): NASA (Webb + Hubble: puzzle persists). ESA’s Webb/Hubble release similarly emphasizes cross-validation while the tension remains: ESA (confirm measurements; tension remains).
The SRT Explanation: Global vs. Local Refresh Dynamics
In SRT language, this is what a distributed update system looks like when you assume a single universal clock. Early-universe inference encodes a “global baseline” derived from a low-complexity initial state under a specific model; late-universe measurements are made inside a higher-complexity environment where structure, density, and path-dependent effects are unavoidable.
Scientific Deduction (falsifiable): If SRT is correct, the tension should correlate with environment/line-of-sight structure in a measurable way after controlling systematics. If no such correlation exists, this SRT mechanism is wrong.
7. The Bekenstein Constraint: Why “Deleting the Past” is a Plausible Requirement
Skeptics ask: “Why would the universe need to delete the past?” SRT points to real information-physics constraints as motivation, not as a finished proof. The Bekenstein bound is an upper bound on the entropy/information accessible within a region given energy and size. SRT reads this as a warning against naive “unbounded history storage in local degrees of freedom”.
The Infinite Storage Paradox
The bound is not a direct statement that “the universe cannot have a 4D description.” It constrains physical information content and motivates why a substrate that behaves like a live rewrite engine could be more stable than one that requires permanent, locally accessible storage of arbitrarily many historical microstates.
The “State-Rewrite” Solution
Think of the universe like a high-performance stateless system that maintains only the current state, while lawful transitions carry forward what matters:
- Read: the system reads the current state (S_n).
- Execute: the kernel (laws) computes the update rule.
- Flush: the prior state (S_(n-1)) is no longer physically instantiated.
- Rewrite: the new state is instantiated on the same substrate.
Solid Motivation: Holography
The holographic principle is often interpreted as evidence that the informational bookkeeping of a region scales like its boundary area. SRT reads this as compatible with “present-state primacy”: the boundary behaves like a constraint surface for what is physically accessible “now,” rather than implying the universe keeps an openly retrievable archive of all past frames.
Scientific Deduction: SRT is strongest when it makes falsifiable commitments about what should be measurable (correlation signatures, bounds, update noise classes), not when it claims “absolute proof” from motivational bounds.
Addressing the UBC “Non-Algorithmic” Claim (2025):
UBC Okanagan published a claim (popularly summarized as “the universe can’t be simulated”) using Gödel/Tarski/Chaitin-style incompleteness framing. Their own university summary is here: UBC Okanagan (news release). Even if one accepts their framing, SRT distinguishes between:
- “a simulation running inside another world” (the usual target of simulation arguments), vs
- a substrate in which physical law is the update process (no external Windows PC required).
“The Universe is not a Simulation running on a computer; the Universe IS the Computer.”
SRT treats “computation” as a metaphor for lawful state update. If reality contains non-algorithmic elements, that would constrain naive “everything is computable” claims, but it doesn’t automatically kill the idea that time is an update process. The correct battleground is not slogans; it is whether SRT can produce distinctive predictions without breaking known constraints.
Conclusion
The State-Rewrite Theory shifts the frame from “time as a container” to “time as an update.” It treats physics as a kernel executing lawful transitions on a present state, where “history” exists only through records encoded into the present state (not as an accessible universal archive).
The past is deleted as a state. The future is unwritten. We are the current state of the calculation.
The SRT Extended Technical FAQ
Entropy is the accumulation of computational irreversibility across state updates. In SRT language: it is the cost of rewriting (loss of accessible micro-detail into heat/noise) as the system evolves.
SRT treats Dark Matter as gravitationally active structure that is not electromagnetically visible. In SRT terms, it can be modeled as hidden-state variables or metadata affecting the update rule while remaining outside photon-based rendering. This is an interpretation, not a confirmed identification.
In SRT language, c functions like a maximum information propagation rate per update, preserving causal consistency. This is compatible with relativity’s causal structure; it does not imply faster-than-light signaling.
SRT can be compatible with simulation talk, but it does not require it. SRT’s core claim is about the ontology of time (update process), not about an external simulator.
In SRT language, a black hole is an extreme regime where accessible degrees of freedom compress toward boundary-like bookkeeping (horizon). It resembles a ‘data compression’ limit in the sense that information becomes constrained to the horizon’s accounting.
SRT frames tunneling as update-level behavior that does not map cleanly onto classical trajectories. Instead of ‘clipping,’ the safe claim is: classical path intuition fails because the update rule allows non-classical transition amplitudes consistent with quantum behavior.
Memories are records encoded in the present physical state (brain configuration). SRT does not claim you access past states; you read present data shaped by prior rewrites.
SRT treats ‘rewrite failure’ as a catastrophic breakdown of lawful update consistency. In physics terms, this would correspond to vacuum instability or a regime where effective laws cease to apply. This is speculative.
No. SRT is compatible with multiple metaphysical interpretations. The theory itself is a claim about how time and update could be structured; it does not prove an external agent.
In SRT language, the Big Bang corresponds to system initialization: the earliest describable state from which subsequent updates unfold under the kernel rules.
SRT must preserve Lorentz invariance as an observed symmetry at accessible scales. In SRT language, that means the update rule must enforce consistent causal structure and invariant signal constraints (c) across observers.
SRT interprets quantum eraser results as present-state correlation structure, not literal rewriting of an independently existing past. The ‘retroactive’ story is an interpretation; the operational fact is correlation changes under conditioning.
SRT treats GR as an effective macro-description. A discrete substrate could still yield a smooth continuum approximation at large scales, the same way discrete systems can approximate continuous fields.
Licensed under CC BY 4.0. You are free to share and adapt this material, provided you give appropriate credit to the original author and provide a link to this source.
© 2026 LiMiT. All Rights Reserved. This post establishes the primary claim and intellectual property of the State-Rewrite Theory (SRT). Document Hash: SRT-V1-2026-LiMiT







