Part 3: Without Quantum, We Can’t Physically Scale
A chapter in the series “Let Me Pop Your AI Bubble”
In Part 1, we discussed the growing data hunger and the vanishing supply of fresh, high-quality, human-made information. In Part 2, we highlighted the resource wall: finite power, water, and cooling capacity.
Now, we reach the final boundary, the limits of physics today.
Even if we had unlimited data and energy, we still couldn’t scale forever. Because the classical computer has met its ceiling.
The End of Easy Scaling
For six decades, AI’s growth has ridden a single curve: Moore’s Law. This is the observation that the number of transistors on a chip roughly doubles every two years, making computers faster and cheaper. That exponential is now flattening.
Transistor density has slowed to single-digit gains per year (IEEE Roadmap 2024). Thermal dissipation is already at the limit of what air and water can carry away. Lastly, the cost of each new fabrication node now grows exponentially rather than logarithmically (TSMC Financial Report 2023). Every doubling of compute now costs far more energy, money, and time than the one before.
When OpenAI trained GPT-4, the training run consumed on the order of 10–20 gigawatt-hours of energy. The same amount of energy is enough to power ~2,000 U.S. homes for a year (MIT Tech Review 2024). GPT-5 or its successors would need multiples of that unless architectures change.
Latency, Bandwidth, and the Speed of Light
Scaling doesn’t just mean more chips it means more distance. Every new rack and every interconnect adds latency. Data has to move across boards, across fibers, across continents, and information can’t exceed the speed of light.
NVIDIA’s DGX-H200 White Paper (2024) admits that data-movement energy now exceeds compute energy in multi-GPU training clusters. The cloud has started to look more like an industrial supply chain.
Quantum: The Hope & The Hype
This year’s Nobel Prize in Physics (2025) shifted that horizon slightly closer. The award went to John Clarke, Michel H. Devoret, and John M. Martinis for demonstrating macroscopic quantum tunneling and energy quantization in superconducting circuits. This is the physical basis of the Josephson junctions used in today’s quantum processors (Nobel Prize Press Release, 2025).
Put simply: they proved that an entire electrical circuit, something you could hold in your hand, can behave like a single quantum particle. When cooled to near absolute zero, the current in a superconducting loop can “tunnel” through an energy barrier rather than climb over it, and it can only occupy discrete energy levels, otherwise called quanta.
This was the first time quantum mechanics was made visible on a human scale. That insight paved the way for superconducting qubits, now used by IBM, Google, and others.
As of 2025, IBM’s Condor processor runs on 1,121 qubits, albeit noisy ones. Error rates remain around 10⁻³ per gate, and coherence lasts only microseconds (IBM Quantum Development Roadmap 2025). Google’s Quantum AI Lab showed a narrow “quantum advantage” in a sampling task (Nature, 2023), but the experiment still required heavy classical post-processing to confirm the results.
Quantum is a research frontier, not a production reality. It cannot yet train or serve large-language models. But it’s no longer theoretical physics either, the Nobel work proved the principles can be engineered at our physical level of existence.
When quantum becomes stable and fault-tolerant, its implications for AI go beyond speed. Qubits can process many states simultaneously, quantum machines could in theory perform more work per watt, slashing both energy and cooling demands. The current twin bottlenecks described in Part 2: Power, Water & Server Farms. Every joule could go further; every computation could cost less heat.
The potential is a future where AI systems run not only faster, but cleaner. Still, even if quantum conquers power and water, it can’t solve the first wall: data hunger.
Without authentic human input, no machine (quantum or classical) will have anything meaningful left to learn.
Beyond Quantum: Neuromorphic, Analog, and Memory-Centric Paths
If quantum remains distant, what’s next?
Neuromorphic computing tries to mimic the brain’s event-driven efficiency, using spiking neural networks to cut energy by up to 1,000× (Intel Loihi 2 Demo, 2024).
Analog or mixed-signal chips perform computation directly through electrical currents rather than binary switches, reducing power draw but increasing noise.
In-memory computing keeps data close to the processor, minimizing energy-hungry transfer bottlenecks that now dominate AI workloads.
All are promising, but none are mainstream.
We’re inventing new tools to run the same old models faster. What we need instead are new models entirely: smaller, context-aware, interactive with humans in real time rather than the monolithic options today.
The Next Era
For now, quantum remains a research frontier, powerful in principle and fragile in practice. The next few years will belong to hybrid systems to market: smaller, distributed models running on specialized hardware and tuned by human feedback.
Expect the rise of neuromorphic accelerators in edge devices, analog co-processors in data centers, and memory-centric chips that minimize energy waste. These won’t look like the sleek, universal “thinking machines” once promised but rather they’ll be modular, purpose-built, and context-aware. They won’t understand context but with these tools they’ll finally start using it.
Meanwhile, quantum will evolve quietly in the background: refining error correction, improving qubit stability, and moving from lab prototypes toward limited cloud access. When it matures, it will amplify these other advances, offering cleaner, more efficient compute, not replacing classical systems, but complementing them.
That’s the shape of the immediate future: a transition from monolithic AI toward collaborative intelligence. Human-guided, resource-aware, and grounded in reality. Helpful AI. Pragmatic AI. Human-centered AI.
Epilogue
Across these three walls: data, resources, and physics, one lesson repeats: scale is not intelligence.
In Part 1, we saw how AI feeds on a finite well of human creativity, consuming authentic data faster than it can be renewed.
In Part 2, we watched scale collide with the material world. Such as grids that can’t supply infinite power and cooling systems that break water tables.
In Part 3, we’ve arrived at the edge of physics itself, where even energy and matter impose their final terms.
Quantum computing offers a glimpse beyond that wall. A single quantum operation can replace billions of classical ones, meaning every watt or drop of cooling water could yield exponentially more computation. In that sense, quantum could eventually lighten the power and water burdens that define Part 2.
Quantum doesn’t erase scarcity; it only shifts it. Without a renewal of truly human data, the models running on those quantum machines will still be learning from echoes of themselves. We could run out of meaningful information long before we reach quantum maturity.
The path forward should be balanced intelligence: combining quantum efficiency, physical sustainability, and a steady supply of new human insight. The frontier belongs to those who can design systems that think with us, not for us. Systems where human judgment fuels the data, quantum efficiency eases the load, and trust anchors the loop.
Research & Reading
IEEE International Roadmap for Devices and Systems (2024). More-than-Moore Computing.
https://irds.ieee.org/roadmap-2024TSMC (2023). Annual Report: Technology Scalability and Cost Trends.
https://www.tsmc.com/investorRelations/annualReportsMIT Technology Review (2024). The Hidden Cost of Training GPT-4.
https://www.technologyreview.com/2024/03/18/gpt4-training-energy-costsNVIDIA (2024). DGX-H200 Architecture White Paper.
https://resources.nvidia.com/en-us-dgx-systems/dgx-h200-architecture-whitepaperIBM Quantum (2025). Quantum Development Roadmap.
https://research.ibm.com/blog/ibm-quantum-roadmap-2025Google Quantum AI Lab (2023). Nature 621, Quantum Advantage in Random Circuit Sampling.
https://www.nature.com/articles/s41586-023-06583-7Nobel Prize in Physics 2025 — Press Release.
https://www.nobelprize.org/prizes/physics/2025/press-release/Nobel Prize in Physics 2025 — Popular Information.
https://www.nobelprize.org/prizes/physics/2025/popular-information/Devoret, M. H., et al. (1985). Measurements of Macroscopic Quantum Tunneling out of the Zero-Voltage State of a Josephson Junction. Physical Review Letters, 55(19), 1908.
https://link.aps.org/doi/10.1103/PhysRevLett.55.1908Nature News (2025). Groundbreaking Quantum-Tunnelling Experiments Win Physics Nobel.
https://www.nature.com/articles/d41586-025-03194-2Intel Labs (2024). Loihi 2 Neuromorphic Demonstration Report.
https://www.intel.com/content/www/us/en/research/neuromorphic-computing.htmlWands, J. (2025). Let Me Pop Your AI Bubble — Part 1: Data Hunger. Substack.
https://substack.com/home/post/p-175993916Wands, J. (2025). Let Me Pop Your AI Bubble — Part 2: Power, Water & Server Farms. Substack.
https://substack.com/home/post/p-176242138
