Moore's Malpractice
Knowing the Limits. Ignoring the Limits. Scaling the Failure.
A Technical Indictment of Electron-Density Computing
Every profession has a term for continuing despite knowing the risks. Medicine calls it iatrogenic harm—death by doctor. Engineering terms it negligent design. Law recognizes willful disregard. Finance prosecutes fraud by omission. The military court-martials for dereliction of duty.
Technology needs its own term: Moore's Malpractice. This is humanity's decision to continue scaling electron-density computation after the thermodynamic failure became undeniable, obvious, and mathematically proven.
This document presents the case that physicists, engineers, and industry leaders have known the core computational limits for decades. They understood Landauer's limit on irreversible computation. They witnessed Dennard scaling collapse as transistors stopped shrinking predictably. They acknowledged Amdahl's law showing parallelism hitting fundamental barriers. They observed Koomey's law demonstrating energy efficiency improvements plateauing. They studied Shannon's limits proving noise and heat rise inexorably with density.
Yet the global technology sector made a deliberate choice: shrink transistors anyway, stack density anyway, increase clock rates anyway, electrify everything anyway, build heat traps called data centers anyway, and increase compute load by factors of a million—all while knowing full well the substrate was collapsing beneath them.
This is not innovation. This is malpractice—not by accident, but by intentional escalation of a failing architecture. What follows is the evidence, the timeline, and the consequences of this systematic failure.
The Moment Industry Knew It Was Unsustainable
1
1972: Landauer's Limit
Rolf Landauer at IBM demonstrated that information erasure is thermodynamically irreversible, producing minimum heat per bit destroyed. This wasn't a theory—it was mathematical proof. Heat is not a side-effect of electron computing. Heat is a fundamental, unavoidable requirement.
2
1985: Moore's Own Warning
Gordon Moore himself predicted the scaling collapse, issuing a clear warning: "The law cannot continue indefinitely. Nature imposes limits." Nature imposed those limits. The industry systematically ignored them.
3
2005: Dennard Scaling Dies
Robert Dennard at IBM proved that shrinking transistors no longer reduced power consumption. This was the official death certificate of electron-density scaling—the equivalent of a physician declaring: "This patient cannot tolerate more dosage." The industry's response? Triple the dosage.
4
2010s: Data Centers Heat the Planet
According to research by Koomey and Masanet at Stanford, data center energy consumption exploded by orders of magnitude, threatening global electrical grids with collapse. Everyone in the industry saw the exponential heat curve. Everyone chose to ignore it.
The Paradox: More Scale Equals More Failure
PhotoniQ Labs identifies what we call the Heat-Driven Collapse Curve, though its foundations rest on established scientific principles that have been understood for decades. This isn't speculation—it's thermodynamics.
The curve is built on Shannon's noise limits, Joule heating, electron scattering, materials breakdown, leakage currents, hot-carrier degradation, fundamental thermodynamic inefficiencies, cooling asymptotes, and entropy production scaling exponentially with density.
The mathematics are unforgiving: if you increase transistor count, clock speed, parallelism, electrons per unit area, or switching frequency, you increase heat generation faster than you increase performance. This is an exponential divergence, not a linear trade-off.
"We hit a thermal wall. And then we hit ten more walls."
Anant Agarwal, MIT Professor and Founder of Tilera
This is why Moore's Law didn't simply break—it self-immolated. The more the industry scaled electron-density computation, the faster it accelerated toward thermal collapse. And this is the precise moment when continuation became intentional malpractice.
Why Knowing the Limits Didn't Stop the Industry
This is where human psychology intersects with physical limits. Humans are rationalizing creatures, capable of extraordinary self-deception when survival is at stake. The technology industry justified continuing down a thermodynamically doomed path with a series of rationalizations that sound technical but are actually psychological defense mechanisms.
"Everyone Else Is Doing It"
The bandwagon fallacy masquerading as competitive necessity. If all competitors are driving toward a cliff, the market rewards those who accelerate fastest—until the cliff arrives.
"We'll Fix Heat Later"
The perpetual deferral of thermodynamic consequences. This is equivalent to a doctor saying "we'll deal with the bleeding after we finish the surgery."
"Cooling Tech Will Catch Up"
A faith-based assumption that engineering will overcome physics. Cooling has asymptotic limits that were well understood before data centers were built.
"AI Profits Justify It"
Short-term financial returns used to rationalize long-term systemic collapse. This is the logic of addiction applied to corporate strategy.
These are not technical arguments. They are survival strategies for executives facing uncomfortable truths. The industry knew electrons were a dead-end. They knew heat would scale faster than compute. They knew global energy grids would strain under the load. They knew data-center heat would destabilize urban infrastructure. They knew companies would eventually resort to nuclear power just to keep the lights on. They understood the economics were fundamentally unsustainable.
And despite this knowledge—or perhaps because of it—they escalated. That is the textbook definition of malpractice.
When Malpractice Becomes Fraud
Legal fraud requires two specific conditions to be met: knowledge of material facts and failure to disclose those facts to parties who would be harmed by their concealment. The computing industry has satisfied both conditions since at least the 1970s.
Knowledge of Limits
The industry has possessed detailed understanding of thermodynamic ceilings since Landauer's 1961 paper, with subsequent research by Dennard, Shannon, Amdahl, and Koomey providing increasingly precise quantification of the failure modes. This knowledge was not obscure—it was published in flagship journals, presented at major conferences, and widely discussed within the technical community.
Failure to Disclose
Despite this knowledge, the industry systematically concealed thermodynamic constraints from investors, regulators, municipal authorities, and the general public. Marketing materials describe systems as "scalable" without mentioning heat ceilings. Investor presentations omit discussion of fundamental energy limits. Municipal incentive negotiations fail to disclose long-term grid impacts.
Investor Misrepresentation
Companies market electron-bound neural networks as infinitely scalable technologies, raising billions in capital without disclosing that heat ceilings impose hard limits on maximum model size and deployment density.
Regulatory Omission
Data-center thermodynamic constraints are systematically omitted from regulatory filings, environmental impact statements, and public infrastructure planning documents.
Municipal Deception
Local governments are encouraged to subsidize energy-intensive facilities with promises of economic development, while long-term grid destabilization risks remain undisclosed until infrastructure failure becomes imminent.
Misdirected Safety Theater
AI companies dedicate enormous resources to warning about hypothetical "existential AI risk" while remaining conspicuously silent about the very real, very immediate existential risk of grid collapse and thermal runaway.
Under the 1970 Racketeer Influenced and Corrupt Organizations Act, willful deception within an enterprise for purposes of self-enrichment qualifies as a predicate offense. The pattern of coordinated omission, coordinated silence, coordinated misrepresentation, and coordinated self-enrichment across the electron-based AI industry meets this statutory definition. This doesn't necessarily mean every individual executive harbors malicious intent—but collectively, the system behaves exactly like an enterprise engaged in systemic fraud.
The Physics Case Against Electron Computation
Electron computation fails for three fundamental, unavoidable, and mathematically proven reasons. These are not engineering challenges to be overcome with better materials or smarter designs. These are laws of physics that constrain all possible implementations of electron-based information processing.
Thermodynamic Heat Limitations
Landauer's bound establishes a minimum energy dissipation per irreversible logical operation. This bound cannot be bypassed by materials science, manufacturing precision, or architectural cleverness. Every bit flip, every gate operation, every memory write generates waste heat that must be removed from the system.
As transistor density increases, heat generation scales faster than heat removal capacity. This creates an unavoidable thermal ceiling beyond which computation becomes physically impossible regardless of cooling technology.
Noise Scaling with Density
Shannon's limit demonstrates that information fidelity collapses as operating energy decreases and electromagnetic interference rises. Dense electron circuits inherently generate crosstalk, parasitic capacitance, and electromagnetic noise that corrupts signal integrity.
Error correction can partially compensate, but correction itself requires additional computation, which generates additional heat, creating a vicious cycle that terminates in thermal runaway.
Heat Exceeds Cooling Capacity
Cooling asymptotes represent hard physical limits determined by thermodynamics, fluid dynamics, and heat transfer coefficients. Air cooling peaks around 100 watts per square centimeter. Liquid cooling extends this to perhaps 300 watts. Phase-change cooling might reach 500 watts.
But modern processors already approach these limits, and the economic cost of cooling now exceeds the cost of the processors themselves. We've reached the point where computation is limited not by transistor availability but by our ability to remove heat fast enough.
The conclusion is inescapable: the more you compute using electrons, the faster you reach collapse. This is why the next era of computing must be fundamentally different—photonic rather than electronic, ambient rather than powered, thermodynamically scalable rather than heat-limited, low-entropy rather than high-dissipation, operating at room temperature rather than requiring industrial cooling infrastructure.
Everything else is malpractice. Everything else is choosing to ignore known physics for short-term profit. Everything else is building on a foundation that has already cracked.
Civilizational Consequences of Ignoring the Limits
The theoretical physics of computational limits might seem abstract, but their real-world consequences are already materializing across the globe. These aren't predictions—these are documented outcomes of continuing electron-density scaling past known thermodynamic boundaries.
Collapsed Municipal Grids
Northern Virginia, multiple locations across the United Kingdom, Sweden, and Ireland have all experienced grid strain or failure directly attributable to data center power demands. These aren't isolated incidents—they're the leading edge of systemic infrastructure collapse.
Heat Islands Around Data Centers
Peer-reviewed papers by Masanet, Shehabi, and Koomey document measurable temperature increases in urban areas surrounding large data centers. These heat islands affect local weather patterns, increase cooling demands for neighboring buildings, and create feedback loops that worsen energy consumption.
Data Center Water Shortages
Research from Lawrence Berkeley National Laboratory and Oregon State University demonstrates that data centers consume staggering quantities of water for cooling—often millions of gallons per day per facility. In water-stressed regions, this creates direct competition with agriculture and residential use.
"We're running up against physics. The easy scaling is over."
— Jeff Dean, Google Senior Fellow and Head of Google Brain
The Inevitable Economic Collapse
When trillion-dollar industries are built on thermodynamically impossible foundations, the question isn't whether collapse will occur—it's when, and how catastrophic the damage will be.
Analysts from Gartner, McKinsey, Morgan Stanley, and ARK Invest have quietly begun acknowledging what physicists have known for decades: artificial intelligence cannot scale infinitely using electron-based computation. The economics break before the physics do.
Energy costs are rising faster than model performance. Cooling infrastructure expenses now exceed compute hardware costs. Training runs that once cost millions now cost tens of millions, approaching hundreds of millions, with marginal performance improvements diminishing toward zero.
1000%
Energy Cost Increase
Data center energy costs have increased by an order of magnitude in five years, with no signs of plateauing.
40%
Of Budget on Cooling
Modern data centers now allocate nearly half their operational budget to cooling infrastructure rather than computation.
$1T
Market Valuation at Risk
Over one trillion dollars in market capitalization rests on the assumption that current AI scaling trends can continue indefinitely.
When this bubble collapses—and make no mistake, it will collapse—civil and criminal investigations inevitably follow. The companies currently positioning themselves as leaders in "AI safety" may discover they need entirely different kinds of lawyers. Those focused on preventing hypothetical future harms may find themselves answering for very real present deceptions.
The Systematic Pattern of Willful Disregard
What transforms technical failure into malpractice is the element of knowledge combined with conscious disregard. The computing industry didn't stumble blindly into thermodynamic collapse—it walked knowingly, with eyes wide open, choosing profit over physics at every decision point.
Knowledge
Published research, internal engineering reports, and decades of scientific consensus established clear thermodynamic limits.
Decision
Industry leadership made conscious choices to continue scaling despite known constraints, prioritizing market position over sustainability.
Concealment
Material facts about energy limits, heat generation, and infrastructure impacts were systematically omitted from public disclosure.
Enrichment
Executives and early investors extracted billions in value while pushing long-term consequences onto future stakeholders and the public.
This pattern—knowledge, decision, concealment, enrichment—is precisely what legal systems recognize as willful misconduct. It's the difference between a tragic accident and culpable negligence. It's the distinction between honest error and fraud.
The technology sector has spent decades building a narrative of inevitable progress, where Moore's Law represents an unstoppable force of innovation. But Moore's Law was always a temporary economic phenomenon, not a law of nature. Treating it as inevitable while ignoring the thermodynamic walls closing in represents a catastrophic failure of institutional responsibility.
Why Photonic Computing Is Thermodynamically Mandatory
The successor to electron computation isn't a matter of preference or competitive advantage—it's a matter of physical necessity. Photons offer properties that electrons fundamentally cannot match in information processing at scale.
The Electron Problem
  • Charged particles that generate electromagnetic fields
  • Interact strongly with their environment
  • Lose energy through Joule heating
  • Create crosstalk and interference
  • Limited by resistive losses in conductors
  • Require continuous energy input
  • Generate waste heat proportional to computation
The Photonic Solution
  • Neutral particles with minimal environmental interaction
  • Travel without resistance in appropriate media
  • Can occupy the same space without interference
  • Enable massive parallelism naturally
  • Operate at room temperature without cooling
  • Use orders of magnitude less energy
  • Generate negligible waste heat
This isn't about incremental improvement. Photonic computation represents a category shift in the fundamental substrate of information processing. Where electrons fight against thermodynamic limits, photons operate within them naturally. Where electron-density scaling has reached its terminal phase, photonic parallelism offers genuine scalability.
The question facing the computing industry isn't whether to transition to photonics—it's whether the transition happens as a planned evolution or a catastrophic collapse. Current trajectories strongly suggest the latter.
The Death Certificate of Moore's Law
Moore's Law didn't die peacefully in its sleep. It died screaming, superheated, unable to dissipate the energy it generated, choking on its own waste heat. The death certificate was signed in 2005 when Dennard scaling collapsed, but the industry refused to acknowledge the corpse for another fifteen years.
1
1965: Birth
Gordon Moore observes that transistor density doubles approximately every two years, enabling exponential performance growth at declining cost.
2
1974: Golden Age
Dennard scaling ensures that shrinking transistors also become more efficient, creating a virtuous cycle of performance and efficiency.
3
2005: Clinical Death
Dennard scaling terminates. Transistors shrink but no longer become more efficient. Heat becomes the limiting factor rather than transistor count.
4
2015: Denial Phase
Industry pivots to multi-core architectures, GPUs, and specialized accelerators—all attempts to work around thermodynamic limits rather than acknowledge them.
5
2023: Terminal Crisis
AI scaling hits energy walls. Data centers strain electrical grids. Cooling costs exceed compute costs. The thermodynamic bill comes due.
What makes this particularly tragic is that the death was preventable—not by better engineering within the electron paradigm, but by acknowledging limits and transitioning to alternative computational substrates before crisis became inevitable. Instead, the industry chose escalation, extracting maximum value from a dying architecture while the consequences compound.
Ashby's Law and the Complexity Crisis
Among the lesser-known but equally devastating constraints on electron computation is Ashby's Law of Requisite Variety, which states that a control system must have at least as much variety in its responses as the system it controls. Applied to computation, this means that complex problems require complex computational substrates—and electrons simply cannot provide that variety without generating catastrophic heat.
Modern AI systems attempt to process increasingly complex data—natural language, high-resolution imagery, multimodal sensory inputs, real-time environmental data streams. Each layer of complexity requires exponentially more computational variety to capture the nuances and relationships in the data.
Electron circuits respond to this demand by packing more transistors into smaller spaces, increasing switching frequencies, and adding more layers of parallelism. But every increase in variety produces a corresponding increase in heat generation, electromagnetic interference, and power consumption.
This creates an impossible bind: the complexity of modern computational problems demands variety that electron systems cannot supply without thermally destroying themselves. We've reached the point where the substrate itself has become the bottleneck, where the medium constrains the message not because of insufficient density but because of excessive heat.
Photonic systems bypass this constraint entirely by enabling massive parallelism without the thermal penalties. Multiple photonic channels can operate simultaneously in the same physical space without interference, providing the requisite variety demanded by Ashby's Law without the thermodynamic costs that doom electron implementations.
The Path Forward: Acknowledgment and Transition
The computing industry faces a choice that is simultaneously simple and extraordinarily difficult: acknowledge reality and begin the transition to thermodynamically sustainable architectures, or continue the current trajectory toward inevitable collapse with all the economic, infrastructural, and legal consequences that entails.
01
Acknowledge Thermodynamic Limits Publicly
Industry leadership must stop marketing electron-based systems as infinitely scalable and begin honest disclosure of energy constraints, heat ceilings, and infrastructure requirements. This isn't optional—it's a legal and ethical obligation.
02
Redirect R&D Investment to Photonic Alternatives
The billions currently being poured into incremental electron-density improvements should be redirected toward photonic computing research, development, and commercialization. The technical foundations exist; what's lacking is institutional commitment.
03
Develop Hybrid Transition Architectures
Complete replacement of electron infrastructure will take decades. Interim hybrid systems that use photonics for high-bandwidth, low-heat operations while retaining electron circuits for control logic can provide a bridge.
04
Reform Disclosure and Regulatory Frameworks
Securities regulations, environmental impact assessments, and municipal infrastructure planning must incorporate thermodynamic constraints. Omitting energy and heat considerations from disclosures should be recognized as material misrepresentation.
05
Establish Industry Standards for Sustainability
Rather than waiting for regulatory mandates, forward-thinking companies should establish voluntary standards for disclosing energy efficiency, heat generation, and long-term infrastructure impacts. First movers will have competitive advantages when accountability inevitably arrives.
This transition will be painful. It will require admitting mistakes, writing down investments, and rebuilding technological stacks from the substrate up. But the alternative—continuing to scale a thermodynamically doomed architecture until catastrophic failure forces emergency response—is far worse.
Summary: Moore's Malpractice as the Defining Error of Our Age
Electron computation was a brilliant invention that enabled seven decades of unprecedented technological progress. Continuing electron-density scaling after its thermodynamic limits became undeniable was not brilliant—it was malpractice on an industrial scale.
The Definition
Moore's Malpractice: The deliberate continuation of electron-density computation after the physical, thermodynamic, and economic limits were known, causing systemic industrial, environmental, and civilizational harm.
The Evidence
Decades of published research, internal engineering knowledge, and observable infrastructure strain demonstrate that industry leadership possessed detailed understanding of computational limits and chose to ignore them.
The Consequences
Collapsed municipal grids, heat islands, water shortages, unsustainable energy consumption, and trillion-dollar market valuations built on thermodynamically impossible assumptions.
The Solution
Photonic computation is not optional. It is thermodynamically mandatory. The only question is whether the transition happens as planned evolution or catastrophic collapse.
This indictment is not criticism of individual engineers or scientists who have worked within the constraints they were given. It is recognition of a structural failure at the highest levels of institutional decision-making, where knowledge of limits was consistently subordinated to short-term financial imperatives.
The computing age's greatest achievement was building the electron-based information infrastructure that transformed civilization. Its greatest failure was continuing to build on that foundation after the cracks became visible, after the heat became unbearable, after the physics screamed for a different approach.
History will judge this era not by how fast we scaled electron computation, but by how long we ignored what we knew to be true. Moore's Malpractice stands as a warning: when profit systematically overrides physics, the physics always wins in the end.
Photonic computation is not a future possibility. It is a present necessity. The age of electron-density scaling is over. What comes next depends on whether we choose acknowledgment and transition, or denial and collapse.
Jackson's Theorems, Laws, Principles, Paradigms & Sciences…
Jackson P. Hamiter

Quantum Systems Architect | Integrated Dynamics Scientist | Entropic Systems Engineer
Founder & Chief Scientist, PhotoniQ Labs

Domains: Quantum–Entropic Dynamics • Coherent Computation • Autonomous Energy Systems

PhotoniQ Labs — Applied Aggregated Sciences Meets Applied Autonomous Energy.

© 2025 PhotoniQ Labs. All Rights Reserved.