Berkeley Lab Develops Noise-Powered Computing Design for Energy-Efficient AI

New thermodynamic computing approach uses heat and random electron vibrations to perform complex AI tasks at a fraction of traditional power costs.

Published on Mar. 6, 2026

Researchers at the Lawrence Berkeley National Laboratory have developed a new design and training framework that allows computers to use thermal noise as a power source rather than a hindrance. The 'thermodynamic computing' approach can now mimic neural networks to perform complex, nonlinear machine learning tasks at room temperature and with far less energy than classical computers, which typically spend vast amounts of power to suppress thermal noise.

Why it matters

This breakthrough in thermodynamic computing could lead to a dramatic reduction in the energy required for AI inference, which currently consumes significant power. For example, a single Google search uses enough energy to power a 6-watt LED for 3 minutes. By shifting AI workloads to this new noise-powered hardware, energy costs could plummet, making AI more accessible and sustainable.

The details

The key innovations are bypassing equilibrium constraints that previously limited the speed of thermodynamic computing, and developing new training methods to program these 'stochastic' systems to perform complex, nonlinear calculations. Researchers used digital simulations running on 96 GPUs to evaluate over a trillion noisy trajectories and find the optimal parameters for a noise-powered neural network. This allows the hardware to function more like a traditional processor - fast and predictable - but with a fraction of the power.

  • The research was published on March 6, 2026.

The players

Lawrence Berkeley National Laboratory

A U.S. Department of Energy national laboratory conducting scientific research in various fields, including energy efficiency and computing.

Stephen Whitelam

A staff scientist at the Molecular Foundry at Lawrence Berkeley National Laboratory and co-author of the research paper.

Corneel Casert

A researcher who utilized the Perlmutter supercomputer at NERSC to run the evolutionary simulations needed to train the noise-powered neural network.

Got photos? Submit your photos here. ›

What they’re saying

“Thermodynamic computing is noise-powered. The premise of thermodynamic computing is that if you take a physical device with an energy scale comparable to that of thermal energy and leave it alone, it will change state over time, driven by thermal fluctuations. The goal is to program it so that this time evolution does something useful.”

— Stephen Whitelam, Staff Scientist, Molecular Foundry, Lawrence Berkeley National Laboratory (Interesting Engineering)

“Training a thermodynamic neural network by simulating it digitally is expensive. But once trained and built as physical hardware, we can perform inference on that hardware for a very low energy cost.”

— Corneel Casert, Researcher (Interesting Engineering)

What’s next

The Berkeley Lab team is now seeking experimental partners to translate these digital designs into physical hardware that can be tested and deployed.

The takeaway

This breakthrough in thermodynamic computing, which harnesses thermal noise as a power source rather than a hindrance, could lead to a dramatic reduction in the energy required for AI inference. By shifting computationally intensive AI workloads to this new noise-powered hardware, the energy costs of running AI could plummet, making the technology more accessible and sustainable.