Posts Tagged With: Technology

 
 

AWAY WITH HUGE, WATER AND ENERGY CONSUMING AI DATA CENTERS?


AWAY WITH HUGE, WATER AND ENERGY CONSUMING AI DATA CENTERS?

While our region is trying to digest the energy requirements of today’s AI data center technology, along with unsustainable water requirements and air pollution challenges, science, (a dirty word in some quarters of our Federal government these days) is hard at work putting today’s massive AI data centers out of business.

Imagine a cabinet containing an AI data center equivalent in computing power to today’s monsters, about the size of small parcel van. One of the “tricks” employed in this shrinkage is that the cabinet interior is at -200 degrees fahrenheit. Of course, removing enough heat from the cabinet to achieve this temperature requires a lot of energy, likely equivalent to that needed to cool a multistory office building in our borderland desert. By comparison, massive AI data centers like the one planned for Santa Teresa New Mexico, are expected to require the energy (not to mention the water) of two typical US cities added together in the same climatic conditions.

Another trick is to use “quantum computing” instead of digital computing–the world of 0s and 1s. This step requires completely new chemistry in the chips. Key is the use of Tantalum instead of aluminum to connect the chips and their components inside them. Such chips already exist, the problem being to “train” the chips and combine processing and data into them–instead of the traditional processing chips linked to data chips by conductors. Architecture we use today.

Since the “input” and “output” of a data center is digital, conversion from digital to quantum and back again is a real headscratcher, in the short term requiring today’s existing AI data centers to help out in the design phase.

The point is that our once world-leading research institutes and universities are going to have the resources to move this technology to scale. None of what appears in this report is a secret, so you can be sure that the torrent of Chinese engineering talent is hell-bent on solving all the challenges mentioned here. Retreat into some strange medieval world as some of our politicians seem to be heading is placing all of us on a path to the edge of a cliff from which it will take us generations to recover.

Success in the concept mentioned has the benefit of relieving our environment of the stresses today’s giant AI data center designs will place on all of us, not to mention the cost competition for our finite supply of fossil fuels and usable water, which will be reflected on all our electric, gas and water bills in the future.

Dan Townsend, January 29, 2026

While engineers are pursuing a range of technologies to develop qubits, the Princeton version relies on a type of circuit called a transmon qubit. Transmon qubits, used in efforts by companies including Google and IBM, are superconducting circuits that run at extremely low temperatures. Their advantages include a relatively high tolerance for outside interference and compatibility with current electronics manufacturing.

Two researchers study a detail on a quantum computing device.
Graduate student Matthew Bland and postdoctoral researcher Faranak Bahrami, who are co-advised by Houck and de Leon, spearheaded the new chip’s design.

But the coherence time of transmon qubits has proven extremely hard to extend. Recent work from Google showed that the major limitation faced in improving their latest processor comes down to the material quality of the qubits.

The Princeton team took a two-pronged approach to redesigning the qubit. First, they used a metal called tantalum to help the fragile circuits preserve energy. Second, they replaced the traditional sapphire substrate with high-quality silicon, the standard material of the computing industry. To grow tantalum directly on silicon, the team had to overcome a number of technical challenges related to the materials’ intrinsic properties. But ultimately they prevailed, unlocking the deep potential of this combination.

Nathalie de Leon, the co-director of Princeton’s Quantum Initiative and co-principal investigator of the new qubit, said that not only does their tantalum-silicon chip outperform existing designs, but it’s also easier to mass-produce. “Our results are really pushing the state of the art,” she said.

Michel Devoret, chief scientist for hardware at Google Quantum AI, which partially funded the research, said that the challenge of extending the lifetimes of quantum computing circuits had become a “graveyard” of ideas for many physicists. “Nathalie really had the guts to pursue this strategy and make it work,” said Devoret, a recipient of the 2025 Nobel Prize in physics.

Tweezers place a quantum computing chip into its larger packaging.
The basic processing unit of the new chip — a redesigned transmon superconducting qubit that uses tantalum on silicon — holds fragile quantum information intact nearly 15 times longer than today’s best industrial processors. Swapping that component into Google’s best chip would increase the machine’s performance by a factor of more than 1,000.

The research was primarily funded by the U.S. Department of Energy National Quantum Information Science Research Centers and the Co-design Center for Quantum Advantage (C2QA) — a center that Houck directed from 2021 to 2025, and where he is now chief scientist. The paper’s co-lead authors are postdoctoral researcher Faranak Bahrami and graduate student Matthew P. Bland.

Using tantalum makes quantum chips more robust

Houck, the Anthony H.P. Lee ’79 P11 P14 Professor of Electrical and Computer Engineering, said a quantum computer’s power hinges on two factors. The first is the total number of qubits that are strung together. The second is how many operations each qubit can perform before errors take over. By improving the quality of individual qubits, the new paper advances both. Specifically, a longer-lasting qubit helps resolve the industry’s greatest obstacles: scaling and error correction.

The most common source of error in these qubits is energy loss. Tiny, hidden surface defects in the metal can trap and absorb energy as it moves through the circuit. This causes the qubit to rapidly lose energy during a calculation, introducing errors that multiply as more qubits are added to a chip. Tantalum typically has fewer of these defects than more commonly used metals like aluminum. Fewer errors also make it easier for engineers to correct those that do occur.

Houck and de Leon, who is an associate professor of electrical and computer engineering, first introduced the use of tantalum for superconducting chips in 2021 in collaboration with Princeton chemist Robert Cava, the Russell Wellman Moore Professor of Chemistry. Despite having no background in quantum computing, Cava, an expert on superconducting materials, had been inspired by a talk de Leon had delivered a few years earlier, and the two struck up an ongoing conversation about qubit materials. Eventually, Cava pointed out that tantalum could provide more benefits and fewer downsides. “Then she went and did it,” Cava said, referring to de Leon and the broader team. “That’s the amazing part.”

Nathalie de Leon speaks with two researchers in the lab.
Nathalie de Leon specializes in engineering materials for quantum information technologies. She initiated the unusual collaboration between Houck, Cava and herself that has led to at least two major advances.

Researchers from all three labs followed Cava’s intuition and built a superconducting tantalum circuit on a sapphire substrate. The design demonstrated a significant boost in coherence time, in line with the world record.

Tantalum’s main advantage is that it’s exceptionally robust and can survive the harsh cleaning needed for removing contamination from the fabrication process. “You can put tantalum in acid, and still the properties don’t change,” said Bahrami, co-lead author on the new paper.

Robert Cava stands outside.
Robert J. Cava, the Russell Wellman Moore Professor of Chemistry. Photo by C. Todd Reichart

Once the contaminants were removed, the team then came up with a way to measure the next sources of energy loss. Most of the remaining loss came from the sapphire substrate. They replaced the sapphire with silicon, a material that is widely available with extremely high purity.

Combining these two materials while refining manufacturing and measurement techniques has led to one of the largest single improvements in the transmon’s history. Houck called the work “a major breakthrough on the path to enabling useful quantum computing.”

Because the improvements scale exponentially with system size, Houck said that swapping the current industry best for Princeton’s design would enable a hypothetical 1,000-qubit computer to work roughly 1 billion times better.

Categories: Automation, History, Science and Biography | Tags: , , , , | Leave a comment

Blog at WordPress.com.