Three grand challenges — corrosion, carbon capture and fertilizer — meet at the quantum scale.
On a winter day in Wisconsin, you can watch chemistry at work. Road salt and moisture roughen a car’s frame. Steel structures slowly turn to rust. At the same time, we burn fuels that release carbon dioxide, and we rely on fertilizer to grow food — fertilizer that still takes a huge amount of energy to make.
Corrosion, carbon capture and fertilizer sound like separate problems. In my research as a physicist, I think of them as three stops on the same road trip, because all three are controlled by the same tiny actors: electrons. Electrons decide whether metal stays strong or crumbles. They rearrange when a catalyst turns CO₂ into something useful. And they govern nitrogen chemistry, including the reactions that produce the nitrogen-containing compounds that modern agriculture depends on. In the above figure, you can see “Rusty Coast,” “Carbon Capture Crags” and “Nitrogen Valley,” with the swirling tornado of electron correlation halting our path to useful solutions.
People are also reading…
If we could reliably predict what electrons will do in complex materials and molecules, we could design better alloys, better catalysts and better chemical processes with far less trial and error. The catch is that electrons follow quantum mechanics. When many electrons interact strongly, the number of possibilities explodes, and even our best classical computers can struggle. Classical simulation tools are powerful and often spectacular, but some of the most important cases remain stubbornly hard.
That is why I work on quantum computing for chemistry and materials. A quantum computer processes information using quantum effects. Because electrons are quantum, it’s natural to hope a quantum computer can simulate them more directly than a classical machine can. In the best cases, this could speed up calculations dramatically.
But it’s not automatic. Today’s quantum devices are still at an early stage: impressive, but noisy and limited. For many useful simulations, we will need future, fault-tolerant quantum computers that can run long calculations while continually catching and correcting errors (think of it as built-in proofreading). My job is planning for that future: figuring out what problems are worth targeting, what algorithms make sense and what it would really take to run them on a real device.
A big part of my research is building resource estimates. These are end-to-end budgets that connect a scientific question to an engineering plan. We start with a concrete target, like a reaction step on a catalyst surface or an electron process relevant to corrosion. Then we choose a quantum algorithm, translate it into the operations the hardware can perform, account for the overhead of error correction and fold in realistic hardware assumptions. The output is something anyone can argue about productively: How many hours it will take? And what are the key bottlenecks?
Sometimes the answer is “too long,” and that’s valuable. For the kind of chemical accuracy industry needs, today’s best classical methods can still outperform today’s small quantum processors. Knowing that helps us focus. It tells us where classical computing is the right tool and where quantum methods might eventually be uniquely useful.
Other times, the estimates show why people are excited. Improvements across the stack compound. Better algorithms reduce the work, better compilation reduces overhead, better error correction reduces hardware needs and better hardware reduces the time per step. In one of our co-design studies on a CO₂-utilization catalyst, coordinated advances across these layers changed the projection by orders of magnitude, turning something that once looked like a multi-decade calculation into something closer to a day on a carefully designed future machine.
Students are at the center of this. My graduate students do the research themselves, testing ideas, writing code, working with collaborators who build hardware and modeling real materials and molecules. Teaching is part of the mission, too, because the quantum workforce is expanding fast. I’ve helped develop new courses in physics for UW–Madison’s quantum computing master’s degree on quantum algorithms and error correction. I also run a hackathon focused on one of the most practical questions students can ask: “What does it actually cost to use a quantum computer for science?”
When I look at rust, carbon and fertilizer, I see three everyday challenges tied together by electrons. I also see three opportunities. If we can harness quantum computers as they mature, we may be able to design materials and chemical processes with a new level of predictive power. That will translate fundamental physics into real benefits for people in Wisconsin and beyond.
—
About the Researcher
Matthew Otten is an assistant professor in UW–Madison’s Department of Physics and the principal investigator for the Otten Group, which researches theoretical quantum information science. His research mission is to make utility-scale quantum computing a reality.

