Article 62W1J Supercomputer Emulator: AI’s New Role in Science

Supercomputer Emulator: AI’s New Role in Science

by
Matthew Hutson
from IEEE Spectrum on (#62W1J)
a-man-wearing-glasses-writes-scientific-

Artificial intelligence has become an indispensable tool in many scientists' lives, such that its use by researchers now has its own moniker-AI4Science-used by conferences and laboratories. Last month, Microsoft announced its own AI4Science initiative, employing dozens of people spread across several countries. Chris Bishop, its director, started on the science side before gravitating to AI. He earned a Ph.D. in quantum field theory at the University of Edinburgh, then worked in nuclear fusion before machine learning caught his eye in the 1980s. He began applying neural networks to his own work. I was kind of 25 years early," he says, but it really has taken off." He joined Microsoft Research's Cambridge lab in 1997, eventually becoming its director, and now has a new role. We spoke about the evolution of the scientific method, lasers versus beer, and nerdy T-shirts.

IEEE Spectrum: What is Microsoft AI4Science?

Chris Bishop: All it really is is a new team that we're building. We see a very exciting opportunity over the next decade at the intersection of machine learning and the natural sciences-chemistry, physics, biology, astronomy, and so on. It goes beyond simply the application of machine learning in the natural sciences.

How does it go beyond that?

Bishop: There was a technical fellow at Microsoft, Jim Gray, who talked about four paradigms of scientific discovery. The first paradigm is the purely empirical. It's observing regularities in the world around us.

We see a new paradigm emerging. You can trace its origins back many decades, but it's a different way of using machine learning in the natural sciences."
-Chris Bishop, Microsoft Research

The second paradigm is the theoretical. Think of Newton's laws of motion, or Maxwell's equations. These are typically differential equations. It's an inductive step, an assumption that they describe the world more generally. An equation is incredibly precise over many scales of length and time, and you can write it on your T-shirt.

The third transformation in scientific discovery began in the middle of the 20th century, with the development of digital computers and simulations, effectively solving these differential equations for weather forecasting and other applications.

The fourth paradigm, taking off in the 21st century, was not about using computers to solve equations from first principles. It's rather analyzing empirical data at scale using computers. Machine learning thrives in that space. Think of the Large Hadron Collider, the James Webb Space Telescope, or protein-binding experiments.

These four paradigms all work together.

We see a new paradigm emerging. You can trace its origins back many decades, but it's a different way of using machine learning in the natural sciences. In the third paradigm, you run a complicated simulation on a supercomputer; then the next day, somebody asks a different question. You take a deep breath, more coin to the electricity meter. We can now use those simulation inputs and outputs as training data for machine-learning deep neural nets, which learn to replicate or emulate the simulator. If you use the emulator many times, you amortize the cost of generating the training data and the cost of training. And now you have this hopefully fairly general-purpose emulator, which you can run orders of magnitude faster than the simulation.

Roughly how much simulation data is needed to train an emulator?

Bishop: A lot of machine learning is an empirical science. It involves trying out different architectures and amounts of data and seeing how things scale. You can't say ahead of time, I need 56 million data points to do this particular task.

What is interesting, though, are techniques in machine learning that are a little bit more intelligent than just regular training. Techniques like active learning and reinforcement learning, where a system has some understanding of its limitations. It could request more data where it has more uncertainty.

What are emulation's weaknesses?

Bishop: They can still be computationally very expensive. Additionally, emulators learn from data, so they're typically not more accurate than the data used to train them. Moreover, they may give insufficiently accurate results when presented with scenarios that are markedly different from those on which they're trained.

I believe in use-inspired basic research"-[like] the work of Pasteur. He was a consultant for the brewing industry. Why did this beer keep going sour? He basically founded the whole field of microbiology."
-Chris Bishop, Microsoft Research

Are all of Microsoft AI4Science's projects based on emulation?

Bishop: No. We do quite a bit of work in drug discovery. That's at the moment entirely fourth-paradigm-based. It's based on empirical observations of the properties of certain molecules, and using machine learning to infer the properties of molecules that weren't part of the training set, and then to reverse that process and say, given a set of properties, can we find new molecules which have those properties? We have a five-year research partnership with Novartis.

What are some other projects you're working on?

Bishop: We're looking actively at partnerships. Microsoft brings a couple of things. We have a lot of expertise in machine learning. We also have a lot of expertise in very-large-scale compute and cloud computing. What we're not endeavoring to do, though, is to be domain experts. We don't want to be a drug company, we don't want to be an expert in catalysis. We are bringing in people who have expertise in quantum chemistry, quantum physics, catalysis, and so on, but really to allow us to build an interface with collaborators and partners.

The bigger picture is we're working anywhere we've got these differential equations. It could be fluid flows, designing turbines, predicting the weather, large-scale astronomical phenomena, plasma in nuclear reactors. A lot of our emphasis is on molecular-scale simulation. Scientifically, it holds some of the most challenging and some of the most interesting problems, but also the applicability is enormous-drug discovery, sustainability. We've been thinking about direct air capture of carbon dioxide.

Is the goal to publish papers or to build intellectual property and products?

Bishop: We have, I guess, three goals. First and foremost, it's about building up our research. Peer-reviewed publication will be a key outlet.

Second, Microsoft is a company whose business model is empowering others to be successful. So one of the things we'll be looking for is how we can turn some of the research advances into cloud-based services which can then be used commercially or by the academic world. The breadth of applicability of this is potentially enormous. If you just think about molecular simulation, it's drugs, it's lubricants, it's protecting corrosion, it's carbon capture, it's catalysis for the chemical industry, and so on.

And then the third goal, ultimately, is to see real-world impact: Health care, sustainability, climate change.

Do you foresee advances not just in the domains where you're helping partners but also in pure computer science and machine learning?

Bishop: That's a great question. I believe in use-inspired basic research." People think in terms of a very linear model, in which you have basic research at one end and applied research at the other. A great example would be Einstein. He discovers stimulated emission with a pencil and paper and a brain, and then later it gets used to build the laser.

But there's a different kind of research, which is often characterized by the work of Pasteur. He was a consultant for the brewing industry. Why did this beer keep going sour? He basically founded the whole field of microbiology. I think about that as use-inspired basic research.

I hope to see that as we go after really hard problems. We're trying to build a neural net that can understand the dynamics of molecules, and we're going to need new neural-network architectures. And that might spill over into completely different domains.

What will the sixth scientific paradigm be? Will AI generate new hypotheses?

I have no idea what the sixth paradigm is. But I think the fifth paradigm will keep us pretty busy for the next decade or more.

This transcript has been edited for brevity and clarity.

External Content
Source RSS or Atom Feed
Feed Location http://feeds.feedburner.com/IeeeSpectrum
Feed Title IEEE Spectrum
Feed Link https://spectrum.ieee.org/
Reply 0 comments