Atomistic simulations (such as molecular dynamics) is the largest consumer of supercomputing time worldwide. Atomistic simulations rely on one of the two models: very accurate and very computationally expensive quantum-mechanical models that resolve electronic structure, and empirical interatomic models that postulate a simple functional form of interatomic interaction that is fast to compute. Application of ideas of machine learning has recently been put forward as a promising way to get the best out of these two models: accuracy of quantum mechanics and computational efficiency of the interatomic potentials.
The purpose of atomistic simulations is often to explore an atomistic potential energy surface, for instance, when a molecular reaction mechanism is not known. This creates a challenge for machine-learning-based approaches: the training dataset should include representative atomistic configurations that are not known a priori in this case. This can be solved by active learning which consists in simultaneously exploring and learning the potential energy surface.
In this talk I will present my version of machine-learning interatomic potentials and an active learning algorithm. I will then illustrate applications of these methods in molecular dynamics, crystal structure prediction, alloy discovery and cheminformatics. Finally, I will discuss the mathematical challenges related to machine learning and active learning of interatomic potentials.