For molecular systems we know the laws of physics to extreme precision. Yet, our ability to compute properties of these systems to numerical precision is very limited. This is mainly due to two sources of computational intractability: quantum mechanics and chaos. On one side, the accurate solution of the Schrödinger equation for large molecular systems is computationally prohibited. One the other side, even if we knew the forces exactly, the dynamical equations for the atoms would exhibit chaotic behaviour, which implies that, just as the weather, we have limited capacity to predict future trajectories. Some aspects of these systems remain predictable at a macroscopic scale, but they require completely different variables, such as pressure, temperature and entropy. We call this emergent theory thermodynamics.
In the completely different discipline of machine learning, a fairly similar phenomenon takes place. The microscopic variables of an image are given by its constituent pixel values. When we analyse the image with a deep neural network (DNN), we will detect edges in the first layer, corners in the next layer, then the object parts and finally, at the top of the neural network, entire objects. We understand the world at the “emergent” level of objects and their relations, not at the level of pixels and edges. In deep learning (DL) emergence happens automatically through learning and some inductive biases such as symmetries.
A major question we want to address in this workshop is whether we can apply the same learning paradigm to the field of molecular science to learn the correct emergent variables and dynamics.
This workshop will include a poster session; a request for posters will be sent to registered participants in advance of the workshop.
This is a Julian Schwinger Workshop on Multiscale Physics, made possible by a gift from the Julian Schwinger Foundation for Physics Research (JSF).