Workflows for experimental facilities and the digital twin approach

Amedeo Perazzo
Stanford University

We recently designed and developed capabilities to stream the science data from the experimental to the computing facility, start the analysis job on the supercomputer, and report the results of the analysis back to the experimenters in quasi-real-time. This presentation will describe our approach and will introduce the idea of developing a framework that combines analysis of experimental data with the infrastructure to spawn a digital twin of an experiment that can distinguish between competing hypotheses in real-time to close the loop between experiment and theory in the study and design of novel materials. The study of materials such as polymers or catalysts through experiments requires exploring a high-dimensional space of parameters within an experimental space constrained by time, sample availability, and cost. The critical roadblock to increased extraction of knowledge from spectroscopy and scattering studies at light sources is the need for real-time interpretation of data to inform experiment design.

Presentation (PDF File)

Back to Workshop III: Complex Scientific Workflows at Extreme Computational Scales