Domains such as cosmology, astrophysics, nuclear or particle physics rely heavily on the use of stochastic simulation to design data analyses and perform inference with respect to fundamental physics theories. With the advances of Deep Learning, Bayesian inference techniques, a new frontier has come into view that focuses on the extension of traditional scientific programming along the two distinct directions of differentiable programming and probabilistic programming. The former plays an important role in navigating high-dimensional spaces, while the latter can manifestly incorporate generative modeling and inference into scientific data analyses. Both promise to expand the science reach of data-intensive domains through, e.g., the integration of domain knowledge and symmetries into neural networks, algorithmic robustness, alignment of science and training optimization goals, uncertainty quantification, causal inference, and more complete inference.
However, most production scientific pipelines are not yet compatible with these techniques and their widespread adoption would represent a deep change in the scientific software landscape. This program will provide the opportunity for researchers from Computer Science and Fundamental Physics to develop applications as well as strategies for integrating these paradigms into existing scientific workflows and to close the gap of what is theoretically possible and practically deployed.