## Sebastian Kaltenbach

Fellow

### Research and Interests

- Bayesian Strategies
- Uncertainty Quantification
- Physics-aware Machine Learning
- Reduced-order modeling
- UQ in multiscale problems
- Causal Inference

### Projects

- DCoMEX: Data-Driven Computational Mechanics at Exascale

### Publications

### 2022

- S. Kaltenbach, P. Perdikaris, and P. -S. Koutsourelakis, “Semi-supervised Invertible DeepONets for Bayesian Inverse Problems,” Arxiv preprint arxiv:2209.02772, 2022.

[BibTeX] [Abstract] [DOI]

Deep Operator Networks (DeepONets) offer a powerful, data-driven tool for solving parametric PDEs by learning operators, i.e. maps between infinite-dimensional function spaces. In this work, we employ physics-informed DeepONets in the context of high-dimensional, Bayesian inverse problems. Traditional solution strategies necessitate an enormous, and frequently infeasible, number of forward model solves, as well as the computation of parametric derivatives. In order to enable efficient solutions, we extend DeepONets by employing a realNVP architecture which yields an invertible and differentiable map between the parametric input and the branch net output. This allows us to construct accurate approximations of the full posterior which can be readily adapted irrespective of the number of observations and the magnitude of the observation noise. As a result, no additional forward solves are required, nor is there any need for costly sampling procedures. We demonstrate the efficacy and accuracy of the proposed methodology in the context of inverse problems based on a anti-derivative, a reaction-diffusion and a Darcy-flow equation.

`@article{kaltenbach_semi-supervised_2022, title = {Semi-supervised {Invertible} {DeepONets} for {Bayesian} {Inverse} {Problems}}, journal = {arXiv preprint arXiv:2209.02772}, doi = {10.48550/arXiv.2209.02772}, urldate = {2022-09-17}, publisher = {arXiv}, author = {Kaltenbach, Sebastian and Perdikaris, Paris and Koutsourelakis, P.-S.}, month = sep, year = {2022}, keywords = {Computer Science - Machine Learning, Statistics - Machine Learning, Physics - Computational Physics}, file = {arXiv Fulltext PDF:/Users/sebastian/Zotero/storage/NJXE46SJ/Kaltenbach et al. - 2022 - Semi-supervised Invertible DeepONets for Bayesian .pdf:application/pdf;arXiv.org Snapshot:/Users/sebastian/Zotero/storage/WEFEWWUP/2209.html:text/html}, }`

### 2021

- J. Eichelsdörfer, S. Kaltenbach, and P. -S. Koutsourelakis, “Physics-enhanced Neural Networks in the Small Data Regime,” in Fourth Workshop on Machine Learning and the Physical Sciences (NeurIPS 2021), 2021.

[BibTeX] [Abstract] [DOI]

Identifying the dynamics of physical systems requires a machine learning model that can assimilate observational data, but also incorporate the laws of physics. Neural Networks based on physical principles such as the Hamiltonian or Lagrangian NNs have recently shown promising results in generating extrapolative predictions and accurately representing the system’s dynamics. We show that by additionally considering the actual energy level as a regularization term during training and thus using physical information as inductive bias, the results can be further improved. Especially in the case where only small amounts of data are available, these improvements can signiﬁcantly enhance the predictive capability. We apply the proposed regularization term to a Hamiltonian Neural Network (HNN) and Constrained Hamiltonian Neural Network (CHHN) for a single and double pendulum, generate predictions under unseen initial conditions and report signiﬁcant gains in predictive accuracy.

`@inproceedings{eichelsdorfer_physics-enhanced_2021, title = {Physics-enhanced {Neural} {Networks} in the {Small} {Data} {Regime}}, doi = {10.48550/arXiv.2111.10329}, language = {en}, author = {Eichelsd{\"o}rfer, Jonas and Kaltenbach, Sebastian and Koutsourelakis, P.-S.}, booktitle={{Fourth Workshop on Machine Learning and the Physical Sciences (NeurIPS 2021)}}, year = {2021}, file = {Eichelsdörfer et al. - Physics-enhanced Neural Networks in the Small Data.pdf:/Users/sebastian/Zotero/storage/TJ8GLJ2H/Eichelsdörfer et al. - Physics-enhanced Neural Networks in the Small Data.pdf:application/pdf}, }`

- S. Kaltenbach and P. -S. Koutsourelakis, “Physics-Aware, Deep Probabilistic Modeling of Multiscale Dynamics in the Small Data Regime,” 14th WCCM-ECCOMAS Congress 2020, 2021.

[BibTeX] [Abstract] [DOI]

The data-based discovery of effective, coarse-grained (CG) models of high-dimensional dynamical systems presents a unique challenge in computational physics and particularly in the context of multiscale problems. The present paper offers a probabilistic perspective that simultaneously identifies predictive, lower-dimensional coarse-grained (CG) variables as well as their dynamics. We make use of the expressive ability of deep neural networks in order to represent the right-hand side of the CG evolution law. Furthermore, we demonstrate how domain knowledge that is very often available in the form of physical constraints (e.g. conservation laws) can be incorporated with the novel concept of virtual observables. Such constraints, apart from leading to physically realistic predictions, can significantly reduce the requisite amount of training data which enables reducing the amount of required, computationally expensive multiscale simulations (Small Data regime). The proposed state-space model is trained using probabilistic inference tools and, in contrast to several other techniques, does not require the prescription of a fine-to-coarse (restriction) projection nor time-derivatives of the state variables. The formulation adopted is capable of quantifying the predictive uncertainty as well as of reconstructing the evolution of the full, fine-scale system which allows to select the quantities of interest a posteriori. We demonstrate the efficacy of the proposed framework in a high-dimensional system of moving particles.

`@article{kaltenbach_physics-aware_2021, title = {Physics-{Aware}, {Deep} {Probabilistic} {Modeling} of {Multiscale} {Dynamics} in the {Small} {Data} {Regime}}, issn = {2696-6999}, doi = {10.23967/wccm-eccomas.2020.280}, language = {en}, urldate = {2022-09-17}, journal = {{14th WCCM-ECCOMAS Congress 2020}}, author = {Kaltenbach, S. and Koutsourelakis, P.-S.}, month = mar, year = {2021}, file = {Full Text:/Users/sebastian/Zotero/storage/DSB5JNMQ/Kaltenbach and Koutsourelakis - 2021 - Physics-Aware, Deep Probabilistic Modeling of Mult.pdf:application/pdf;Snapshot:/Users/sebastian/Zotero/storage/Q8M8BNCL/Kaltenbach_Koutsourelakis_2021a.html:text/html}, }`

- S. Kaltenbach and P. -S. Koutsourelakis, “Physics-aware, probabilistic model order reduction with guaranteed stability,” in International Conference on Learning Representations (ICLR), 2021.

[BibTeX] [Abstract] [DOI]

Given (small amounts of) time-series’ data from a high-dimensional, fine-grained, multiscale dynamical system, we propose a generative framework for learning an effective, lower-dimensional…

`@inproceedings{kaltenbach_physics-aware_2021-1, title = {Physics-aware, probabilistic model order reduction with guaranteed stability}, doi = {10.48550/arXiv.2101.05834}, language = {en}, urldate = {2022-09-17}, booktitle={{International Conference on Learning Representations (ICLR)}}, author = {Kaltenbach, Sebastian and Koutsourelakis, P.-S.}, month = mar, year = {2021}, file = {Full Text PDF:/Users/sebastian/Zotero/storage/C9SXK5K7/Kaltenbach and Koutsourelakis - 2021 - Physics-aware, probabilistic model order reduction.pdf:application/pdf;Snapshot:/Users/sebastian/Zotero/storage/EGSGWR4S/forum.html:text/html}, }`

### 2020

- S. Kaltenbach and P. -S. Koutsourelakis, “Incorporating physical constraints in a deep probabilistic machine learning framework for coarse-graining dynamical systems,” Journal of Computational Physics, vol. 419, 2020.

[BibTeX] [Abstract] [DOI]

Data-based discovery of effective, coarse-grained (CG) models of high-dimensional dynamical systems presents a unique challenge in computational physics and particularly in the context of multiscale problems. The present paper offers a data-based, probabilistic perspective that enables the quantification of predictive uncertainties. One of the outstanding problems has been the introduction of physical constraints in the probabilistic machine learning objectives. The primary utility of such constraints stems from the undisputed physical laws such as conservation of mass, energy etc. that they represent. Furthermore and apart from leading to physically realistic predictions, they can significantly reduce the requisite amount of training data which for high-dimensional, multiscale systems are expensive to obtain (Small Data regime). We formulate the coarse-graining process by employing a probabilistic state-space model and account for the aforementioned equality constraints as virtual observables in the associated densities. We demonstrate how deep neural nets in combination with probabilistic inference tools can be employed to identify the coarse-grained variables and their evolution model without ever needing to define a fine-to-coarse (restriction) projection and without needing time-derivatives of state variables. We advocate a sparse Bayesian learning perspective which avoids overfitting and reveals the most salient features in the CG evolution law. The formulation adopted enables the quantification of a crucial, and often neglected, component in the CG process, i.e. the predictive uncertainty due to information loss. Furthermore, it is capable of reconstructing the evolution of the full, fine-scale system and therefore the observables of interest need not be selected a priori. We demonstrate the efficacy of the proposed framework by applying it to systems of interacting particles and a series of images of a nonlinear pendulum. In both cases we identify the underlying coarse dynamics and can generate extrapolative predictions including the forming and propagation of a shock for the particle systems and a stable trajectory in the phase space for the pendulum.

`@article{kaltenbach_incorporating_2020, title = {Incorporating physical constraints in a deep probabilistic machine learning framework for coarse-graining dynamical systems}, volume = {419}, issn = {0021-9991}, doi = {10.1016/j.jcp.2020.109673}, language = {en}, urldate = {2022-09-17}, journal = {{Journal of Computational Physics}}, author = {Kaltenbach, Sebastian and Koutsourelakis, P.-S.}, month = oct, year = {2020}, keywords = {Bayesian machine learning, Coarse graining, Multiscale modeling, Reduced order modeling, Virtual observables}, file = {ScienceDirect Snapshot:/Users/sebastian/Zotero/storage/XBRSD26Z/S0021999120304472.html:text/html;Submitted Version:/Users/sebastian/Zotero/storage/QXYL8SRX/Kaltenbach and Koutsourelakis - 2020 - Incorporating physical constraints in a deep proba.pdf:application/pdf}, }`

- S. Kaltenbach and P. -S. Koutsourelakis, “Physics-aware, data-driven discovery of slow and stable coarse-grained dynamics for high-dimensional multiscale systems,” in 1st NeurIPS workshop on Interpretable Inductive Biases and Physically Structured Learning, 2020.

[BibTeX]`@inproceedings{kaltenbach_physics-aware_2020, title = {Physics-aware, data-driven discovery of slow and stable coarse-grained dynamics for high-dimensional multiscale systems}, author = {Kaltenbach, Sebastian and Koutsourelakis, P.-S.}, year = {2020}, booktitle = {1st {NeurIPS} workshop on {Interpretable} {Inductive} {Biases and Physically Structured Learning}}, file = {Full Text:/Users/sebastian/Zotero/storage/K4Y9MXI2/Kaltenbach and Koutsourelakis - Physics-aware, data-driven discovery of slow and s.pdf:application/pdf}, }`

### Education

- Master of Science in Computational Mechanics, TUM, 2017
- Master of Science in Medical Engineering and Technology, TUM, 2017