High-Performance Uncertainty Quantification and Optimization
Korali is a high-performance framework for uncertainty quantification of computational models. Korali provides a scalable engine enables sampling and optimization on large-scale HPC systems, and a multi-language interface allows the execution of multiple computational models, either sequential or distributed (MPI), C++ or Python, and pre-compiled/legacy applications.
Korali offers randomized and gradient-free optimization algorithms with almost perfect scaling on large computing architectures and hence they are perfectly suited for large scale engineering applications. Additionally, Korali implements a selection of well-known gradient based optimization algorithms. [ Examples ]
Bayesian Uncertainty Quantification for the inference of parameters of computational models is one of the main pillars of Korali. Korali is designed to accommodate computationally demanding models by utilizing sampling algorithms that take advantage of parallel architectures. A large variety of likelihood models and prior distributions is offered that cover a wide spectrum of modeling needs. [ Examples ]
Deep Neural Networks and Gaussian Processes allow to accurately approximate unknown or intractable functions. With Reinforcement Learning sequential decision making problems with complex or chaotic dynamics can be solved. Together, Korali offers supervised and semi-supervised learning capabilities, utilising its strengths in parallel tasking, optimisation and statistical modelling. [ Examples ]
The Korali work distribution engine has been optimized to fully harness computational resources of large-scale supercomputers, maximizing throughput and minimizing workload imbalance. Furthermore, Korali supports the execution of parallel (OpenMP, Pthreads), distributed (MPI, UPC++), and GPU-based (CUDA) models.
Korali offers a checkpoint/resume fault-tolerance mechanism in which the entire state of an experiment is stored regularly and can be resumed later in case of work-load split or errors, guaranteeing that the result of a resumed experiment will be exactly the same as if the original launch would had ran to completion.
Korali is designed as a completely modular and extensible software. Researchers can easily integrate and test their algorithms for optimization and sampling into Korali. Likewise, new problems types can be easily added into Korali. All newly developed modules automatically benefit from Korali's scalability and fault-tolerance features without additional efforts.
Korali applications can be programmed in either C++ or Python. Additionally, Korali can sample from C++/Fortran/Python and pre-compiled computational models.
Journal / Conference / arXiv Papers
Please use the following article to cite Korali:
S. Martin, D. Wälchli, G. Arampatzis, P. Koumoutsakos, "Korali: a High-Performance Computing Framework for Stochastic Optimization and Bayesian Uncertainty Quantification". arXiv 2005.13457. Zürich, Switzerland, May 2020.
D. Wälchli, S. Martin, A. Economides, L. Amoudruz, G. Arampatzis, X. Bian, P. Koumoutsakos, "Load Balancing in Large Scale Bayesian Inference". [To be published at] Proceedings of the Platform for Advanced Scientific Computing Conference (PASC2020). Geneva, Switzerland, June 2020.
G. Arampatzis, D. Wälchli, P. Weber, H. Rästas, and P. Koumoutsakos, "(μ,λ)-CCMA-ES for Constrained Optimization with an Application in Pharmacodynamics", Proceedings of the Platform for Advanced Scientific Computing Conference (PASC2019). Zürich, Switzerland, June 2019. [PDF][DOI]
Publications that use Korali
P. Karnakov, G. Arampatzis, I. Kičić, F. Wermelinger, D. Wälchli, C. Papadimitriou, P. Koumoutsakos "Data driven inference of the reproduction number (R0) for COVID-19 before and after interventions for 51 European countries". medRxiv 2020.05.21.20109314. Zürich, Switzerland, May 2020.