High-Performance Uncertainty Quantification and Optimization

Korali is a high-performance framework for uncertainty quantification of computational models. Korali provides a scalable engine enables sampling and optimization on large-scale HPC systems, and a multi-language interface allows the execution of multiple computational models, either sequential or distributed (MPI), C++ or Python, and pre-compiled/legacy applications.


Latest Release (Build Status:  CircleCI)

(06/11/2020) Korali v2.0.0 - Download / Release Notes / Documentation

What Korali Does


Korali offers randomized and gradient-free optimization algorithms with almost perfect scaling on large computing architectures and hence they are perfectly suited for large scale engineering applications. Additionally, Korali implements a selection of well-known gradient based optimization algorithms. [ Examples ]

Uncertainty Quantification

Bayesian Uncertainty Quantification for the inference of parameters of computational models is one of the main pillars of Korali. Korali is designed to accommodate computationally demanding models by utilizing sampling algorithms that take advantage of parallel architectures. A large variety of likelihood models and prior distributions is offered that cover a wide spectrum of modeling needs. [ Examples ]

Machine Learning

Deep Neural Networks and Gaussian Processes allow to accurately approximate unknown or intractable functions. With Reinforcement Learning sequential decision making problems with complex or chaotic dynamics can be solved. Together, Korali offers supervised and semi-supervised learning capabilities, utilising its strengths in parallel tasking, optimisation and statistical modelling. [ Examples ]

Design Principles


The Korali work distribution engine has been optimized to fully harness computational resources of large-scale supercomputers, maximizing throughput and minimizing workload imbalance. Furthermore, Korali supports the execution of parallel (OpenMP, Pthreads), distributed (MPI, UPC++), and GPU-based (CUDA) models.

Fault Tolerance

Korali offers a checkpoint/resume fault-tolerance mechanism in which the entire state of an experiment is stored regularly and can be resumed later in case of work-load split or errors, guaranteeing that the result of a resumed experiment will be exactly the same as if the original launch would had ran to completion.

Extensible Engine

Korali is designed as a completely modular and extensible software. Researchers can easily integrate and test their algorithms for optimization and sampling into Korali. Likewise, new problems types can be easily added into Korali. All newly developed modules automatically benefit from Korali's scalability and fault-tolerance features without additional efforts.

Multi-Language Support

Korali applications can be programmed in either C++ or Python. Additionally, Korali can sample from C++/Fortran/Python and pre-compiled computational models.


Journal / Conference / arXiv Papers

Please use the following article to cite Korali:

Publications that use Korali

Talks, Posters and Lectures

Contact us

Principal Investigator

Development Team