Scaling HMC on large multi-CPU and/or multi-GPGPUs architectures

Project reference: 2216

A reasonable part of the data analytics and statistical modelling community relies on some form of probabilistic programming language and software to implement particular solvers for models written in this language. This approach allows users to specify stochastic models in a more natural way, which are then translated into, e.g., C++ code and, upon execution, will return the estimates for the parameters of the model (stochastic parameter fitting).

This project will continue the more physics-based implementation of Hamiltonian Monte Carlo started during the SoHPC 2021 to be used for sampling from probability distributions in a probabilistic programming framework with performance, parallelism and portability as goals. To this end we will utilise MPI + OpenMP, potentially swapping the latter for TBB.
The first stage of implementation achieved by SoHPC students has demonstrated that a reduction in complexity of the code and an introduction of hybrid parallelism can achieve significant (> ~2x) runtime reductions already for simple models. A preliminary analysis has indicated that the performance gains are in part due to the programming model and in part due to the re-introduction of physical parameters.
The successful applicant will work on a reimplementation of HMC with different sampling strategies in C++ and its integration in the code framework. The project will involve recurring benchmarking, scalability analyses as well as the analysis of the fundamental method to distinguish the effects of the physics-based formulation of HMC from those of the programming model. As such, a familiarity with classical dynamics and the basics of statistical physics is highly advantageous but can be communicated prior to the start of the project.

A purely pythonic implementation, that will allow for an execution on the GPU, can be attempted either in parallel with the CPU-only implementation or after the latter has reached sufficient maturity.
The expected outcome will be a parallel and intuitively understandable implementation of the method to be submitted to the appropriate software package for inclusion.

Project Mentor: Anton Lebedev

Project Co-mentor: Vassil Alexandrov

Site Co-ordinator: Luke Mason

Learning Outcomes:
The student  will also  acquire key skills such as:

– Familiarity with software development processes using GIT.

– Familiarity with concepts of Bayesian inference and its applications.

– Fundamentals of statistical physics.

The student will also learn to benchmark, profile and modify CPU and  multi-GPUs code written in  C++ and Python. Additionally they will acquire skills to implement hybrid programming approaches using MPI/OpenMP and to write mixed-language programs.

 Student Prerequisites (compulsory):
Necessary: (exclusion constraints)

– Working knowledge of C++ or Python


Student Prerequisites (desirable):
Highly desirable (any two): (necessary, but could be acquired before starting the project)

– Familiarity with fundamental concepts of stochastics (PDF, CDF, Bayes’ rule).

– Hamiltonian mechanics or statistical physics.

– MPI/OpenMP programming.

 Training Materials:
Will be provided to the student once he/she is selected depending on the student’s background.
“A Conceptual Introduction to Hamiltonian Monte Carlo“ by M. Betancourt ( )


Week 1/: Training week
Week 2/:  Familiarising with existing code, SW development with GIT
Week 3 – 5/: Project Development
Week 5/: Intermediate report, Adjustments of work schedule
Week 6-7/:Project Development
Week8/: Final Report write-up

Final Product Description:
The final product will be an efficient HMC parallel implementation together with the corresponding internal report, convertible to a conference or journal paper.

Adapting the Project: Increasing the Difficulty:
The difficulty can be raised by:

– Using the existing implementation as a kernel of a more complex method.

– Interfacing with optimization methods such as simulated annealing.

– Deriving test cases for correctness of the stochastic method.

Adapting the Project: Decreasing the Difficulty:
The project consists of multiple steps building upon each other. A decrease in difficulty will correspond to aiming for an intermediate, rather than final goal.

The student will need access to multi CPU and possibly multi GPU machines as well as standard computing resources (laptop, internet connection).

Hartee Centre – STFC

Tagged with: , ,

Leave a Reply

Your email address will not be published. Required fields are marked *


This site uses Akismet to reduce spam. Learn how your comment data is processed.