Note: On monday I started working on the PRACE Summer of HPC project called: Mixed precision linear solvers for lattice QCD. Since I think that most people don’t know about the subject I will write this post to tell you more about lattice QCD. You don’t have to have a degree in physics to understand (or to enjoy it) and I promise I will not use math formulas or difficult words.
The wonders of the subatomic world: tiny particles and huge computers.
You probably have heard of protons and neutrons before. They are component of the atomic nuclei, we are all made of. However there are some particles that are smaller than neutrons and protons and they are called quarks and gluons.
Quarks and gluons are confined through strong interaction to form neutrons and protons and QCD is the theory that explains how this happens. In a very simplistic way, we could tell that QCD is the theory of how is it possible for our world to exist, starting from the tiniest part of us.
Ok, I have explained what QCD is, but what is lattice QCD? In QCD there are some arduous math expressions called path integrals. Does it sounds troublesome?
Don’t worry because you don’t have to deal personally with path integrals: thank to lattice QCD these path integrals are discretised in such a way that they are reduced to numerical computations. These computations are carried out on supercomputers, and they are incredibly challenging: thousands of scientists in the world are studying new methods to solve lattice QCD problems, and they are even creating special supercomputers for this purpose.
To sum up, until now we have the following: we have a theory (QCD) that tries to explain how matter is composed, starting from the tiniest particles. This theory is very complicated, and so we have an approach (lattice QCD) that transforms integrals paths in a very complex numerical problem. Finally, we need supercomputers to solve this very complex numerical problem.
You are probably thinking: Do you really need a supercomputer for computing interactions between tiny particles? The answer is yes: these computations are extremely difficult. How difficult, you may wonder?
First of all, lattice QCD is one of the original computational grand challenges.
“A grand challenge is a fundamental problem in science or engineering, with broad applications, whose solution would be enabled by the application of high performance computing resources that could become available in the near future”
Other grand challenges are speech recognition and computer vision, plasma dynamics for fusion, weather forecasting (if you are interested, check the Wikipedia page )
Moreover lattice QCD is so computationally expensive, that many supercomputers have been designed exclusively to solve this kind of problems. During our training week in Juelich for the PRACE Summer of HPC we had the opportunity to see QPRACE. QPRACE is a supercomputer developed for lattice QCD, and its computational cores are an enhanced version of the PlayStation 3 processor! However, many other famous supercomputers have been designed keeping in mind the challenge posed by lattice QCD: the IBM Blue Gene/L is one of those. Just to give you an idea, IBM Blue Gene/L cost 100 millions of dollars, and it took 5 years to create it. Consider that a single scientist would take 177000 years to perform the computations that Blue Gene can do in one second! Finally, QCD is such an important problem in HPC, that it has been used for years as a benchmark: if you really wanted to test your supercomputer, you had to do that on a QCD problem!
This is what I found out during the first days working on the project. I told you about the physical problem, and its importance in HPC. In the following posts I will talk more about the techniques we will use to tackle this important problem and I will keep you updated about my progress.