High Performance Data Analysis: global simulations of the interaction between the solar wind and a planetary magnetosphere

Project reference: 2211
Space is a harsh environment; however, it is not totally void. In the solar system, space is filled with charged particles continuously expelled from the Sun corona. This flow is the so-called “solar wind”. Our goal is to deepen the understanding of the mechanisms at play in the interaction between such solar wind and the planets of the solar system.
Our group addresses the study of this interaction via three-dimensional, full-kinetic plasma simulations. To put it in a nutshell, our simulation code solves the Maxwell equations for the electromagnetic fields coupled with the equation of motion for the solar wind particles (forming a so called “collisionless plasma”). Such nonlinear coupling is at the origin of the complex and intricate behavior governing such systems. Eventually, we aim at comparing the results of our simulations with spacecraft observations around planets of the solar system. Our work therefore directly supports solar system exploration and in situ observations performed by several space missions from ESA (European Space Agency) and JAXA (Japanese Space Agency) such as the exploratory planetary missions Rosetta, BepiColombo, Juice.
With our HPC numerical simulations, typical running on Tier-1 and Tier-0 computing centers, we can model Sun-Planet interaction but at the cost of producing large dataset of physical quantities around the planet (including electric and magnetic fields, particles trajectories etc). These outputs need careful and efficient post-processing. Such post-processing based on HPDA approach is the goal of this student research project.
This HPDA project focuses on increasing the efficiency and reliability of our post-processing techniques. Given the raw simulation outputs, the tasks for this project concern the computation of (1) velocity distribution function for particles, (2) particle currents and fluxes onto the planet surface, (3) global 3D shape of the planetary magnetic field. The accomplishment of this tasks will enable a much more efficient analysis of the simulation, and thus a much higher scientific impact. To conclude, the goal of this project is to build – from a given set of simulations output – efficient and scientifically rigorous post-processing routines that would eventually run in parallel on the computer center where the HPC simulations are performed.

Outline of the post-processing pipeline at the basis of this project. From left to right:
(left panel) we obtain the raw simulation data from our code, stored at the HPC center,
(Middle panel) these data are first processed at the HPC center to visualize them and understand the overall dynamics in the simulation. In the middle panel, we show a 2D cut of the electron current around the planet, the solar wind is flowing from left to right and the planet is shown by a gray solid sphere in the center of the box,
(Right panel) in a second, more refined, post-processing level we address in detail a specific physical process of interest, as an example in the right panel we show a comparison between electron energy distribution functions both from the simulation (upper panel) and from the Mariner10 spacecraft in-situ data at Mercury (bottom panel).
Project Mentor: Pierre Henri
Project Co-mentor: Federico Lavorenti
Site Co-ordinator: Karim Hasnaoui
Learning Outcomes:
Development of HPDA skills for scientific research. Get to know the HPC methods used for numerical modeling of space science phenomena and contribute to the interpretation of spacecraft observations.
Student Prerequisites (compulsory):
Background in physics, especially fluid-mechanics and electromagnetism. Coding experience with common scientific computing language (C/C++ and/or Fortran). Familiar with Linux environment.
Student Prerequisites (desirable):
Coding experience with Python.
Parallel coding (MPI and/or OpenMP).
Plasma physics.
Training Materials:
Some video of our simulations: https://vimeo.com/user146522309
The simulation code: https://github.com/KTH-HPC/iPIC3D
A scientific article explaining the code: https://doi.org/10.1016/j.matcom.2009.08.038
A scientific article showing an application of this code: https://doi.org/10.1103/PhysRevLett.123.055101
Workplan:
The workplan is organized as follows:
- Training week (W1)
- Introduction to the working environment, the workflow of the simulation code, the structure of the code output, and the existing post-processing routines (W3)
- Definition of the most suitable approach to optimize the existing routines and implement the new ones (W3)
- Optimization and implementation (W4-W5)
- Validation and debugging (W6-W7)
- Final report writing (W8)
Final Product Description:
– Optimization of the existing post-processing routines (currently written in python in serial) for both 1st and 2nd level analysis, to be parallelized and to eventually run on the HPC center where the dataset is stored.
– Implementation of new post-processing routines (parallel and to eventually run on the HPC center where the dataset is stored) to extend the 2nd level scientific analysis.
Adapting the Project: Increasing the Difficulty:
The implementation of the new routines could be implemented directly within in the simulator code structure, in order to perform the post-processing directly during the HPC run, instead of post-processing the simulation output.
Adapting the Project: Decreasing the Difficulty:
Focus only on the optimization of the existing routines.
Resources:
- Datasets from HPC simulations will be provided
- The simulator will be provided
- No s/w license will be necessary
- To avoid foreseen difficulties regarding the access to Tier-0 facilities for summer students, we will use a Tier-2 center instead on which the HPC output data will be transferred. The « Maison de la Simulation » has agreed to give to both the students and the tutors access to the Paris-Saclay mesocenter (http://mesocentre.centralesupelec.fr/) during the duration of the project.
- NB: Computer not provided if online project
Organisation:
IDRIS, Laboratoire Lagrange, Observatoire de la Côte d’Azur, Université Côte d’Azur, CNRS, Nice, France
Leave a Reply