Skip to main content

LUP Student Papers

LUND UNIVERSITY LIBRARIES

Impact of different primary particles on Geant4 simulation execution time - study on protons and pions

Jigström Madeira, Rebecca LU (2021) FYSK02 20202
Particle and nuclear physics
Department of Physics
Abstract
The High Luminosity Large Hadron Collider (HL-LHC) at CERN, an improvement to the LHC, scheduled to be operating in 2027, will further the potential for scientific breakthroughs in High Energy Physics. The project’s demand in computational resources, however, is predicted to exceed what will be available then. Thus, efficiency optimization of Monte Carlo simulations of the collisions that occur in the LHC experiments, which today occupy 40% of those resources and are pivotal for the correct interpretation of the collected data, is the aim of recent research and development efforts.

This study focuses on the ATLAS experiment, specifically, and makes use of the Geant4 simulation software and its extensive libraries, used at LHC to... (More)
The High Luminosity Large Hadron Collider (HL-LHC) at CERN, an improvement to the LHC, scheduled to be operating in 2027, will further the potential for scientific breakthroughs in High Energy Physics. The project’s demand in computational resources, however, is predicted to exceed what will be available then. Thus, efficiency optimization of Monte Carlo simulations of the collisions that occur in the LHC experiments, which today occupy 40% of those resources and are pivotal for the correct interpretation of the collected data, is the aim of recent research and development efforts.

This study focuses on the ATLAS experiment, specifically, and makes use of the Geant4 simulation software and its extensive libraries, used at LHC to simulate the passage of particles through matter. It aims at providing useful data for future full scale studies on software time response improvement. Parameters such as the type of build method - static or dynamic - and version of the GCC compiler, from recent studies, have been shown to have a considerable impact in reducing the execution time of the simulations. This research carries forward this analysis and studies the impact of different primary particles, which are created in the pp collisions, on the simulation time, specifically the pions, π+/−, and protons, p. The software’s virtual particles, geantino and charged geantino, were also studied.

A simulation benchmark was used, with a simplified version of the ATLAS detector, and was run through the Aurora cluster at Lund University. This was carried out for 10 and 20 GeV particles, using both static and dynamically compiled libraries. The statically compiled simulations were confirmed to decrease time by 10%, as was foreseeable. In addition, all considered particles exhibit simulation time distributions which agree with what would be expected from theory. The virtual particles confirm the large contribution that the simulation of interactions in the detector has on the execution time. Moreover, both the negative and positive pions registered a mean execution time about 4% smaller than the proton’s, in agreement with the pions’ smaller probability of interaction. (Less)
Popular Abstract
By accelerating bunches of 100 billion protons each to close to the speed of light and smashing them together, scientists at LHC, CERN - one of the most sophisticated High Energy Physics apparatus today - can study the smallest fundamental particles that make up all matter, including us, using various detectors. The LHC detector experiments are highly intricate, but there are still unanswered questions, and progress is necessary. The more energy we can collide particles with, the bigger the chances we have of finding new particles and new physics; perhaps find answers to questions such as “What and where is dark matter?”. Scientists at LHC have, for this reason, worked to increase the energy of collisions and improve detectors and... (More)
By accelerating bunches of 100 billion protons each to close to the speed of light and smashing them together, scientists at LHC, CERN - one of the most sophisticated High Energy Physics apparatus today - can study the smallest fundamental particles that make up all matter, including us, using various detectors. The LHC detector experiments are highly intricate, but there are still unanswered questions, and progress is necessary. The more energy we can collide particles with, the bigger the chances we have of finding new particles and new physics; perhaps find answers to questions such as “What and where is dark matter?”. Scientists at LHC have, for this reason, worked to increase the energy of collisions and improve detectors and experiments; the High Luminosity Large Hadron Collider, HL-LHC, is an upgrade to the LHC, scheduled to be running by 2027.

The drawback, however, is that such advances will require an unprecedented
amount of computational power, and, as is foreseen today, the available resources will not fulfill these requirements. It is, therefore, imperative, that the primary consumer of these resources be optimized, to the fullest extent possible: as much as 40% is occupied by the simulation of the collisions and the consequent passage of the products through the various mediums of the detector, alone.

Simulation is an important stage at the LHC experiments: for any scientific theory to be proven right, it needs to be backed up by simulation. Not only that, it was also crucial before construction, to predict equipment failure and guarantee the staff’s safety. The simulation software developed at CERN is called Geant4: a swiss knife that provides scientists with all the tools they need to build their specific simulation in a detector, run it and analyse the results in the most optimal way. The software is well capable of handling the amount of data it is required to process, however simulating the millions of events that occur can still be extremely time consuming; it can take Geant4 hours to process one event.

The work of this thesis is intended to provide quantitative data on the impact the type of particle that is being tracked has on the simulation time in the ATLAS detector experiment, given that different particles will undergo different interactions within the detector. This study was mainly focused on two particles detected in abundance by ATLAS: the proton and the pion; despite undergoing similar sets of processes, these particles have different masses and different probabilities of interactions. The instances of no particle-detection were also analyzed. Given theoretical background and the framework of the particular simulation used, all collected data was in agreement with what would be expected from theory, at the considered energies. This data will be useful and practical for ongoing and future research and development efforts aiming towards the optimization and reduction of the execution time of the simulations, without the loss of data quality. These efforts will, hopefully, spark new scientific breakthroughs in High Energy Physics. (Less)
Please use this url to cite or link to this publication:
author
Jigström Madeira, Rebecca LU
supervisor
organization
course
FYSK02 20202
year
type
M2 - Bachelor Degree
subject
keywords
Geant4, Monte Carlo method, Large Hadron Collider, simulation time, collider experiment
language
English
id
9038005
date added to LUP
2021-01-26 23:37:56
date last changed
2021-01-26 23:37:56
@misc{9038005,
  abstract     = {{The High Luminosity Large Hadron Collider (HL-LHC) at CERN, an improvement to the LHC, scheduled to be operating in 2027, will further the potential for scientific breakthroughs in High Energy Physics. The project’s demand in computational resources, however, is predicted to exceed what will be available then. Thus, efficiency optimization of Monte Carlo simulations of the collisions that occur in the LHC experiments, which today occupy 40% of those resources and are pivotal for the correct interpretation of the collected data, is the aim of recent research and development efforts.

This study focuses on the ATLAS experiment, specifically, and makes use of the Geant4 simulation software and its extensive libraries, used at LHC to simulate the passage of particles through matter. It aims at providing useful data for future full scale studies on software time response improvement. Parameters such as the type of build method - static or dynamic - and version of the GCC compiler, from recent studies, have been shown to have a considerable impact in reducing the execution time of the simulations. This research carries forward this analysis and studies the impact of different primary particles, which are created in the pp collisions, on the simulation time, specifically the pions, π+/−, and protons, p. The software’s virtual particles, geantino and charged geantino, were also studied. 

A simulation benchmark was used, with a simplified version of the ATLAS detector, and was run through the Aurora cluster at Lund University. This was carried out for 10 and 20 GeV particles, using both static and dynamically compiled libraries. The statically compiled simulations were confirmed to decrease time by 10%, as was foreseeable. In addition, all considered particles exhibit simulation time distributions which agree with what would be expected from theory. The virtual particles confirm the large contribution that the simulation of interactions in the detector has on the execution time. Moreover, both the negative and positive pions registered a mean execution time about 4% smaller than the proton’s, in agreement with the pions’ smaller probability of interaction.}},
  author       = {{Jigström Madeira, Rebecca}},
  language     = {{eng}},
  note         = {{Student Paper}},
  title        = {{Impact of different primary particles on Geant4 simulation execution time - study on protons and pions}},
  year         = {{2021}},
}