Skip to main content

LUP Student Papers

LUND UNIVERSITY LIBRARIES

Monte Carlo production of proton-proton collision events using the ATLAS@Home framework

Sidiropoulos Kontos, Dimitrios LU (2018) FYSM60 20172
Department of Physics
Particle and nuclear physics
Abstract
The ATLAS@Home project is a volunteer computing project, part of the larger LHC@Home project, aimed at using the computational power of personal computers from volunteers around the globe, who are interested in helping with the particle physics research taking place at the ATLAS experiment at LHC. Up this point it runs only ATLAS detector simulation tasks. This thesis explores the possibility of having the full generation of large Monte Carlo samples performed on this platform after laying the theoretical physics groundwork and introducing the concepts and elements used by this platform. Such a task has not been attempted so far but with the computational resources increasingly limited for such tasks, due to the large amounts of data the... (More)
The ATLAS@Home project is a volunteer computing project, part of the larger LHC@Home project, aimed at using the computational power of personal computers from volunteers around the globe, who are interested in helping with the particle physics research taking place at the ATLAS experiment at LHC. Up this point it runs only ATLAS detector simulation tasks. This thesis explores the possibility of having the full generation of large Monte Carlo samples performed on this platform after laying the theoretical physics groundwork and introducing the concepts and elements used by this platform. Such a task has not been attempted so far but with the computational resources increasingly limited for such tasks, due to the large amounts of data the LHC produces nowadays, the need for additional non-dedicated resources, such as those offered by ATLAS@Home, is increasing. The study explores that possibility using reference Monte Carlo samples and tests whether their generation can be reliably reproduced on the virtual machine used by the project and other environments such as the Grid and a local cluster. It also tests whether the generation and derivation of the simulation data in an, appropriate and readable by commonly used analysis software, file format can occur in a single Grid task submission (as ATLAS@Home should operate as a Grid site as well from where the tasks are picked up and where the output files return upon generation), without storing and transferring heavy, intermediate generation files during this process. The study succeeds in meeting those conditions for the samples tested and proceeds to the succesful submission of a number of such tasks to a test project, running parallel to ATLAS@Home for such new kinds of submissions, with plans, in the near future, to have such tasks running on the main application. (Less)
Popular Abstract
The field of particle physics has seen significant strides in terms of new discoveries the last years, mainly due to the contributions of the ambitious and large scale experiments taking place at the CERN particle physics laboratory.The CERN facilities host the largest and most powerful particle accelerator, the Large Hadron Collider (LHC). In that accelerator high energy particle beams collide and new particles are created that eventually decay in a number of complex ways
inside the detectors that consist its 4 main experiments. The information provided by those collisions, relating to the nature of the produced particles, is stored and digitalized creating a "digital summary" of the event. At the ATLAS experiment currently, at what is... (More)
The field of particle physics has seen significant strides in terms of new discoveries the last years, mainly due to the contributions of the ambitious and large scale experiments taking place at the CERN particle physics laboratory.The CERN facilities host the largest and most powerful particle accelerator, the Large Hadron Collider (LHC). In that accelerator high energy particle beams collide and new particles are created that eventually decay in a number of complex ways
inside the detectors that consist its 4 main experiments. The information provided by those collisions, relating to the nature of the produced particles, is stored and digitalized creating a "digital summary" of the event. At the ATLAS experiment currently, at what is called Run 2 of the LHC (the first run data were used to establish the discovery of the famous Higgs boson), the amount of data produced and stored from the proton-proton collisions is about 800 MB/s - 1GB/s - a truly impressive amount. Processing all this data takes up considerable space and time at the computing facilities at CERN dedicated to data processing.
The interpretation of the results these data will produce requires the production of another component - theoretical simulations of such collisions events and detector behavior called Monte Carlo simulations. Such simulations, produced by so-called generators, consist of events that simulate those produced in particle accelerators and are based on theoretical predictions of a tested model. Such a comparison can lead to the confirmation or rejection of a proposed theoretical
model.Those generated events also need to be processed, similarly to actual data and thus they need to use the processing power of the dedicated computing facilities at CERN.
The extremely high amounts of data produced at the the current run of LHC though already use most of the available computing power at CERN in order to be processed and that leaves little to no computational power to process the needed large amounts of simulated events. In order to make up for this lack of computational power it has been proposed to employ the processing power volunteer computing resources can offer.
But what is volunteer computing ? Imagine that you are in need to perform a very demanding computer task for a project you have to finish and your friend, who owns a computer capable of running this task, offers to lend it to you when he is not using it for you to finish your task. Volunteer computing works in a similar way - volunteers from around the globe offer the processing power of their personal computers’ CPUs, when they are not using them, to help process simulations and data. Collectively, the processing power collected from the various personal computers participating is considerable and to a degree comparable to that of the dedicated computing facilities at CERN. In order to participate, the volunteer simply has to download the relevant software (called BOINC) and pick the task they want to process.
This is what the ATLAS@Home project has been doing - taking advantage of the volunteer computing concept and assembling an amount of volunteers around the globe interested in assisting with ATLAS research who are willing to offer their computer’s processing power when they are not using it to process mostly detector simulation tasks. Monte Carlo generation has not yet been tested at ATLAS@Home due to conceptual differences those show compared to actual data (for example they have no actual input data provided as they are randomized computer-generated events). However, with the increased workload of the dedicated computing facilities at CERN, the urgency to produce those theoretical samples using alternative solutions such as those provided by the ATLAS@Home framework becomes more and more urgent and that is the target of this project - successful production of Monte Carlo samples at volunteer computing resources to ease the burden on the dedicated computing facilities at CERN.
The successful processing of such simulations with ATLAS@Home will be significant for the future of particle physics research - it will lead to a higher accuracy of the ATLAS results, better interpretations, more efficient processing and possibly new discoveries in the field of particle physics. With the first results seemingly hopeful, as a number of simulated events have already been generated on the platform on a test level, the possibilities for anyone interested into furthering
our knowledge of the creation of the universe itself seem endless and their ability to contribute just one click away. (Less)
Please use this url to cite or link to this publication:
author
Sidiropoulos Kontos, Dimitrios LU
supervisor
organization
course
FYSM60 20172
year
type
H2 - Master's Degree (Two Years)
subject
keywords
Particle, Physics, ATLAS, ATLAS@Home, framework, Volunteer, Computing, Monte, Carlo, BOINC, proton, collider, LHC, event, virtual, machine
language
English
id
8932453
date added to LUP
2018-01-25 09:35:31
date last changed
2018-01-25 09:35:31
@misc{8932453,
  abstract     = {{The ATLAS@Home project is a volunteer computing project, part of the larger LHC@Home project, aimed at using the computational power of personal computers from volunteers around the globe, who are interested in helping with the particle physics research taking place at the ATLAS experiment at LHC. Up this point it runs only ATLAS detector simulation tasks. This thesis explores the possibility of having the full generation of large Monte Carlo samples performed on this platform after laying the theoretical physics groundwork and introducing the concepts and elements used by this platform. Such a task has not been attempted so far but with the computational resources increasingly limited for such tasks, due to the large amounts of data the LHC produces nowadays, the need for additional non-dedicated resources, such as those offered by ATLAS@Home, is increasing. The study explores that possibility using reference Monte Carlo samples and tests whether their generation can be reliably reproduced on the virtual machine used by the project and other environments such as the Grid and a local cluster. It also tests whether the generation and derivation of the simulation data in an, appropriate and readable by commonly used analysis software, file format can occur in a single Grid task submission (as ATLAS@Home should operate as a Grid site as well from where the tasks are picked up and where the output files return upon generation), without storing and transferring heavy, intermediate generation files during this process. The study succeeds in meeting those conditions for the samples tested and proceeds to the succesful submission of a number of such tasks to a test project, running parallel to ATLAS@Home for such new kinds of submissions, with plans, in the near future, to have such tasks running on the main application.}},
  author       = {{Sidiropoulos Kontos, Dimitrios}},
  language     = {{eng}},
  note         = {{Student Paper}},
  title        = {{Monte Carlo production of proton-proton collision events using the ATLAS@Home framework}},
  year         = {{2018}},
}