Skip to main content

LUP Student Papers

LUND UNIVERSITY LIBRARIES

Scrutinizing the Schmidt Test and Exploring the Use of Machine Learning for Statistical Assessment of Radioactive Decay Chains Stemming from Superheavy Nuclei Research

Nelissen, Pim LU (2024) FYSK04 20241
Particle and nuclear physics
Department of Physics
Abstract
Experimental nuclear structure data coming from superheavy nuclei synthesis experiments often consists of correlated alpha decay chains. In the absence of neutron detectors - which would fully characterize the exit channel after the fusion-evaporation reaction - the sequence of decay energies and half-lives are the 'fingerprint' of the exit channel itself. Experimental data in this region is sparse, and its interpretation can be liable to error or confirmation bias. A so-called "Schmidt test" is a method for determining the congruence of correlation times for a set of measurements of one decay step. Its outcome is not always entirely conclusive, however. This study evaluates the congruence derived from the Schmidt test using Monte Carlo... (More)
Experimental nuclear structure data coming from superheavy nuclei synthesis experiments often consists of correlated alpha decay chains. In the absence of neutron detectors - which would fully characterize the exit channel after the fusion-evaporation reaction - the sequence of decay energies and half-lives are the 'fingerprint' of the exit channel itself. Experimental data in this region is sparse, and its interpretation can be liable to error or confirmation bias. A so-called "Schmidt test" is a method for determining the congruence of correlation times for a set of measurements of one decay step. Its outcome is not always entirely conclusive, however. This study evaluates the congruence derived from the Schmidt test using Monte Carlo simulated data with various level of contamination from incongruent data. Furthermore, the study also includes the evaluation of congruence of data stemming from single decays and multi-step decay chains. A multi-layer perceptron was trained on extracted features from simulated decay chain sets with one step. The Schmidt test performs well with larger decay sets and when the half-life of the contaminating species is longer than the original species by a factor 5 or 10. However, the test performs poorly in low counting statistics, where few recorded decay times are available. The newly proposed machine learning model outperforms the Schmidt test in certain high statistics scenarios, but also fails when few decay times are available. Its performance is also poor when the half-life of the contaminant is shorter than the original half-life. The learning behaviour of the model is analysed, showing significant contributions from higher statistical moments in training. Future work involves including chain correlations across multiple steps, alpha decay energies, as well as the potential use of alternative machine learning models. (Less)
Popular Abstract
Superheavy nuclei are by definition those with more than 103 protons. They are all man-made and are created through the unlikely fusion of two lighter nuclei. After fusion, these new superheavy nuclei are very radioactive and decay following a characteristic sequence, or 'chain' of radioactive decays. This chain serves as a fingerprint for the newly created nucleus. Experimental chain decay data consists of energies and associated decay times which can be used to confirm the production of a nucleus. However, experimental data is sparse due to the extremely low probability of fusion. This, together with experimental limitations, means that uniquely assigning data to a specific nuclear origin is difficult. This thesis explores a so-called... (More)
Superheavy nuclei are by definition those with more than 103 protons. They are all man-made and are created through the unlikely fusion of two lighter nuclei. After fusion, these new superheavy nuclei are very radioactive and decay following a characteristic sequence, or 'chain' of radioactive decays. This chain serves as a fingerprint for the newly created nucleus. Experimental chain decay data consists of energies and associated decay times which can be used to confirm the production of a nucleus. However, experimental data is sparse due to the extremely low probability of fusion. This, together with experimental limitations, means that uniquely assigning data to a specific nuclear origin is difficult. This thesis explores a so-called "Schmidt test" that is used to determine whether a set of decay times likely originates from one nucleus or not. A machine learning algorithm is also developed with the same goal as the Schmidt test in order to avoid any bias in the data interpretation. Simulations of decay chain data were performed to evaluate the Schmidt test and the algorithm. In certain cases, the machine learning model outperformed the Schmidt test. However, the algorithm fails to correctly predict congruence when decay times of the contaminant are not sufficiently different from the other decay times. Both the Schmidt test and the machine learning algorithm are rather inconclusive when it comes to situations where very few decay times are available in the data. This work gives some insight into the statistical properties of sets of decay times, but more work is needed to add robustness to the determination of the properties that are most meaningful in identifying the originating nuclei of the decay data. (Less)
Please use this url to cite or link to this publication:
author
Nelissen, Pim LU
supervisor
organization
course
FYSK04 20241
year
type
M2 - Bachelor Degree
subject
keywords
nuclear physics, superheavy nuclei, superheavy elements, alpha decay, decay chains, monte carlo simulations, schmidt test, machine learning, artificial neural networks, multi-layer perceptron
language
English
id
9168893
date added to LUP
2024-06-27 08:02:41
date last changed
2024-06-27 08:02:41
@misc{9168893,
  abstract     = {{Experimental nuclear structure data coming from superheavy nuclei synthesis experiments often consists of correlated alpha decay chains. In the absence of neutron detectors - which would fully characterize the exit channel after the fusion-evaporation reaction - the sequence of decay energies and half-lives are the 'fingerprint' of the exit channel itself. Experimental data in this region is sparse, and its interpretation can be liable to error or confirmation bias. A so-called "Schmidt test" is a method for determining the congruence of correlation times for a set of measurements of one decay step. Its outcome is not always entirely conclusive, however. This study evaluates the congruence derived from the Schmidt test using Monte Carlo simulated data with various level of contamination from incongruent data. Furthermore, the study also includes the evaluation of congruence of data stemming from single decays and multi-step decay chains. A multi-layer perceptron was trained on extracted features from simulated decay chain sets with one step. The Schmidt test performs well with larger decay sets and when the half-life of the contaminating species is longer than the original species by a factor 5 or 10. However, the test performs poorly in low counting statistics, where few recorded decay times are available. The newly proposed machine learning model outperforms the Schmidt test in certain high statistics scenarios, but also fails when few decay times are available. Its performance is also poor when the half-life of the contaminant is shorter than the original half-life. The learning behaviour of the model is analysed, showing significant contributions from higher statistical moments in training. Future work involves including chain correlations across multiple steps, alpha decay energies, as well as the potential use of alternative machine learning models.}},
  author       = {{Nelissen, Pim}},
  language     = {{eng}},
  note         = {{Student Paper}},
  title        = {{Scrutinizing the Schmidt Test and Exploring the Use of Machine Learning for Statistical Assessment of Radioactive Decay Chains Stemming from Superheavy Nuclei Research}},
  year         = {{2024}},
}