Skip to main content

LUP Student Papers

LUND UNIVERSITY LIBRARIES

Latency Prediction in 5G Networks by using Machine Learning

Elgcrona, Erica and Mete, Evrim (2023)
Department of Automatic Control
Abstract
This thesis presents a report of predicting latency in a 5G network by using deep learning techniques. The training set contained data of network parameters along with the actual latency, collected in a 5G lab environment during four different test scenarios. We trained four different machine learning models, including Forward Neural Network (FNN), Convolutional Neural Network (CNN), Recurrent Neural Network (RNN), and Long Short-Term Memory (LSTM). After the initial model implementation, each model was refined by using Bayesian optimization for Hyperparameter Optimization (HPO). In addition, both the standard mean squared error (MSE) and a custom asymmetric version of the mean squared error (AMSE) were used as loss functions.

Overall,... (More)
This thesis presents a report of predicting latency in a 5G network by using deep learning techniques. The training set contained data of network parameters along with the actual latency, collected in a 5G lab environment during four different test scenarios. We trained four different machine learning models, including Forward Neural Network (FNN), Convolutional Neural Network (CNN), Recurrent Neural Network (RNN), and Long Short-Term Memory (LSTM). After the initial model implementation, each model was refined by using Bayesian optimization for Hyperparameter Optimization (HPO). In addition, both the standard mean squared error (MSE) and a custom asymmetric version of the mean squared error (AMSE) were used as loss functions.

Overall, it was possible to predict the latency behavior for all models, although the FNN model was reactive rather than predictive and therefore not suitable for this task. Before the Bayesian optimization the models excluding FNN had a R2 score of 0.88−0.95, and after Bayesian optimization the score increased to 0.96−0.98 for the first data set. According to research, custom loss functions can be used to make the models even more suitable for practical use by penalizing underpredictions more severely than overpredictions. (Less)
Please use this url to cite or link to this publication:
author
Elgcrona, Erica and Mete, Evrim
supervisor
organization
year
type
H3 - Professional qualifications (4 Years - )
subject
report number
TFRT-6211
other publication id
0280-5316
language
English
id
9136969
date added to LUP
2023-09-12 14:06:03
date last changed
2023-09-12 14:06:03
@misc{9136969,
  abstract     = {{This thesis presents a report of predicting latency in a 5G network by using deep learning techniques. The training set contained data of network parameters along with the actual latency, collected in a 5G lab environment during four different test scenarios. We trained four different machine learning models, including Forward Neural Network (FNN), Convolutional Neural Network (CNN), Recurrent Neural Network (RNN), and Long Short-Term Memory (LSTM). After the initial model implementation, each model was refined by using Bayesian optimization for Hyperparameter Optimization (HPO). In addition, both the standard mean squared error (MSE) and a custom asymmetric version of the mean squared error (AMSE) were used as loss functions.

Overall, it was possible to predict the latency behavior for all models, although the FNN model was reactive rather than predictive and therefore not suitable for this task. Before the Bayesian optimization the models excluding FNN had a R2 score of 0.88−0.95, and after Bayesian optimization the score increased to 0.96−0.98 for the first data set. According to research, custom loss functions can be used to make the models even more suitable for practical use by penalizing underpredictions more severely than overpredictions.}},
  author       = {{Elgcrona, Erica and Mete, Evrim}},
  language     = {{eng}},
  note         = {{Student Paper}},
  title        = {{Latency Prediction in 5G Networks by using Machine Learning}},
  year         = {{2023}},
}