Forecasting the USD/SEK exchange rate using deep neural networks
(2019) In LUNFMS-4037-2019 MASK11 20191Mathematical Statistics
- Abstract
- This thesis is about predicting the average ten minute closing bid price of the USD/SEK exchange rate by applying deep learning methods. First, the time lag method is applied for the vanilla Feedforward Neural Network (FNN) to undertake one-step prediction. Secondly, three univariate Long Short-Term Memory (LSTM) models are used to undertake one-step and multi-step prediction. Each network is theoretically described and motivated.
The results indicate that both the FNN and LSTM are applicable to time series prediction and that the LSTM outperforms the FNN. Furthermore, the results suggests the LSTM can outperform the naïve predictor by a small margin but it remains uncertain. It is concluded that to detect structure in the exchange rate... (More) - This thesis is about predicting the average ten minute closing bid price of the USD/SEK exchange rate by applying deep learning methods. First, the time lag method is applied for the vanilla Feedforward Neural Network (FNN) to undertake one-step prediction. Secondly, three univariate Long Short-Term Memory (LSTM) models are used to undertake one-step and multi-step prediction. Each network is theoretically described and motivated.
The results indicate that both the FNN and LSTM are applicable to time series prediction and that the LSTM outperforms the FNN. Furthermore, the results suggests the LSTM can outperform the naïve predictor by a small margin but it remains uncertain. It is concluded that to detect structure in the exchange rate much more computing power might be required to learn from significantly longer time series as input.
Finally, some economic theory is reviewed and presented which could be used as potential inputs improving the results. A small discussion on overlooked biases is also included. (Less) - Popular Abstract
- In the last couple of years, machine learning has become a hot topic in both the academic as well as the corporate world. Unprecedented amounts of data and technology possibilities has enabled its application to difficult problems. One of the branches within machine learning is neural networks.
The term neural networks comes from the similarities made to neurons in the biological brain. After receiving some input, the neuron is activated and the processed input becomes an output. Over time, the neuron is able to adapt and generalize.
Although neural networks are resembled as real neurons, they are mathematically motivated. The purpose of the most basic neural network, the Feedforward Neural Network
(FNN), is to approximate some... (More) - In the last couple of years, machine learning has become a hot topic in both the academic as well as the corporate world. Unprecedented amounts of data and technology possibilities has enabled its application to difficult problems. One of the branches within machine learning is neural networks.
The term neural networks comes from the similarities made to neurons in the biological brain. After receiving some input, the neuron is activated and the processed input becomes an output. Over time, the neuron is able to adapt and generalize.
Although neural networks are resembled as real neurons, they are mathematically motivated. The purpose of the most basic neural network, the Feedforward Neural Network
(FNN), is to approximate some function. The parameters, referred to as weights, are to be optimized. Information is first sent into the network in order to produce an output, a process referred to as forward propagation. Secondly, the error is sent backwards through the network. For this, backpropagation algorithm is used. Finally, by updating the weights, what is known as gradient descent can be undertaken. If done iteratively, this is what enables the network to learn.
The FNN however has difficulties learning temporal sequences and in 1997 a new type of neural network was introduced by Hochreiter and Schmithuber. The new neural network
was termed Long Short Term Memory (LSTM) and can detect patterns in long sequences of data where observations are not independent.
This thesis applied the FNN and LSTM to forecasting the USD/SEK exchange rate. The data consisted of three years of ten minute averages of the USD/SEK closing bid.
The results show that both the FNN and the LSTM can be applied to forecasting time series. But when benchmarked with the näive predictor (prediction is set to
what can currently be observed) both networks struggle to outperform it. It is concluded, first, that the LSTM learns to become the naïve predictor with some modification. And secondly, the LSTM does not necessarily benefit from more historical data as input.
This suggests the data is random and lacks structure. However, it cannot be neglected that the thesis could have failed in mobilizing enough computing power and that more data is required to detect patterns. (Less)
Please use this url to cite or link to this publication:
http://lup.lub.lu.se/student-papers/record/8992603
- author
- Hamfelt, Thomas LU
- supervisor
- organization
- course
- MASK11 20191
- year
- 2019
- type
- M2 - Bachelor Degree
- subject
- keywords
- Neural networks, Deep Learning, Foreign Exchange, Finance
- publication/series
- LUNFMS-4037-2019
- report number
- 2019:K22
- ISSN
- 1654-6229
- language
- English
- id
- 8992603
- date added to LUP
- 2020-03-09 10:47:32
- date last changed
- 2020-03-09 10:47:32
@misc{8992603, abstract = {{This thesis is about predicting the average ten minute closing bid price of the USD/SEK exchange rate by applying deep learning methods. First, the time lag method is applied for the vanilla Feedforward Neural Network (FNN) to undertake one-step prediction. Secondly, three univariate Long Short-Term Memory (LSTM) models are used to undertake one-step and multi-step prediction. Each network is theoretically described and motivated. The results indicate that both the FNN and LSTM are applicable to time series prediction and that the LSTM outperforms the FNN. Furthermore, the results suggests the LSTM can outperform the naïve predictor by a small margin but it remains uncertain. It is concluded that to detect structure in the exchange rate much more computing power might be required to learn from significantly longer time series as input. Finally, some economic theory is reviewed and presented which could be used as potential inputs improving the results. A small discussion on overlooked biases is also included.}}, author = {{Hamfelt, Thomas}}, issn = {{1654-6229}}, language = {{eng}}, note = {{Student Paper}}, series = {{LUNFMS-4037-2019}}, title = {{Forecasting the USD/SEK exchange rate using deep neural networks}}, year = {{2019}}, }