Skip to main content

LUP Student Papers

LUND UNIVERSITY LIBRARIES

Evaluation of Ferroelectric Tunnel Junction memristor for in-memory computation in real world use cases

Guerin, Alec LU and Papadopoulos, Christos (2023) EITM02 20231
Department of Electrical and Information Technology
Abstract
Machine learning algorithms are experiencing unprecedented attention, but their inherent computational complexity leads to high energy consumption. However, a paradigm shift in computing methods has the potential to address the issue. This shift could be a move towards analog in-memory computing, a method which uses Ohm’s and Kirchhoff’s Laws, and carries out the processing directly where data resides. This approach is being propelled by the development of memristors, versatile memory devices that are programmable and energy efficient.

This thesis explores the capabilities of a newly engineered memristor device. This device, based on Ferroelectric Tunnel Junctions (FTJ), was developed by Lund University and presents promising technology... (More)
Machine learning algorithms are experiencing unprecedented attention, but their inherent computational complexity leads to high energy consumption. However, a paradigm shift in computing methods has the potential to address the issue. This shift could be a move towards analog in-memory computing, a method which uses Ohm’s and Kirchhoff’s Laws, and carries out the processing directly where data resides. This approach is being propelled by the development of memristors, versatile memory devices that are programmable and energy efficient.

This thesis explores the capabilities of a newly engineered memristor device. This device, based on Ferroelectric Tunnel Junctions (FTJ), was developed by Lund University and presents promising technology for analog in-memory computing. In this thesis, the creation of a mathematical model took place within a simulated setting. This provided the foundation for a sensitivity analysis of chosen neural network algorithms operating on hardware featuring FTJ devices. A variety of techniques were deployed to mitigate the hardware imperfections, such as hardware-aware training, which enhanced the resilience of the algorithms.

The outcomes from this investigative approach are promising, particularly regarding the inference processes in neural networks. Our research demonstrated the effectiveness of all applied mitigation techniques. The standout discovery was the robustness of the Transformer
algorithm, compared to convolutional one, which proved capable of withstanding hardware imperfections while producing results on par with those of the digital model. (Less)
Popular Abstract
Machine learning algorithms have become an integral part of our daily lives, powering applications and systems across various industries. However, their widespread use comes with a significant drawback - high power consumption. As the demand for machine learning continues to grow, there is an urgent need for innovative computing approaches that can tackle this power challenge. Enter Analog In-Memory Computing (AIMC), a promising solution that could revolutionize the way we process information. When combined with a type of memory cell called a Ferroelectric Tunnel Junction (FTJ) memristor, AIMC can perform the calculations needed for machine learning with much less power.

In this study, we evaluated the effectiveness of FTJ memristors,... (More)
Machine learning algorithms have become an integral part of our daily lives, powering applications and systems across various industries. However, their widespread use comes with a significant drawback - high power consumption. As the demand for machine learning continues to grow, there is an urgent need for innovative computing approaches that can tackle this power challenge. Enter Analog In-Memory Computing (AIMC), a promising solution that could revolutionize the way we process information. When combined with a type of memory cell called a Ferroelectric Tunnel Junction (FTJ) memristor, AIMC can perform the calculations needed for machine learning with much less power.

In this study, we evaluated the effectiveness of FTJ memristors, engineered at NanoLund, in enabling AIMC. Our findings were encouraging. Interestingly, FTJ memristors demonstrated impressive performance when interfaced with the same kind of transformer neural network that powers ChatGPT, a state-of-the-art algorithm. But our work did not stop there. We also explored the potential of AIMC for training neural networks and designed several techniques to enhance the precision of our networks. Finding the proper way to map your application on the AIMC hardware is quite the challenge. Furthermore, we analyzed the sensitivity of the algortihms and proposed approaches that could make them functional. Through this comprehensive understanding, we suggest the implementation of AIMC for transformer models and pinpoint the key attributes for a successful design, known as mitigation techniques.

The magic of AIMC lies in its simplicity. It's based on Ohm’s and Kirchhoff laws for multiplication and addition. These combined operations, known as multiplication accumulation (MAC), are extensively used for matrix operations required for machine learning algorithms. While the digital domain needs extensive hardware and power consumption to compute the operation, in the analog domain, simple resistors can do the trick.

So, why should you care? Because this revolution in machine learning could transform everything from artificial intelligence to data processing and beyond. It could make our technologies smarter, our data more meaningful, and our lives better. (Less)
Please use this url to cite or link to this publication:
author
Guerin, Alec LU and Papadopoulos, Christos
supervisor
organization
course
EITM02 20231
year
type
H2 - Master's Degree (Two Years)
subject
keywords
FTJ, Ferroelectric Tunneling Junction, Analog in-memory computing, AIMC, Memristor, A.I., AIHWKIT, Semantic segmentation, Natural Language Processing, NLP, Neuromorphic Computing, Matrix Vector Multiplication
report number
LU/LTH-EIT 2023-943
language
English
id
9129374
date added to LUP
2023-08-29 10:42:27
date last changed
2023-08-29 10:42:27
@misc{9129374,
  abstract     = {{Machine learning algorithms are experiencing unprecedented attention, but their inherent computational complexity leads to high energy consumption. However, a paradigm shift in computing methods has the potential to address the issue. This shift could be a move towards analog in-memory computing, a method which uses Ohm’s and Kirchhoff’s Laws, and carries out the processing directly where data resides. This approach is being propelled by the development of memristors, versatile memory devices that are programmable and energy efficient.

This thesis explores the capabilities of a newly engineered memristor device. This device, based on Ferroelectric Tunnel Junctions (FTJ), was developed by Lund University and presents promising technology for analog in-memory computing. In this thesis, the creation of a mathematical model took place within a simulated setting. This provided the foundation for a sensitivity analysis of chosen neural network algorithms operating on hardware featuring FTJ devices. A variety of techniques were deployed to mitigate the hardware imperfections, such as hardware-aware training, which enhanced the resilience of the algorithms.

The outcomes from this investigative approach are promising, particularly regarding the inference processes in neural networks. Our research demonstrated the effectiveness of all applied mitigation techniques. The standout discovery was the robustness of the Transformer
algorithm, compared to convolutional one, which proved capable of withstanding hardware imperfections while producing results on par with those of the digital model.}},
  author       = {{Guerin, Alec and Papadopoulos, Christos}},
  language     = {{eng}},
  note         = {{Student Paper}},
  title        = {{Evaluation of Ferroelectric Tunnel Junction memristor for in-memory computation in real world use cases}},
  year         = {{2023}},
}