Skip to main content

LUP Student Papers

LUND UNIVERSITY LIBRARIES

Near-Memory Computing Compiler for Neural Network Architectures

Allfjord, Alex LU (2022) EITM01 20222
Department of Electrical and Information Technology
Abstract
With an increased popularity of machine learning, both higher performance and more energy-efficient circuits are needed to meet the demands of increasing workloads. This master's thesis focuses on convolutional neural networks and implements a compiler that generates an accelerator architecture that can be tailored to performance needs. The implemented architecture utilizes near-memory computing to gain increased performance and higher energy efficiency. This report gives an overview of the implemented architecture. Area and performance results for an example use-case are presented and ideas for future improvements are listed.
Please use this url to cite or link to this publication:
author
Allfjord, Alex LU
supervisor
organization
course
EITM01 20222
year
type
H2 - Master's Degree (Two Years)
subject
report number
LU/LTH-EIT 2023-911
language
English
id
9111229
date added to LUP
2023-02-27 11:09:17
date last changed
2023-02-27 11:33:17
@misc{9111229,
  abstract     = {{With an increased popularity of machine learning, both higher performance and more energy-efficient circuits are needed to meet the demands of increasing workloads. This master's thesis focuses on convolutional neural networks and implements a compiler that generates an accelerator architecture that can be tailored to performance needs. The implemented architecture utilizes near-memory computing to gain increased performance and higher energy efficiency. This report gives an overview of the implemented architecture. Area and performance results for an example use-case are presented and ideas for future improvements are listed.}},
  author       = {{Allfjord, Alex}},
  language     = {{eng}},
  note         = {{Student Paper}},
  title        = {{Near-Memory Computing Compiler for Neural Network Architectures}},
  year         = {{2022}},
}