Skip to main content

Lund University Publications

LUND UNIVERSITY LIBRARIES

Augmentation Strategies for Self-Supervised Representation Learning from Electrocardiograms

Andersson, Matilda ; Nilsson, Mattias ; Flood, Gabrielle LU and Aström, Kalle LU orcid (2023) 31st European Signal Processing Conference, EUSIPCO 2023 In European Signal Processing Conference p.1075-1079
Abstract

In this paper, we investigate the effects of different augmentation strategies in self-supervised representation learning from electrocardiograms. Our study examines the impact of random resized crop and time out on downstream performance. We also consider the importance of the signal length. Furthermore, instead of using two augmented copies of the sample as a positive pair, we suggest augmenting only one. The second signal is kept as the original signal. These different augmentation strategies are investigated in the context of pre-training and fine-tuning, following the different self-supervised learning frameworks BYOL, SimCLR, and VICReg. We formulate the downstream task as a multi-label classification task using a public dataset... (More)

In this paper, we investigate the effects of different augmentation strategies in self-supervised representation learning from electrocardiograms. Our study examines the impact of random resized crop and time out on downstream performance. We also consider the importance of the signal length. Furthermore, instead of using two augmented copies of the sample as a positive pair, we suggest augmenting only one. The second signal is kept as the original signal. These different augmentation strategies are investigated in the context of pre-training and fine-tuning, following the different self-supervised learning frameworks BYOL, SimCLR, and VICReg. We formulate the downstream task as a multi-label classification task using a public dataset containing ECG recordings and annotations. In our experiments, we demonstrate that self-supervised learning can consistently outperform classical supervised learning when configured correctly. These findings are of particular importance in the medical domain, as the medical labeling process is particularly expensive, and clinical ground truth is often difficult to define. We are hopeful that our findings will be a catalyst for further research into augmentation strategies in self-supervised learning to improve performance in the detection of cardiovascular disease.

(Less)
Please use this url to cite or link to this publication:
author
; ; and
organization
publishing date
type
Chapter in Book/Report/Conference proceeding
publication status
published
subject
keywords
augmentation, ECG, electrocardiogram, pre-processing, representation learning, self-supervised
host publication
31st European Signal Processing Conference, EUSIPCO 2023 - Proceedings
series title
European Signal Processing Conference
pages
5 pages
publisher
European Signal Processing Conference, EUSIPCO
conference name
31st European Signal Processing Conference, EUSIPCO 2023
conference location
Helsinki, Finland
conference dates
2023-09-04 - 2023-09-08
external identifiers
  • scopus:85178373493
ISSN
2219-5491
ISBN
9789464593600
DOI
10.23919/EUSIPCO58844.2023.10289960
language
English
LU publication?
yes
additional info
Publisher Copyright: © 2023 European Signal Processing Conference, EUSIPCO. All rights reserved.
id
2c07f416-037f-41cd-95a1-e4e6dc105fbf
date added to LUP
2024-01-05 09:24:05
date last changed
2024-01-06 02:54:03
@inproceedings{2c07f416-037f-41cd-95a1-e4e6dc105fbf,
  abstract     = {{<p>In this paper, we investigate the effects of different augmentation strategies in self-supervised representation learning from electrocardiograms. Our study examines the impact of random resized crop and time out on downstream performance. We also consider the importance of the signal length. Furthermore, instead of using two augmented copies of the sample as a positive pair, we suggest augmenting only one. The second signal is kept as the original signal. These different augmentation strategies are investigated in the context of pre-training and fine-tuning, following the different self-supervised learning frameworks BYOL, SimCLR, and VICReg. We formulate the downstream task as a multi-label classification task using a public dataset containing ECG recordings and annotations. In our experiments, we demonstrate that self-supervised learning can consistently outperform classical supervised learning when configured correctly. These findings are of particular importance in the medical domain, as the medical labeling process is particularly expensive, and clinical ground truth is often difficult to define. We are hopeful that our findings will be a catalyst for further research into augmentation strategies in self-supervised learning to improve performance in the detection of cardiovascular disease.</p>}},
  author       = {{Andersson, Matilda and Nilsson, Mattias and Flood, Gabrielle and Aström, Kalle}},
  booktitle    = {{31st European Signal Processing Conference, EUSIPCO 2023 - Proceedings}},
  isbn         = {{9789464593600}},
  issn         = {{2219-5491}},
  keywords     = {{augmentation; ECG; electrocardiogram; pre-processing; representation learning; self-supervised}},
  language     = {{eng}},
  pages        = {{1075--1079}},
  publisher    = {{European Signal Processing Conference, EUSIPCO}},
  series       = {{European Signal Processing Conference}},
  title        = {{Augmentation Strategies for Self-Supervised Representation Learning from Electrocardiograms}},
  url          = {{http://dx.doi.org/10.23919/EUSIPCO58844.2023.10289960}},
  doi          = {{10.23919/EUSIPCO58844.2023.10289960}},
  year         = {{2023}},
}