Skip to main content

Lund University Publications

LUND UNIVERSITY LIBRARIES

Advancing non-invasive melanoma diagnostics with deep learning and multispectral photoacoustic imaging

Merdasa, Aboma LU orcid ; Fracchia, Alice ; Stridh, Magne LU ; Hult, Jenny LU orcid ; Andersson, Emil LU orcid ; Edén, Patrik LU ; Olariu, Victor LU and Malmsjö, Malin LU orcid (2025) In Photoacoustics 45.
Abstract

The incidence of melanoma is rising and will require more efficient diagnostic procedures to meet a growing demand. Excisional biopsy and histopathology is still the standard, which often requires multiple surgical incisions with increasing margins due inaccurate visual assessment of where the melanoma borders to healthy tissue. This challenge stems, in part, from the inability to reliably delineate the melanoma without visually inspecting chemically stained histopathological cross-sections. Spectroscopic imaging have shown promise to non-invasively characterize the molecular composition of tissue and thereby distinguish melanoma from healthy tissue based on spectral features. In this work we describe a computational framework applied... (More)

The incidence of melanoma is rising and will require more efficient diagnostic procedures to meet a growing demand. Excisional biopsy and histopathology is still the standard, which often requires multiple surgical incisions with increasing margins due inaccurate visual assessment of where the melanoma borders to healthy tissue. This challenge stems, in part, from the inability to reliably delineate the melanoma without visually inspecting chemically stained histopathological cross-sections. Spectroscopic imaging have shown promise to non-invasively characterize the molecular composition of tissue and thereby distinguish melanoma from healthy tissue based on spectral features. In this work we describe a computational framework applied to multispectral photoacoustic (PA) imaging data of melanoma in humans and demonstrate how the borders of the tumor can be automatically determined without human input. The framework combines K-means clustering, for an unbiased selection of training data, a one-dimensional convolutional neural network applied to PA spectra for classifying pixels as either healthy or diseased, and an active contour algorithm to finally delineate the melanoma in 3D. The work stands to impact clinical practice as it can provide both pre-surgical and perioperative guidance to ensure complete tumor removal with minimal surgical incisions.

(Less)
Please use this url to cite or link to this publication:
author
; ; ; ; ; ; and
organization
publishing date
type
Contribution to journal
publication status
published
subject
keywords
Clinical translation, Deep learning, Melanoma, Photoacoustic imaging, Spectroscopy
in
Photoacoustics
volume
45
article number
100743
publisher
Elsevier
external identifiers
  • scopus:105009741400
  • pmid:40686556
ISSN
2213-5979
DOI
10.1016/j.pacs.2025.100743
language
English
LU publication?
yes
additional info
Publisher Copyright: © 2025 The Authors
id
f5e4c5e5-00b8-4f52-9b85-2e6e0291be82
date added to LUP
2025-09-12 14:17:09
date last changed
2025-09-26 18:57:32
@article{f5e4c5e5-00b8-4f52-9b85-2e6e0291be82,
  abstract     = {{<p>The incidence of melanoma is rising and will require more efficient diagnostic procedures to meet a growing demand. Excisional biopsy and histopathology is still the standard, which often requires multiple surgical incisions with increasing margins due inaccurate visual assessment of where the melanoma borders to healthy tissue. This challenge stems, in part, from the inability to reliably delineate the melanoma without visually inspecting chemically stained histopathological cross-sections. Spectroscopic imaging have shown promise to non-invasively characterize the molecular composition of tissue and thereby distinguish melanoma from healthy tissue based on spectral features. In this work we describe a computational framework applied to multispectral photoacoustic (PA) imaging data of melanoma in humans and demonstrate how the borders of the tumor can be automatically determined without human input. The framework combines K-means clustering, for an unbiased selection of training data, a one-dimensional convolutional neural network applied to PA spectra for classifying pixels as either healthy or diseased, and an active contour algorithm to finally delineate the melanoma in 3D. The work stands to impact clinical practice as it can provide both pre-surgical and perioperative guidance to ensure complete tumor removal with minimal surgical incisions.</p>}},
  author       = {{Merdasa, Aboma and Fracchia, Alice and Stridh, Magne and Hult, Jenny and Andersson, Emil and Edén, Patrik and Olariu, Victor and Malmsjö, Malin}},
  issn         = {{2213-5979}},
  keywords     = {{Clinical translation; Deep learning; Melanoma; Photoacoustic imaging; Spectroscopy}},
  language     = {{eng}},
  publisher    = {{Elsevier}},
  series       = {{Photoacoustics}},
  title        = {{Advancing non-invasive melanoma diagnostics with deep learning and multispectral photoacoustic imaging}},
  url          = {{http://dx.doi.org/10.1016/j.pacs.2025.100743}},
  doi          = {{10.1016/j.pacs.2025.100743}},
  volume       = {{45}},
  year         = {{2025}},
}