Skip to main content

Lund University Publications

LUND UNIVERSITY LIBRARIES

Measuring mammographic density : comparing a fully automated volumetric assessment versus European radiologists' qualitative classification

Sartor, Hanna LU ; Lång, Kristina LU ; Rosso, Aldana LU ; Borgquist, Signe LU ; Zackrisson, Sophia LU and Timberg, Pontus LU (2016) In European Radiology 26(12). p.4354-4360
Abstract

OBJECTIVES: Breast Imaging-Reporting and Data System (BI-RADS) mammographic density categories are associated with considerable interobserver variability. Automated methods of measuring volumetric breast density may reduce variability and be valuable in risk and mammographic screening stratification. Our objective was to assess agreement of mammographic density by a volumetric method with the radiologists' classification.

METHODS: Eight thousand seven hundred and eighty-two examinations from the Malmö Breast Tomosynthesis Screening Trial were classified according to BI-RADS, 4th Edition. Volumetric breast density was assessed using automated software for 8433 examinations. Agreement between volumetric breast density and BI-RADS... (More)

OBJECTIVES: Breast Imaging-Reporting and Data System (BI-RADS) mammographic density categories are associated with considerable interobserver variability. Automated methods of measuring volumetric breast density may reduce variability and be valuable in risk and mammographic screening stratification. Our objective was to assess agreement of mammographic density by a volumetric method with the radiologists' classification.

METHODS: Eight thousand seven hundred and eighty-two examinations from the Malmö Breast Tomosynthesis Screening Trial were classified according to BI-RADS, 4th Edition. Volumetric breast density was assessed using automated software for 8433 examinations. Agreement between volumetric breast density and BI-RADS was descriptively analyzed. Agreement between radiologists and between categorical volumetric density and BI-RADS was calculated, rendering kappa values.

RESULTS: The observed agreement between BI-RADS scores of different radiologists was 80.9 % [kappa 0.77 (0.76-0.79)]. A spread of volumetric breast density for each BI-RADS category was seen. The observed agreement between categorical volumetric density and BI-RADS scores was 57.1 % [kappa 0.55 (0.53-0.56)].

CONCLUSIONS: There was moderate agreement between volumetric density and BI-RADS scores from European radiologists indicating that radiologists evaluate mammographic density differently than software. The automated method may be a robust and valuable tool; however, differences in interpretation between radiologists and software require further investigation.

KEY POINTS: • Agreement between qualitative and software density measurements has not been frequently studied. • There was substantial agreement between different radiologists´ qualitative density assessments. • There was moderate agreement between software and radiologists' density assessments. • Differences in interpretation between software and radiologists require further investigation.

(Less)
Please use this url to cite or link to this publication:
author
; ; ; ; and
organization
publishing date
type
Contribution to journal
publication status
published
subject
in
European Radiology
volume
26
issue
12
pages
4354 - 4360
publisher
Springer
external identifiers
  • pmid:27011371
  • scopus:84961637036
  • wos:000387810700018
ISSN
0938-7994
DOI
10.1007/s00330-016-4309-3
language
English
LU publication?
yes
id
e457afbd-f55e-4014-ba3d-fb5c3ad6c047
date added to LUP
2016-04-11 12:40:44
date last changed
2024-03-06 20:45:40
@article{e457afbd-f55e-4014-ba3d-fb5c3ad6c047,
  abstract     = {{<p>OBJECTIVES: Breast Imaging-Reporting and Data System (BI-RADS) mammographic density categories are associated with considerable interobserver variability. Automated methods of measuring volumetric breast density may reduce variability and be valuable in risk and mammographic screening stratification. Our objective was to assess agreement of mammographic density by a volumetric method with the radiologists' classification.</p><p>METHODS: Eight thousand seven hundred and eighty-two examinations from the Malmö Breast Tomosynthesis Screening Trial were classified according to BI-RADS, 4th Edition. Volumetric breast density was assessed using automated software for 8433 examinations. Agreement between volumetric breast density and BI-RADS was descriptively analyzed. Agreement between radiologists and between categorical volumetric density and BI-RADS was calculated, rendering kappa values.</p><p>RESULTS: The observed agreement between BI-RADS scores of different radiologists was 80.9 % [kappa 0.77 (0.76-0.79)]. A spread of volumetric breast density for each BI-RADS category was seen. The observed agreement between categorical volumetric density and BI-RADS scores was 57.1 % [kappa 0.55 (0.53-0.56)].</p><p>CONCLUSIONS: There was moderate agreement between volumetric density and BI-RADS scores from European radiologists indicating that radiologists evaluate mammographic density differently than software. The automated method may be a robust and valuable tool; however, differences in interpretation between radiologists and software require further investigation.</p><p>KEY POINTS: • Agreement between qualitative and software density measurements has not been frequently studied. • There was substantial agreement between different radiologists´ qualitative density assessments. • There was moderate agreement between software and radiologists' density assessments. • Differences in interpretation between software and radiologists require further investigation.</p>}},
  author       = {{Sartor, Hanna and Lång, Kristina and Rosso, Aldana and Borgquist, Signe and Zackrisson, Sophia and Timberg, Pontus}},
  issn         = {{0938-7994}},
  language     = {{eng}},
  number       = {{12}},
  pages        = {{4354--4360}},
  publisher    = {{Springer}},
  series       = {{European Radiology}},
  title        = {{Measuring mammographic density : comparing a fully automated volumetric assessment versus European radiologists' qualitative classification}},
  url          = {{http://dx.doi.org/10.1007/s00330-016-4309-3}},
  doi          = {{10.1007/s00330-016-4309-3}},
  volume       = {{26}},
  year         = {{2016}},
}