Skip to main content

Lund University Publications

LUND UNIVERSITY LIBRARIES

Automatic determination of NET (neutrophil extracellular traps) coverage in fluorescent microscopy images

Coelho, Luis Pedro ; Pato, Catarina ; Friães, Ana ; Neumann, Ariane LU ; von Köckritz-Blickwede, Maren ; Ramirez, Mário and Carriço, João André (2015) In Bioinformatics 31(14). p.70-2364
Abstract

MOTIVATION: Neutrophil extracellular traps (NETs) are believed to be essential in controlling several bacterial pathogens. Quantification of NETs in vitro is an important tool in studies aiming to clarify the biological and chemical factors contributing to NET production, stabilization and degradation. This estimation can be performed on the basis of fluorescent microscopy images using appropriate labelings. In this context, it is desirable to automate the analysis to eliminate both the tedious process of manual annotation and possible operator-specific biases.

RESULTS: We propose a framework for the automated determination of NET content, based on visually annotated images which are used to train a supervised machine-learning... (More)

MOTIVATION: Neutrophil extracellular traps (NETs) are believed to be essential in controlling several bacterial pathogens. Quantification of NETs in vitro is an important tool in studies aiming to clarify the biological and chemical factors contributing to NET production, stabilization and degradation. This estimation can be performed on the basis of fluorescent microscopy images using appropriate labelings. In this context, it is desirable to automate the analysis to eliminate both the tedious process of manual annotation and possible operator-specific biases.

RESULTS: We propose a framework for the automated determination of NET content, based on visually annotated images which are used to train a supervised machine-learning method. We derive several methods in this framework. The best results are obtained by combining these into a single prediction. The overall Q(2) of the combined method is 93%. By having two experts label part of the image set, we were able to compare the performance of the algorithms to the human interoperator variability. We find that the two operators exhibited a very high correlation on their overall assessment of the NET coverage area in the images (R(2) is 97%), although there were consistent differences in labeling at pixel level (Q(2), which unlike R(2) does not correct for additive and multiplicative biases, was only 89%).

AVAILABILITY AND IMPLEMENTATION: Open source software (under the MIT license) is available at https://github.com/luispedro/Coelho2015_NetsDetermination for both reproducibility and application to new data.

(Less)
Please use this url to cite or link to this publication:
author
; ; ; ; ; and
publishing date
type
Contribution to journal
publication status
published
keywords
Algorithms, Extracellular Traps, Humans, Image Interpretation, Computer-Assisted, Microscopy, Fluorescence, Neutrophils, Observer Variation, Pattern Recognition, Automated, Reproducibility of Results, Software, Journal Article, Research Support, Non-U.S. Gov't
in
Bioinformatics
volume
31
issue
14
pages
7 pages
publisher
Oxford University Press
external identifiers
  • scopus:84941729482
  • pmid:25792554
ISSN
1367-4803
DOI
10.1093/bioinformatics/btv156
language
English
LU publication?
no
id
c8b7ce65-f83c-4496-909b-f584b922f7c2
date added to LUP
2017-09-19 12:20:04
date last changed
2024-03-31 16:53:27
@article{c8b7ce65-f83c-4496-909b-f584b922f7c2,
  abstract     = {{<p>MOTIVATION: Neutrophil extracellular traps (NETs) are believed to be essential in controlling several bacterial pathogens. Quantification of NETs in vitro is an important tool in studies aiming to clarify the biological and chemical factors contributing to NET production, stabilization and degradation. This estimation can be performed on the basis of fluorescent microscopy images using appropriate labelings. In this context, it is desirable to automate the analysis to eliminate both the tedious process of manual annotation and possible operator-specific biases.</p><p>RESULTS: We propose a framework for the automated determination of NET content, based on visually annotated images which are used to train a supervised machine-learning method. We derive several methods in this framework. The best results are obtained by combining these into a single prediction. The overall Q(2) of the combined method is 93%. By having two experts label part of the image set, we were able to compare the performance of the algorithms to the human interoperator variability. We find that the two operators exhibited a very high correlation on their overall assessment of the NET coverage area in the images (R(2) is 97%), although there were consistent differences in labeling at pixel level (Q(2), which unlike R(2) does not correct for additive and multiplicative biases, was only 89%).</p><p>AVAILABILITY AND IMPLEMENTATION: Open source software (under the MIT license) is available at https://github.com/luispedro/Coelho2015_NetsDetermination for both reproducibility and application to new data.</p>}},
  author       = {{Coelho, Luis Pedro and Pato, Catarina and Friães, Ana and Neumann, Ariane and von Köckritz-Blickwede, Maren and Ramirez, Mário and Carriço, João André}},
  issn         = {{1367-4803}},
  keywords     = {{Algorithms; Extracellular Traps; Humans; Image Interpretation, Computer-Assisted; Microscopy, Fluorescence; Neutrophils; Observer Variation; Pattern Recognition, Automated; Reproducibility of Results; Software; Journal Article; Research Support, Non-U.S. Gov't}},
  language     = {{eng}},
  month        = {{07}},
  number       = {{14}},
  pages        = {{70--2364}},
  publisher    = {{Oxford University Press}},
  series       = {{Bioinformatics}},
  title        = {{Automatic determination of NET (neutrophil extracellular traps) coverage in fluorescent microscopy images}},
  url          = {{http://dx.doi.org/10.1093/bioinformatics/btv156}},
  doi          = {{10.1093/bioinformatics/btv156}},
  volume       = {{31}},
  year         = {{2015}},
}