“This ground truth is muddy anyway” : Ground truth data assemblages for medical AI development
(2025) In Sociologisk Forskning 62(1–2). p.85-106- Abstract
- This article explores assemblages of ground truth datasets for the development of medical artificial intelligence (AI). By drawing from interviews and observations, I examine how AI experts developing medical AI relate to the referential truth basis of their work, their ground truths, as an epistemic concern. By addressing how datasets are assembled from different sources, and produced, augmented and synthesised, this study shows how ground truths are valued based on humanness, quality of medical expert judgements, temporality and technical qualities. Moreover, this article analyses truth practices as productive moments in AI development, the role of human expertise and the perceived strengths and limits of expert-based annotations. The... (More)
- This article explores assemblages of ground truth datasets for the development of medical artificial intelligence (AI). By drawing from interviews and observations, I examine how AI experts developing medical AI relate to the referential truth basis of their work, their ground truths, as an epistemic concern. By addressing how datasets are assembled from different sources, and produced, augmented and synthesised, this study shows how ground truths are valued based on humanness, quality of medical expert judgements, temporality and technical qualities. Moreover, this article analyses truth practices as productive moments in AI development, the role of human expertise and the perceived strengths and limits of expert-based annotations. The valuations of ground truths shatter the image of medical classifications, and AI models, as stable neutral entities. Moreover, this article shows how valuations of ground truths encompass more than alignment with standardised expertise. To better understand the possibilities for medical AI to live up to ideals of accuracy, fairness, trustworthiness and transparency, we need more knowledge on assumptions, negotiations and epistemic concerns upon which medical AI is built. (Less)
Please use this url to cite or link to this publication:
https://lup.lub.lu.se/record/1cd2bcc0-9379-4ae7-99ee-aa004f735310
- author
- Högberg, Charlotte
LU
- organization
- publishing date
- 2025-06-12
- type
- Contribution to journal
- publication status
- published
- subject
- keywords
- AI, Artificial intellgience, ground truth, Medicine, Science and technology, epistemology
- in
- Sociologisk Forskning
- volume
- 62
- issue
- 1–2
- pages
- 85 - 106
- publisher
- Swedish Sociological Association
- external identifiers
-
- scopus:105008660329
- ISSN
- 2002-066X
- DOI
- 10.37062/sf.62.27826
- project
- AI in the Name of the Common Good - Relations of data, AI and humans in health and public sector
- language
- English
- LU publication?
- yes
- id
- 1cd2bcc0-9379-4ae7-99ee-aa004f735310
- date added to LUP
- 2025-02-25 09:30:08
- date last changed
- 2025-07-10 04:04:18
@article{1cd2bcc0-9379-4ae7-99ee-aa004f735310, abstract = {{This article explores assemblages of ground truth datasets for the development of medical artificial intelligence (AI). By drawing from interviews and observations, I examine how AI experts developing medical AI relate to the referential truth basis of their work, their ground truths, as an epistemic concern. By addressing how datasets are assembled from different sources, and produced, augmented and synthesised, this study shows how ground truths are valued based on humanness, quality of medical expert judgements, temporality and technical qualities. Moreover, this article analyses truth practices as productive moments in AI development, the role of human expertise and the perceived strengths and limits of expert-based annotations. The valuations of ground truths shatter the image of medical classifications, and AI models, as stable neutral entities. Moreover, this article shows how valuations of ground truths encompass more than alignment with standardised expertise. To better understand the possibilities for medical AI to live up to ideals of accuracy, fairness, trustworthiness and transparency, we need more knowledge on assumptions, negotiations and epistemic concerns upon which medical AI is built.}}, author = {{Högberg, Charlotte}}, issn = {{2002-066X}}, keywords = {{AI; Artificial intellgience; ground truth; Medicine; Science and technology; epistemology}}, language = {{eng}}, month = {{06}}, number = {{1–2}}, pages = {{85--106}}, publisher = {{Swedish Sociological Association}}, series = {{Sociologisk Forskning}}, title = {{“This ground truth is muddy anyway” : Ground truth data assemblages for medical AI development}}, url = {{https://lup.lub.lu.se/search/files/221246566/Hogberg_This_ground_truth_is_muddy_anyway_SoFo.pdf}}, doi = {{10.37062/sf.62.27826}}, volume = {{62}}, year = {{2025}}, }