Skip to main content

Lund University Publications

LUND UNIVERSITY LIBRARIES

Technical Skills Assessment in a Coronary Angiography Simulator for Construct Validation.

Jensen, Ulf LU ; Jensen, Jens ; Olivecrona, Göran LU ; Ahlberg, Gunnar and Tornvall, Per (2013) In Simulation in Healthcare: the Journal of the Society for Simulation in Healthcare 8(5). p.324-328
Abstract
INTRODUCTION: The aim of this study was to evaluate technical skills in a coronary angiography (CA) simulator to establish the performance level of trainees and experts in virtual CA.The traditional master-apprentice way of learning CA is by practicing on patients despite a known risk for complications during training. Safe CA training is warranted, and simulators might be one possibility. Simulators used must be validated regarding their ability to separate trainees from experts. Construct validation of a CA simulator, to our knowledge, has not yet been published. METHODS: Ten cardiology residents without experience in CA, 4 intermediate, and 10 CA experts performed 5 CAs in the Mentice VIST (Vascular Intervention Simulation Trainer).... (More)
INTRODUCTION: The aim of this study was to evaluate technical skills in a coronary angiography (CA) simulator to establish the performance level of trainees and experts in virtual CA.The traditional master-apprentice way of learning CA is by practicing on patients despite a known risk for complications during training. Safe CA training is warranted, and simulators might be one possibility. Simulators used must be validated regarding their ability to separate trainees from experts. Construct validation of a CA simulator, to our knowledge, has not yet been published. METHODS: Ten cardiology residents without experience in CA, 4 intermediate, and 10 CA experts performed 5 CAs in the Mentice VIST (Vascular Intervention Simulation Trainer). Metrics reflecting proficiency skills such as total procedure time, fluoroscopy time, and contrast volume were extracted from the simulator computer and compared between the groups. All examinations were videotaped, and the number of handling errors was examined. The videos were evaluated by 2 experts blinded to the test object's performance level. RESULTS: Experts outperformed trainees in all metrics measured by the simulator. Improvement was demonstrated in all metrics through all 5 CAs. Furthermore, beginners had more handling errors compared with experts. CONCLUSIONS: Mentice VIST simulator can distinguish between trainees and experts in CA in the metrics extracted from the computer and therefore prove the concept of construct validity. (Less)
Please use this url to cite or link to this publication:
author
; ; ; and
organization
publishing date
type
Contribution to journal
publication status
published
subject
in
Simulation in Healthcare: the Journal of the Society for Simulation in Healthcare
volume
8
issue
5
pages
324 - 328
publisher
Lippincott Williams & Wilkins
external identifiers
  • wos:000330308100005
  • pmid:23598862
  • scopus:84885380181
  • pmid:23598862
ISSN
1559-713X
DOI
10.1097/SIH.0b013e31828fdedc
language
English
LU publication?
yes
id
92643984-23f6-4425-8640-df2bc55a0d76 (old id 3733585)
alternative location
http://www.ncbi.nlm.nih.gov/pubmed/23598862?dopt=Abstract
date added to LUP
2016-04-01 09:48:12
date last changed
2022-04-27 07:39:22
@article{92643984-23f6-4425-8640-df2bc55a0d76,
  abstract     = {{INTRODUCTION: The aim of this study was to evaluate technical skills in a coronary angiography (CA) simulator to establish the performance level of trainees and experts in virtual CA.The traditional master-apprentice way of learning CA is by practicing on patients despite a known risk for complications during training. Safe CA training is warranted, and simulators might be one possibility. Simulators used must be validated regarding their ability to separate trainees from experts. Construct validation of a CA simulator, to our knowledge, has not yet been published. METHODS: Ten cardiology residents without experience in CA, 4 intermediate, and 10 CA experts performed 5 CAs in the Mentice VIST (Vascular Intervention Simulation Trainer). Metrics reflecting proficiency skills such as total procedure time, fluoroscopy time, and contrast volume were extracted from the simulator computer and compared between the groups. All examinations were videotaped, and the number of handling errors was examined. The videos were evaluated by 2 experts blinded to the test object's performance level. RESULTS: Experts outperformed trainees in all metrics measured by the simulator. Improvement was demonstrated in all metrics through all 5 CAs. Furthermore, beginners had more handling errors compared with experts. CONCLUSIONS: Mentice VIST simulator can distinguish between trainees and experts in CA in the metrics extracted from the computer and therefore prove the concept of construct validity.}},
  author       = {{Jensen, Ulf and Jensen, Jens and Olivecrona, Göran and Ahlberg, Gunnar and Tornvall, Per}},
  issn         = {{1559-713X}},
  language     = {{eng}},
  number       = {{5}},
  pages        = {{324--328}},
  publisher    = {{Lippincott Williams & Wilkins}},
  series       = {{Simulation in Healthcare: the Journal of the Society for Simulation in Healthcare}},
  title        = {{Technical Skills Assessment in a Coronary Angiography Simulator for Construct Validation.}},
  url          = {{http://dx.doi.org/10.1097/SIH.0b013e31828fdedc}},
  doi          = {{10.1097/SIH.0b013e31828fdedc}},
  volume       = {{8}},
  year         = {{2013}},
}