Advanced

The interactive examination: assessing students' self-assessment ability

Mattheos, N; Nattestad, A; Falk Nilsson, Eva LU and Attstrom, R (2004) In Medical Education 38(4). p.378-389
Abstract
BACKGROUND The ability to self-assess one's competence is a crucial skill for all health professionals. The interactive examination is an assessment model aiming to evaluate not only students' clinical skills and competence, but also their ability to self-assess their proficiency. METHODS The methodology utilised students' own self-assessment, an answer to a written essay question and a group discussion. Students' self-assessment was matched to the judgement of their instructors. As a final task, students compared their own essay to one written by an 'expert'. The differences pointed by students in their comparison documents and the accompanying arguments were analysed and categorised. Students received individual feedback on their... (More)
BACKGROUND The ability to self-assess one's competence is a crucial skill for all health professionals. The interactive examination is an assessment model aiming to evaluate not only students' clinical skills and competence, but also their ability to self-assess their proficiency. METHODS The methodology utilised students' own self-assessment, an answer to a written essay question and a group discussion. Students' self-assessment was matched to the judgement of their instructors. As a final task, students compared their own essay to one written by an 'expert'. The differences pointed by students in their comparison documents and the accompanying arguments were analysed and categorised. Students received individual feedback on their performance and learning needs. The model was tested on 1 cohort of undergraduate dental students (year 2001, n = 52) in their third semester of studies, replacing an older form of examination in the discipline of clinical periodontology. RESULTS Students' acceptance of the methodology was very positive. Students tended to overestimate their competence in relation to the judgement of their instructors in diagnostic skills, but not in skills relevant to treatment. No gender differences were observed, although females performed better than males in the examination. Three categories of differences were observed in the students' comparison documents. The accompanying arguments may reveal students' understanding and methods of prioritising. CONCLUSIONS Students tended to overestimate their competence in diagnostic rather than treatment skills. The interactive examination appeared to be a convenient tool for providing deeper insight into students' ability to prioritise, self-assess and steer their own learning. (Less)
Please use this url to cite or link to this publication:
author
organization
publishing date
type
Contribution to journal
publication status
published
subject
keywords
medical, students, education, undergraduate, clinical competence, standards, reproducibility of results, faculty, educational measurement, methods, self evaluation programmes, comparative study
in
Medical Education
volume
38
issue
4
pages
378 - 389
publisher
Wiley-Blackwell
external identifiers
  • wos:000220272400007
  • pmid:15025639
  • scopus:1842784967
ISSN
0308-0110
DOI
10.1046/j.1365-2923.2004.01788.x
language
English
LU publication?
yes
id
093993ba-5915-44a2-afa8-f06902ffe472 (old id 284802)
date added to LUP
2007-10-27 19:19:11
date last changed
2017-07-02 03:32:28
@article{093993ba-5915-44a2-afa8-f06902ffe472,
  abstract     = {BACKGROUND The ability to self-assess one's competence is a crucial skill for all health professionals. The interactive examination is an assessment model aiming to evaluate not only students' clinical skills and competence, but also their ability to self-assess their proficiency. METHODS The methodology utilised students' own self-assessment, an answer to a written essay question and a group discussion. Students' self-assessment was matched to the judgement of their instructors. As a final task, students compared their own essay to one written by an 'expert'. The differences pointed by students in their comparison documents and the accompanying arguments were analysed and categorised. Students received individual feedback on their performance and learning needs. The model was tested on 1 cohort of undergraduate dental students (year 2001, n = 52) in their third semester of studies, replacing an older form of examination in the discipline of clinical periodontology. RESULTS Students' acceptance of the methodology was very positive. Students tended to overestimate their competence in relation to the judgement of their instructors in diagnostic skills, but not in skills relevant to treatment. No gender differences were observed, although females performed better than males in the examination. Three categories of differences were observed in the students' comparison documents. The accompanying arguments may reveal students' understanding and methods of prioritising. CONCLUSIONS Students tended to overestimate their competence in diagnostic rather than treatment skills. The interactive examination appeared to be a convenient tool for providing deeper insight into students' ability to prioritise, self-assess and steer their own learning.},
  author       = {Mattheos, N and Nattestad, A and Falk Nilsson, Eva and Attstrom, R},
  issn         = {0308-0110},
  keyword      = {medical,students,education,undergraduate,clinical competence,standards,reproducibility of results,faculty,educational measurement,methods,self evaluation programmes,comparative study},
  language     = {eng},
  number       = {4},
  pages        = {378--389},
  publisher    = {Wiley-Blackwell},
  series       = {Medical Education},
  title        = {The interactive examination: assessing students' self-assessment ability},
  url          = {http://dx.doi.org/10.1046/j.1365-2923.2004.01788.x},
  volume       = {38},
  year         = {2004},
}