Article Text

Download PDFPDF
Development and testing of an assessment instrument for the formative peer review of significant event analyses
  1. J McKay1,
  2. D J Murphy2,
  3. P Bowie3,
  4. M-L Schmuck3,
  5. M Lough2,
  6. K W Eva3
  1. 1Department of Postgraduate Medicine, University of Glasgow, Glasgow, UK
  2. 2NHS Education for Scotland, Glasgow, UK
  3. 3McMaster University, Hamilton, Ontario, Canada
  1. Correspondence to:
 Dr J McKay
 NHS Education for Scotland, Postgraduate General Practice Education, 2 Central Quay, 89 Hydepark Street, Glasgow G3 8BW, UK; john.mckay{at}nes.scot.nhs.uk

Abstract

Aim: To establish the content validity and specific aspects of reliability for an assessment instrument designed to provide formative feedback to general practitioners (GPs) on the quality of their written analysis of a significant event.

Methods: Content validity was quantified by application of a content validity index. Reliability testing involved a nested design, with 5 cells, each containing 4 assessors, rating 20 unique significant event analysis (SEA) reports (10 each from experienced GPs and GPs in training) using the assessment instrument. The variance attributable to each identified variable in the study was established by analysis of variance. Generalisability theory was then used to investigate the instrument’s ability to discriminate among SEA reports.

Results: Content validity was demonstrated with at least 8 of 10 experts endorsing all 10 items of the assessment instrument. The overall G coefficient for the instrument was moderate to good (G>0.70), indicating that the instrument can provide consistent information on the standard achieved by the SEA report. There was moderate inter-rater reliability (G>0.60) when four raters were used to judge the quality of the SEA.

Conclusions: This study provides the first steps towards validating an instrument that can provide educational feedback to GPs on their analysis of significant events. The key area identified to improve instrument reliability is variation among peer assessors in their assessment of SEA reports. Further validity and reliability testing should be carried out to provide GPs, their appraisers and contractual bodies with a validated feedback instrument on this aspect of the general practice quality agenda.

  • CVI, content validity index
  • GP, general practitioner
  • SEA, significant event analysis

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Footnotes

  • Funding: NHS Education for Scotland.

  • Competing interests: None.