Credentialing and Certifying with Simulation

https://doi.org/10.1016/j.anclin.2007.03.002Get rights and content

Assessment and evaluation are integral parts of any educational and training process, and students at all levels of training respond by studying more seriously for the parts of the course or training that are assessed. To promote and enhance effective learning successfully, simulation and other teaching methods should be both formative and summative, because the ultimate goal is to ensure professional competence. This article describes a model of medical competence, and focuses on the use of medical simulation in assessment and evaluation of different levels of clinical competence using examples from experience.

Section snippets

Miller's model of medical competence

A well-established model of medical competence was suggested by Miller [3]. Miller's pyramid presents four layers of competence, defined as (1) knows, (2) knows how, (3) shows how, and (4) does.

  • 1.

    The “knows” level refers to the recall of facts, principles, and theories. The traditional method of assessing this level was the essay, replaced later by multiple-choice tests, to assess the recall and application of knowledge [4].

  • 2.

    The “knows how” level involves the ability to solve problems and describe

Objective structured clinical examination

The objective structures clinical examination (OSCE) consists of multiple stations around which examinees rotate and perform and are assessed on specific tasks by examiners or by simulated patients on whom the task was performed [16]. OSCE is widely used for assessment and certification around the world. This tool is used for evaluation of foreign doctors wishing to practice in the United States [17]; the General Medical Council [18]; the Medical Council of Canada [19]; and for the evaluation

Admission to Tel Aviv University Sackler School of Medicine

Most medical schools' selection processes rely mainly on tools that measure cognitive factors. The most common tools record candidates' previous academic achievements, coupled with norm-referenced cognitive examinations [35]. These tools can be efficiently administered to large numbers of candidates, produce data that can be objectively scored and analyzed, and they are good predictors of students' academic success in medical school [36]. Interviews incorporated to the admission process are the

The Israeli board examination in anesthesiology

Many institutions introduced the use of high-fidelity medical simulation into residents' training. The interrater reliability, construct validity, and value of simulation-based scenarios as an effective tool for the evaluation of residents have been explored. In the United Kingdom, OSCE was incorporated into the Fellow Royal College of Anesthesiologists (FRCA) examinations. The areas tested in the OSCE stations include resuscitation, technical skills, anatomy, history-taking, physical

Keys for success in incorporating simulation and objective structured clinical examination into assessment and evaluation

The incorporation of OSCE into the assessment of medical personnel involves a structured process of examination development, including the definition of assessment conditions, tasks, and scenarios on the basis of accepted medical protocols and broad professional consensus. This process contributes to the objectivity and the content and face validity of the examination, and also its positive reception by the medical community. According to the authors' experience, a major key for success in such

References (42)

  • G. Bordage

    An alternative approach to PMP's: the ‘key-features’ concept

  • S.M. Case et al.

    Extended-matching items: a practical alternative to free response questions

    Teach Learn Med

    (1993)
  • R.M. Harden et al.

    Assessment of clinical competence using an objective structured clinical examination (OSCE)

    Med Educ

    (1979)
  • J. Turnbull et al.

    Clinical work sampling: a new approach to the problem of in-training evaluation

    J Gen Intern Med

    (2000)
  • P. Ram et al.

    Assessment of general practitioners by video observation of communicative and medical performance in daily practice: issues of validity, reliability and feasibility

    Med Educ

    (1999)
  • P.G. Ramsey et al.

    Use of peer rating in physician performance

    JAMA

    (1993)
  • J. Rethans et al.

    Assessment of performance in actual practice of general practitioners by use of standardized patients

    Br J Gen Pract

    (1991)
  • Lockie C. The examination for membership in the Royal College of General Practitioners of England. Royal College of...
  • E.R. Petrusa

    Clinical performance assessment

  • A. Ziv et al.

    Lessons learned from six years of international administrations of the ECFMG's SP-based clinical skills assessment

    Acad Med

    (1998)
  • P. Tombeson et al.

    Defining the content for the objective structured clinical examination component of the Professional and Linguistic Assessment Board examination: development of a blueprint

    Med Educ

    (2000)
  • Cited by (0)

    View full text