Reference | Strategy of event identification | Degree of automation* | Automated method source of data | Comparison method source of data | Gold standard applied by independent, blind reviewer? | Gold standard applied regardless of automated outcome? | Study method applied to independent patient set? | Comments |
Field-defined | ||||||||
Nebeker et al18 | Computer algorithms | Chart review for study, unclear if strategy aims to be Full or Partial | ICD-9 CM codes | Medical record | Yes | Yes | No | Study used Hougland et al30 methodology to specifically apply HOCTA (hierarchically optimal classification tree analysis) to administrative data to develop surveillance rules for the identification of ADEs manifesting as either bleeding or delirium. Requires expert computer programming. |
Zhan et al17 | Patient Safety Indicators | Full | ICD-9 CM codes | Medical record | Unknown | Unknown | No | DVT/PE events flagged by ICD-9 CM codes were compared with those discovered by gold standard chart review. The sample studied was a random sample abstracted by the Medicare Patient Safety Monitoring System. |
Brossette et al44 | Nosocomial Infection Marker | Full | Medical record and Lab database | Medical record | Yes | Yes | No | Nosocomial Infection Marker (NIM) program by Med Mined, Birmingham, Alabama. Took about 10 min/week to maintain. Total time for NIM: 2 h/10 000 admissions, compared with medical record review at 1.5 full time employees per 10 000 admissions. |
Hougland et al30 | Automated ICD-9 code strategy | Full: Review of flagged charts here for study purposes | ICD-9 CM codes | Medical record | Yes | Yes | No | Expert panel identified 416 ICD-9 CM codes to represent ADEs (flagged ADEs). Then chart review performed to ascertain codes' ability to detect/identify ADE. |
Polancich et al15 | Patient Safety Indicators | Full | Administrative data, Billing data, ICD-9 CM diagnosis and procedure codes | Medical Record | No | No | No | Designed to test validity of Agency for Healthcare Research and Quality (AHRQ) PSIs for detecting hospital acquired decubitus ulcers. Only a sample of cases was manually reviewed. |
Dormann et al26 | Automated laboratory signal detection | Full | Demographics, History, Lab findings, Drugs, & Diagnosis | Medical record | Unknown | Unknown | No | Used automated lab signals (ALS) and changes in ALS to identify ADEs. Automated system used to flag potential ADEs, which were then sent as an alert to physicians. Use of delta ALS (change) resulted in improvement over Dormann et al's25 methodology. |
Trick et al20 | Computer algorithm | Full and Partial | Medical record; Lab, pharmacy, & radiology database; Microbiology | Medical record; Lab, pharmacy, & radiology database; Microbiology | Yes | Yes | No | Comparison of manual and computer assisted bloodstream central venous catheter infection surveillance using data from two hospitals. Different computer algorithms developed for full or partial automation were tested. |
Levy et al23 | Automated laboratory signal detection | Partial | Lab database | Lab database and clinical data | Unknown | Yes | No | Implementation of the pilot program described in Azaz-Livshits et al.22 Computerised lab data monitored to detect ADEs using the same signals as the pilot study. |
Azaz-Livshits et al22 | Automated laboratory signal detection | Partial | Lab database | Lab database and clinical data | Unknown | Yes | No | Pilot program to develop and assess computerised laboratory data as a detection tool for ADE in 34-bed medical ward in Jerusalem, Israel. Lab signals generated by computer, then verified by team. Limited computerised patient data at this hospital; however lab data were fully electronic. Cost of this system reasonable compared with costs of ADEs. |
Jha et al32 | Automated triggers | Partial | Medical record | Medical record | Yes | Unknown | No | Study of computer-based ADE identification using modified Classen 19918 rules to create automated triggers with which the electronic record was screened. Rules modified during the study to increase PPV, and new rules created. Trained reviewer and physician were blinded to detection method. 11 person-hours per week for automated method versus 55 for chart review and 5 for voluntary reporting. |
NLP | ||||||||
Penz et al47 | Computer algorithms & Natural Language Processing | Partial | Text records: Daily progress notes; Consultation, Nursing, and Procedure notes; Operative reports; Discharge summaries | Text records: Daily progress notes; Consultation, Nursing, and Procedure notes; Operative reports; Discharge summaries | No | No | No | Compared two methods for semiautomated review of text records within the VA database using NLP (MedLEE) and a phrase matching algorithm (PMA). Limited by incomplete or inaccurate documentation, incomplete coding, spelling errors, and sentence structure abbreviations. Time/technology intensive. |
Forster et al34 | Computerised screen for trigger words in free text | Partial | Discharge summaries | Discharge summaries | Yes | Yes | No | Automated adverse event lexicon made up of 104 terms used by Murff et al.33 Computerised search engine scanned discharge summaries (dtsearch desktop) to detect potential harm. Specificity higher for non-elective admissions and discharge summaries dictated by residents/staff versus medical students. Automated detection reduced physician time by one-fifth. |
Melton and Hripcsak48 | Natural Language Processing | Partial | Discharge summaries | Full electronic chart; combined electronic chart and paper chart for a subset of 100 patients | No | No | No | Natural Language Processing system (MedLEE) to identify 45 NY Patient Occurrence Reporting and Tracking System event types. Chart review by physician and independent informatician of random sample of 1000 charts to assess performance of NLP program. Results biased towards patients with electronic discharge summaries. This method is technologically intensive. |
Murff et al33 | Computerised screen for trigger words in free text | Full (goal is a fully automated system, manual review of subsamples performed for study) | Discharge summaries | Medical record (not otherwise specified) | Yes | Yes† | Yes | Brigham and Women's Hospital, using Brigham Integrated Computer system. Computerised screening tool searched free text discharge summaries for trigger words indicating possible adverse events. List of automated trigger words compiled using Harvard Medical Practice Study definitions as base. Electronic method alone versus electronic plus manual review compared for two cohorts. Reviewers blinded to whether screening tool had identified the admission. |
ADE, adverse drug event; AE, adverse event; DVT, deep venous thrombosis; NLP, natural language processing; PE, pulmonary embolism; PPV, positive predictive value; PSI, patient safety indicators; VA, Veterans Administration.
↵* We define fully automated methods as those where the identification of harm was not followed by further chart review, and partially automated methods where patient records flagged by the automated detection of potential harm (eg, ‘trigger’) were manually reviewed to verify harm.
↵† Authors manually reviewed a random 25% sample of screened-negative charts, then used this random sample to estimate the number of adverse events occurring in entire set of screened-negative charts.