Article Text

Crises in clinical care: an approach to management
  1. W B Runciman1,
  2. A F Merry2
  1. 1Department of Anaesthesia & Intensive Care, University of Adelaide and Royal Adelaide Hospital, Adelaide, South Australia, Australia
  2. 2Department of Anaesthesiology, University of Auckland, Auckland, New Zealand
  1. Correspondence to:
 Professor W B Runciman
 Australian Patient Safety Foundation, GPO Box 400, Adelaide, South Australia 5001, Australia; researchapsf.net.au

Abstract

A “crisis” in health care is “the point in the course of a disease at which a decisive change occurs, leading either to recovery or to death”. The daunting challenges faced by clinicians when confronted with a crisis are illustrated by a tragic case in which a teenage boy died after a minor surgical procedure. Crises are challenging for reasons which include: presentation with non-specific signs or symptoms, interaction of complex factors, progressive evolution, new situations, “revenge effects”, inadequate assistance, and time constraints. In crises, clinicians often experience anxiety- and overload-induced performance degradation, tend to use “frequency gambling”, run out of “rules” and have to work from first principles, and are prone to “confirmation bias”. The effective management of crises requires formal training, usually simulator-based, and ideally in the inter-professional groups who will need to function as a team. “COVER ABCD–A SWIFT CHECK” is a precompiled algorithm which can be applied quickly and effectively to facilitate a systematic and effective response to the wide range of potentially lethal problems which may occur suddenly in anaesthesia. A set of 25 articles describing additional precompiled responses collated into a manual for the management of any crisis under anaesthesia has been published electronically as companion papers to this article. This approach to crisis management should be applied to other areas of clinical medicine as well as anaesthesia.

  • anaesthesia complications
  • crisis management
  • adverse events

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

A “crisis” has been defined as “a turning point” and, in the context of health care, “the point in the course of a disease at which a decisive change occurs, leading either to recovery or to death”.1 These definitions do not convey the daunting challenges faced by a clinician suddenly confronted with having to respond to a life threatening crisis in clinical medicine. The task facing the clinician is to detect that a crisis is at hand, to diagnose its underlying cause, and to take the necessary steps to divert the course of the patient’s condition from impending disaster towards recovery. This task is not always managed adequately, and the consequences of this inadequacy are sometimes tragic.

A substantial body of work has been done on managing crises during anaesthesia—much of it using high fidelity simulators.2–8 It has been clearly shown that experience in itself does not inoculate against failure, and that even senior consultants often make apparently basic errors in crises.9 These errors are understandable, but they cost lives and expose clinicians in the “front line” to litigation or even criminal prosecution.10 Crisis management needs to be improved urgently. This is true in anaesthesia, and also in other areas of clinical medicine. How is this improvement to be achieved?

In this article we present an overview of some of the important reasons why clinical crises continue to be so challenging and why clinicians may have difficulty responding appropriately. Some of these points are illustrated by an account of a clinical mishap and a case is made for the use of pre-compiled responses combined, ideally, with team training. A set of pre-compiled responses to crises during anaesthesia is presented in 25 articles which have been published electronically with this issue of the journal (box 1).11 The first of these describes how these responses were developed over a period of more than 15 years, involving 10 consensus meetings, each attended by 60–100 anaesthetists and human factors experts, and an exhaustive iterative analysis of 4000 incidents by a team of over 30 researchers. We argue that the need for this type of approach is as great in other fields of medicine as in anaesthesia.

Box 1 “Crisis management during anaesthesia” web resource list11

What messages can be drawn from the story in box 2 to guide us towards safer practice in the future?

Box 2 A tragic case

Richard Davis*, a fit, active 13 year old boy, died after a minor out-of-hours procedure for an infected knee.12 The anaesthetic was administered by a senior consultant anaesthetist, Dr Gale*, and proceeded uneventfully until, at the end of the procedure, Richard was transferred from the operating table. At this point he regurgitated and aspirated at least some gastric contents. He developed difficulty with breathing. He sat up and removed the laryngeal mask which had been used to maintain his airway, and then proceeded to display the signs of acute laryngospasm and, shortly thereafter, of frothing negative pressure pulmonary oedema. He was returned to a supine position and his trachea was intubated by Dr Gale. Dr Gale found it almost impossible to ventilate Richard’s lungs through the endotracheal tube. She took appropriate steps to determine that the tube was in the trachea and not blocked or kinked, and then removed and replaced it. She went on to perform other tasks one might expect in the management of a patient whose lungs and circulation were progressively deteriorating (the administration of 100% oxygen and of adrenaline, for example). She called for help very early in the development of this crisis, but effective help (in the form of an anaesthetic colleague who had to come in from home) took nearly 30 minutes to arrive. The contribution of junior doctors on the “arrest team”, who arrived earlier, was ineffectual. Richard’s condition deteriorated rapidly. When the second anaesthetist arrived, Dr Gale handed him the reservoir bag of the anaesthetic circuit to hold while she suctioned the endotracheal tube again. The bag did not deflate through the disconnected circuit alerting the second anaesthetist to the fact that the filter used to protect the anaesthetic breathing circuit had become blocked, presumably by the frothing pulmonary oedema. Removing the filter restored adequate ventilation, but Richard had unfortunately developed irreversible brain damage by this stage and life support was discontinued the following day.

A charge of manslaughter

At Dr Gale’s subsequent trial for manslaughter—for failing to recognise that the filter was blocked—evidence was presented to the effect that this blockage could not have occurred until quite late in the proceedings when the froth in the circuit started to dry and encrust the surface of the hydrophobic filter. All four expert witnesses called in the case said that the general conduct of the resuscitation was within the limits of acceptable practice, and none was prepared to criticise without reservation Dr Gale’s failure to identify the problem with the filter. There was some criticism of the fact that she had not followed a crisis management algorithm known as “COVER” which had been published in the anaesthetic literature some years previously13 (see table 1), particularly in that she had not eliminated the patient breathing circuit and replaced it with a self-inflating bag as a formal diagnostic manoeuvre to distinguish between a patient related problem and an equipment related problem. However, it was accepted that not all anaesthetists agree that an algorithm based approach is the best way to manage a crisis. Moreover, this algorithm at that time did not explicitly require the elimination of filters because the problem of a blocked filter had not been anticipated or previously reported (the use of filters in anaesthetic circuits represents a relatively recent evolution in practice). The protocol has subsequently been modified in this regard, but would have been deficient at the time of this particular case.

*The case is a genuine one, the names are pseudonyms.

Table 1

 Crisis management algorithm – memorise and practise: “COVER ABCD” as published in 199313 (the sequence is “AB COVER CD” when the patient is breathing spontaneously)

The obvious message, that the filters used in anaesthetic circuits can block, is certainly germane to safety in anaesthesia, but the most important message is more generic: there is clearly an urgent need for approaches to the management of crises in clinical medicine that are effective and universally accepted, and that take into account both the challenging features of clinical crises and the inherent limitations of human beings. Furthermore, the practice and artefacts of clinical medicine (protocols, equipment, drugs) are continuously evolving, so these approaches need to be revised regularly.

WHY ARE SOME CRISES SO CHALLENGING?

It is clear when reading incident reports and observing clinicians attempting to handle crises in simulators that a number of factors frequently combine together to make the crises more challenging.2,3,4,5,6,7,8,9,10,11,12,13,14,15,16 These are listed in box 3 and are discussed in the light of the story in box 2.

Box 3 Factors which may conspire together to make crises challenging

  • They may present with opaque, non-specific signs or symptoms.

  • They may arise from the interaction of many complex factors.

  • The problems may evolve, revealing additional layers of complexity.

  • The particular set of circumstances may never have been encountered before.

  • Recently introduced processes and equipment may bring new unforeseen problems.

  • Skilled assistance may not be available in the necessary time frame.

  • They may have to be resolved very rapidly if disaster is to be averted.

Crises may present with opaque, non-specific signs and symptoms

Of nearly 500 anaesthetic crises which resulted in death, well over half presented initially with signs as non-specific as changes in blood pressure, heart rate, and rhythm.14 In the tragic case of Richard Davis, the presenting features did not provide a clear indication of what the problem was, although he was clearly hypoxic. However, the differential diagnosis for hypoxia is huge. For example, among 179 incidents presenting with cyanosis or desaturation on pulse oximetry, there were over 20 different causes as diverse as hypoxic gas mixtures, unrecognised oesophageal intubation, anaphylaxis, air embolism, tension pneumothorax, and pulmonary oedema.15,16 Each of these superficially similar clinical situations requires an urgent but quite different response which, if delayed, may lead to death or permanent harm.

Clinical crises often arise from the interaction of many complex factors

They occur in a “dynamic environment with complex interactions between pathophysiology and disease processes, staff, infrastructure, equipment, and policies and procedures”.10 In Richard’s case the stage was set for disaster by the need for an urgent procedure out-of-hours in a small hospital, the lack of immediately available skilled help, Richard’s vulnerability due to his septic state, and a complex problem which arose suddenly without any warning.

The problem may evolve, revealing additional layers of complexity

The situation faced by Dr Gale was exceedingly difficult, involving multiple factors in a complex and rapidly evolving emergency which started with a “patient problem” (post-obstructive pulmonary oedema) and migrated to a very unusual “equipment problem” (the blockage of a filter), both of which manifested as difficulty in ventilation of the patient’s lungs.

The problem faced may not have been encountered before

Dr Gale had never encountered the particular set of circumstances that she was faced with. In fact, over two thirds of iatrogenic incidents in acute care hospitals that lead to patient harm are rare events which occur only once or twice a year, or even more infrequently.17 An individual clinician has no hope of getting a feel for all of these problems.

Recently introduced processes and equipment may bring new unforeseen problems

Health care and its artefacts (protocols, drugs, equipment) are continuously evolving and changing, bringing with them unforeseen and often unforeseeable new problems that can suddenly manifest when least expected. The filter blockage in Richard’s case is a good example. Filters in the breathing circuit were introduced into anaesthetic practice to protect patients from cross infection. The blockage of a filter, and the subsequent harm to a patient, is a form of “revenge effect”—the phenomenon by which an intervention to reduce one risk actually introduces another, which may remain unforeseen until a problem occurs.18

Skilled assistance may not be available in the necessary time frame

In Richard’s case, the junior medical staff available provided no useful assistance in the diagnosis or management of the problem, and more skilled help (although called for very promptly) was not immediately to hand.

Crises may have to be resolved very rapidly if disaster is to be averted

Many medical crises can lead to irreversible brain damage within minutes—especially those involving a problem (or, in Richard’s case, more than one problem) in the complex chain of oxygen delivery from a source of gas to the tissues.

WHY DO CLINICIANS HAVE DIFFICULTY RESPONDING APPROPRIATELY TO SOME CRISES?

Cognitive strategies and work practices which have evolved for speed and efficiency under normal working conditions may become maladaptive in uncertain or unusual circumstances.13,19,20 A number of factors which can affect human performance in a crisis may conspire together to impede the rapid resolution of a dangerous problem (box 4).

Box 4 Factors which can affect human performance in a crisis

  • The usual cognitive strategy, “frequency gambling”, may be counterproductive in a crisis.

  • The clinician may run out of rules or apply the wrong rules.

  • Working from first principles, although powerful, may be too slow and laborious.

  • Anxiety engendered by imminent disaster may degrade performance.

  • The workload may be excessive in a crisis.

  • A “mind set” may lead to “confirmation bias” (that is, new information is discarded).

The usual cognitive strategy, “frequency gambling”, may be counterproductive in a crisis

Doctors are taught to look for common problems and not to be unduly enticed by the possibility of the unusual. “Frequency gambling” (in a medical context) involves choosing the most likely diagnosis without formally excluding all alternatives. This is a cognitive strategy necessary to function in complex environments,20 but one which mitigates against the diagnosis of an unusual problem. A blocked filter had not been reported in the first 2000 incidents submitted to the Australian Incident Monitoring Study.21

The clinician may run out of rules or apply the wrong rules

Human beings function preferentially by pattern recognition and the use of “rules” to navigate routine tasks efficiently.20 In a crisis, those situations that can be resolved on the basis of the clinician’s resources of pre-stored, rule-based responses to situations identified by pattern recognition from past experience usually are resolved on this basis. It is when the clinician runs out of rule-based responses, or when an inappropriate rule is applied, that the situation moves out of control. At this stage it becomes necessary to work from first principles.

Working from first principles, although powerful, may be too slow and laborious to be effective in a crisis

The capacity of human beings to work things out from first principles (so called knowledge-based or deliberative reasoning) is impressive, but it is laborious and time consuming.12,13,20 Rule-based decisions can be made almost instantaneously; thinking things out from first principles is slow and demanding of concentration, which may be compromised if the clinician’s arousal increases beyond the point of optimal performance.22

Anxiety engendered by imminent disaster may degrade performance

This is almost inevitable in the nightmare-like circumstances of trying to save the life of a previously well patient whose condition continues inexorably to deteriorate despite repeated and increasingly desperate attempts to turn around the crisis. This is one of the advantages of getting prompt assistance from helpers with less emotional investment in the case. When the other anaesthetist arrived to help Dr Gale, he had no knowledge of the progression of events, no preconceived idea of the problem, and was less inhibited by anxiety. The situation is analogous to that in which Brian Mehler arrived in the control room of the nuclear power station at Three Mile Island and, with “fresh eyes”, was able to avert catastrophe by identifying the nature of a problem that had eluded the fifty operators, engineers, and supervisors who had been present during the evolution of the crisis.23

The workload may be excessive in a crisis

In a crisis there may be multiple tasks needing simultaneous organisation and execution, leading to “overload” and a degradation in performance. This is in contrast to the “everyday” workload which involves fewer sequential tasks.

A “mind set” may lead to “confirmation bias”

The tendency to interpret new information as supportive of an early view of events (even when the new information might, in fact, be pointing elsewhere) is known as “confirmation bias”.20 For someone in Dr Gale’s position, identifying a blocked filter as an isolated problem at the beginning of a case would be quite different from diagnosing a late blockage in the context of numerous other problems. Dr Gale had used the circuit (including the filter) for Richard’s entire anaesthetic without difficulty. More than one person had noted that air entry into the chest was present during the developing crisis (which would have been incompatible with a completely blocked filter, at least at that stage). There was clear evidence of a genuine patient-related problem—frothing pulmonary oedema. Each newly acquired fact would have been added to Dr Gale’s developing mental picture of what became, with the passage of time, a “strong but wrong” impression of the situation.20 “Fixation errors” may not only manifest in this way—“it’s this and only this”—but also as “it’s anything but this” or as “this can’t be happening”, leading to failure to commit to effective management.24

THE NEED FOR PRE-COMPILED RESPONSES

Dr Gale was found not guilty. Her responses were considered to constitute reasonable practice, so this was not a case of criminal negligence.12 In fact, this tragedy occurred in the hands of a well motivated, conscientious, fully trained, experienced, medically qualified, specialist anaesthetist who was present in the operating room throughout the case, who was not in any way impaired, and who was held by four experts to have been practising to a reasonable standard. Thus, we have to conclude that, even in skilled hands, the current standards of crisis management may fall short of the aim of the Anaesthesia Patient Safety Foundation, that “no patient shall be harmed by the effects of anaesthesia”.25

We have to accept that there will always be challenges that exceed any person’s ability to react adequately and in time, if he or she has to rely entirely on memorised rules supplemented by thinking from first principles.12 We are convinced that the crisis faced by Dr Gale would have been extremely difficult for any anaesthetist to handle on this basis.

On the other hand, the problem may not have been inherently impossible to resolve if appropriate “pre-compiled” algorithms had been available in writing. The use of a systematic pre-compiled response would, at the very least, have increased the chance of identifying those aspects of the crisis amenable to treatment. Even if we assume that the early laryngospasm and pulmonary oedema were severe enough to overwhelm conventional treatment to the extent that brain death preceded the blockage of the filter, at least we would have known that that was the case if the blockage had been found and corrected as soon as it occurred. The tragedy in cases such as this lies, to a large extent, in the possibility that the outcome could have been averted by relatively simple means.

DEVELOPING PRE-COMPILED RESPONSES

As the exact manifestation of future problems is impossible to predict, the best approach is to try, ahead of time, to work out from first principles all that can go wrong, taking into account reports of things that have gone wrong. The aim is to develop a structured approach which will be clear and quick to follow and which will cover all contingencies.

Algorithms for crisis management are often divided into a “phase 1” or “learned response” where some immediate actions are taken, backed up by some written instructions (the “phase 2” response).13 Phase 2 responses should be understandable to people who are not necessarily very experienced or particularly skilled in that discipline; this may be important when emergency procedures have to be carried out after hours in small hospitals, as was the case with Richard Davis. This two phase approach was taken in developing a core algorithm for anaesthesia crises represented by the mnemonic COVER ABCD–A SWIFT CHECK. The initial component of this algorithm can be learned and practised (see table 1) and can then be backed up by an easy-reference manual with supplementary sub-algorithms for specific problems. These sub-algorithms, and how they were developed, are presented as a set of 25 articles which accompany this overview (box 1).11

The concept of the necessity for a core algorithm had its origins in 1988,26 and an early version of COVER, checked against the first 2000 incidents reported to the Australian Incident Monitoring Study (AIMS), was published in 1993.13 It permits 60% of all problems reported in these 2000 incidents to be addressed in 40–60 seconds and leads onto 24 sub-algorithms covering more than 99% of the remaining problems.

Gaba et al published a comprehensive book on crisis management in anaesthesia in the year after the publication of COVER.24 Commendably comprehensive algorithms are provided for 10 “generic events” and a further 73 “specific events”. Most of the core “steps” in COVER for an initial response to a severe event are recommended, but the book does not advocate always starting with a standard sequence of actions. The possibility therefore exists of choosing the wrong algorithm. For example, if the “hypoxaemia” or “cardiac arrest” algorithm had been chosen in Richard’s case there would have been no direct recommendation to replace the breathing circuit, whereas if the algorithm for “high peak inspiratory pressure” had been chosen there would have been (this occurs at the “E” (eliminate) stage of COVER, table 1).

It is necessary to strike a balance between a comprehensive compendium of detailed responses to all possible events and a simple set of recommendations which will be of practical value in a crisis in which the diagnosis has yet to be made. The COVER algorithm itself has been criticised for being too complex.27 The optimal level of detail for a crisis management tool to be used by the average practitioner has not been determined. The use of algorithms in crisis management has not yet gained universal acceptance. Important work remains to be done in this area.

ACCOMPANYING PUBLICATIONS

A crisis management manual was published by the Australian Patient Safety Foundation in 199628 and has since been updated. The manual has been crosslinked within an Australian Society of Anaesthetists’ on-line teaching module which is in preparation.29 The pre-compiled responses in this manual were developed largely on the basis of an analysis of successfully managed crises, as recorded by anaesthetists in relevant AIMS reports, checked against the first 4000 incidents reported to AIMS27 and iteratively modified on the basis of wide consultation and feedback. They are published electronically in 25 papers which accompany this issue of the journal.11 There are limitations to the approach outlined in the manual,27 and it is so far supported only by evidence at level IV.30 Nevertheless, the approach is rational and the 24 sub-algorithms, correctly used, should handle more than 99% of the crises faced by an anaesthetist.

Much remains to be done. Carefully designed simulation-based studies should be undertaken to evaluate the approaches advocated against possible alternatives and to systematically refine the sub-algorithms.27 The greatest limitation may, in fact, be in how successfully the recommendations are applied. For example, 58% of trainees made major errors in the management of a cardiac arrest in a simulator, in spite of the fact that this is one crisis for which there are accepted recommendations with which anaesthetists are supposed to be familiar.31

THE NEED FOR TRAINING

The successful management of crises requires formal training and regular practice.32–34 However, such training and practice is still the exception rather than the rule in anaesthesia (let alone the rest of medicine), although there is now a simulator-based course in Australia and New Zealand for the “Effective Management of Anaesthetic Crises”. Most anaesthetists spend the vast majority of their time on the provision of routine, relatively uneventful anaesthesia, not on crisis management. This problem pertains to other activities as well, such as aviation and monitoring the function of nuclear power plants. Reason has called it “the catch-22 of human supervisory control”.20

One of the reasons some anaesthetists may not have availed themselves of crisis management training appears to be the psychological phenomenon known as “optimist bias”, in which most individual members of a group view and report their abilities as better than the average of the group—a clearly impossible state of affairs.35–37 If individuals believe they can fall back on their (“superior”) ability to handle crises from past experience and from first principles, they are unlikely to be highly motivated to spend time and effort on training.

There are compelling reasons for training in crisis management to be conducted in interprofessional teams. For example, managing a difficult airway requires several people to cooperate closely in a tight time frame.38 Yet, doctors and nurses, who routinely have to work together in managing these problems, rarely train for this important eventuality in teams. The concept of crew (or crisis) resource management (CRM) training has been highlighted as a valuable tool in the management of crises in aviation.39 There has been an extensive body of work applying these concepts to anaesthesia.2–8,32–34,40

THE NEED FOR ONGOING SURVEILLANCE

COVER was updated in light of the new information that emerged from Richard’s case (see footnote to table 1). The need for ongoing collection of information about things that go wrong is increasing as the pace of medical advances quickens. This information should be collated in large databases and made available to all who need it. The databases should include detailed information from near-miss incidents and from occasions when patients have been harmed (“sentinel” and adverse events), collated using a common classification from reviews, incident reports, audits, medical records, complaints, legal files, registers, root cause analyses, and coronial recommendations. This will maximise the chance of any new dangerous problem being “designed out” of the system, included in relevant algorithms, or at least of being brought to everyone’s attention, ideally before patients are harmed.41

Key messages

  • Even experienced clinicians make basic errors in crisis situations (box 2).

  • This exposes clinicians to litigation and adverse publicity even if they are not at fault.

  • The use of pre-compiled responses is one strategy clinicians can use to improve their performance and reduce perceived culpability (table 1).

  • Crises may be very challenging (box 3).

  • Clinicians may have difficulty responding appropriately (box 4).

  • The use of pre-compiled responses should be supplemented by regular training, preferably in teams.

  • Crisis management algorithms should be regularly reviewed and updated in the light of new information about what may go wrong in health care.

  • Other disciplines in clinical medicine would benefit from the use of crisis management algorithms, based on consensus and evidence on what goes wrong in their field of practice.

CRISIS MANAGEMENT IN OTHER AREAS OF CLINICAL MEDICINE

Although protocols exist for the management of a variety of acute clinical conditions (beyond the boundaries of anaesthesia) at a local hospital level (for example, hypertension, eclampsia, dysrhythmias), only a few have been published42 or are widely accepted. One exception is the approach known as Early Management of Severe Trauma (EMST)43 and as Advanced Trauma Life Support.44 This was originally developed by a practitioner whose relatives were poorly managed after suffering multiple trauma at a remote location, has now been widely adopted, and has undoubtedly improved the management of acute trauma. EMST training is undertaken in teams (and may be regularly updated), but these teams still seldom include nurses.

The need for algorithms and team training has been identified in several other areas—for example, problems with cardiopulmonary bypass.45 The longest standing widely accepted crisis management protocols are those for cardiac arrest. These have been progressively refined over half a century and are often nationally endorsed.46,47 Many hospitals require staff to be “certified”, sometimes at yearly intervals. It is now recognised that most patients do not suffer a cardiac arrest “out of the blue”; on the contrary, there are sufficient prodromal signs in the majority of cases for 6–8 hours before an arrest to allow a medical emergency team (MET) to be called at a time when it is possible to avert the impending crisis.48,49 Evidence is accumulating that this approach is associated with fewer cardiac arrests, fewer unexpected intensive care admissions and, possibly, improvements in hospital safety and an overall reduction in hospital mortality.50–52 However, to date there does not seem to be widespread realisation that the majority of MET calls are for a limited number of problems which are amenable to a structured approach. Such an approach would reduce variability in diagnostic and therapeutic interventions and decrease the likelihood of overlooking some of the more unusual problems when they do occur.

CONCLUSIONS

Many crises in clinical medicine are managed poorly, exposing patients to harm and clinicians to litigation. We as doctors owe it to our patients to make sure that contemporary knowledge of cognitive function and human performance is incorporated into our practice, so that the chance of avoiding potentially preventable disasters is always optimised. Anaesthetists have led the way in this endeavour, but much remains to be done. Not all anaesthetists have accepted the need for this approach and there is substantial scope for crisis management algorithms to be developed and adopted in a number of other areas in clinical medicine. A start has been made by the development of a set of tools which is underpinned by the 25 papers which accompany this article.

REFERENCES

Supplementary materials

  • Some of the PDFs in the following crisis management e-supplement may not display correctly on screen. However, this problem should be corrected when the PDFs are viewed at normal size or printed out. The PDFs are best viewed using Adobe Reader v.5.

Linked Articles