GLOSSARY

 

A   B   C   D   E   F   G   H   I   J   K   L   M   N   O   P   Q   R   S   T   U   V   W   X   Y   Z

 

 

Browsing Letter A

 

Active Error (or Active Failure)

An active error is one that is occurring at the point of care, or "sharp end", and committed by the person closest to the patient. As a result, these errors are usually easily visible, as for example, the nurse who gives the wrong dose of insulin to the patient.

 In contrast, a latent error (or latent condition) may exist far away from the point of care or "sharp end". A latent error may be described as a more subtle and less evident change in the system or organization that often has a powerful effect on those at the "sharp end" of care.

 An example of a latent error would be the hospital administrator who schedules the nurse to work back-to-back eight-hour shifts, greatly increasing the chance that the exhausted nurse will make an error in insulin administration.

 One reason why recurrent active errors tend to be hard to prevent is that they may be largely due to latent errors which are often more powerful in their "downstream effects" but harder to spot. Until the latent error is corrected, other individuals, given the same unsafe conditions, will be prone to making errors at the "sharp end" of care.

 These terms were coined by the Cognitive Psychologist, James Reason, and are widely used today.

 

Adverse Drug Event (ADE)

An adverse event involving medication use that may or may not be due to a medical error. An example of an adverse drug event without an error would be the first episode of muscle pain following the administration of an appropriate dose of a statin. If, on the other hand, an excessively large dose of insulin was given because of the misreading of an order, that would be an ADE due to a medication error. Many ADE’s are preventable, even if no medical error has occurred.

 

Adverse Drug Reaction

An adverse effect that occurs despite the correct use of a medication. An example would be orthostatic hypotension due to an antihypertensive, or a drug rash secondary to the same drug.

 

Adverse Event

Any injury caused by medical care. This may not be due to an error, but an expected or unexpected result of the therapy. This undesirable outcome often may be misjudged due to the presence of “hindsight bias”, but an adverse event by itself is not evidence of medical error.

 

Algorithm

A series of steps used to solve a problem. When an organization uses a well-validated single written algorithm for a particular problem, and the care team is taught how to correctly use the algorithm, the number of errors can be greatly reduced in delivering the care. Widespread use of common procedures and algorithms can have a positive effect on patient safety.

 

Algoritmo

Una serie de pasos utilizados para resolver un problema. Cuando una organización utiliza un bien único validado por escrito algoritmo para un problema particular, y el equipo de atención se ense&ntildaa c&oacoute;mo usar correctamente el algoritmo, el n&uacoute;mero de errores puede ser reducido en gran medida en la prestación de la atención. Uso generalizado de los procedimientos comunes y los algoritmos pueden tener un efecto positivo sobre la seguridad de los pacientes.

 

Anchoring Bias (or Error)

This is a common logical error that frequently leads to faulty analysis of a clinical setting. This cognitive error is due to the mistake of allowing a first or early impression to be overvalued in making a diagnosis.

 An example of an anchoring bias is a common, but often incorrect rule of thumb, or "heuristic", such as "always trust your first impression".

 Another related example is the phrase, "this and only this". When an anchoring bias leads to a knowledge based diagnostic error, physicians have a difficult time recovering from this kind of error, and find it harder to re-evaluate in light of contradictory new information. This is due to the anchoring bias.

 

Authority Gradient

A term used widely in crew and organizational management circles to describe the balance of decision-making power in a given situation. Another term sometimes used is the steepness of command hierarchy.

 The danger of a steep authority gradient can be the understandable reluctance of members who are lower in the hierarchy to successfully challenge the even leader when there is a concern that an error was made or an unsafe condition exists. Even mere clarification of an order or situation may be difficult.

 More successful "cultures of safety" make the necessary authority lines clearer, more transparent, more appropriate for the level of experience and skill, but allow the team members to question authority if there is a concern regarding an order or an analysis of the clinical needs of the patient.

 

Availability Bias (or Heuristic)

This bias occurs when the first explanation that comes to mind is overvalued and the process of analysis ends at that point. A common example of this bias is the well-studied example that a physician who picks up a pen with the name of a drug on it is more apt to think of that drug when writing a prescription for a drug in that class of drugs.

 More serious for diagnostic problems is the tendency to choose the diagnosis already accompanying the patient chart as the cause for the present symptom set instead of exploring other options.

 TOP

Browsing Letter B

 

Back-up Checks

A phrase used to indicate a re-examination that a critical procedure has been done. For example, a nurse checks the dosage of insulin drawn up by the first nurse. Another example is another physician in the team rechecking the initial orders to be sure they are complete and still appropriate. It is characteristic of a well-functioning team, a "culture of safety", that back-up checks are routine and are accepted error reduction techniques. In general, the more critical the process, the more crucial back-up checks become, but they are also useful in lower risk situations.

 

Barriers

A term that is often used in a positive way, such as barriers to prevent patient injury, such as in the "swiss cheese" model of patient safety strategies. Alternatively, it is used to describe obstacles to safer clinical practices, such inadequate staffing in a hospital setting, or lack of willingness for team members to change a "workaround" when it causes further problems in care.

 

Bayesian Approach

Using probabilistic reasoning which factors in the probability of the various disease states under consideration together with the sensitivity and specificity of the particular test or tests that were performed.

 This sound clinical approach is a logical way to approach dealing with diagnostic uncertainty but is used far less often than optimal.

 

Beers Criteria

This defines medications that should be avoided in older patients, either a specific medicine, or a dosage level of frequency that is unsafe. These criteria grew out of a guide for clinical practice and for health services research and quality assurance review, and cover numerous categories of drugs.

 

Benchmarks

A positive outcome or result that is derived from empiric data comparing several institutions or individuals that can serve as a standard of excellence.

 An example would be the observation that top 10% of all practices achieved mean A1c in their Type I diabetic patients of 7%. The A1c 7% may be chosen as the benchmark.

 For a benchmark to be useful, it should have face validity, be clinically important, be reasonably achievable, and be free of unintended consequences.

 

Black Box Warnings

Prominent warning labels placed on packages for some prescription medications in the United States. The genesis of these warnings is a result of either post-approval clinical trials or post-market drug surveillance that document serious adverse reactions to the drug. As a result, the US Food & Drug Administration (FDA) may require a pharmaceutical company to add a black box warning on the label of a particular drug.

 The key to understanding black box warnings is to realize that the warning points the clinician towards the relative risk and benefit for that patient, but does not rule out the use of this medication. Implicit in black box warnings is the understanding that the risk-benefit ratio of a particular drug for a particular patient may very widely. This clinical information in the black box warning is meant to be used to help delineate the relative risk-benefit ratio. Sometimes black box warnings appear years after the drug is approved, emphasizing the need for post-market surveillance for all medications.

 

Blunt End

The term "blunt end" refers to the portions of the health care system often far away from the point of care, which have a profound influence on the performance of personnel at the "sharp end" of care. Latent errors or conditions occur at the "blunt end" of care. Examples at the "blunt end" include a hospital administrator’s decision to under staff a nursing unit. Although the error may be made by an exhausted nurse in her last hours of a long shift, the decision at the "blunt end" of care makes that kind of error more likely.

 TOP

Browsing Letter C

 

Checklist

Checklists are used widely in both medical and non-medical settings, because of the many advantages of using a written listing of the sequence of the actions that will be performed in a clinical setting. The strength of checklists derives from the fact that they represent algorithms that are understood by all, and represent rules of appropriate behavior that the team is expected to follow.

 

Clinical Decision Support System (CDSS)

Systems designed to support and improve clinical decision making, either diagnostic of therapeutic in nature. They are particularly powerful when embedded in an electronic health record where a series of prompts may guide the user to perform clinical tasks, such as beginning the sequence of a drug interaction check. Another example would be the development of patient-specific recommendations after the user provides clinical information such as renal function or allergy history. The CDSS software will factor that data into the recommendation of the appropriate dose or choice of an antibiotic. A related topic is clinical reminders, which may be warnings generated by the computer software, or prompts appearing at the time an order is entered.

 

Clinical Guidelines

Clinical guidelines refer to an evidence-based review of best practices and recommendations concerning a clinical topic. Clinical guidelines are frequently submitted to the National Clinical Guideline Clearinghouse and posted on their website. The preferred model of a clinical guideline emphasizes clarity, brevity, grading of supporting evidence, and is transparent with respect to the supporting data behind the recommendations. Robust clinical guidelines form the basis for performance measure sets used by the National Quality Forum, the Physician Consortium for Performance Improvement, and other related organizations.

 

Close Call

This term is used in patient safety discussions to describe an error or event that did not lead to patient injury, but easily might have. In settings where backup checks are common, close calls result when an error is corrected by someone who is checking the work of another. A synonym is the term "near miss".

 

Cognitive Factors

Many medical errors result from factors that affect how people think, or the limitation of memory, judgment, distraction, fatigue, and other extraneous factors. Cognitive factors are frequently underestimated in importance, yet may be very predictable in settings where things may not be what they seem.

 

Cognitive Impairment

This is a subset of cognitive factors, but one where some condition, either transient or permanent, is affecting the ability of the individual to correctly analyze, recall, or deal with a clinical situation. This is not merely a patient problem, but often may relate to those closely involved in the care of the patient, including the providers.

 

Cognitive Overload

This is an all-too-common problem in health care settings, often because of defective methods of communication to both patients and health care personnel. As the complexity or unfamiliarity of information presented increases, the error rate increases dramatically. Often the person providing the information is not aware that the person receiving the information no longer comprehends it well, and a subsequent error is likely. An example of this is a physician or educator who provides an intensive hour of education regarding self-care in diabetes to an individual who is recovering from diabetic ketoacidosis. The patient may be reluctant to admit that their thinking is less clear and that they no longer can recall much of what was said. Another example is an inexperienced house officer in a critical care setting who does not carry out complex instructions given to them during the crisis.

 

Communication

In the area of patient safety, disruption of communication or faulty communication techniques represent an extremely important and common source of misunderstandings which often result in unsafe patient conditions. The range of possible causes is very broad, and encompasses problems which may be institutional, systemic, technical, patient-related, and also due to lapses in the delivery of the information by the provider.

 

Competency

In the sphere of patient safety, competency is often defined as having the necessary knowledge or skill to perform a given procedure or function at a level that is determined to be sufficient for safe and acceptable care. Data from several countries shows that trainees often are not adequately supervised, and it may be unknown whether they have the necessary knowledge or technical skill to competently perform a particular procedure.

 

Complexity Theory

Originally defined as an approach to understanding the behavior of systems that exhibit non-linear dynamics, it is also used to describe the ways in which some complex and adaptive systems may produce novel behaviors that one would not expect given the properties of the individual components. The importance of complexity theory, when used in the patient safety literature, is that it emphasizes both the interactions between local systems and their environment, as well as the strategy of first studying a given system in depth prior to engaging in efforts to change it.

 An example of this would be in the realm of changing behavior of a given group of physicians. Prior to instituting system changes to induce desired behavior changes, the complexity theorist would try to understand first what works best in that particular practice and what attitudes or structures underline the preferences or behaviors of the individuals. The complexity theorist may conclude that in some situations, a financial incentive might be ineffective because of how it would affect how some in the system under study would interact. The study of systems in the patient safety literature is related to complexity theory in this respect. But complexity theory is, at its heart, different from other major theories of organization behavior.

 

Computerized Physician Order Entry (CPOE)

This refers to a computer based system of orders, commonly used to medications, laboratory and xrays, and other diagnostic tests. Depending on the extensiveness of the CPOE system, the capabilities may be global in nature, and access pre-loaded algorithms as well as a wide variety of other routine needs for the hospitalized or clinic patient. Many CPOE systems offer clinical decision support systems (CDSS), including tests for drug interactions, orders for drug allergies, and even dose modifications and suggestions regarding routes of administration. A poorly designed CPOE can also introduce new categories of errors, depending upon how the data is displayed and how inappropriate any forcing functions are.

 

Confirmation Bias

This falls in the category of a cognitive error. It is the tendency to pick and choose an initial diagnosis then overvalue the evidence that supports this initial hypothesis, rather than to equally value evidence that may refute it or provide an alternative. An example of confirmation bias may be the emergency room physician who makes the diagnosis of appendicitis on a child with severe abdominal pain, and after the elevated white blood count is found, barely looks at the chemistry studies which show both an anion gap and elevated blood glucose level and suggest an alternative explanation for the abdominal pain, diabetic ketoacidosis.

 

Coordination of Care

A central need for patients is to have their care centered on their needs and their particular individual problems and vulnerabilities. With more complex chronic illnesses or severe acute episodes, failure to adequately coordinate care may result in medical errors and patient injury. There is a very high frequency of adverse events occurring within 30 days of a hospital discharge because of the lack of coordination of care between inpatient and outpatient follow up. The term "hand-off" is often used to describe the method by which information is transmitted from those giving the care in one setting to those continuing the care in another.

 

Crew Resource Management

This general term refers to a variety of approaches used in team training and development. This was originally used in the aviation industry, and is used widely to improve management styles and organizational cultures in environments which are both high stress and high risk. The principles of CRM training are meant to improve communication skills and develop a more mutually supportive environment between team members and not to suppress the willingness of less senior personnel to speak up when they sense a problem. In many settings, simulations and repeated hands-on training are central to their coordination of effort.

 

Critical Incidents

This term was first widely used in the anesthesia literature to describe critical incidents as occurrences which are significant, pivotal, or crucial in either a positive or negative way. In a sense, critical incidents reveal not only that there is significant potential for patient injury, but often may uncover latent errors, or important hazards that may exist in the organization. The term "close calls" is related to these critical incidents.

 

Culture of Safety

The term "culture of safety" refers to a system of care that establishes a collective point of view among its members and results in them working together as a team to protect the patient from accidental injury resulting from an error. Common characteristics of a COS include the timely communication of important information, and presence of "back-up checks" in critical settings. Examples of defective culture of safety include a rigid hierarchal system where no one feels empowered to correct an error made by the leader. Another would be where inadequate staffing eliminates the coordination of care of backup checks that reduce the frequency of injurious error. Error-prone organizations are more apt to fail to analyze their own errors, and instead focus on the errors of the person at the front line of care.

 TOP

Browsing Letter D

 

Decision Support

A system that provides guidance or advice regarding a clinical decision that is made in patient care. Decision support can be occurring with computerized decision support systems, but it may be paper-based as well, triggers or flags to guide people towards or away from certain behaviors.

 

Depression

A mood disorder that is a common source of morbidity and mortality, and is a frequently under-recognized source of a latent condition that profoundly affects the communication and behavior of the patient. Frequently a patient who becomes clinically depressed will be much more vulnerable to having an adverse clinical event, in part because of the effects of the depression on their perceptions, cognition, and ability to both communicate and receive communication from others.

 TOP

Browsing Letter E

 

Error

A mistake that may be categorized in multiple ways. There are errors of commission (doing something wrong) or omission (failing to do the correct thing). There may be an error of intention, that is trying to do something specifically but missing the mark, or an error where the intention is correct but the target is wrong. An example of this would be a rule-based error (strong but wrong), where the individual correctly carries out an incorrect behavior. Examples of omission may be slips or lapses, sometimes called skill-based mistakes. An example of a knowledge based error may be a diagnostic error. Other ways of describing types of errors include errors at the blunt end of care, the so-called latent errors, and errors at the sharp end of care, the so-called active errors or failures. Perhaps the best review on this subject can be found in James Reason’s book, "Human Error".

 

Error Chain

A term commonly used in root cause analysis, referring to the sequence of events that led to a disastrous outcome. In cases where the chain is tightly coupled, one problem may directly cause the next. Examples of key categories used include breakdown in communications, poor leadership, failure to follow standard operating procedures, ignoring individual infallibility, and losing track of objectives. Error chain analysis is commonly used in teamwork training programs.

 

Error Mitigation

A way of softening or deflecting the negative impact of an error. An example is a physician who notes that his partner has not noticed an allergy alert on a chart and incorrectly ordered a closely related drug. The physician cancels the order before the patient receives the medication, and notifies his associate so a safer alternative can be used.

 

Error Recovery

A way of responding to an error that has already occurred and may have already taken place. An example is a physician noting that an xray result two months ago has not been signed off by a colleague, and it indicates an abnormal chest xray with a mass. The physician makes sure that the patient is promptly informed, apologizing for the delay, and makes sure the appropriate care is given as soon as possible.

 TOP

Browsing Letter F

 

Face Validity

The degree to which a statement, concept, or study appears credible, because the results are consistent with prior knowledge and assumptions.

 

Failure Mode

A form of error analysis. This may be retrospective, as in root cause analysis, or prospective, in efforts to predict the probability of error. This is used in the failure mode and effect analysis (FMEA), which is a prospective method of valuing the relative urgency of a particular error.

 

Failure Mode and Effect Analysis (FMEA)

A prospective approach to preventing medical errors is to weigh the relative importance of different errors and their likelihood of occurring. A formal FMEA study involves generating three numbers, each with a range of one to ten, for the categories of:

  • Severity (S) – the degree to which this error of failure will cause catastrophe;
  • Occurrence (O) – the relative probability of this error or failure occurring in the system;
  • Detection (D) – the ability of those in the system to detect the error or failure when it occurs, before it causes harm.

FMEA constructs a criticality index (CI), the multiplication product of these numbers, (CI=S*O*D), to indicate the relative urgency of taking steps to prevent the conditions that would allow this error or failure to occur. This allows a rank ordering or prioritization of quality improvement targets. Items ranked with the highest criticality index might receive the highest priority for improvement.

 

Failure to Rescue

This widely used concept has been increasingly used in the literature on health care quality and safety. It derives from the observation that in certain settings there is a relationship between outcomes following a serious complication and other measures of quality and safety of hospital care. For example, an adverse occurrence such as cardiac arrest, deep venous thrombosis, or sepsis, can be used. If death follows any of these occurrences, it can be considered a failure to rescue. But it is also clear that without a robust comorbidity index or severity index, that data from this method of analysis might be inaccurate, or worse, discriminate against those who deal with the highest risk patients.

 

Forcing Function

A design aspect that will either allow or prevent a particular action from being performed by requiring that another function or action be performed beforehand. An example of this is the removal of multi-use vials for insulin administration from a clinical unit. Another is a CPOE not allowing the printing of a prescription without checking for drug interactions.

 TOP

Browsing Letter H

 

Hand-offs

Hand-offs are an important component of coordination of care efforts. Specifically they refer to the transmission of communication of information from one care provider to another as the patient moves from one setting of clinical care to another, a transition in care. Hand-offs are such a frequent source of error, in part because of the fragmented health care system, that they are often the proper target of high value patient safety improvements. An example of a failed hand-off is the common finding that a patient who has a hip fracture from osteoporosis will leave the hospital after surgery and return to their primary care physician without either a diagnosis of osteoporosis or treatment recommendations.

 

Health Literacy

The ability of individuals to comprehend the basic health care information they need to have in order to make decisions about their own health care and to follow health care instructions.

 

Heuristic

Alternatively described as a rule of thumb, it is typically an informal rule, the result of trial and error, which is used as a cognitive shortcut. Although heuristics are widely used and highly efficient, they also are frequently incorrect. Pattern recognition is frequently used by experienced clinicians to make rapid diagnostic and therapeutic decisions. This too can be the source of important and serious medical errors. (See Anchoring Bias and Availability Bias)

 

High Alert Drugs

Drugs involved in a high percentage of medication errors or sentinel events. High alert drugs are those that are associated with a higher risk of abuse, errors, or other adverse outcomes.

 

High Reliability Organizations (HROs)

High Reliability organizations are those which, despite operating in highly hazardous conditions, have relatively few adverse events. Studies focused on nuclear power plants and air traffic control systems as examples of HROs. In patient safety literature, HROs are defined as having nearly failure free performance records, which is unusual in health care systems. Health care systems usually have much higher failure rates than in the domestic air traffic control systems and nuclear power plants. Four common characteristics of HROs are noteworthy:

  1. Preoccupation with failure and acknowledgement of a high risk error prone nature and determination of achieve consistently safe operations. It should be noted that health care systems have been described as a high error opportunity system, where decisions are made with limited information.
  2. Commitment to resilience, the capacity to contain unexpected threats before they cause harm
  3. Sensitivity to operations, allowing autonomy for those at the point of care in identifying and responding to threats to safety
  4. A culture of safety, in which it is safe to draw attention to potential hazards or failures without censure.

 

Hindsight Bias

This cognitive error refers to the profound way in which the knowledge of the outcome affects the person’s judgment regarding causality. In patient care settings, this refers to the tendency for a bad outcome to distort the perception of errors. Prior knowledge of the outcome distorts retrospective analysis so often, that fair analysis of the situation under study is seldom possible.

 

Human Factors Engineering (Human Factors)

Human factors engineering attempts to optimize systems design so as to best mesh the design with the individual characteristics and abilities of humans. Ideally, both the strengths and weakness of the physical and mental abilities of humans should be considered in the design of optimal equipment. For example, if an external insulin pump is too complex for a patient to use without frequent confusion, one could say the human factor engineering is inadequate because of the failure to account for the human factors which lead to the degradation of performance when the individual uses the complex pump. In general, human factors engineering would consider a side variety of factors which would affect system design, and attempt to optimize, to take into account the limitations of the human machine interface. Among the successful human factor engineering factors we take for granted is the standardization of complex instrumentation so as to make it more familiar and easier for people to learn. HFE studies often identify where people commonly make mistakes, and add forcing functions to limit the options for use of the device in question.

 An example of this would be an insulin pump that stops delivery if no programming of the pump has taken place in 12 hours. It requires the user to actively give a command at the desired intervals.

 TOP

Browsing Letter I

 

Iatrogenic

An adverse event due to the medical care itself, rather than the underlying disease. From the Greek, iatros, for "healer", and gennan, "to bring forth".

 

Incident Reporting

Reports, usually from personnel directly involved in an incident that identifies an occurrence that may have led to an undesirable outcome. It is a form of surveillance for errors, which in hospital settings in particular, tend to represent only a very small fraction of the true number of adverse events. In some systems, allowing incident reporting to be anonymous or confidential will increase the frequency of reports. They are, nevertheless, highly useful in pointing out both active and latent errors in the system.

 

Informed Consent

A critically important process, both from the legal standpoint and educational, in providing a formal method of assuring that the patient is correctly informed about a proposed therapy or test, and fully understands the implications and alternatives. Despite the acknowledged importance of informed consent, often informed consent is not obtained correctly, or may even not be obtained at all. The failure to obtain informed consent properly may be a key portion of an error chain that leads to an adverse outcome.

 

Institute of Medicine Improvement Aims

The six IOM Improvement Aims are as follows (2001):

  • Care should be:
  • Safe
  • Effective
  • Patient Centered
  • Timely
  • Efficient
  • Equitable

These six aims have been widely accepted as six essential goals of medical care.

 TOP

 

Browsing Letter J

 

Just Culture

A recently coined term, popular in patient safety writings, which described a culture in which personnel at the point of care are comfortable discussing their errors and others while maintaining professional accountability. The distinction commonly used is that individual practitioners should not be held accountable for systems failings, over which they have no control but there is zero tolerance for disregard of severe risks to patients of gross misconduct. In short, a just culture accepts that competent professionals make errors, but there is zero tolerance for reckless behavior or willful disregard of crucial safety principles.

 TOP

Browsing Letter L

 

Latent Error

Latent errors are commonly referred to as errors at the blunt end of care, in contrast to active failures, which are errors at the sharp end of care. The terms active and latent were originally coined by James Reason. See definition of Active Error

 

Learning Curve

This refers to the reality that any new skill is frequently associated with higher than expected complication rates when performed initially, and lower than expected success rates. It is one reason why new procedures are often associated with poorer outcomes than when the procedure becomes more established. It is another reason why training of operators should be under the close supervision of more experienced operators. It also highlights why simulations, whether it be simulations used in the teaching of the correct procedure for performing fine needle aspiration of the thyroid, or of CPR, make a great deal of sense.

 TOP

Browsing Letter M

 

Medication Reconciliation

The formal process of reconciling the actual list of medications that a patient actually using at that time, with any previous or incomplete information that is available. It is generally recommended that every time the patient has a transition of care, especially when they move from one provider to another, this process be repeated. The hazards of not having an accurate medication reconciliation are great, because potential drug interactions would be unknown, and duplication of medications and overdosage may occur. Even the understanding of the current clinical data regarding the patient will not be complete if the complete list of medications the person is taking is not available.

 The process of medication reconciliation can often be utilized to identify other problems, either in the system of care, or relating to cognition or communication problems. For example, a patient may not understand what they are taking, why they are taking it, and what the significance of the medication is. Medications may sound alike. Often the medication that one family member thinks the patient is taking may not be what another knows it to be. This central process, however, when it is performed, should result in a list that has a time and date associated with it, so that others who view it can see whether it has been superseded by new orders. Common omissions include not including over the counter medications, herbal medications, or illicit medications ranging from inappropriate steroid use to drugs of abuse. The complexity of medication reconciliation is often daunting. Its importance, however, is very great.

 

Mental Models

A psychological representation of real or hypothetical situations. Mental models are commonly used to both anticipate and explain events. Mental models may also include scripts or processes, and may follow the different expectations. For instance, a physician may infer from a patient’s demeanor that the patient is merely profoundly depressed, when the correct diagnosis is actually hypothyroidism with secondary depression and cognitive impairment.

 

Metacognition

A highly useful method of reflecting on one’s own thinking processes, judging whether cognitive biases or shortcuts may have adversely affected their own decision making. Metacognition might lead a clinician to examine the cognitive bias which led them to make a rapid diagnosis that was not based on fact. This highly useful method is underutilized.

 

Mistakes

An alternate way of describing errors is to distinguish between slips or mistakes. On the one hand, attentional behavior is characterized by active problem solving with conscious through, analysis, and planning. Schematic behavior, in contrast, refers to the automatic activities, the activities we perform as if by reflex. Using these terms, mistakes reflect failures during attentional behavior, as opposed to lapses in concentration, as in slips. Mistakes are more apt to be knowledge based, incorrect interpretation, or incorrect rule. Slips tend to be an oversight and more readily corrected. Slips are more apt to occur with competing distractions, whether it be fatigue, emotional, or stress. Mistakes are more often due to lack of experience or inadequate training or knowledge. The frequency of slips can be reduced by changes in the design of the workplace, using algorithms or checklists, reducing fatigue of personnel, reducing stress, and decreasing the variation in design of key devices. The remedy for the particular mistake or slip should relate to strategies that will best solve that particular problem. In redesign for safety concerns, one size does not fit all.

 TOP

Browsing Letter N

 

Near Miss

See "Close Call"

 

Normal Accident Theory

A theory developed by the sociologist Charles Perrow, based on a careful analysis of the Thee Mile Island Power Plant Accident in 1979. The central tenet in this theory is that in this setting, major accidents are inevitable and therefore normal. Perrow proposed two factors that create such an environment, complexity and tight coupling. His theory was that the inevitability of accidents occurs in a system where no single operator can see the consequences of an action in the system. Tight coupling occurs when processes are intrinsically time dependent. Once a process has begun it must be completed in a certain period of time. Hospitals would be regarded in this sense as exhibiting both tight coupling and complexity. His theory has provided some extraordinary insights into possible failure modes for hospitals.

 

Normalization Deviance

This term was coined by Diane Vaughn in her book, "The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA". Vaughn used this term to describe a gradual shift in what was considered normal after repeated exposures to deviant or unsafe behavior. This theme has been used by others, including Mark Chassin and Elise Becher, who coined the term "a culture of low expectations" to describe a health care system that routinely produces errors, desensitizing health care providers to malfunction.

 TOP

Browsing Letter P

 

Patient Safety

Freedom from accidental or preventable injuries produced by medical care.

 

Pay for Performance (P4P)

The general strategy of changing behavior and promoting quality improvement by financially rewarding providers who meet certain performance standards relating to some measurement of health care quality of efficiency. There is very limited data to show the efficacy of pay for performance models in the United States. Barriers that are currently evident include the difficulty or unwillingness of purchasers of health care to use clinical data with risk stratification to make their judgments. This may be due to the widespread lack of electronic health records and difficulties in exporting data in a secure fashion for analysis.

 

Performance Measures, Performance Measurement

A formal process in which conservative, evidence-based measures of clinical excellence are derived from the clinical guidelines of professional societies. Performance measure sets refers to a grouping of measures regarding a particular clinical condition for the purpose of both quality improvement and accountability. Performance measures are widely used by both government and private sector payors to evaluate clinical quality of care, and have been used in both pay for participation and pay for performance programs. An example of the former is PQRI, the Physicians Quality Reporting Initiative of the Centers for Medicare & Medicaid Services, which was begun in 2007.

 

Plan-Do-Study-Act (PDSA)

A cycle of activities advocated for process or system improvement, first proposed by Walter Shewhart, and popularized by his student, quality expert W. Edwards Deming. The PDSA cycle is one of the cornerstones of Continuous Quality Improvement (CQI).

  • Plan: Analyze the problem you intent to improve and devise a plan to correct the problem.
  • Do: Carry out the plan (preferably as a pilot project to avoid major investments of time or money in unsuccessful efforts)
  • Study: Did the planned action succeed in solving the problem? If not, what went wrong? If partial success was achieved, how could the plan be refined?
  • Act: Adopt the change piloted above as is, abandon it as a complete failure, or modify it and run through the cycle again. Regardless of which action is taken, the PDSA cycle continues, either with the same problem or a new one.

 

Potential ADE

A potential adverse drug event is a medication error, or drug related mishap, that reached the patient but did not produce injury. An example is the administration of a drug to a patient who was allergic to that drug, but no harm resulted.

 

Production Pressure

The pressure to put quantity of output for a product or service ahead of safety. In health care, production pressure refers to the delivery of services, the pressure to fit as many patients into the schedule as possible, or to provide as much education as possible within a given time period for diabetes care. Production pressure produces an organizational culture which resists any attempt to slow the process down for any reason, including safety concerns. Many errors can be traced to the pressure to work as fast a humanly possible, regardless of the consequences.

 TOP

Browsing Letter R

 

Rapid Response Team (Medical Emergency Team)

A new team in many hospitals, which responds not just to cardiac arrests but a wide range of acute changes in a patient’s clinical status. A striking feature of such teams is that anyone in the hospital hierarchy can initiate the call without approval from more senior personnel. It is interesting that there is excellent documentation as to the value of such a program in the finest hospitals; however, it is also true that the reason why it is cost effective is because the team is usually very busy avoiding so many near misses.

 

Red Rules

Rules that must be followed to the letter. Any deviation from a red rule will bring work to a stop unless compliance is achieved. A common example is: no hospitalized patient can undergo a test of any kind unless they have an identification bracelet. If the ID is not present, then all work must stop in order to verify the identity and provide an identification band. Red rules are, by definition, ones that will be supported by top management and the entire organization. They should be clear, easy to follow and remember, and have a particular importance in patient safety activities and foster a culture of safety.

 

Reliable Organizations

In health care, reliability is defined as the measurable capability of a process, procedure, or health service to perform its intended function in the required time under commonly occurring conditions. A Highly Reliable Organization accomplishes their desired goals the vast majority of the time and carefully studies their systems to minimize error and focus on improving safety through systemic improvement.

 

Root Cause Analysis (RCA)

A retrospective structured process for identifying causal and contributing factors underlying critical incidents or adverse events. This is a highly structured and comprehensive analysis that will always involve multiple categories of analysis before coming to a conclusion. It is based on a detailed catalogue of events leading to the event under observation. There is considerable question as to whether most adverse events have one true cause, but rather multiple contributing factors.

 

An alternative analysis is to focus on systems analysis as the major way to understand the relationships between the different parts of the health care system and their role in producing catastrophic error.

 

Rule of Thumb

See Heuristic

 

Run Charts

A quality control graph, a form of statistical process control, plotting an observation over time to see if there is are runs of points above or below a center line, usually representing the average or median. Run charts are commonly used in health care quality monitoring.

 TOP

Browsing Letter S

 

Safety Culture

Also called "a culture of safety". Safety culture usually embodies characteristics of high reliability organizations (see HROs).

 

Scope of Awareness

The scope of awareness refers to the fact that the person who initiates a therapeutic maneuver will seldom be able to see all of the potential consequences of what they have intended to do. Unknown comorbid conditions, interactions with other individuals, may all introduce unanticipated consequences. In a culture of safety, the initial plan will be supported by members of the team who can intercept deviations from the intended therapeutic results by unintended factors and protect the patient from harm. In a setting where there may be a system under stress, due to, for example, production pressures or a lack of resources, the scope of awareness typically narrows. The person writing the order may be even less aware of what the unintended consequences may be. An example is the physician who, when there is 20 minutes to spend with a patient, will analyze carefully the options for diagnosis, but when there is only 5 minutes or less, will be much more apt to rely on pattern recognition and introduce cognitive biases during the course of the analysis.

 

Sense Making

A term coined by Carl Weick, from his book entitled "sense making in organizations". It refers to the process by which an organization takes in information to make sense of its environment, to generate knowledge and to make decisions, and what happens with organizational sense-making breaks down. There are consequences that have a particular relevance for health care institutions.

 

Sentinel Event

An adverse event in which death or serious harm to a patient has occurred and the events are neither expected nor acceptable. A sentinel event is often an egregious error. It is common that investigation of such events will reveal many problems in current policies and procedures.

 

Sharp End

The personnel or parts of the health care system that are in direct contact with patients. In contrast to the layers of the health care system which are at the "blunt end" of care. (See Active Error)

 

Situational Awareness

The degree to which one’s perception of the situation matches reality. In crisis management, situational awareness includes a wide variety of different and simultaneously occurring states, such as stress and fatigue among team members, appropriate immediate goals, environmental threats to safety, and deterioration in the status of the patient. During a crisis situation, often the scope of awareness of individuals is adversely affected, such as a patient who has a myocardial infarction but also develops severe hyperglycemia, which remains unnoticed and untreated and adversely affects left ventricular function, while people are tending to the other immediate consequences of the acute infarction. It is always helpful to have an individual who focuses on maintaining situational awareness and is aware of the "big picture." Failure to work together collaboratively often creates an unsafe condition for the patient.

 

Six Sigma

The attempt to perfect a process or product to the degree that there is defect rate of 3.4 per million opportunities, or six standard deviations from the population average. When it is applied to health care, which typically has failure or defect rates in the 2-5% range, Six Sigma performance will invariably affect the normal workflow. In fact, it is used when it is thought that the normal workflow is playing a role in the unacceptably high defect rate. An example of a setting in which it would be appropriate to use in health care would be to reduce catheter-related bacteremia rates to zero.

 

Slips or Lapses

See Mistakes

 

Standard of Care

Often defined as the care expected from a reasonable practitioner with similar training practicing in the same location under the same circumstances. The standard of care may vary by community, depending on the resources. It is, however, thought to be defined in terms of the condition rather than the standard of a particular physician, i.e. generalist vs. specialist.

 

Structure-Process-Outcome Triad

Quality can be defined as the "degree to which health services to individuals and populations increase the likelihood of desired health outcomes and are consistent with current professional knowledge." Experts such as Avedis Donabedian have proposed that quality can be measured using aspects of care with proven relationships to desirable patient outcomes, such as process and structure measures. An example of a process measure is using aspirin and beta blockers for myocardial infarction. An example of a structural measure would be a dedicated diabetes unit for the care and education of new onset diabetes.

 

Swiss Cheese Model

A model developed by James Reason to indicate how analysis of major accidents and catastrophic systems failure shows multiple smaller failures that lead up to the actual event. Further, in his model, James Reason shows a slice of cheese representing a safety barrier, but each is imperfect, and when the holes are aligned, or when there are too few barriers or too many holes, then an error can pass through the holes and cause a patient injury. Poorly designed work schedules, lack of teamwork, lack of communication between team members, and language barriers can all work together to cause injury in a complex system.

 

System

In patient safety parlance, a system is a set of interacting, interdependent elements that work together in an environment to perform the required functions in order to attain their goals. Some describe a system as multiple layers of an onion, or as multiple intersecting groups that may behave in complex and novel ways, not always in harmony with each other.

 

System Layers

This is derived from the "Crossing the Quality Chasm" Institute of Medicine report. In this report they identified multiple layers of the health care system that affect the ability to improve care:

  • The patient experience;
  • The functioning of microsystems;
  • The functioning of macrosystems, or large organizations that support the Microsystems;
  • The environment (policy, payment, regulations) that shapes the intent, behavior, and opportunities of organizations.

 

System Under Stress

See Scope of Awareness

 

Systems Approach

In contrast to the more traditional ‘blame and shame" method of treating medical errors and quality problems as personal failings on the part of individual providers, the systems approach takes the view that poorly designed systems will predictably greatly increase the risk of human failings in their context. For example, placing inexperienced personnel in complex and unfamiliar situations will predictably cause errors, just as exhausted personnel perform poorly especially in stressful and distracted situations. The systems approach tries to identify factors which are most likely to cause human error and reduce the prevalence of these factors minimize their impact. Also the systems approach will tend to provide backup checks to prevent an error that does occur from causing harm. The systems focus pays attention to human factors engineering and latent errors at the blunt end, which often have extraordinary power to influence actions at the point of care.

 TOP

Browsing Letter T

 

The Five Rights

The five rights are defined as administering the right medication at the right does at the right time by the right route to the right patient. This is thought to be a fundamental statement regarding nursing responsibilities in safe medical practices. However, modern analysis on culture of safety issues points out that the focus of the five rights is on individual performance not team performance. Also, both human factors and system design issues, such as poor lighting, workload, and distractions, may confound the efforts of the best nurse to perform the five rights flawlessly.

 

Time Outs

Planned periods of interdisciplinary discussion to ensure that key procedural details have been addressed. This is typical prior to a surgical procedure to confirm the identification of the patient and to double check the procedure and subsequent plans. This has broad applicability for other settings where a critical procedure is soon to begin.

 

Transitions in Care

See Hand Offs

 

Triggers

Signals to detect likely adverse events. An example of a trigger might be the use of Glucagon to treat hypoglycemia. In retrospective reviews, triggers have great potential as a patient safety tool, provided they are chosen carefully so the trigger does not cause so-called false alarms in higher proportion that real ones.

 TOP

Browsing Letter U

 

Underuse, Overuse, and Misuse

Underuse is the failure to provide a service when it would have provided a favorable outcome for the patient. An example would be failure to provide an influenza immunization for a fragile diabetic patient.

 Overuse refers to providing a process of care where the potential for harm exceeds the benefit, such as use of powerful antibiotics.

 Misuse occurs when an appropriate process of care has been selected, but a preventable complication occurs. An example is prescribing excessively high levels of thyroid hormone replacement in a patient with thyroid cancer. The treatment not only suppresses the TSH level completely, which is intended and correct, but also leads to severe osteoporosis and atrial fibrillation due to the excessively high free thyroxine levels.

 TOP

Browsing Letter V

 

Vulnerable System Syndrome (VSS)

James Reason has identified systems more prone to adverse events than others. The features that characterize these systems are three reinforcing elements:

  • Blaming those at the point of care;
  • Denying the existence of systemic weaknesses;
  • Pursing the wrong kind of excellence.

An example of the wrong kind of goals would be production based goals, such as maximum number of visits per day. Another example would be waiting times as the major measure of success.

 TOP

Browsing Letter W

 

Workaround

A method used by personnel at the point of care to bypass certain features of their system in trying to accomplish their work. For example, when the pharmacy proves to be excessively slow in providing insulin to the unit in a hospital, a nurse may counter by obtaining a multiple use vial of insulin and leaving it in a prominent place in the nursing station so it is available to her when she wants it. Though this would solve the problem of the delay in receiving insulin from the pharmacy, it may create another potential problem in that there may be confusion with another vial of an unrelated medication that is also at the nursing station, such as heparin. In the analysis of medical errors, workarounds are often problems that must be resolved because the ad hoc decision of the person at the point of care to solve one problem often creates yet another latent error or condition that may be the source of a later, even more important error. It is important that the presence of workarounds should alert people to the fact that there is probably a serious problem in the system performance that led to the decision to create a workaround, and that correction of that problem is just as important as preventing the workaround.

 TOP