Why a “y”? Perhaps it was a typo at first and I decided to go with it anyway, convincing myself it was the right thing to do, to grab your attention, and because it contained a hidden lesson or a double meaning, a portmanteau, crossing “mistake” and “mystery”. Or maybe I was just too lazy to go back and fix it, and would rather expend the effort to justify my action, even though it is clearly wrong. If you lisp the word, it is “myth take”. Or maybe I just love to play with words, like a toddler with food.
There are so many different ways that people confront mistakes.
Keep asking why, and the reason will come out, eventually.
Hospitals are dangerous places, full of fragile patients and complicated operations where the smallest mistake can have dire consequences. Disasters happen. The staff fails to question a woman dressed as nurse, leaving her free to abduct an infant. A mop and bucket comes into the hands of a martial arts champion having an emotional crisis, so he cracks the mop in two, creating a spear, and then he swings the bucket filled with water like a wrecking ball. A high wind blows a construction crane into the side of the building. Rival gang members shoot each other, arriving together in the emergency room to continue their confrontation, along with the rest of the gangs who face off in the waiting room. An aging air-conditioning unit grows mold, infecting immunocompromised patients. And then there’s the routine matters—falls, medication errors, retained sponges, respected professionals who practice well past their ability to practice well, fires, hurricanes, suicides.
Risk Managers are supposed to prevent disasters, or at least minimize the effects. Most of the time, it means dealing with the disaster immediately after it happens to their institutions. The other kinds of disasters —the ones that happened somewhere else or long ago—or the ones that haven’t happened at all, but hypothetically could happen– tend to slide off the list of high priorities. One thinks of the long period when Florida enjoyed no serious hurricanes– the long stretch preceding Hurricane Andrew, or the century that stretched between global pandemics.
The Risk Manager confronts a disaster using a variety of tools. The most widely used of these tools for retrospective examination is the Root Cause Analysis. In simple terms, it means you gather the people most directly involved and begin to ask why the disaster happened.
After the reality unfolds, all the steps that might have been taken to prevent or blunt the disaster present themselves with increasing clarity the farther they are viewed in retrospect.
So, if you have a disaster in which the kingdom was lost, a root-cause analysis might yield the following result:
“For the want of a nail the shoe was lost,
For the want of a shoe the horse was lost,
For the want of a horse the rider was lost,
For the want of a rider the battle was lost,
For the want of a battle the kingdom was lost,
And all for the want of a horseshoe-nail.”
Conventional wisdom holds that you limit the inquiries to five levels of asking why, the way Benjamin Franklin famously did, above. The why’s never end, really, though the wise do. There is an inevitable conflict that continually shifts back and forth between assigning responsibility to fix a problem and fixing blame to assign responsibility for the problem. You’ll find yourself lost in a rabbit hole of paradox and contradiction if you aim for the highest intellectual truth, or the lowest, like a toddler asking why, spurred at first by curiosity, then letting it become a game until it becomes a joke, and one at your expense.
If you go back to Ben Franklin’s chain of causation, the kingdom wasn’t really lost due to the fault of a blacksmith who skimped on a nail, or a rider who skimped on replacing the horse, or a general who skimped on an army so that the loss of a single rider made a difference, but rather a combination of all of them.
Mistakes are part of the waters of life. You could think of an enterprise as being like a river, with mistakes as wayward currents straining against the banks, continually testing the boundaries, but the wayward currents fall back into place and go with the flow—most of the time. Every now and then, those wayward currents surge into one another, coalescing into a mighty torrent that builds into a deluge.
Take a look at a type of disaster that happens all too often in a healthcare setting. A nurse gives the wrong medication to a patient and the patient dies. It is heartbreaking, and tragic. The first impulse is to punish the nurse. It seems very simple.
But maybe there were contributing factors. Maybe the medication had a misleading label, or was the medication the doctor had ordered, though the nurse should have questioned it, or maybe the pharmacist gave the nurse the wrong dose, and it should have been caught, and maybe the nursing unit was seriously understaffed that night and full of concurrent emergencies and demands. Maybe many of these factors were in play simultaneously. And maybe the medication had nothing to do with the patient’s death.
As much as possible, it helps to analyze the disaster in a way that avoids assigning blame to individuals. The inquiry process works better when the participants don’t feel they’re under attack. They tend to be more candid that way. You get more information. It helps when people look squarely at their actions, accepting responsibility for what they have done, even recognizing that it led to, or contributed to, a bad result. Those are the kind of people most apt to learn from a mistake.
It is the sign of a healthy culture when this kind of blameless and candid examination is able to thrive. It isn’t easy. It runs counter to human nature.
I once asked a medical malpractice defense attorney to tell me what he would have done differently in a case he lost. He thought about it, seriously, for about a minute. “Nothing,” he said, and said nothing else. It was one of the worst answers he could have given me, but he didn’t understand that for the same reasons he didn’t look critically at whatever decisions he had made that resulted in the case being lost. At least he didn’t try to bullshit his way around his mistakes, which is something that lawyers often do, with great skill.
If you go back to Ben Franklin’s example, you might want to pin the blame for losing the kingdom on a negligent blacksmith. And there’s a beguiling simplicity in treating the chain of causation in that way. Look for someone to blame. It bolsters the social value of personal responsibility. My experience with disasters shows that most of the time, it isn’t the fault of one person making a big mistake. Disasters are more often the result of a confluence of small mistakes ricocheting off one another. In any endeavor, especially in a complex endeavor like the delivery of healthcare, mistakes happen all the time. Mistakes are unavoidable.
There is a connectedness between the roles and responsibilities of the individuals engaged in the enterprise, the psychology of those individuals, their cultures, biases and propensities and those of the organization itself, and those of the community and society in which the culture functions. Each component has a distinct identity, a distinction that shimmers between being important on its own and important only as part of the totality.
There’s a useful metaphor for the dynamic interchange between these polarities in the philosophy surrounding Tai-Chi. In simple terms, it recognizes the unity between seeming opposites and promotes the values of flowing energy between them while maintaining balance and harmony. That principle can be a useful tool in gaining orientation in the midst of a disaster. The solution is very simple. You know it at once until you look deeper.
Simplicity and certainty, complexity and uncertainty all can be a strengths or weaknesses, and maintaining balance and harmony and flow between them it can be a solution—so long as one doesn’t lose sight of the imbedded contradiction.
Part of the nature of flowing between seeming opposite values is the principle that every strength eventually becomes a weakness, and every solution eventually becomes a problem.
Part of the time, and part of the staff, want to be told exactly what to do. And the other part of the time, and the other part of the staff, wants the latitude to do it his or her own way. The specific times and specific staff members don’t sort into neat categories, and often overlap.
Tai-Chi promotes the notion that one can achieve effortless perfection by repetition and practice. Organizations often strive to reach the point of efficiency where operations function on automatic, like highly disciplined Tai-Chi practitioners performing their fluid routines. The motions seem to well up from an external source. The task assumes a kind of neutrality, like a surgeon who has sutured so many lacerations she might as well be sewing a garment. Or like driving to work every day by the same route—you insert your key into the ignition, and the next thing you know, you’re in the company parking lot. Practice makes perfect, or so some have said.
No individual, no organization, can attain true proficiency without going through a learning curve, stumbling at first, making mistakes— encountering the wrong way to do things, and avoiding it. And the right way to do something is subject to frequent changes.
And sometimes effortless perfection itself is the source of a disaster–that automatic functioning—the AutoMagic. Perfection doesn’t exist, except as an ideal, an illusion, though ideals and illusions are as important as recognizing that imperfections are unavoidable. In practice, there are always abnormalities, biases, glitches, and mistakes. Someone fails to appreciate a slight variation in the routine, a nuance that makes all the difference in the world. A variation breezes past because the participants had become uninvolved with what they were doing. Or one of them was too confident. Or not confident enough. Or too preoccupied with avoiding a previous mistake, so that he or she focused on the wrong aspect of the task. Or one or more of the participants put too much reliance on the others to perform repetitive checks and balances that each one should have been performing. Or like the malpractice defense attorney who had done everything perfectly and still lost the case– who got caught up in the competition and lost sight of the fact his objective was wrong– that he should have stopped trying to win the case and should have focused on getting it settled.
Risk Management is best performed as a multi-disciplinary joint effort, with representatives from the affected parties. Not uncommonly, participants grope at a problem like blind men at the elephant, each sharing insights from a limited point of view. To maintain the flow of the process while holding onto the values of balance and harmony, it is useful to hold to the following ideals, even if they are illusions. Everyone present is working toward solving the problem, and each has a unique perspective that plays a vital role in reaching the solution. Everyone should approach every contribution as if it were true, because there is some context in which it is actually true. Is it true as a probability, or only in theory? Is it true as an ideal, something that should be? Is it scientifically proven to be true? Is it true in the past? Is it true as a shortcut? Is it true as an ambiguity? Is it true as a matter of faith? Is it true as a belief? Is it spiritually true? Legally true? Expediently true? Conveniently true? Hypothetically true? Artistically true? Beautifully true? And once you determine the context, ask if it is a context that will lead to the goal you want. It doesn’t matter what is actually true, as long as the organization does the right thing. It doesn’t matter if it is true, as long as it works. These ideals are myths, and contain imbedded contradictions when followed through to their logical conclusion. But they work as great myths work. They transform cultures.
Not uncommonly, the participants approach the process as if it were a zero-sum game. The doctors might be looking to the finance managers to invest in new facilities or equipment or hire more staff. These tend to be generic solutions to most hospital disasters. The doctors are fighting for the lives of their individual patients while the finance managers are fighting for the life of the organization. There is a risk that the Risk Management process falls prey to the spirit of competition. The discussions become tactical and combative, full of red herrings, exaggerations, rhetorical ploys and theatrics. The goal is winning. Neither side cedes the moral high ground, with the value of even a single human life outweighing mere money, though if finances fall below a certain level, the quality of the institution’s services will fall with it, creating dangers for all patients.
Sometimes the organization actively encourages competitive Risk Management—sometimes without even realizing it, because competition is often an effective means of settling disputes. That bias is ingrained in many corporate cultures, as it is ingrained in the American society itself, in the laws, in the politics, and in commerce. The organization defaults to a competitive approach. The participants become home-blind to the way they are competing, losing sight of the common purpose. If you look at the enterprise holistically, you can see intense competitions over a choice of that amounts to nothing more than whether to pay the same amount of money out of the organization’s left pocket or its right pocket.
Instead of solving problems with a competitive process, you could resort to the opposite– an extreme of cooperation. One way is through bureaucracy, establishing a chain of command to address the decisions that involve unusual circumstances, or ambiguities, or unanticipated nuances. Bureaucracies tend to be slow moving and cumbersome. Organizations that are “flat” tend to be more agile, with empowered employees working at multiple levels. But that approach won’t work well for the part of the staff that wants to be told exactly what to do and how to do it, or for management that doesn’t want to pay at a level commensurate with the kind of decision making skills needed to address ambiguities, nuances, and unusual circumstances.
You could spread responsibility for given tasks, so that all participants are supervising each other. They each might repeat a task that really only takes one person to perform—like counting instruments during surgery, or checking a patient’s name band, or reading a medication label. Most of the time, the task really doesn’t need to be repeated, so it slows down the process and adds to expenses. Sometimes, the checks and balances don’t even fix the problem, because the staff ends up passively accepting the results of what the first person was supposed to do, since that was the person with the main responsibility in the first place.
People and organizations try to conserve energy with such fervor, you’d think they were obeying a law of physics. This creates a natural tension between focusing on the role of process to correct a problem and focusing on the role of the individual participants. If you put too much reliance on the process, you disincentive the individual participants to take responsibility. A chore owned by everyone belongs to no one. If you put too reliance on each individual, you ignore the constant of fallibility. A bias towards one or the other is part or each person’s orientation, and each culture, corporate or otherwise.
Rather than defaulting to using competition as a problem-solving tool, the organization may default to measuring value in monetary terms because that approach is ubiquitous in the larger culture. Money presents as a useful metaphor for measuring the way an organization expends and conserves its energy. That approach wears a veneer of science, but it is a thin veneer.
When there is a disaster, the solutions tend to be generic, a combination of one or more of the following. Counsel or retrain or discipline certain participants. Monitor their performance. Redesign the process, often in a way that shares responsibility among participants, building in checks and balances. Replace participants or products. Hire more people. Buy new products. Build more facilities. Eliminate the service line, or delegate it to an outside party.
You can often read the history of disasters past in a hospital’s policies and procedures. Those with redundancies and additional checks and balances present like the calcium deposits of old bone fractures on a radiograph.
Every person, and every culture, develops a level of tolerance for errors. Most of time the errors are of no consequence. The wrong medication was given, but it worked anyway because of the placebo effect. Or it made the patient vomit, but he was fine afterward.
Sometimes the disasters arise as a result of tolerated dysfunctions that began as solutions to prior problems—but circumstances changed and no one noticed. Or dysfunctions that once were strengths but have transformed into the opposite due to overuse.
Because errors are ubiquitous and tolerated, within limits, they often get swept into the process, and after a while, dysfunction becomes part of the design. Like the way physicians might omit an informed consent discussion under the assumption the nurse had done it, and vice versa, with no one being the wiser as long as the consent form is signed, a myth that permits the omission of a job no one likes. Or health insurance companies that issue denials of coverage on reflex, and only back down for the loudest complainers, thereby saving the company money. Or rituals of cardio-pulmonary resuscitation, when the providers know it will do no good but a family member is insisting upon it. Or automated answering systems that save money for those who implement it but waste the time of those who call it. Or diagnostic testing to carefully monitor the course of a dying patient, though no medical decisions depend on the information. Or marketing that promises more than it delivers, but shorts the buyer just shy of being actionable. Or marketing that promises nothing, but does it so grandly that people spend money on things they don’t need and stop wanting once they gain possession. Or the way that the institution of health insurance allows people to tolerate rising medical costs, though the institution requires an infra-structure so costly as to rival the service it is supposed to support. Or the painkillers that create an addiction and cause more damage than the pain they were supposed to ameliorate. Or traffic accidents, a species of disaster that spawns whole industries around it, though they could be avoided almost entirely by strict adherence to rules and lower speed limits. People complain, but they learn to tolerate the dysfunction.
During the initial stages of the AIDS epidemic, the providers on the frontlines tried to get as many patients tested as possible—even the patients who had no symptoms and didn’t need the test results as part of their treatment. And even when there had been no staff exposure to blood or body fluids. The Organization sought to dampen down the fervor for testing—reminding the providers to practice universal precautions, treating every patient as if they were HIV positive. But still the providers protested— claiming they handle the patients differently when they actually know their status.
It was kind of prime example of the principle that you don’t have to know the actual truth, but you need to do the right thing. Testing, in this instance, could actually prove counter-productive by inspiring a false sense of security—particularly given that in the early stages of infection, patients might test negative but still be capable of transmitting the disease.
It took time, but eventually, universal precautions became routine. They became part of the culture within the hospital.
The issue arose again during the early stages of the Covid-19 pandemic. Prior to the vaccine, there was no way to tell for sure if someone had the virus or could transmit the virus. Even if they had it and recovered. At that point, everyone— virtually everyone on the planet—needed to be practicing bi-lateral universal precautions, not only treating everyone else as if they were infectious, but also treating themselves the same way. It also demonstrated the kind of conundrum a Risk Manager faces when trying to correct a problem by imposing a rule. Theoretically, if everyone scrupulously held to bilateral universal precautions, wearing masks and maintaining a six-foot distance from others, the risk of virus transmission should be low enough to allow loosening the lockdown. Putting the theory into practice showed, at least as of this writing, that people, or at least a significant percentage, could not be trusted to maintain even this modest discipline. One might hope the culture will change.
Risk Managers tend to adjust their rules to take into account the way the boundaries will be tested, setting them higher than what is actually needed. Like when you need motorists to drive no more than 40 miles an hour to keep a road safe, so you set the speed limit at 35, knowing that impatience will usually make drivers test the speed limit up to five miles an hour faster.
The term optimal itself is a trigger—that is to say, the best way possible, given the assumption that all participants will scrupulously adhere to the prescribed actions. Once the Risk Manager identifies a set of corrective actions, there’s tension in the optimal way to implement them.
In some instances, the rule needs to be set at the optimal level—like when the risk of a mistake is high and the consequences dire. Or when you have an authority, like the government or an administrative agency looking over your shoulder, or the public, when the mistake has hit the press.
Setting the rule at the optimal level this also assumes there are no variations or nuances that might come into play—no ambiguities that call for the exercise of individual judgement. But the tributaries in the waters of life are as filled with variations and nuances and ambiguities as they are with the downstream mistakes.
When the Risk Manager commits to the optimal standard in writing, the rule has to be realistic and achievable by the people responsible for implementing it. You might strive to prevent patient falls by setting a five-minute standard for nurses to respond to patient call signals. Most of the time, nurses won’t be able to meet that standard. So, it becomes a liability trap if the patient climbs out of bed and breaks a hip after waiting ten minutes. Some organizations adopt detailed patient assessment tools to identify those at risk for falling, including assessments of infirmity, drug level, and cognition, so as to triage urgency for responding to call signals. Other institutions might skip the assessment step and use the fall prevention equivalent of universal precautions. Every patient sick enough to be in a hospital might be considered at risk for falling.
Because of the risk for creating a liability trap, some Risk Managers try to avoid putting optimal standards in writing. They might try to hedge the standards with legalistic disclaimers—something like this: “The standards stated here are guidelines for staff, to be adjusted in accord with personal professional judgement. These guidelines are not intended as statements of the standard of care, nor may they be used in connection with establishing liability for personal injury claims.” These types of disclaimers are kind of lawyerly wishful thinking, an attempt to have it magically work both ways, setting a standard without setting one. But lawyers think that way all the time.
If you don’t commit the standard to writing, you run the risk of leaving it to oral tradition. Hospitals are often rich in this kind of folklore, and often the staff adheres to rules they think are written down, somewhere, though no one can ever find them.
There’s a kind of comfort to be had in a comprehensive, well-written rule. It is there for the staff to consult whenever the need arises, to tell them the way things should be done. For management, it works like a legal decree—it shall be done, though it might damn the institution when it isn’t. There is something to be said for the magic that is worked by a written rule. People are more prone to obey it, and the effect works like a posted speed limit, even if there is no cop in sight.
Some Risk Managers have a bias against written rules, claiming the staff doesn’t read them anyway. That’s a notion more likely to come to pass when the policy is weighted down with legalistic disclaimers, like the one above, or complicated hedges that defer to personal professional judgement. The latter puts the burden on individuals, demanding that all multiple conflicting priorities be taken into account, and all actions taken with the kind of preternatural wisdom that prevails in most legal controversies—that obtained in hindsight.
There are a variety of Risk Management approaches to deal with these competing priorities.
It is so much simpler just to discipline—maybe even fire—the person who made the mistake, and hang a bloody head on a pike as a warning. The first instinct is to assign blame, especially when you are the person being blamed. You want to shift it to someone else. Ultimately, the primary party being blamed when things go wrong will be the organization itself.
Though the prevailing view in Risk Management is to avoid assigning blame to a person, sometimes it is the right thing to do. A careful review of the disaster might identify the main problem being an employee who should have been fired sooner, and a corporate culture too intent on not assigning blame to individuals that it becomes lax in demanding excellence.
When the rules aren’t written with fixed, ascertainable standards, they’re a challenge to enforce when the objectives aren’t reached.
Once rules are firmly in place, there are a number of ways to identify how and when they aren’t being followed. Mandatory self-reporting shows compliance through an incident reporting process. Every time there is an error or deviation from standard hospital routine that has caused or could cause an injury or unmercenary expense, it should be reported in writing to the Risk Manager. The system helps to alert the organization to brewing problems. Reporting responsibility falls not only on the worker responsible for the error or deviation, but also everyone who witnesses it, or has reasonable grounds to believe it is true. So, the worker who fails to report his or her own error commits a serious policy violation, along with co-workers who know about it and fail to snitch.
If the Risk Management process works at its optimal level, it will seem unnecessary, a great deal of expense when things are functioning just fine. It gets taken for granted, like the activity of driving to work by the same route every day that seems to do itself. The driver might not feel the need to pay careful attention to the road ahead, and might even take a chance on sending a text message or running a stop sign.
The only way to avoid actually having disasters is to imagine them. That makes effective Risk Management a kind of exercise in shadow-boxing. Thinking about what might go wrong, immersing yourself in dreadful unrealities. Looking at the misfortunes of others. Extrapolating what might have happened as a result of accidents where no harm actually occurred.
Taken to its logical extreme, optimal Risk Management can be crippling. The imagined disasters contain limitless horrors. The best way to avoid risk is to eliminate uncertainty, though uncertainty in life is the only real certainty, and avoiding all risks carries its own set of risks. The optimal way to avoid uncertainty is the opposite of life.
There is no way to learn and grow without making mistakes, as long as you deal with them the right way.
Risk Management is much like gambling. There’s an element of skill, in figuring out the odds, developing strategies to maximize them, looking at the upside risk and the downside, evaluating what is in front of you and what might lie ahead, and testing your luck. Gambling is a time honored discipline to address uncertainty. It means losing part of the time.
I’m writing this essay on healthcare risk management in the midst of a disaster—the global Covid-19 world pandemic of 2020. The disasters I’ve dealt with in the past are nothing on the scale of the present disaster, but the territory is somehow familiar. The physicians and the finance administrators are arguing about their respective expertise and arguing which of them should call the shots.
The biggest problem in America right now lies not in who is in charge, but in the way that decent, intelligent people are looking at the exact same set of circumstances and perceiving entirely different realities. But maybe that weakness can become a strength.
Correcting these mistakes must come from decent, intelligent people– both those who lead and those who would be led, a transformation not only from the top down, but also from the bottom up. One might wish that decent, intelligent people would learn from the Hospital Risk Management process to address this disaster—to find the meaning of the mistake, to recognize that the old ways of doing things don’t work anymore, that the world must be viewed in a new way, that blame is not a productive strategy, that competition and cooperation are values that flow and blend into one another, that they must be harmonized and balanced. The same principles apply to the values of individual responsibility and community effort, the value of the life of each person and the value of the many systems that support it, the questions that can be answered by science and those that can be answered by mere belief or social conventions, the values that can be measured financially and those that can’t, the things that are material and the things that are ideal, the importance of leaders who can connect with the culture and those who can transform it, the value of a unified culture and the value of diversification within it, the value of those who agree with us, and the value of those who don’t, the value of the separate components and the value of the whole.
We are one country, America.
America, we are one world.
I look at this disaster, and I think I have the answer, but I don’t, not alone. I’m not alone in thinking I have the answer. And it all seems very familiar and it seems very simple, but it isn’t. And I see a lot that is blameworthy, but it isn’t that simple, and I know it is not the answer. It is a mistake.