An excerpt from You’re the Leader. Now What? Leadership Lessons from Mayo Clinic: Leadership Lessons from Mayo Clinic. What is most important when making a decision: process or expert analysis? If you’re like many people I’ve worked with, you’re probably thinking, “they both are,” but I’m asking you to pick just one. Which one is most important in improving the effectiveness of decisions? A study published in the McKinsey Quarterly examined 1,048 complex decisions made by managers within large organizations about such things as major acquisitions, investment in new products, and choices of key technologies. First, the researchers asked the managers involved in each decision to estimate their application of various practices of analysis and process. Then they asked the same managers to assess the outcomes of their decisions based upon metrics such as revenue, profitability, market share, and productivity. The authors found that process is six times more important than analysis in improving decision-making effectiveness. Yes, a leader’s analysis—detailed financial modeling, sensitivity analysis, and predictions of the market’s reaction—was important. But far more important was the process used to make the decision. Did it include perspectives that contradicted the leader’s point of view? Did it capture the input of individuals with varied experiences and skills without regard to their formal rank within the organization? The fact that process trumps analysis makes sense, doesn’t it? Our individual analysis is obscured by blind spots in our expertise. We can create intricate spreadsheets, explore subtle nuances, and deeply consider the possibilities, but still lack the essential data that lurks beyond our own perspective. A decision process that helps us see around blind corners will result in better decisions. Re-evaluating and improving decisions is essential for advancing medical care. In the past, despite extensive expert analysis, medication errors were alarmingly common. The wrong drug, given to the wrong patient, at the wrong dose, at the wrong time. For example, previously when a patient presented to the critical care unit with pneumonia, treatment decisions came from the expert analysis of a single physician during a busy shift. The physician would scan through the patient’s past medical history, list of allergies, and test results, and then compare a list of local bacteria susceptibilities with the published opinions of experts—all to figure out the antibiotic and dose for treatment. You can imagine that such detailed analysis in a dynamic critical care unit could occasionally result in error, despite best intentions. In modern critical care units, the electronic health record, clinical pharmacists, and bar code scanners help the care team see through blind spots of individual analysis. The electronic health record (EHR) identifies pertinent allergies, potential medication interactions, and appropriate antibiotic choices. Clinical pharmacists perform independent bedside analysis of each patient’s treatment plan. They recognize important variables that the physician, nurse, and electronic health record might miss. Nurses use bar code administration systems to scan each medication and the patient’s wristband to ensure that the correct drug is infused. Such improvements help care teams leverage the expert analysis of several professionals to identify the right medication, for the right patient, at the right dose, at the right time. In the emergency department I treat patients based upon my expert judgment. I can explain to you the sensitivities and specificities of the tests I order, I can detail the physiology and pathology involved in my differential diagnoses, and I can explain the rationale for each part of my treatment plan. But despite my expert analysis, I can still miss important details. Did I get input from the family or the paramedics? The pharmacist noticed something in the past medical records that might be important, and the nurse thinks something else might be going on. Did I invite their opinions and listen to them? When I block out the perspectives and analyses of others as part of my process of caring for patients, no matter how on-point my own siloed analysis is, I am less effective. The executive team at Quaker Oats performed detailed analysis, no doubt, before each of their actions. But did they seek out the insights of key customers, distributors, and promoters before making their decision? And if they got those perspectives, did they incorporate them into the decision process? Or did they reflexively think, “No. We disagree. We’ve come to the conclusion that they are all incorrect, and we will proceed with our plan”? We’ve all worked with individuals who are deeply analytical— armed with spreadsheets and diagrams and facts—but just don’t seem to get the big picture. When I talk to leaders, I frequently hear about the pitfalls that occur when they rely on their own expert analyses—and it’s often a shock to them when it happens. Soon after rising to an executive level at his company, one leader I coached got a question from the CEO about changing the employee health plan. The company needed to cut expenses without jeopardizing care for employees. The leader was new in his role but had been in the industry for more than a decade and felt secure in his expertise. He compared the two health plans using a spreadsheet and saw that there was a 98 percent crossover in approved primary care providers. The new list captured all of the preferred specialists and nearly all of the primary care physicians. This high number resonated with him, and he gave a confident decision to make the switch to the new, cheaper plan. Turns out he should have been more concerned with that outlying 2 percent. When the health plan was switched it became clear that the outlying 2 percent represented an influential group of primary care physicians who were now excluded from the options employees could choose. The leader now had a stream of angry employees complaining that the switch had forced them to move from long-term, trusted local providers and specialist physicians. Who cared about the 98 percent overlap? They missed the 2 percent no longer part of their insurance coverage. The leader spent the next two years unraveling his mistake and rebuilding trust. He had overplayed his expertise. His confidence in his own background led him to see the decision as an easier and more direct choice than it really was. He was partially blinded by all that he knew. And he couldn’t have been more shocked by this outcome. The damage he did and the time it took to fix came to him as a surprise. After all, he was an expert—who expected such a nightmare? Source