You are on page 1of 11

THE ART AND SKILL OF DECISION-MAKING

Are you making good managerial decisions? Are you sure?

As managers our important work-related decisions affect, at a minimum: our direct reports, our superiors, our organizations, and our careers. Nobody wants to appear indecisive, but being perceived as cavalier toward major decisions also is not a good idea. We want to make decisions quickly, but do not wish to overlook key information.

On the surface this give and take between speed and thoroughness appears to represent two opposite poles of one decision-making process. However, research has shown that the way we make decisions quickly and the way we make thorough decisions are really two separate things.

According to cognitive scientists, there are two modes of thinking, intuitive and reflective In intuitive, or System One, thinking, impressions, associations, feelings, intentions, and preparations for action flow effortlesslyWere not consciously focusing on how to do those things; we just do themIn contrast reflective, or System Two, thinking is slow, effortful, and deliberateBoth modes are continuously active, but System Two is typically just monitoring things. Its mobilized when the stakes are high, when we detect an obvious error, or when rule-based reasoning is required. But most of the time, System One determines our thoughtsOur visual system and associative memory (both important aspects of System One) are designed to produce a single coherent interpretation of what is going on around usAs System One makes sense of those inputs and develops a narrative, it suppresses alternative stories. (Kahneman et al. 1)

The great danger in making decisions intuitively is cognitive bias. BusinessDictionary.com lists the definition of cognitive bias as the Common tendency to acquire and process information by filtering it through one's own likes, dislikes, and experiences. When cognitive bias is active, we run the risk of misapplying past lessons

learned or inappropriately projecting and making incorrect assumptions about the situation. Again from Kahneman, et al

Because System One is so good at making up contextual stories and were not aware of its operations, it can lead us astray. The stories it creates are generally accurate, but there are exceptions. Cognitive biases are one major, well-documented example. An insidious feature of cognitive failures is that we have no way of knowing that theyre happening: We almost never catch ourselves in the act of making intuitive errors. Experience doesnt help us recognize them. (By contrast, if we tackle a difficult problem using System Two thinking and fail to solve it, were uncomfortably aware of that fact.) This inability to sense that weve made a mistake is the key to understanding why we generally accept our intuitive, effortless thinking at face value. It also explains why, even when we become aware of the existence of biases, were not excited about eliminating them in ourselves. After all, its difficult for us to fix errors we cant see. (Kahneman et al. 2)

Furthermore, System One can be quite sensitive to context. Our emotional or situational orientation at the time the decision is made will have considerable impact upon the choices we make.

Consequences

The impact of cognitive bias is multi-layered and potentially far-reaching. For instance, in a publicly held company or non-profit setting a manager is not making decisions about his or her own money. Rather, it is money meant for clientele and it belongs to the corporation. Decisions based upon gut or instinct are difficult to defend when faced with questions from board members, funding sources, and the customers for whom the products or services were intended.

Another problem is the practice of some managers to make quick intuitive decisions about employees and from then on everything the employees do is viewed through the lens of that original decision. Anything they do that might be consistent with the original

decision is magnified and cited as evidence that the initial intuitive response was correct. Meanwhile everything they do that is inconsistent with the initial intuition led decision is minimized or viewed as a minor exception. Rather than routinely testing those impressions for accuracy, managers caught up in this bias have been known to twist information from all ensuing interaction in ways that justify their original opinion. This sometimes inappropriately leads to attempts to drive the employees away or fire them. That is, rather than treat the instinctive response as one clue to pursue when assessing a workers ability; it becomes the cornerstone upon which all future treatment of the individuals is built. Plans of correction are written in ways that enable the manager to catch the person doing something wrong rather than to actually correct or improve performance. This gotcha style of management is not only unfair and counterproductive, but can be a Human Resource departments nightmare as they try to head off lawsuits for wrongful termination.

If we make up our minds first and then go about finding evidence to justify our decisions, it is intuition gone awry. It is instinct skewed by self-preservation, followed by reflective thinking which is actually a hunt for supporting evidence rather than a search for truth and fact. As a great systems trainer, Ken Alexander, once said to a group he and I were co-facilitating: If youre immediately, absolutely, positively sure that youre right, youre probably wrong.

An additional source of trouble is the fact that, since cognitive bias is context-sensitive, we run the risk of being seen as erratic or overly emotional (Dont ask her about it today because she got up on the wrong side of the bed. Dont even think about persuading him today because hes in a bad mood., Hes shooting down everything today. Better ask another time. If we dont invest the rigor, will someone who reviews our work in good faith at a later date say you should have known better? The problem is that we will make wrong decisions when we could have made correct ones. Moreover, our decisions will be indefensible. The legal term knew or should have known can come into play, as well as other legal risks when making human resource decisions where certain biases are legally prohibited.

Solutions People & Processes Past attempts at solutions have depended upon the belief that if we knew about the risk of cognitive bias, then we could protect against it. However

You may accept that you have biases, but you cannot eliminate them in yourself. There is reason for hope, however, when we move from the individual to the collective, from the decision maker to the decision-making process, and from the executive to the organization. As researchers have documented in the realm of operational management, the fact that individuals are not aware of their own biases does not mean that biases cant be neutralizedor at least reducedat the organizational level. (Kahneman et al. 2)

In other words, forewarned is forearmed does not apply here because knowing we have biases does not enable us to catch them in ourselves when they are in operation. This is because biases are a sort of blind spot. It brings to mind an exercise I used in group sessions many years ago called the Johari Window. In it, we broke out information about an individual according to what the individual knows about him/herself, and what others know about him or her.

Johari_Window (paraphrased) Things I know about myself Things I dont know about myself

Things others know about me

Things others dont know about me

BB Area

Characteristics about the individual are gathered and then written into the boxes. Since cognitive bias is rarely caught by the person exhibiting it, we turn our focus to the box where the Things I dont know about myself and Things others know about me intersect. The nickname for that cell in the table is the bad breath area because, like halitosis, others tend to notice cognitive bias in us before we notice it in ourselves. So while the Johari Window illustrates the weakness created by cognitive bias, it also demonstrates one solution: The utilization of others observations.

Someone might say I know myself, I know how to make decisions, and I usually make good ones. Why should I turn to anybody else? Well, you know what you look like, dont you? You dont look half bad, right? But you still look in a mirror at least once every morning before you leave the house, dont you? Checking with others is very much like the view in the mirror. It provides an opportunity for reflection.

Thus, as Kahneman et al stated above, there is reason for hope. Our decision-making mirrors are people and processes. They remind us of the need and opportunities for reflective thinking. Questioning our own judgment before making a big decision is a lot better than asking ourselves How could I have been so stupid? later.

People=Teamwork

Because other people can spot bias in us before we catch it in ourselves, good teamwork is essential. To be clear, this differs from group work (see more about the difference here). Team members must not only be permitted but empowered and expected to challenge what they perceive as errors.

Similarly, teams can review one anothers decisions. These teams can be lateral to one another or superior/sub to one another. This group-to-group process can be particularly helpful if one team is experiencing problematic dynamics. A typical example would be the review by a board finance sub-committee of a budget proposal developed by an executive leadership team.

Of course, these people and team oriented processes make it necessary to be surrounded by people whose opinions we respect. This means we hire to our weaknesses and by all means avoid bringing sycophants into the business.

Processes=Disciplined forms of review

Kahneman et al encouraged the use of a checklist when reviewing proposals and recommendations. This was composed of questions to ask ourselves in order to limit the effects of cognitive bias: 1. Is there any reason to suspect motivated errors, or errors driven by the self-interest of the recommending team? 2. Have the people making the recommendation fallen in love with it? 3. Were there dissenting opinions within the recommending team?

4. Could the diagnosis of the situation be overly influenced by salient analogies? 5. Have credible alternatives been considered? 6. If you had to make this decision again in a year, what information would you want, and can you get more of it now? 7. Do you know where the numbers came from? 8. Can you see a halo effect? 9. Are the people making the recommendation overly attached to past decisions? 10. Is the base case overly optimistic? 11. Is the worst case bad enough? 12. Is the recommending team overly cautious? These 12 questions should be helpful to anyone who relies substantially on others evaluations to make a final decision. But theres a time and place to ask them, and there are ways to make them part and parcel of your organizations decision-making processes. When to use the checklist. This approach is not designed for routine decisions that an executive formally rubberstamps. (Kahneman et al. 9-10)

Another important process-oriented solution was referenced in item 7 above: Analytics. These can be critical to the health, competitiveness, & growth of an organization, and are an antidote to cognitive bias when properly utilized. But analytics are not just numbers. In a section entitled Combining Art and Science Davenport et al. writes:

Analytics will continue to be critical in financial services and all other industries, so the best decision makers will be those who combine the science of quantitative analysis with the art of sound reasoning. (Davenport et al. 15)

In order to properly choose the items to be measured, counted, recorded, and analyzed, the user must exercise wisdom and perceptiveness. That is where art meets science, and the interaction develops fully when interpretation of the numbers is performed. This does not mean we have to get it all right the first time we try. Rather, the selection and use of analytics is an iterative process. This process of trial and error continuously hones the informations relevance to the decisions we need to make.

While analytics are not perfect, we prefer them to the shoddy alternatives of bias, prejudice, self-justification, and unaided intuition. Humans often make long lists of excuses not to be analytical, but theres plenty of research showing that data, facts, and

analysis are powerful aids to decision making, and that the decisions made on them are better than those made through intuition or gut instinct. (Davenport et al. 17)

Use wisdom in choosing what to measure and measure those things accurately. Then let the data do the talking and listen to it rather than trying to mine the data for something that will help you make a point you already had in mind. Clear analytics can help us challenge our biases in much the same way a team member can call us out. In fact, analytics can help empower our team members to challenge our cognitive biases. The two complement each other quite well in a system designed to improve decision making.

Complications

It is true that the solutions listed above can be quite effective in reducing the negative impacts of cognitive bias. Unfortunately it is also true that cognitive bias can interfere with the implementation of these solutions.

First of all, the teamwork solution can be difficult to carry out in hierarchical teams. The person at the top of an executive teams hierarchy is often the one who signs the paychecks. So if, for instance, the CEO has a cognitive bias that contributes to him or her being a control freak, it would take a particularly diplomatic and brave (or foolhardy) member of the executive leadership team to challenge that bias when it is presented. On the other hand, if the person at the top of a hierarchical team invites and rewards constructive, well considered feedback both positive and negative cognitive bias can be kept in check. Another factor in a teams ability to challenge bias is organizational culture. If, for example, an organization has a fortress culture it can be very dicey for any team member to expresses a contrary opinion.

Secondly, review processes can be pressured or even circumvented if time is a major factor. Time & schedule are not always the highest priority, but putting the processes aside a few times can lead to putting it aside more often than is necessary. If nothing bad happens immediately as a result of bypassing the rigorous review processes, then the new less rigorous approach eventually becomes the new formal process. This is similar to what author & researcher Diane Vaughan famously described as an incremental descent into poor judgment when she reviewed the dynamics that lead to the space shuttle

Challenger disaster. In addition, organizational culture plays a role in this much as it does in enabling or disabling a team to confront cognitive bias. As Kahneman put it, executives need to be prepared to be systematicsomething that not all corporate cultures welcome. (Kahneman et al. 7)

Third, metrics are chosen by humans, data are interpreted by humans, and more importance is assigned to particular analytics than others by humans. Some of those humans are the same ones whose very human cognitive biases are depending upon the analytics to catch and stop the biases. If that sounds like a bit of a Catch-22, you are correct it can become that. Certainly if an executive is malfeasant, he or she can cherry pick and twist the analytics to support whatever point he or she wishes to make. If the leaders are seeking to act in the best interests of their organizations they will make every effort to avoid mis-applying the data in this way, but one cannot presume that to always be the case. Lastly, good analysts are not always easy to come by. People who can do the calculations while exercising appropriate judgment in going where the data are telling them to go can be invaluable, and a certain amount of work and expense is usually involved in finding and keeping them.

The Result

So if we have the right people in an atmosphere that encourages open exchange, adhere to rigorous evaluation processes, and properly choose the right data to evaluate, were home free right? Well, not exactly.

The point of this exercise is not to always follow a single rigid process. For openers, the process must be designed to meet the needs of our particular organizations. Next, the impact of the processes and people must be routinely assessed and changed as needed. And then, most of all, we need to know when to put aside the information generated by the processes and go with our instincts.

But what about all the things in the first 80% of this article? Well, they are all still in play. They are the rules. Now were talking about when to make exceptions to the rules. Lets face it if every executive was a rigid adherent to commonly held best practices, everyone would be making the same decisions in the same situations. Thats not exactly

creative. Its more like following the herd. That might seem safe unless youve heard of a buffalo jump.

For those not already aware, there used to be millions of buffalo on the plains of North America. For thousands of years indigenous tribes had no horses or firearms, but the hide, meat, and bones of the buffalo were highly prized by them. They perfected the art of stealthily nearly surrounding a herd, leaving only one opening that faced a cliff. Suddenly the hunters would jump up from their hiding places and make a lot of racket thus confusing and frightening the herd. The buffalo would then look around in the hope of seeing one of their herd who appeared to know what to do. When they saw one running, they would follow. However, that one was usually a hunter (often wrapping himself in the skin of a buffalo) who would run toward the cliff and then jump to a preidentified ledge just a few feet below the edge of the cliff. The rest of the herd would then go tumbling by him, crashing onto the rocks a hundred or more feet below. Some members of the hunting group were waiting near the bottom, and would collect the bounty gained by the buffalos tendency to blindly follow the herd.

Since the sites of this activity have names such as Ulm Pishkun (in the language of the Blackfoot: deep blood kettle) (Auchly 1) and estipah-skikikini-kots (Blackfoot, meaning head smashed in)(Parks Canada 1), following the herd acquires a less than palatable flavor and one might realize it is far less safe than some presume.

The point is that following widely held best practices or rigorous procedures without being able to justify it is potentially just as damaging as making a decision based only on our guts with no clear justification. In an article by Samuel Greengard about his interview with Malcolm Gladwell. Mr. Greengard quotes Mr. Gladwell as citing a study by Leybourne and Eugene SadlerSmith saying "'Improvisation and intuition represent two important and related aspects of management in general and of the management of projects in particular'". (Greengard 80) However, he went on to say "The mistake people make with intuition is that they think it is something that resides in all of us. They believe that these feelings are things that all of us can trust under any circumstances". (Greengard 80)

This writer asserts that the same is true of procedures, information from teammates, and analytics. Nothing not instinct, colleagues, or sophisticated computerized analytics can be treated as 100% reliable for every major decision. The art is in judging which to follow.

How do we do that?

Again, from the interview with Malcolm Gladwell via Samuel Greengard:

It's not intelligence. It has a lot to do with attitudes. It revolves around a person's willingness to be self-critical, to examine and critique beliefs. It's also about how open a person's mind is, how willing he or she is to test preconceived notions against new data, and how hard a person is willing to work. (Greengard 81)

Summary

So sometimes its best to ride with our instincts and other times we should follow the guidance provided by analytics and sound advice from others. How do we know which is right? How do we decide how to decide? Here are some things to keep in mind: 1. Recognize that The Imperial CEO is dead as a dodo. (Kumar, p. 1) Be open to constructive criticism and new ideas. Go beyond accepting it when its offered. Invite it. Reward those who provide it. This not only improves our decision-making, but also models the approach and attitude for those around us. 2. Obtain a clear idea of the status of your business, your market, and your industry. Be aware of macro factors and learn the nature of the business climate both present and future. Then determine how much risk your organization can tolerate and should take on. If it can tolerate (or desperately needs) a significant amount of risk, we can probably go with our instincts once in a while even if they go against the grain of conventional wisdom and disciplined processes. Having an overview of risk information ahead of time helps us judge whether its necessary to do a full and formal risk assessment for a particular decision, or if we instead have enough wiggle room to make a quick, instinctive major decision. 3. While seeking the opinions of others in order to make an executive decision, if at all possible, dont put things up for a vote. The results of a vote give the appearance of a group decision. Make it clear that while you need and value others ideas, you will make the decision and be accountable for its results. So if we put aside advice from those who confront our cognitive biases, we know the risks of doing so and will live with them.

4. If you dont already have top flight analytics, get to work on it now. If you have nothing in this regard, start small and work your way up but always keep an eye to the goal of developing an enterprise-wide system of analytics.

One last thing: I have always ended up regretting it when I did not follow my instincts. But Im not saying my first instinct was always right. Nor am I saying that data or advice from others lead me astray. Rather, I did not follow it to a conclusion. That is, as I look back I have realized I just didnt invest enough time and attention to find out if my intuition was right before I abandoned it. Thats what it really all comes down to - hard work.

References: Auchly, Bruce. Montana Outdoors, Where the Buffalo Fell, 2003. Online, found at http://fwp.mt.gov/mtoutdoors/HTML/articles/2003/ulmpishkun.htm BusinessDictionary.com, cognitive bias Definition. Online, found at http://www.businessdictionary.com/definition/cognitive-bias.html Davenport, Thomas H., Harris, Jeanne G., and Morison, Robert. Analytics at Work, Harvard Business School Publishing, Boston, 2010 Greengard, Samuel. PM Network, Malcolm Gladwell on Intuition, October 2011, Kahneman, Lovallo, & Sibony. Harvard Business Review, Bias- the Big Idea: Before You Make That Big Decision, June 2011 Kumar, Vikas. The Economic Times India Times, The Imperial CEO is dead as a dodo; Times News Network article; November 4, 2005 Parks Canada, World Heritage Canada, Alberta. Online, found at http://www.pc.gc.ca/progs/spm-whs/itm2/site6.aspx

You might also like