Skill at Decision Making: Evidence and Interpretation
by Langdon Morris
This post is adapted from Chapter 10 of Langdon Morris’ forthcoming book Strategy for the 21st Century, which is due to be published this spring.
This crisis has turned out to be much broader than anything I could have imagined.
Allan M. Greenspan
When the facts change, Sir, I change my opinion. What do you do?
Attributed to JM Keynes
In reflecting on the 2008 financial collapse that was triggered by the failure of the American subprime mortgage market, former US Federal Reserve Bank Chairman Allan Greenspan testified before the US Congress that, “This crisis has turned out to be much broader than anything I could have imagined.”
As Chairman of America’s central bank, the most powerful banking institution in the world, it was precisely the definition of Mr. Greenspan’s job to imagine what might happen under a very wide variety of possible scenarios, to foresee and anticipate the traps, pitfalls, and potential disasters of the policies that he was considering and indeed advocating, as well as policies that he was opposed to. And so by claiming that a series of events which indeed occurred, and which he had in fact been warned about in advance, was “broader than anything he could have imagined,” he thus acknowledged both a failure of imagination and an equally significant failure of analysis. He also exemplified the difficulties we all face when trying to think clearly when our analytical process is inhibited by ideology, bias, and incomplete information. He was apparently looking at the wrong evidence, and his interpretation of the evidence he did examine proved to be pretty shoddy.
This is not a problem that central bankers face alone. In fact, it is a challenge that everyone faces, the difficulty of identifying facts as facts, for often they do not appears as facts at all upon first glance, and the commensurate difficulty of thinking that our assumptions actually are facts when they are not. Ive’s comment about being wrong gets precisely to a critically important issue, which we will discuss here.
Rational or Irrational
We often like to think that humans are rational beings, and a great many economists would have us believe that their science is supremely rational and logical, and that if we could just remove the irrational and counterproductive forces of regulatory interference in the market then the aggregate of highly logical individual choices would result in the optimal economic conditions for communities, nations and globally. But we know better.
And now Mr. Greenspan does too, because he attributed the flaw in this thinking to his great surprise that America’s corporate bankers did not behave rationally. We know in fact that people are far from rational, that ideals and ideology are often much more compelling motivators than any other force. This is one of the only ways we can explain suicide terrorism, for example. This is also how we can explain the flood of sub-prime mortgages that crashed the economy, and people who go deeper into debt to keep up with the neighboring Joneses, and perhaps also people who vote against their own economic self interest.
We also know that the abstract ideal of a perfectly functioning economic system would require that perfect information be available to all participants who would then interpret it in a perfectly rational manner, and it’s utterly clear that our information is far from perfect, nor is what we have perfectly distributed, nor are we consistently good at interpretation. Whether it is conscious or unconscious, lag and inaccuracy and bias are inherent in every bit of news that we read or watch or hear.
This is critically important for us as futurists, analysts, planners, strategists, and leaders, because if we are to make sound choices about the challenging and complex issues that we face now, and will face in the coming years, it would be hugely advantageous to overcome our perceptual and cognitive limitations; to do that we need to know what those limitations are.
Ideology and Assumptions
What exactly is ideology? It’s a set of beliefs that exist independent of the evidence, and which are applied independently of evidence. Hence, what makes ideology so powerful is that it is generally impervious to facts. Conversely, it is the basis through which the existence and the meaning of facts is inferred or deduced; it is the beliefs that determine what we see.
Thus, when we observe opposing protest groups with differing opinions engaged in a screaming match, now a sadly regular occurrence on cable TV news, what we are usually witnessing is an emotionally charged conflict of ideologies that is generally defined by a disregard for or disinterest in evidence or facts.
Interestingly, this sort of conflict happens in science as readily as it does in politics, as scientists can be as selective in the application of facts and the fitting together of facts into interpretations as the rest of us are. One of Kuhn’s key points is that most of the adherents of an old paradigm have become emotionally invested in it, which contributes to their lack of appreciation for the new one. We also have to acknowledge, however, that there is no reason a priori to assume that those advocating the new model are correct. Hindsight embellishes the credibility of Newton and Einstein, but at the time they first published, it made sense to question the validity of their ideas. The problem arises when people either refuse to study the evidence (such as Galileo’s inquisitor) and therefore persist in ignorance, or when the discount the evidence because it does nor conform to their pre-existing beliefs (Greenspan).
When our ideologies and biases predispose us to ignore certain distasteful evidence then it’s only a matter of time until we are deceived. We will miss important evidence, and imagine evidence that doesn’t exist; we will give credibility to evidence that affirms previously-held opinions and biases, and probably not even notice evidence that contradicts them. This is a chronic condition of politics because the pro-change and the anti-change come to the broader questions from opposite perspectives. The pro-change embrace it in the belief that is will lead to a better future, the anti-change oppose it on the grounds that it will only make things worse. The bias towards either position seems to be built in, and perhaps it is acquired in childhood or youth.
The proliferation of media has enabled a proliferation of polarized media outlets that specialize in one or the other ideological viewpoint, which has only amplified the tendency of ideologues to retreat more deeply into their own set of fortified positions. You can now watch, read, or listen to news and commentary that reinforces whichever political positions you feel most comfortable with, and you may never have to hear an opposing view at all, except in derision. This enhances social polarity by reinforcing biases and stereotypes, which is quite unhelpful to those interested in finding meaningful solutions regardless of the ideological nuances involved.
Hence, we are living in a time when political thought is dominated by prejudices, when many literally “pre judge” the outcomes beforehand, before the event occurs or the evidence is collected. Every issue is now framed by the opposing sides according to their ideological predispositions, and the meaning of every tiny shred of actual evidence is argued incessantly as the opponents try to craft a narrative that validates their hypotheses and biases. This is perhaps the worst thing that could happen in politics, as just when what we need is an abundance of evidence, clear thinking, astute forecasting, and an open and honest discussion about our options, what we have instead is an ideological shouting match.
Governments, being national monopolies, can persist for a while trapped in these illusions, but corporations generally cannot. Corporate managers still suffer from exactly the same difficulties, although in their case the culprit is less likely to be described as ideological bias and more likely to be flawed managerial assumptions. The broader result in business is the rapidly declining life span of major companies, which itself provides abundant evidence that many leaders are also not doing so well at managing in markets where change is occurring so fast. This is the going out of business curve showing itself again. The difference between government and business is generally competition, because when there is a competing firm in the marketplace that has a better grasp on the meaning of the evidence it can then design and produce better products and services as a result, and then have a good chance to prevail. Companies, that is, must deliver facts, while politicians can persist for some time in delivering ideology. But eventually change catches up with even the ideologues, as nations decline and fall when they fail to heed reality.
For leaders and strategists whose job is to enable their communities, nations, and organizations to adapt to the challenges of the real world rather then pretend one of ideology, the persistence of ideological prejudices and delusions is a deadly pattern, and it becomes vitally important to recognize what is real and not to substitute for it what is preferred or imagined, or which is ideologically aligned but which is not based in reality.
And among the additional challenges we face in this are the importance and persistence of social norms and group expectations, both of which work to bind us to present realities even when change and adaptation would serve us much better.
Norms and Rules
Scientists tell us that we are a species whose evolution is driven by social factors as much as or even more than by the physical environment, and that therefore the social standards, cues, and agreements that we make are central to our success. If this is actually the case, then it would be necessary for us to recognize and respond to those cues from a very young age. In fact, research has verified that babies only a few months old are quite capable of making these judgments, and that even by the age of three our children are fully aware of and operating within a social context.
By observing others, young children spontaneously infer context-specific rules for social life and assume these rules are norms – rules that others should obey. Deviations and deviants make children angry and motivate them to instill proper behavior in others.
Social norms are of essential to sustain a functional society; the problem arises, as I noted in the previous chapter, when the requirements for successful functioning change and therefore the norms need to change. But there is lag time involved, as society is of course a complex socio-technical system that is subject to all manner of pressures and influences. The visionaries call for change, the rebels and revolutionaries promote change, and the anti-revolutionaries and reactionaries tell us why we must not change. Amidst this cacophony the context itself continues to change.
A key dimension of our vulnerability as decision makers is therefore our shared tendency to prefer to fit in with our peer group, which often leads us to modify our behavior and gloss over disagreements in order to reinforce social cohesion. Irving Janis named this tendency groupthink, and while in many social settings this is quite positive, but when we confronted with challenging issues that must be decided, partial information to work from, and lack of clarity around what the options are, then this tendency to avoid conflict can lead to disastrous results. Janus points out that it’s quite common to observe what he refers to as mindless conformity and collective misjudgment of serious risks…
… which are collectively laughed off in a clubby atmosphere of relaxed conviviality. Consider what happened a few days before disaster struck the small mining town of Pitcher, Oklahoma, in 1950.The local mining engineer had warned the inhabitants to leave at once because the town had been accidentally undermined and cave in at any moment.At a Lion’s Club meeting of leading citizens, the day after the warning was issued, the members joked about the warning and laughed uproariously when someone arrived wearing a parachute.What the club members were communicating to each other by their collective laughter was that “sensible people like us know better than to take seriously those disaster warnings; we know can’t happen here, to our fine little town.”
A few days later some of these men and their families died in the collapse.
Janis organizes the book around case studies on notable examples of misjudgment in American history, including the failure of the US leadership to anticipate the attack on Pearl Harbor, the fiasco of the Bay of Pigs invasion in 1961, the American descent in the quagmire of the Vietnam War, and the Watergate cover-up by President Richard Nixon. In each case, Janis shows quite convincingly that the desire of those involved to sustain their convivial relationships prevented various individuals from raising doubts about the plans that were under way, and also allowed the groups as a whole to discount evidence that the plan was not a good one. Groupthink is the very down side of social normalization.
Part of the problem that all decision makers face is the difficulty of collecting and then evaluating worthwhile evidence. Often, it turns out what we see is not as trustworthy as we may think, as our very process of looking at the world around us contains built-in biases.
Believing is Seeing
Bias is evident in that which we see with our own eyes and in that which we fail to see, as it turns out that the brain is quite selective about what it brings to our attention. The aspects of our experiences that we remember are not necessarily the objective recreations of past events as we suppose them to be, they are instead selective reconstructions that often align with our prejudices and biases and therefore omit details that are not consistent with our preconceptions.
A small bit of common sense that we often hear repeated is the phrase seeing is believing, but, in the words of Ira Gershwin, it ain’t necessarily so. While we may believe that our beliefs are founded on an objective version of facts, it would be far more accurate to say the reverse, that believing is seeing, because what we see is actually a selective recreation, not an objective record.
We form our beliefs for a variety of subjective, personal, emotional and psychological reasons in the context of environments created by family, friends, colleagues, culture, and society at large; after forming our beliefs we then defend, justify, and rationalize them. Beliefs come first; explanations for beliefs follow. Our perceptions about reality are dependent on the beliefs that we hold about it. … Once beliefs are formed, the brain begins to look for and find confirmatory evidence, which adds an emotional boost of further confidence.
This insidious process of “belief confirmation” is a powerful force that still more deeply embeds our beliefs in our consciousness. And did you notice the phrase “our perceptions about reality are dependent on the beliefs that we hold about it.” Or more simply, we do not believe what we see, we see what we believe. “Beliefs come first, explanations for beliefs follow.”
This could well explain the trap the Mr. Greenspan fell into, as well as the literal hole that the good people of Pitcher died in. This also explains, at least partially, the difference between the curve of exponential change and the more leisurely one labeled the “going out of business curve,” which is broadly the rate at which many organizations adapt to change.
The greater the difference between these two curves, obviously, the greater the implied risk. Addressing the gap means taking action both on the cognitive aspects as well as in the hands-on managerial ones. However, in addition to the social pressures related to groupthink and the problems of belief preceding sight, the cognitive realm is also subject to biases, to particularly common thinking errors that stand in the way of clear judgments.
A broader category of errors called “cognitive biases” describes some of the ways in which our judgment commonly lapses, particularly in the face of novelty. Total Oil executive Guy Mansfield explains it this way:
Managers like to think that the decisions they make are objective and rational and have been executed taking into consideration all the pertinent facts. Studies by a variety of researchers including Rosenzweig, Kahneman and Schoemaker have identified six principal decision-making biases:
- Anchoring: Ignoring contradictory evidence.
- Framing: How a situation is presented affects the decision; and it can be easily, even if unconsciously framed to validate a given expectation or position.
- Availability Heuristic: Vivid and easily imaginable events and recent events are weighted disproportionately in making decisions. Something that occurred this morning, even if insignificant in the bigger picture, may exert disproportionate influence in a decision making process.
- Confirmation Bias: Initial decisions become self-fulfilling prophesies, and data are collected after the event to justify the decision. Contradictory evidence is often disregarded.
- Commitment Escalation: Previous commitments tend to influence present decisions; this is often referred to as “putting good money after bad,” and generally refers to our unwillingness to walk away after a bad investment.
- Hindsight Bias: It is easy to construct a logical narrative to explain events in hindsight even when foresight had no clue what was coming. Nassim Taleb notes, “Past events will always look less random than they were.”
These are particularly pernicious in an era of accelerating change, wherein the prior evidence is consistently likely to compose a fragmented of even fully defective picture of the future.
Given the risks that organizations and society in general face due to the rate and acceleration of change, along with increase in complexity, etc., we must undertake a systematic search for evidence, reliable evidence, and then interrogate it relentlessly in order to arrive at a reliable interpretation about what is real, as opposed to what we wish to be real.
The evidence I presented in Part 1 is intended to affirm the proposition that due to the confluence of five revolutions, the rate of change has raced beyond the capacity of most decision makers to grasp what’s going on, which of course prevents them from providing the leadership necessary to enable their organizations to successfully adapt. In Chapter 7 I contrasted the exponential rate of change, which I defined as reality, with the much slower rate of change that most companies can accommodate, which I described as the going-out-of-business curve.
Following this lesser curve inevitably puts governments and corporations into cycles of constant crisis management because they’re not anticipating change, but instead arguing about where to place the deck chairs as they careen into icebergs that they have failed to anticipate. This reactive dynamic is fully evident in every dimension of society, from the financial crises that are occurring with greater regularity in multiple countries, to the political and ideological, religious, and diplomatic crises that have degenerated into 50 civil wars across numerous regions and countries. The crises are certainly real – they directly and adversely affect millions of us, and billions more feel the impacts a bit less directly. And all of this leaves us with the uncomfortable feeling that our social systems are out of control, and that we don’t know what to do, accentuating the future shock/now shock that is pervasive.
And while it’s difficult to envision a plausible scenario in which the speed of change slows down, it’s quite easy to see how change could continue to accelerate. Consequently, crises may worsen but are unlikely to lessen. It is more realistic and more prudent to anticipate that they will become more frequent and their impacts will become more significant. The turbulence, in short, will increase, and the demands on managers will increase commensurately.
Despite the uncertainties, the proposition that we are in the midst of five revolutions does offer least the beginnings of a model, and as I mentioned above, we have to have a sound model if we’re going to manage effectively in any situation. Perhaps we already do know a lot about our future; we do certainly have a lot of insights about what to expect from the continuing advances of digital technology, and about the likely impacts of climate change in both moderate and more extreme cases, about the massive economic dislocation that the shift from fossil fuels will cause, about the continuing advance of urbanization and the consequent decline in the rate of population growth, and about the continuing tumult throughout human societies as we struggle to cope with everything that’s changing. We know a lot about them because we’re already experiencing them; that is, we already have a lot of evidence.
We also have a lot of guidance from the many visionaries who have traversed this ground ahead of us, many of which I mentioned in the previous chapter.
Prior to the global financial collapse of 2008, a number of insiders were quite outspoken in expressing concerns about the state of the American mortgage market, but few of them found much of an audience, and certainly none of them with Mr. Greenspan. For example, Ms. Brooksley Borne, an executive in the US Treasury Department, raised concerns about the subprime mortgage market at a number of meetings for which the complete record is available, and she was roundly criticized and even publicly humiliated for having the impudence to suggest that there was anything wrong. Such public humiliations are characteristic of the groupthink environment, in which it’s quite common for those with views contrary to the consensus opinion to be forced out of the conversation.
It would have been a much better reflection on Mr. Greenspan’s capacity as leader if he had at least considered the possibility of the subprime boom and bust, even if he had rejected it, than for him to have to admit that it was entirely beyond his imagination. If it had been considered then his failure would have been one of interpretation and judgment, and we all recognize that failures of this type are a constant risk for all decisions makers, and thus they’re perhaps less inexcusable.
History did show that Ms. Borne had a better grasp on the market risks than did Mr. Greenspan, a better interpretation of the available evidence, and actually in her role at Treasury she was actually closer to the meaningful evidence, which makes it all the less excusable that her opinion would have been handled so caustically.
Another observer, also someone who had access to evidence, and who correctly forecast the calamity was Nassim Taleb, who wrote this:
Regulators in the banking business are prone to a severe expert problem and they tend to condone reckless but (hidden) risk taking. … The government-sponsored institution Fanny Mae, when I look at their risks, seems to be sitting on a barrel of dynamite, vulnerable to the slightest hiccup.But not to worry: their large staff of scientists deemed these events “unlikely.”
The distinction between fragile and anti-fragile is yet another of the cognitive fallacies, Taleb’s quite valid point being that we mistakenly create systems with inherent fragility rather than anti-fragile ones that would serve us much better.
Among Taleb’s books, the most famous is probably The Black Swan, which discusses the mismatch between our understanding of what is possible, and the much broader scope of what actually is possible. He suggests that many of us live within artificially constrained mindsets, and that our appreciation for the depth and complexity the world is quite limited compared with the much greater complexity and possibility of the actual world. Consequently, Taleb suggests that we are surprised when we should not be, by events that could well be foreseen. It’s obvious why his views are appealing here, when the topic is change and our capacity to recognize and prepare for it.
The concept of the black swan, from which his book gets its title, is this:
Before the discovery of Australia, people in the Old World were convinced that all swans were white, an unassailable belief as it seemed completely confirmed by empirical evidence. …This illustrates a severe limitation to our learning from observations or experience and the fragility of our knowledge. One single observation [or event] can invalidate a general statement derived from millennia of confirmatory sightings of millions of white swans.
Hence, Taleb reminds us that in addition to the driving forces of change that we have been examining so far, we should also consider events that, if they happened, would decisively change the world. Not all of them are nightmare events, but many are.
The black swan problem is also a cognitive one, wherein we mistake the fact that we don’t have any evidence either way that something exists or not, for evidence of that it cannot exist. The concept of the black swan fallacy gives Taleb and us a shorthand way to explain that absence of evidence (never having seen a black swan) is not at all the same as the evidence of absence (the nonexistence of black swans). Hence, just because we don’t have any evidence for a thing is not proof that it does not exist.
This is precisely one of the traps that befell Mr. Greenspan. The argument that was made in attacking Ms. Bourne when she had the temerity to speak out was that there was no evidence that the practice of subprime mortgage lending was creating any problems, so obviously it wasn’t creating any problems. In fact it was creating huge problems, problems that just hadn’t revealed themselves to Mr. Greenspan quite yet, although others, including Bourne and Taleb, had seen plenty of evidence and had interpreted it correctly.
This also reflects an aspect of our world that the comment by John Keynes addresses, which I used at the beginning of the chapter. Apparently no one has found that this is a direct quote from Keynes, but it is a comment that has been widely attributed to him. The point, anyway, was that the passing of time reveals new information that was not previously known, and new information may include new facts that contradict the old ones. The ideologue will generally distrust and disregard the new facts, not believing them because they contradict the ideology, dearly beloved as it is. This will lead, of course, to faulty decisions. Hence, Keynes’ point was that we are better off looking at the facts for what they are than for what we want them to be. Everything we’ve discussed so far in this chapter shows that this is in fact harder to do than perhaps you thought, but that’s the challenge then.
In our world, where reinforcement of ideologies and prejudices is the very purpose of a vast array of special-interest media outlets and personalities, it requires dogged persistence and uncharacteristic openness to allow the facts to speak for themselves.
Causality vs. Coincidence
As we seek to identify what’s real, as distinct from what is pre-believed, another challenge we face are the illusions caused by relationships between events. When two events occur in sequence, it’s our natural tendency to understand that the first one has caused the second one. But in a complex and complexifying world, it’s quite common for events to occur in sequences that are not causal at all, but merely coincidental.
This was one of the insights that Forrester illuminated in his work on social systems, their increasing complexity, and the likelihood that preferred policy prescriptions would lead to unintended and undesirable outcomes. His discovery that the behaviors of complex social systems were often counter-intuitive meant that when government, for example, took action to deal with systemic problems, very often the results achieved were contrary to what was intended because the actions were planned according to defective understanding about cause and effect.
The very complexity of the systems we’re dealing with defeats our intent because of our lack of understanding.
Forrester came to this understanding by developing highly sophisticated models of these systems, and the techniques he pioneered became quite influential among his students at MIT, and elsewhere. He was achieving these results in the 1960s and 70s; today we have computers that are orders of magnitude more powerful, with which we are able to develop much more comprehensive models, and social scientists who have become quite adept at combining public opinion polls with in-depth behavioral research, thereby enabling us to master complexities that used to bedevil us.
This is as true in the physical sciences and economics as it is in the social sciences, giving today’s decision makers have the benefit of much better tools. Tomorrow’s will be even better still, and perhaps just in time to help us sort our way out of the mess that the five revolutions are bringing.
If we cannot distinguish between our ideological biases and meaningful evidence then we are destined to be cast adrift by our illusions. If do not distinguish between causal connections and coincidences then we will constantly misidentify the true causes of our fortunes and misfortunes. Obviously this is worth mentioning because we’re chronically seeing both types of errors. These shortcomings are an example of what we might call “trained incapacity,” which are essentially poor thinking habits that have been learned. We are incapable because we have been trained to be incapable, trained by a society that’s rife with biases including a preference for ideology and chronic lapses in rigor.
Happily, there is a different way. We train our capacity, we learn to gather sufficient evidence and then interpret skillfully it by decoding the relevant patterns. We map the intersections of events, forces and ideas, which make it possible to get a decent view of the future, but the challenge, of course, is that we never know if we have enough evidence because the information available to us is always incomplete. Always.
This is why leaders develop both their skills and their judgment, and thereby increase their capacity to recognize facts, and also to make sound decisions even when they prefer to wait for yet more information. And in the requirement that they make such judgments, they may be significantly aided by those who have already been thinking about the future.
This has always been one of the crucial roles that visionaries have played in society. As observers and advisors, as critics and challengers, as journalists and artists, those who have seen the future have frequently tried to point out its threats and dangers well beforehand.
In addition to his critique of group decision making, Irving Janis also gives examples of tension-filled situations in which the decision makers were rigorous, and he provides a helpful rubric of seven steps that constitute sound and disciplined approach. These are essential skills:
- Thoroughly canvass a wide range of alternative courses of action.
- Survey the objectives and the values implicated.
- Weigh carefully the costs, drawbacks, and subtle risks of negative consequences, as well as the positive consequences, that could result.
- Search continuously for relevant information for evaluating alternatives.
- Take account the expert judgments, even when those judgments contradict initial preferences.
- Reexamine positive and negative consequences of the main alternatives, including actions that had initially been considered unacceptable.
- Made detailed execution plans, including contingency plans if things go wrong.
This, then, is our process for making decisions, and thus our method also of making predictions.
Changing Our Beliefs
Social norms are relevant and help to sustain society because they provide continuity and enable the sharing of gained knowledge across generations. This has been, as we saw, essential to our endurance as a species that is otherwise mediocre in its physical attributes, and it is the core of how we succeed in any particular context. Our ways of thinking, interacting, deciding are all a function of where and how we live, and the situations in which we find ourselves, and evolution has crafted us magnificently to occupy our niche.
The challenge we face today, however, is that our context, the world around us, is changing much faster than our genes can change. How will we handle that? We will have to change our beliefs.
Belief change comes from a combination of personal psychological readiness and a deeper social and cultural shift in the underlying zeitgeist, which is affected in part by education but is more the product of larger and harder-to-define political, economic, religious, and social changes.
The political, economic, religious, and social context are definitely changing, and quickly, but is the zeitgeist also changing? For that to happen we may have to push it …
In particular, we have to develop and adopt more relevant beliefs about today’s and tomorrow’s worlds, and how they work, and also develop our capacity to reason more effectively in a tense,. complex and adversarial decision-making environment, as Janis recommends.
This means, in effect, that we have to acknowledge that for the last 200 years we’ve been perfecting a particular type of economy that worked well in a particular situation in the history of a civilization on our particular planet, but now that the planet is changing and the economic context is changing fundamentally, we will soon have to make significant decisions about how to change our approach to civilization. This is no small challenge for us, a species aligned through social and historical norms, and that has attained the pinnacle of economic dominance and success. Just as we have mastered the game of today, the rules are changed.
The really good strategists have already begun to identify the new rules, and to inform us that we will soon be obliged to change our ways of thinking and acting or suffer the significantly unpleasant consequences. We will examine their evolving political, social, and economic maps and models in the next chapter, along with the actions that those maps may require of us.
 Henrich, Joseph. The Secret of Our Success: How Culture is Driving Human Evolution, Domesticating Our Species, and Making Us Smarter. Princeton University Press, 2016. P. 186.
 Janis, Irving L. Groupthink. Houghton Mifflin Company, 1982. Second Edition. P. 3.
 Shermer, Michael. The Believing Brain: From Ghosts and Gods to Politics and Conspiracies – How We Construct Beliefs and Reinforce Them as Truths. Times Books, 2011. P. 5.
 Mansfield, Guy. Developing Your Leadership Skills: From the Changing World to Changing the World. 2013.
 Taleb, Nassim Nicholas. Fooled by Randomness: The Hidden Role of Change in Life and in the Markets. Random House, 2004.
 Taleb, Nassim Nicholas. The Black Swan: The Impact of the Highly Improbable. Random House, 2007. P. 225.
 Taleb, Nassim Nicholas. The Black Swan: The Impact of the Highly Improbable. Random House, 2007. P. xvii.
 Janis, Irving L. Groupthink. Houghton Mifflin Company, 1982. Second Edition. P. 136.
 Shermer, Micahel. The Believing Brain: From Ghosts and Gods to Politics and Conspiracies – How We Construct Beliefs and Reinforce Them as Truths. Times Books, 2011. P. 4.
Langdon Morris is Senior Partner of InnovationLabs, one of the world’s leading consulting firms working in the areas of strategy and innovation. He is author or coauthor of ten books on innovation. To learn more please visit www.innovationlabs.com.