Superstitious Learning and Groundhog Day

In the movie Groundhog Day (if you haven’t seen it), Bill Murray plays a narcissistic TV weatherman, who is sent to Punxsutawney, PA to report whether the groundhog, Punxsutawney Phil, sees his shadow. Murray, instead, finds himself repeating the same day, again and again. It doesn’t matter what he does, he’ll wake up again on Groundhog Day, in the same bed and breakfast, with the alarm clock radio waking him up to the song, “I Got you Babe.” Recognizing his predicament, Murray begins to indulge in every base-impulse—leading to humorous scenes of Murray gorging on every donut in a diner, and driving Punxsutawney Phil off a cliff in a pick-up truck, among other sordid examples.

Throughout the movie Bill Murray learns from experience, that his selfish and indulgent behavior leads to repeated negative outcomes. This includes being repulsive to his love interest (Andie MacDowell). Slowly, over dozens of cycles, his character begins to transform into a “Renaissance Man.” He learns to speak French and how to play the piano. Humorously, his helpful actions around town make Murray the target of a bidding war at a charity bachelor auction that evening. Needless to say, Andie MacDowell can’t help but be impressed at Murray’s redemptive transformation.

Contrast how Bill Murray learned from experience with one-shot experiences. When we are in one-shot experiences (e.g. choosing a college, a first job, a home) it is much costlier to “learn from experience” (see also Thaler, 2015, Chapter 6). In one-shot experiences, we can’t alter our approach slightly and see the results we get (as Bill Murray did), it is harder to learn what will produce better outcomes.

Without many feedback cycles to validly learn from experience, we are more prone to “superstitious learning” (March & Olsen, 1975), meaning in the search for causality we attribute a prior act as the cause of what we experience. Maybe we received a “lucky rabbit’s foot” as a gift prior to making a big, one-shot decision. We might superstitiously attribute the consequences we experience to that prior act. Or, more relatedly, we might “learn” that the shadow of a groundhog somehow predicts future weather conditions. 🙂

Likewise, when we try to explain organizational performance, we may attribute some social norm as the cause of performance. This may be plausible, but in other cases it is an extraneous social norm that has no causal impact. For example, I often teach a case study about Apple and the turnaround of the company by Steve Jobs. Steve Jobs had many strengths and talents, but he also had some well-known tendencies to treat people in a less than respectful manner. It’s cognitively-tempting to attribute the success of Apple to the accountability created by his berating-style of management. We’ll superstitiously learn that a berating management style leads to superior performance. In the one-shot experience of Steve Jobs turning around Apple, we cannot go back in time and see if they still would have succeeded had they removed Jobs’ style of management. Perhaps it was primarily Steve Jobs’ unique eye for talent and his focus on the beauty and simplicity of design that was causally important, and Apple succeeded in spite of his berating management style.

The point is to recognize if we are learning like Bill Murray in a “Groundhog Day-like” environment where we can improve with constant feedback and repetition (see also Kahneman & Klein, 2009), or if we are in one-shot experiences. If we are in one-shot experiences the likelihood of superstitious learning is high—heightening the need for tempered conclusions. This also heightens the need to learn how to learn. By this, I mean more systematically designing experiments to foster more valid learning. For example, rather than implementing only one option (given it is cost effective), we would test two options (much like A/B testing in web analytics where users receive one of two options and the designer can see which option leads to higher click-through rates.) By testing two options we can more validly learn what produces better results, rather than making attributions that may be superstitiously-derived.

So the next time you see yourself attributing causality (and more importantly taking action based on that attribution), ask yourself if there are other plausible attributions and how you know what you know. And, given the situation affords the opportunity to test various options, work to design ways to more deliberately create feedback cycles to validly learn. You may not redemptively transform like Bill Murray does, but you’ll increase the odds of reaching your goals.

References:

Kahneman, D., & Klein, G. (2009). Conditions for intuitive expertise: A failure to disagree. American Psychologist, 64(6), 515-526.

March, J. G. & Olsen, J. P. (1975). The uncertainty of the past: Organizational learning under ambiguity. European Journal of Political Research, 3,147-171.

Thaler, R. H. (2015). Misbehaving: The history of behavioral economics. New York: Norton

Ryan Smerek is an assistant professor and assistant director of the MS in Learning and Organizational Change program at Northwestern University. He is the author of Organizational Learning and Performance: The Science and Practice of Building a Learning Culture.

Photo Source: Alexas_Fotos/Pixabay

Don’t Tread on Me! Psychological Reactance as Omnipresent

If you’ve raised kids, or are trying to enact change in organization, you know well the phenomenon of psychological reactance, although you probably don’t know it by that name. You might call it the “terrible twos” or “change resistance.”

Psychological reactance is the instantaneous reaction we have to being told what to do (Brehm & Brehm, 1981). This leads to some remarkable findings, one of which I came across while reading about correcting misinformation. As Lewandowsky, Ecker, Seifert, Schwarz, & Cook (2012) state:

“People generally do not like to be told what to think and how to act, so they may reject particularly authoritative retractions. For this reason, misinformation effects have received considerable research attention in a courtroom setting where mock jurors are presented with a piece of evidence that is later ruled inadmissible. When the jurors are asked to disregard the tainted evidence, their conviction rates are higher when an “inadmissible” ruling was accompanied by a judge’s extensive legal explanations than when the inadmissibility was left unexplained (Pickel, 1995, Wolf & Montgomery, 1977)” (p. 116).

So, in this case, the mock jurors reacted to being told what to think—to the degree that they decided to use the inadmissible evidence just to spite the judge for being spoken to authoritatively!

With such a reaction that seems instantaneous (and is present for most toddlers), is psychological reactance pre-wired? Are we born with a propensity to react to any freedom being limited? And if so, what would be the evolutionary justification for such a tendency? Jonathan Haidt offers one explanation in his book The Righteous Mind. He hypothesizes that psychological reactance evolved to avoid being dominated by an alpha male, which could help a collective survive. He cites a stunning passage from Christopher Boehm’s Hierarchy in the Forest:

“A man named Twi had killed three other people, when the community, in a rare move of unanimity, ambushed and fatally wounded him in full daylight. As he lay dying, all of the men fired at him with poisoned arrows until, in the words of one informant, ‘he looked like a porcupine.’ Then, after he was dead, all the women as well as the men approached his body and stabbed him with spears, symbolically sharing the responsibility for his death” (Boehm as cited in Haidt 2012, p. 199).

There is a lot going on in this passage, but partly it is the response of a community responding to potential domination by Twi. The difficult question to answer is if societies (and individuals) who rose up against dominance were more likely to survive than those who passively accepted domination. Selection at a societal level is a controversial hypothesis, but if true, then psychological reaction may be a pre-wired disposition in us—making it all the more prevalent.

So, the next time you are seeking to implement a change in an organization, think of the villagers responding to Twi’s attempted domination. You will surely be on the receiving end of psychological reactance. This is all the more reason to build cooperation and buy-in from the start by openly inviting contributions and participation. Also, beware of condescension. Similar to jurors using inadmissible evidence, people may not be reacting to the content of any change initiative, but the manner in which it is explained. Of course, I am aware of the irony of telling you what to do in a post about psychological reactance. Consider these “helpful suggestions.” 🙂

References:

Brehm, S. S., & Brehm, J. W. (1981). Psychological reactance: A theory of freedom and control. Academic Press.

Haidt, J. (2012). The righteous mind: Why good people are divided by politics and religion. New York: Pantheon Books.

Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest, 13(3), 106-131.

Pickel, K. L. (1995). Inducing jurors to disregard inadmissible evidence: A legal explanation does not help. Law and Human Behavior, 19, 407–424.

Wolf, S., & Montgomery, D. A. (1977). Effects of inadmissible evidence and level of judicial admonishment to disregard on the judgments of mock jurors. Journal of Applied Social Psychology, 7, 205–219.

Ryan Smerek is an assistant professor and assistant director of the MS in Learning and Organizational Change program at Northwestern University. He is the author of Organizational Learning and Performance: The Science and Practice of Building a Learning Culture.

Photo Source: Chris Karidis/Unsplash

When Do We Change Our Minds? Think of a Jenga Tower

I recently read Philip Tetlock’s excellent book Superforecasting: The art and science of prediction. (1) It is about improving our ability to make forecasts, which includes updating our beliefs as we learn new information about how the future may unfold.

There is one section of the book that discusses belief updating and Tetlock (along with his co-author Dan Gardner) use a helpful metaphor for when we will or will not change our minds when encountered with new information. They say:

“Commitment can come in many forms, but a useful way to think of it is to visualize the children’s game Jenga, which starts with building blocks stacked one on top of another to form a little tower. Players take turns removing building blocks until someone removes the block that topples the tower. Our beliefs about ourselves and the world are built on each other in a Jenga-like fashion” (p. 162).

With this metaphor, they discuss how, if you are forecasting within your specialty, you can be more reluctant to discard certain beliefs when the domain is tightly-connected with your identity and self-worth.

I like the metaphor of beliefs as a Jenga Tower because it’s easy to be pessimistic that we’ll never change our minds when confronted with information that conflicts with our beliefs. In political science, they even find a strengthening of prior beliefs when confronted with conflicting evidence—called the “backfire” effect (Nyhan & Reifler, 2010). For example, if we are strong supporters of gun control or another hot-button issue, when we receive conflicting information that doesn’t support our preexisting beliefs, we’ll argue against it, thus strengthening our prior beliefs. This is certainly true in many circumstances, but occasionally we will update our views (even those close to the base of the Jenga tower) as “incongruency builds” over time (See Redlawsk, Civettini, & Emmerson, 2010). Yes, for many issues we won’t budge (at least in the short term), because it would bring the whole Jenga Tower crashing down, but for other beliefs near the top, we’ll slowly remove and update some of them in accordance with the evidence we are presented with.

So, the next time you seem to be in an intractable conflict, think about where you are in the Jenga Tower (and what this says about belief centrality). If you are near the base of someone’s self-worth and deeply-held identity, you have little chance of dislodging a core belief (at least in the moment). Instead, move up the Jenga Tower toward less foundational assumptions. For example, we can discuss whether tax credits for low-emission vehicles are a worthy policy to combat pollution, without having to agree and debate the causes of climate change (See Lewandowsky, Ecker, Seifert, Schwarz & Cook, 2012), This may be a more fruitful (and hopeful) way of conversing in domains that seem intractable, and eventually changing our (and others) minds.

(1) Don’t be put off by the heroically dramatic title. The book is highly-nuanced and impressively-grounded. I’d put it in the same category as Kahneman’s masterful Thinking Fast and Slow.

References:

Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest, 13(3), 106-131.

Nyhan, B., & Reifler, J. (2010). When corrections fail: The persistence of political misperceptions. Political Behavior, 32, 303-330.

Redlawsk, D. P., Civettini, A. J. W., & Emmerson, K. M. (2010). The affective tipping point: Do motivated reasoners ever “get it”? Political Psychology, 31(4), 563-593.

Tetlock, P. E. & Gardner, D. (2015). Superforecasting: The art and science of prediction. New York: Crown Publishers.

Ryan Smerek is an assistant professor and assistant director of the MS in Learning and Organizational Change program at Northwestern University. He is the author of Organizational Learning and Performance: The Science and Practice of Building a Learning Culture.

Photo Source: Guma89/Wikimedia Commons

The Intuition of Truth

Whenever I finish a movie “based on a true story,” there comes a moment when the credits begin rolling and I want to know how much of it was true. This happened recently after finishing “The Founder,” about Ray Kroc and how he turned McDonald’s into an American institution. It led to a brief internet search to learn about the deal that was signed (or not signed) with the original McDonald’s brothers and whether the movie’s portrayal resembled reality. We only have so much time, of course, to dig into the “facts” of any movie “based on a true story,” but the point remains that we still have an intuitive reaction about the “truth.” We’ll certainly allow some “creative license,” but we still want the major points to largely resemble what occurred.

Likewise, I heard a talk by the author Tim O’Brien around the fall of 2010. He is the author, among others, of the book The Things They Carried, a collection of short stories about soldiers during the Vietnam War. I had read the book in the 1990s and was moved by the reflective tales about what war must have been like. During his talk, he mentioned an amusing story about his young son peeing into a waste paper basket in their family bathroom. I believe the waste paper basket was of the wicker-variety so that a pool of urine was all over the bathroom floor. I do not remember the exact reason his son did this, but the reason was humorous, along with the numerous other particulars of the story. The room was in laughter. When he finished, however, he said, “I just made that story up. It never happened.” He went on to say how it didn’t matter whether it happened or not, it was the emotional truth of the story. I can remember vehemently disagreeing. It did matter.

Both of these scenarios (movies based on a true story and Tim O’Brien’s “tale” of his son) sparked an intuitive reaction—that it matters whether something is true or not. But knowing “the truth,” is very complex, of course. There are many meanings of the word “truth” (Horwich, 2010, 2013). As Horwich (2013) describes, truth has been defined as “correspondence with the facts,” as “provability,” as “practical utility,” and as “stable consensus,” but “all turned out to be defective in one way or another—either circular or subject to counterexamples” (para. 12). Despite the contested nature of truth, we don’t want to “throw the baby out with the bathwater.”

This brings me to the excellent TED talk by Michael Patrick Lynch titled “How to see past your own perspective and find truth.” Lynch, a philosophy professor at the University of Connecticut, states:

“Skepticism about truth…is a bit of self-serving rationalization disguised as philosophy. It confuses the difficulty of being certain with the impossibility of truth. Look — of course it’s difficult to be certain about anything…But in practice, we do agree on all sorts of facts. We agree that bullets can kill people. We agree that you can’t flap your arms and fly. We agree — or we should — that there is an external reality and ignoring it can get you hurt.”

Thus, while truth has many meanings and it is difficult to be certain about anything, we still need “truth” as a regulatory ideal. At the very least, a belief in truth is functional in that it helps us get to the bottom of issues to develop a more solid footing for our beliefs and opinions. Asking of any claim “is it true?” helps spark critical thinking and dialogue to overcome a confirmatory stance to the world (1). So, we shouldn’t throw out the notion of truth with postmodern skepticism, even if it is difficult to ascertain, and the word, itself, means many different things. We should, at least, listen to our intuitive reaction and curiosity to understand if something is “true,” (i.e., did it actually occur), and not so easily abandon this ideal.

(1) See Dalio (2017) or Smerek (2017) for embedding the social norm of “pursuing truth” in an organizational setting.

References:

Dalio, R. (2017). Principles: Life and work. New York: Simon & Schuster.

Horwich, P. (2010). Truth–meaning–reality. Oxford, UK: Oxford University Press.

Horwich, P. (2013, March 3). Was Wittgenstein right? The New York Times.

Lynch, M. P. (2017, April). How to see past your own perspective and find truth. TED.com

Smerek, R. E. (2017). Organizational learning and performance: The science and practice of building a learning culture. New York: Oxford University Press.

Ryan Smerek is an assistant professor and assistant director of the MS in Learning and Organizational Change program at Northwestern University. He is the author of Organizational Learning and Performance: The Science and Practice of Building a Learning Culture.

Photo Source: Dino Reichmuth/Unsplash

 

 

Forget Intelligence. Aim for Mental Complexity.

In reading the entire Oxford English Dictionary over the course of a year (a mere 20 volumes and 21,730 pages!), Ammon Shea was asked by NPR host Tom Ashbrook about some of the more compelling words he learned.

Shea mentioned “apricity,” which is the warmth of the sun in the winter. As he says:

“[Apricity] is a word that I sincerely hope I will never work into general conversation…I am not a fan of using big words for their own sake. However, I do find now that in the [wintertime] as I am bathed in that warmth of the winter sun, I am more cognizant of the fact that it’s happening, knowing there is a word with which to describe it” (Shea, 2008).

After learning the word apricity, Shea began to experience the world with more cognizance. You’ve likely had the same experience in learning a new concept about workplace dynamics. Perhaps you’ve learned about personality dimensions such as introversion-extraversion or how power dynamics can be at play in subtle ways. After learning these new ideas you experience workplace dynamics with more cognizance. You see more and experience more—just like Ammon Shea experiencing the warmth of the winter sun.

However, when trying to explain any kind of mental development, we often attribute that development to an increase in “intelligence.” Researchers in the field of adult development discuss a different form of intellectual competence—mental complexity (see Kegan & Lahey, 2009). Mental complexity is the variety of perspectives, concepts, and vocabulary we have to make sense of the world. With greater mental complexity, we can perceive more and take more effective action. We expand our action repertoire. As, “No one is ever free to do something he can’t think of” (Weick, 1979, p. 193).

Of course, there have been many critiques of intelligence that have tried to take it off its explanatory pedestal. Most notably is Keith Stanovich’s (2009) impressive work on the rational thinking habits that are dissociable from traditional measures of intelligence. In addition, Carol Dweck (2006), with her research on having a growth mindset, has helped us question our assumptions about intelligence. In particular, how our implicit belief about intelligence as a fixed quantity limits whether we seek challenges and persist, because challenges might reveal what is innately lacking. Instead, if we view the mind as a muscle, and capable of growing, we are more likely to pursue challenges to expand our “intelligence.” As important as this may be, we are still trapped within frame of intelligence.

If, however, we seek to expand our mental complexity, we can forgo any implicit assumptions we might have about intelligence. Instead, by aiming for mental complexity, we can more freely expand our cognizance of what we are experiencing and what is occurring in the organizations in which we work.

What does this mean you should do? Increase the number of educational actions you take: read more, persist through confusion to try and explain what you are experiencing with a more nuanced vocabulary. Do this—not to increase your intelligence—but to expand your mental complexity.

References:

Dweck, C. S. (2006). Mindset: The new psychology of success. New York: Random House.

Kegan, R., & Lahey, L. L. (2009). Immunity to change: How to overcome it and unlock the potential in yourself and your organization. Boston, MA: Harvard Business School Publishing.

Shea, A. (2008, August 9). Reading the OED. On Point with Tom Ashbrook. [Audio Podcast] Retrieved from http://onpoint.legacy.wbur.org/2008/08/19/reading-the-oed

Stanovich, K. E. (2009). What intelligence tests miss: The psychology of rational thought. New Haven, CT: Yale University Press.

Weick, K. E. (1979). The social psychology of organizing (2nd ed.). Reading, MA: Addison-Wesley.

Ryan Smerek is an assistant professor and assistant director of the MS in Learning and Organizational Change program at Northwestern University. He is the author of Organizational Learning and Performance: The Science and Practice of Building a Learning Culture.

Photo Source: Kelly Sikkema/Unsplash

 

“Oh no, not another ‘learning experience’”

I saw the bumper sticker once while driving in Boston some 15 years ago that said:

“Oh no, not another ‘learning experience.’”

I still smile at the sentiment. We know we should learn from failure but euphemisms such as having a “learning experience” seem tone deaf when faced with the many disappointments and setbacks we face. Sure, at some point we’ll make sense of the disappointment, but immediately glorifying failure as a “learning experience” doesn’t seem to respect the disappointment itself.

I never met the owner of the car with the bumper sticker, but you have to wonder what sentiment and life experience led them to joyfully paste that phrase on the back of his or her car. There is a forlorn stance with the opening “oh no,” and a humorous refusal of facing yet again another “learning experience.”

But the phrase indicates what we’ve all experienced, that many of the most critical things we have learned in life have come from failure. In fact, when I ask students to write a reflective paper on what they have learned from failure, it can be tough to even identify “pure” failures in our lives. What was once a failure becomes transformed as we reinterpret the past and take alternative actions that lead to redemptive paths. We “failed” by getting rejected by one school, but were accepted at another and flourished.

Failure is a momentary label, and its impact can be transformed by what is learned—both individually and in the organizations in which we work. So in the face of disappointments, mistakes, errors, and failures, see the developmental potential in them, and while you still might bemoan another “learning experience,” make it exactly that.

Ryan Smerek is an assistant professor and assistant director of the MS in Learning and Organizational Change program at Northwestern University. He is the author of Organizational Learning and Performance: The Science and Practice of Building a Learning Culture.

Photo Source: Sven Scheuermeier/Unsplash