What Happens When Two Competing Ideas are True in Managing People?

In teaching and in writing, I use a fair number of continuums to help move us off the traps of binary thinking. Our language tends to be in competing constructs that can obscure more nuanced views of reality, such as nature or nurture, fixed or growth mindset, smart or not smart, etc.* Sometimes these binary categories adequately represent reality when there are discrete shifts, but more often than not, they take us toward untenable extremes, open to so many counterexamples, that we may be whipsawed back-and-forth before we must admit that they are both partially true (e.g. it’s nature and nurture).

While continuums are useful, and they help us see how we might move back-and-forth along some binary given the context, there is another way of directly addressing the limits of our language—by understanding paradox.

If you are like me, however, when someone begins speaking about paradoxes, it seems like an overreaching attempt at being profound. I’ll try to avoid this trap as I describe what I think is a compelling research study on paradoxical leader behaviors (PLBs) that was recently published in the Academy of Management Journal.

In the article by Yan Zhang and colleagues, the researchers outline five dimensions of paradoxical leader behaviors, (I’ve included a sample measurement item in quotes):

  1. Treats subordinates uniformly while allowing individualization: “Assigns equal workloads, but considers individual strengths and capabilities to handle different tasks” (p. 548).
  2. Combines self-centeredness with other-centeredness: “Is confident regarding personal ideas and beliefs, but acknowledges that he or she can learn from others” (p. 548).
  3. Maintains decision control while allowing autonomy: “Makes decisions about big issues, but delegates lesser issues to subordinates” (p. 548).
  4. Enforces work requirements while allowing flexibility: “Is highly demanding regarding work performance, but is not hypercritical” (p. 548).
  5. Maintains both distances and closeness: “Maintains distance from subordinates at work, but is also amiable toward them” (p. 548).

These behaviors help us see continuing tensions that occur in management. You want high standards, but not to the extreme of being hypercritical; You also want close relationships, but you are not best friends; And you want to give employees autonomy but it needs to be balanced with control. Zhang et al. (2015) describe these as paradoxes because “rather than being ‘either–or,’ all things, including problems and challenges, are interrelated as ‘both–and’” (p. 539).

You might argue, however, that a person in a leadership role should flexibly shift their approach in either direction given the context, making this just a re-packaging of situational leadership and contingency theory. In response, the authors argue that choosing between these tensions should not be perceived as a “necessary evil,” as situational theorists might, but instead “to sustain long-term effectiveness, leaders must accept and harmonize paradoxes simultaneously” (p. 539). I’m not sure that contingency theorists would see choosing as a “necessary evil,” but the authors make a fair point about the need to harmonize and see the tensions as occurring simultaneously, rather than shifting behaviors discretely at different times. Although, perhaps it is both. In some conditions you are balancing these competing tensions, and in others you are shifting discretely.

Nevertheless, the research brings clarity to what can be hidden tensions, and the authors’ balanced approach helps overcome polemics about management derived from a selective reading of research (e.g. “Give everyone autonomy.”)

While these paradoxical leader behaviors are descriptively helpful, Zhang et al. (2015) also find that supervisors who scored higher on a combined measure of paradoxical leader behaviors had positive effects on subordinates. Specifically, in a sample of 76 supervisors and 588 subordinates across six companies in China, subordinates had higher work role performance as measured by scales of proficient, adaptive, and proactive behavior. But why would this be the case?

The authors argue that a leader serves as a role model “to show employees how to accept and embrace contradictions in complex environments” (p. 545), which leads to more effective action. Second, the authors argue that paradoxical leader behaviors more effectively create bounded and discretionary work environments. As they state, “bounded environments stress norms and standards whereby followers understand their work roles and responsibilities. At the same time, PLB gives followers discretion and individuality within the structure, allows them to be the focus of influence to maintain their dignity and confidence” (p. 545).

All of this sounds intuitively plausible and brings an Eastern perspective to harmonizing competing tensions. It also helps build additional nuance into our thinking, (see an earlier post on mental complexity).

As you think about the five paradoxical leader behaviors listed above, which one resonates the most with you? Are there other paradoxes related to managing that you would include? When does an “either/or” perspective seem to obscure the phenomenon, and “both/and” is needed?

References:

Walker, B. M. & Winter, D. A. (2007). The elaboration of personal construct psychology. Annual Review of Psychology, 58, 453-477.

Zhang, Y., Waldman, D. A., Han, Y-L., & Li, X-B. (2015). Paradoxical leader behaviors in people management: Antecedents and consequences. Academy of Management Journal, 58(2), 538-566.

*This is often referred to as our “personal constructs” derived from George Kelly’s (1955) The Psychology of Personal Constructs. For an updated overview of personal construct psychology, see Walker & Winter (2007).

Ryan Smerek is an assistant professor and assistant director of the MS in Learning and Organizational Change program at Northwestern University. He is the author of Organizational Learning and Performance: The Science and Practice of Building a Learning Culture.

Photo Source: Pixabay/Geralt

What Explains the Puzzle of Independent Thinking?

If you were like me in the aftermath of the 2008 financial crises, you often heard (and believed) that no one saw this coming. There were prominent firms going bankrupt that had thousands of people working in them who had every reason to see the risk they were taking. If no one saw this coming, the financial crises must have been the result of hidden and overwhelming financial complexity, beyond the realm of anyone’s ability to see it.

When I read Michael Lewis’ book The Big Short, this was a dramatic blow to this naïve belief. Many individuals did see this coming, and it was more than a half-formed opinion—they made million-dollar bets based on their analysis. But why did the individuals portrayed in the book have it figured out while so many others were caught blindly unaware?

I posed this question to an old college friend who has spent his career at prominent investment banks and hedge funds in New York City. He responded that you had to be an outsider to see it. To a large degree, the individuals in the Big Short were outsiders to Wall Street, although not exclusively as seen by Steve Carell’s character in the movie adaptation of The Big Short.

So what might account for the independent thinking that allowed many individuals to see what others couldn’t? While early in this research, there are likely a cluster of thinking habits (and situational conditions) that help foster independent thinking. Some of these thinking habits are outlined in the “employee voice” literature in the field of management. Employee voice is when individuals speak up at work, sometimes for ethical violations, but sometimes because they see things differently and have the courage to speak their minds. (For an overview of the employee voice field see Chamberlin, Newton, & LePine, 2017).

Some of the thinking habits of the employee voice literature include having a proactive personality and desire to make improvements, but these habits hardly explain the puzzle of independent thinking, especially in the face of the financial crisis.

One clue is that our thinking and motivation is powerfully shaped by wanting to be part of a social group. We are powerfully motivated to belong and to be a part of an “Inner Ring” (Lewis, 1944). This means we agree with the dominant view and desperately fear being ostracized (Eisenberger, Lieberman, & Williams, 2003). Those who are outsiders are free of these constraints. This is certainly one plausible candidate to partly explain independent thinking.

In research I have done on employee voice, many interviewees discuss a spontaneous response that breaks with the social norms of their immediate reference group. The reason often given for this response is “integrity.” Respondents mention not being able to live with “integrity” if they didn’t speak their mind and defend a core value. There would be an unacceptable split between how they were acting and what they truly believed.

While being relatively immune to being a member of the “Inner Ring” and having a sense of integrity are both important, I think there was something else at play with individuals able to see the forthcoming financial crisis. For lack of a better term, I would call it having a “commitment to the truth,” meaning a dogged persistence to look at reality as clearly as possible and listen to any hint of delusion within oneself and those around you.

It’s hard to know how a “commitment to the truth” is developed, perhaps with exposure to scientific thinking, or being part of an organization committed to truth (see Dalio, 2017). There are likely many ways this disposition is developed. If you have some candidates, please comment below any ideas you may have.

References:

Chamberlin, M., Newton, D. W., & LePine, J. A. (2017). A meta-analysis of voice and its promotive and prohibitive forms: Identification of key associations, distinctions, and future research directions. Personnel Psychology, 70, 11-71.

Dalio, R. (2017). Principles: Life and work. New York: Simon & Schuster.

Eisenberger, N. I., Lieberman, M. D., & Williams, K. D. (2003). Does rejection hurt? An fMRI study of social exclusion. Science, 302, 290–292.

Lewis, C. S. (1944). The Inner Ring. Memorial Lecture at King’s College, University of London. Retrieved from http://www.lewissociety.org/innerring.php.

Ryan Smerek is an assistant professor and assistant director of the MS in Learning and Organizational Change program at Northwestern University. He is the author of Organizational Learning and Performance: The Science and Practice of Building a Learning Culture.

Photo Source: Clker/Pixabay

Superstitious Learning and Groundhog Day

In the movie Groundhog Day (if you haven’t seen it), Bill Murray plays a narcissistic TV weatherman, who is sent to Punxsutawney, PA to report whether the groundhog, Punxsutawney Phil, sees his shadow. Murray, instead, finds himself repeating the same day, again and again. It doesn’t matter what he does, he’ll wake up again on Groundhog Day, in the same bed and breakfast, with the alarm clock radio waking him up to the song, “I Got you Babe.” Recognizing his predicament, Murray begins to indulge in every base-impulse—leading to humorous scenes of Murray gorging on every donut in a diner, and driving Punxsutawney Phil off a cliff in a pick-up truck, among other sordid examples.

Throughout the movie Bill Murray learns from experience, that his selfish and indulgent behavior leads to repeated negative outcomes. This includes being repulsive to his love interest (Andie MacDowell). Slowly, over dozens of cycles, his character begins to transform into a “Renaissance Man.” He learns to speak French and how to play the piano. Humorously, his helpful actions around town make Murray the target of a bidding war at a charity bachelor auction that evening. Needless to say, Andie MacDowell can’t help but be impressed at Murray’s redemptive transformation.

Contrast how Bill Murray learned from experience with one-shot experiences. When we are in one-shot experiences (e.g. choosing a college, a first job, a home) it is much costlier to “learn from experience” (see also Thaler, 2015, Chapter 6). In one-shot experiences, we can’t alter our approach slightly and see the results we get (as Bill Murray did), it is harder to learn what will produce better outcomes.

Without many feedback cycles to validly learn from experience, we are more prone to “superstitious learning” (March & Olsen, 1975), meaning in the search for causality we attribute a prior act as the cause of what we experience. Maybe we received a “lucky rabbit’s foot” as a gift prior to making a big, one-shot decision. We might superstitiously attribute the consequences we experience to that prior act. Or, more relatedly, we might “learn” that the shadow of a groundhog somehow predicts future weather conditions. 🙂

Likewise, when we try to explain organizational performance, we may attribute some social norm as the cause of performance. This may be plausible, but in other cases it is an extraneous social norm that has no causal impact. For example, I often teach a case study about Apple and the turnaround of the company by Steve Jobs. Steve Jobs had many strengths and talents, but he also had some well-known tendencies to treat people in a less than respectful manner. It’s cognitively-tempting to attribute the success of Apple to the accountability created by his berating-style of management. We’ll superstitiously learn that a berating management style leads to superior performance. In the one-shot experience of Steve Jobs turning around Apple, we cannot go back in time and see if they still would have succeeded had they removed Jobs’ style of management. Perhaps it was primarily Steve Jobs’ unique eye for talent and his focus on the beauty and simplicity of design that was causally important, and Apple succeeded in spite of his berating management style.

The point is to recognize if we are learning like Bill Murray in a “Groundhog Day-like” environment where we can improve with constant feedback and repetition (see also Kahneman & Klein, 2009), or if we are in one-shot experiences. If we are in one-shot experiences the likelihood of superstitious learning is high—heightening the need for tempered conclusions. This also heightens the need to learn how to learn. By this, I mean more systematically designing experiments to foster more valid learning. For example, rather than implementing only one option (given it is cost effective), we would test two options (much like A/B testing in web analytics where users receive one of two options and the designer can see which option leads to higher click-through rates.) By testing two options we can more validly learn what produces better results, rather than making attributions that may be superstitiously-derived.

So the next time you see yourself attributing causality (and more importantly taking action based on that attribution), ask yourself if there are other plausible attributions and how you know what you know. And, given the situation affords the opportunity to test various options, work to design ways to more deliberately create feedback cycles to validly learn. You may not redemptively transform like Bill Murray does, but you’ll increase the odds of reaching your goals.

References:

Kahneman, D., & Klein, G. (2009). Conditions for intuitive expertise: A failure to disagree. American Psychologist, 64(6), 515-526.

March, J. G. & Olsen, J. P. (1975). The uncertainty of the past: Organizational learning under ambiguity. European Journal of Political Research, 3,147-171.

Thaler, R. H. (2015). Misbehaving: The history of behavioral economics. New York: Norton

Ryan Smerek is an assistant professor and assistant director of the MS in Learning and Organizational Change program at Northwestern University. He is the author of Organizational Learning and Performance: The Science and Practice of Building a Learning Culture.

Photo Source: Alexas_Fotos/Pixabay

Don’t Tread on Me! Psychological Reactance as Omnipresent

If you’ve raised kids, or are trying to enact change in organization, you know well the phenomenon of psychological reactance, although you probably don’t know it by that name. You might call it the “terrible twos” or “change resistance.”

Psychological reactance is the instantaneous reaction we have to being told what to do (Brehm & Brehm, 1981). This leads to some remarkable findings, one of which I came across while reading about correcting misinformation. As Lewandowsky, Ecker, Seifert, Schwarz, & Cook (2012) state:

“People generally do not like to be told what to think and how to act, so they may reject particularly authoritative retractions. For this reason, misinformation effects have received considerable research attention in a courtroom setting where mock jurors are presented with a piece of evidence that is later ruled inadmissible. When the jurors are asked to disregard the tainted evidence, their conviction rates are higher when an “inadmissible” ruling was accompanied by a judge’s extensive legal explanations than when the inadmissibility was left unexplained (Pickel, 1995, Wolf & Montgomery, 1977)” (p. 116).

So, in this case, the mock jurors reacted to being told what to think—to the degree that they decided to use the inadmissible evidence just to spite the judge for being spoken to authoritatively!

With such a reaction that seems instantaneous (and is present for most toddlers), is psychological reactance pre-wired? Are we born with a propensity to react to any freedom being limited? And if so, what would be the evolutionary justification for such a tendency? Jonathan Haidt offers one explanation in his book The Righteous Mind. He hypothesizes that psychological reactance evolved to avoid being dominated by an alpha male, which could help a collective survive. He cites a stunning passage from Christopher Boehm’s Hierarchy in the Forest:

“A man named Twi had killed three other people, when the community, in a rare move of unanimity, ambushed and fatally wounded him in full daylight. As he lay dying, all of the men fired at him with poisoned arrows until, in the words of one informant, ‘he looked like a porcupine.’ Then, after he was dead, all the women as well as the men approached his body and stabbed him with spears, symbolically sharing the responsibility for his death” (Boehm as cited in Haidt 2012, p. 199).

There is a lot going on in this passage, but partly it is the response of a community responding to potential domination by Twi. The difficult question to answer is if societies (and individuals) who rose up against dominance were more likely to survive than those who passively accepted domination. Selection at a societal level is a controversial hypothesis, but if true, then psychological reaction may be a pre-wired disposition in us—making it all the more prevalent.

So, the next time you are seeking to implement a change in an organization, think of the villagers responding to Twi’s attempted domination. You will surely be on the receiving end of psychological reactance. This is all the more reason to build cooperation and buy-in from the start by openly inviting contributions and participation. Also, beware of condescension. Similar to jurors using inadmissible evidence, people may not be reacting to the content of any change initiative, but the manner in which it is explained. Of course, I am aware of the irony of telling you what to do in a post about psychological reactance. Consider these “helpful suggestions.” 🙂

References:

Brehm, S. S., & Brehm, J. W. (1981). Psychological reactance: A theory of freedom and control. Academic Press.

Haidt, J. (2012). The righteous mind: Why good people are divided by politics and religion. New York: Pantheon Books.

Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest, 13(3), 106-131.

Pickel, K. L. (1995). Inducing jurors to disregard inadmissible evidence: A legal explanation does not help. Law and Human Behavior, 19, 407–424.

Wolf, S., & Montgomery, D. A. (1977). Effects of inadmissible evidence and level of judicial admonishment to disregard on the judgments of mock jurors. Journal of Applied Social Psychology, 7, 205–219.

Ryan Smerek is an assistant professor and assistant director of the MS in Learning and Organizational Change program at Northwestern University. He is the author of Organizational Learning and Performance: The Science and Practice of Building a Learning Culture.

Photo Source: Chris Karidis/Unsplash

When Do We Change Our Minds? Think of a Jenga Tower

I recently read Philip Tetlock’s excellent book Superforecasting: The art and science of prediction. (1) It is about improving our ability to make forecasts, which includes updating our beliefs as we learn new information about how the future may unfold.

There is one section of the book that discusses belief updating and Tetlock (along with his co-author Dan Gardner) use a helpful metaphor for when we will or will not change our minds when encountered with new information. They say:

“Commitment can come in many forms, but a useful way to think of it is to visualize the children’s game Jenga, which starts with building blocks stacked one on top of another to form a little tower. Players take turns removing building blocks until someone removes the block that topples the tower. Our beliefs about ourselves and the world are built on each other in a Jenga-like fashion” (p. 162).

With this metaphor, they discuss how, if you are forecasting within your specialty, you can be more reluctant to discard certain beliefs when the domain is tightly-connected with your identity and self-worth.

I like the metaphor of beliefs as a Jenga Tower because it’s easy to be pessimistic that we’ll never change our minds when confronted with information that conflicts with our beliefs. In political science, they even find a strengthening of prior beliefs when confronted with conflicting evidence—called the “backfire” effect (Nyhan & Reifler, 2010). For example, if we are strong supporters of gun control or another hot-button issue, when we receive conflicting information that doesn’t support our preexisting beliefs, we’ll argue against it, thus strengthening our prior beliefs. This is certainly true in many circumstances, but occasionally we will update our views (even those close to the base of the Jenga tower) as “incongruency builds” over time (See Redlawsk, Civettini, & Emmerson, 2010). Yes, for many issues we won’t budge (at least in the short term), because it would bring the whole Jenga Tower crashing down, but for other beliefs near the top, we’ll slowly remove and update some of them in accordance with the evidence we are presented with.

So, the next time you seem to be in an intractable conflict, think about where you are in the Jenga Tower (and what this says about belief centrality). If you are near the base of someone’s self-worth and deeply-held identity, you have little chance of dislodging a core belief (at least in the moment). Instead, move up the Jenga Tower toward less foundational assumptions. For example, we can discuss whether tax credits for low-emission vehicles are a worthy policy to combat pollution, without having to agree and debate the causes of climate change (See Lewandowsky, Ecker, Seifert, Schwarz & Cook, 2012), This may be a more fruitful (and hopeful) way of conversing in domains that seem intractable, and eventually changing our (and others) minds.

(1) Don’t be put off by the heroically dramatic title. The book is highly-nuanced and impressively-grounded. I’d put it in the same category as Kahneman’s masterful Thinking Fast and Slow.

References:

Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest, 13(3), 106-131.

Nyhan, B., & Reifler, J. (2010). When corrections fail: The persistence of political misperceptions. Political Behavior, 32, 303-330.

Redlawsk, D. P., Civettini, A. J. W., & Emmerson, K. M. (2010). The affective tipping point: Do motivated reasoners ever “get it”? Political Psychology, 31(4), 563-593.

Tetlock, P. E. & Gardner, D. (2015). Superforecasting: The art and science of prediction. New York: Crown Publishers.

Ryan Smerek is an assistant professor and assistant director of the MS in Learning and Organizational Change program at Northwestern University. He is the author of Organizational Learning and Performance: The Science and Practice of Building a Learning Culture.

Photo Source: Guma89/Wikimedia Commons

The Intuition of Truth

Whenever I finish a movie “based on a true story,” there comes a moment when the credits begin rolling and I want to know how much of it was true. This happened recently after finishing “The Founder,” about Ray Kroc and how he turned McDonald’s into an American institution. It led to a brief internet search to learn about the deal that was signed (or not signed) with the original McDonald’s brothers and whether the movie’s portrayal resembled reality. We only have so much time, of course, to dig into the “facts” of any movie “based on a true story,” but the point remains that we still have an intuitive reaction about the “truth.” We’ll certainly allow some “creative license,” but we still want the major points to largely resemble what occurred.

Likewise, I heard a talk by the author Tim O’Brien around the fall of 2010. He is the author, among others, of the book The Things They Carried, a collection of short stories about soldiers during the Vietnam War. I had read the book in the 1990s and was moved by the reflective tales about what war must have been like. During his talk, he mentioned an amusing story about his young son peeing into a waste paper basket in their family bathroom. I believe the waste paper basket was of the wicker-variety so that a pool of urine was all over the bathroom floor. I do not remember the exact reason his son did this, but the reason was humorous, along with the numerous other particulars of the story. The room was in laughter. When he finished, however, he said, “I just made that story up. It never happened.” He went on to say how it didn’t matter whether it happened or not, it was the emotional truth of the story. I can remember vehemently disagreeing. It did matter.

Both of these scenarios (movies based on a true story and Tim O’Brien’s “tale” of his son) sparked an intuitive reaction—that it matters whether something is true or not. But knowing “the truth,” is very complex, of course. There are many meanings of the word “truth” (Horwich, 2010, 2013). As Horwich (2013) describes, truth has been defined as “correspondence with the facts,” as “provability,” as “practical utility,” and as “stable consensus,” but “all turned out to be defective in one way or another—either circular or subject to counterexamples” (para. 12). Despite the contested nature of truth, we don’t want to “throw the baby out with the bathwater.”

This brings me to the excellent TED talk by Michael Patrick Lynch titled “How to see past your own perspective and find truth.” Lynch, a philosophy professor at the University of Connecticut, states:

“Skepticism about truth…is a bit of self-serving rationalization disguised as philosophy. It confuses the difficulty of being certain with the impossibility of truth. Look — of course it’s difficult to be certain about anything…But in practice, we do agree on all sorts of facts. We agree that bullets can kill people. We agree that you can’t flap your arms and fly. We agree — or we should — that there is an external reality and ignoring it can get you hurt.”

Thus, while truth has many meanings and it is difficult to be certain about anything, we still need “truth” as a regulatory ideal. At the very least, a belief in truth is functional in that it helps us get to the bottom of issues to develop a more solid footing for our beliefs and opinions. Asking of any claim “is it true?” helps spark critical thinking and dialogue to overcome a confirmatory stance to the world (1). So, we shouldn’t throw out the notion of truth with postmodern skepticism, even if it is difficult to ascertain, and the word, itself, means many different things. We should, at least, listen to our intuitive reaction and curiosity to understand if something is “true,” (i.e., did it actually occur), and not so easily abandon this ideal.

(1) See Dalio (2017) or Smerek (2017) for embedding the social norm of “pursuing truth” in an organizational setting.

References:

Dalio, R. (2017). Principles: Life and work. New York: Simon & Schuster.

Horwich, P. (2010). Truth–meaning–reality. Oxford, UK: Oxford University Press.

Horwich, P. (2013, March 3). Was Wittgenstein right? The New York Times.

Lynch, M. P. (2017, April). How to see past your own perspective and find truth. TED.com

Smerek, R. E. (2017). Organizational learning and performance: The science and practice of building a learning culture. New York: Oxford University Press.

Ryan Smerek is an assistant professor and assistant director of the MS in Learning and Organizational Change program at Northwestern University. He is the author of Organizational Learning and Performance: The Science and Practice of Building a Learning Culture.

Photo Source: Dino Reichmuth/Unsplash

 

 

Forget Intelligence. Aim for Mental Complexity.

In reading the entire Oxford English Dictionary over the course of a year (a mere 20 volumes and 21,730 pages!), Ammon Shea was asked by NPR host Tom Ashbrook about some of the more compelling words he learned.

Shea mentioned “apricity,” which is the warmth of the sun in the winter. As he says:

“[Apricity] is a word that I sincerely hope I will never work into general conversation…I am not a fan of using big words for their own sake. However, I do find now that in the [wintertime] as I am bathed in that warmth of the winter sun, I am more cognizant of the fact that it’s happening, knowing there is a word with which to describe it” (Shea, 2008).

After learning the word apricity, Shea began to experience the world with more cognizance. You’ve likely had the same experience in learning a new concept about workplace dynamics. Perhaps you’ve learned about personality dimensions such as introversion-extraversion or how power dynamics can be at play in subtle ways. After learning these new ideas you experience workplace dynamics with more cognizance. You see more and experience more—just like Ammon Shea experiencing the warmth of the winter sun.

However, when trying to explain any kind of mental development, we often attribute that development to an increase in “intelligence.” Researchers in the field of adult development discuss a different form of intellectual competence—mental complexity (see Kegan & Lahey, 2009). Mental complexity is the variety of perspectives, concepts, and vocabulary we have to make sense of the world. With greater mental complexity, we can perceive more and take more effective action. We expand our action repertoire. As, “No one is ever free to do something he can’t think of” (Weick, 1979, p. 193).

Of course, there have been many critiques of intelligence that have tried to take it off its explanatory pedestal. Most notably is Keith Stanovich’s (2009) impressive work on the rational thinking habits that are dissociable from traditional measures of intelligence. In addition, Carol Dweck (2006), with her research on having a growth mindset, has helped us question our assumptions about intelligence. In particular, how our implicit belief about intelligence as a fixed quantity limits whether we seek challenges and persist, because challenges might reveal what is innately lacking. Instead, if we view the mind as a muscle, and capable of growing, we are more likely to pursue challenges to expand our “intelligence.” As important as this may be, we are still trapped within frame of intelligence.

If, however, we seek to expand our mental complexity, we can forgo any implicit assumptions we might have about intelligence. Instead, by aiming for mental complexity, we can more freely expand our cognizance of what we are experiencing and what is occurring in the organizations in which we work.

What does this mean you should do? Increase the number of educational actions you take: read more, persist through confusion to try and explain what you are experiencing with a more nuanced vocabulary. Do this—not to increase your intelligence—but to expand your mental complexity.

References:

Dweck, C. S. (2006). Mindset: The new psychology of success. New York: Random House.

Kegan, R., & Lahey, L. L. (2009). Immunity to change: How to overcome it and unlock the potential in yourself and your organization. Boston, MA: Harvard Business School Publishing.

Shea, A. (2008, August 9). Reading the OED. On Point with Tom Ashbrook. [Audio Podcast] Retrieved from http://onpoint.legacy.wbur.org/2008/08/19/reading-the-oed

Stanovich, K. E. (2009). What intelligence tests miss: The psychology of rational thought. New Haven, CT: Yale University Press.

Weick, K. E. (1979). The social psychology of organizing (2nd ed.). Reading, MA: Addison-Wesley.

Ryan Smerek is an assistant professor and assistant director of the MS in Learning and Organizational Change program at Northwestern University. He is the author of Organizational Learning and Performance: The Science and Practice of Building a Learning Culture.

Photo Source: Kelly Sikkema/Unsplash