Metacognition Changed Our Lives — For Good and Bad

On Friday, your correspondent hosted a party and, typically, began to inflict his ideas on unsuspecting strangers. One guest — a Muscovite cognitive scientist researching early childhood learning at the Harvard Ed school — turned out to be a very willing victim. A long conversation about cognitive science ensued, centering on the value of “metacognition” — that is, thinking about thinking.

A little background: One of the most important Important Facts about the Modern World is the rise and dispersion of psychology. Throughout our history, we humans have always talked about our selves. But before the 20th century, our major tool for doing so was literature. When we sought to understand our lives and feelings, we looked for comparisons in myths and stories, we borrowed metaphors and constructed our own, we compared and shared our experiences with our acquaintances through conversation in an utterly unscientific manner; and with the rise of novels, we began to compare our inner experiences to those imputed to the characters within. But, post-Freud, educated people everywhere have instead begun to appropriate the vocabulary of psychology to understand themselves. We started with pseudo-scientific Freudian and Jungian lexica, and have since progressed to the language of brain chemistry. E.g., in the olden days, when we had a question like “why do couples cuddle?” we would maybe use a saccharine metaphor about a vine and a tree or borrow the myth of Aristophanes; from the 1920s through the 1970s or so we would talk about our relationship with our mothers or archetypes or something; today we tell a story about oxytocin and pair bonding and evolutionary psychology.

Is this revolutionary change in how we think about ourselves — and our new fixation on thinking about our brains — a good thing? Yes and no.

Your correspondent and interlocutor agreed on the main benefits of metacognition. The single biggest, most important takeaway from cognitive science (though many earlier philosophers, most notably Nietzsche, happened upon the same insight without scientific pretensions) is that the human mind does not reflect reality, but rather, reconstructs it from a number of different sources — perception, ideas, biases, social cues, and all kinds of motivations. That is, our mind is less like a mirror  and more like a painter: A typical painting obviously in some ways corresponds to the subject on which it is based, but it differs from a direct reflection (or re-presentation) according to, for example, the aspects of the scene the painter selected to bring within his frame, the aspects he has chosen to focus on, his attitude toward the scene and the subjects within, the other painters and schools of painting he identifies with and has learned from, the painters whom he hates and wishes to dissociate from, his own idiosyncracies, the flaws of his own hand coordination and brushes and palette, and his own desire to win fame by cultivating a unique style. To get a little cheesy about it, we can follow Oscar Wilde in saying that a portrait reveals more of the painter than the sitter. (The key problem with this metaphor is that people can easily see how a painting differs from the actual object it represents, whereas we are, tautologically, incapable of directly seeing how our mind’s reconstruction of reality is different from real reality.) This goes even to the very basic level of how, for example, basic visual perception works: We naively assume that we are ‘taking in’ our entire field of vision all of the time, but in fact our brain really only directly visually perceives the objects of our most intense focus and unexpected changes in the field of vision, while it continually reconstructs the rest based off of memories and expectations and other senses.

When you start to think about your mind this way — and combine that basic model of the brain with more specific examples and insights from other parts of the brain sciences, such as social psychology — it really does change your life. Before you write someone off as a jerk, you start to ask questions like: “Why am I perceiving this person as a jerk? Well, look at that jerkish behavior! But, then again, ‘jerkish’ is a matter of interpretation. If he had a different facial structure would I feel quite so sure? If my best friend did the exact same thing, would I consider it jerkish behavior? Or would I, feeling a little warmer toward the subject, be more likely to write it off as playfulness? Or might I attribute the behavior to a rare product of particularly bad, uncontrollable circumstances?” And then, you start to think things like this: “I think this group is wicked: Are they actually wicked? Yes, of course! I read all kinds of news reports about the wicked things they do! But wait. Could it be that, for whatever reason, I am disproportionately attentive to news stories about the wicked things this group does? Or, beyond that, could people in the news media be disproportionately attentive to, hence disproportionately inclined to report on, the wicked things this group does? In short, what do the actual rigorous statistics say?” Where your correspondent has written ‘this group‘ the reader should insert, according to her own animosities, words like ‘people from neighborhood X’ or ‘Democrats’ or ‘Republicans’ or ‘evangelicals’ or ‘Jewish financiers’ or ‘working class toughs’ or ‘rich people’ or ‘football players’ or ‘lesbians’ or ‘XYZ activists’ or whatever.

And it also affects every interaction in social life: “That person at this party does not like me — Execute avoidance maneuvers. But am I sure he does not like me? Well, his body language is standoffish. Then again, so is mine. But that’s just because I know he does not like me! Wait: Maybe the problem is he thinks I do not like him? How to fix that?” In short, when you fully internalize the idea that your mind is a painter and not a mirror, it makes you much, much more skeptical about yourself and all of your ideas, and, accordingly, less assertive in your animosities, more willing to give people the benefit of the doubt, and, where applicable, more interested in actual scientific methods for figuring things out. This kind of mindset is really important for basic social life, but it could also have hugely useful social consequences. It could do things from, e.g., (1) getting more people to take advantage of cheap housing in putatively ‘bad’ but statistically safe neighborhoods (which, in practice, would also lead to greater integration), to (2) reducing international conflict, as people become more cognizant of how xenophobia could be biasing their perceptions of other countries’ intentions, etc.

Does your correspondent sound pretty rapturous about the potential metacognition for human comity and happiness, etc? He is. Your correspondent and interlocutor also agreed that, in the digital present and the digital future, in which it is safe to assume that a piece of information that cannot be found via Google within 10 minutes probably does not exist, metacognition will become an increasingly essential intellectual tool. Since in a few years we all will have Google in our eyeglasses, we’re not going to need to be trained to get or retain information, but we will need to be very good at culling it, interpreting it, and preventing our own brains from distorting it. So metacognition is becoming increasingly important intellectual tool.

***

But as the euphoria of the celebration subsided, your correspondent, the next day, considered three downsides to our society’s shift from literature to psychology as a way of understanding ourselves:

1. The more we conceive of all aspects of our personality and behaviors as brain-chemistry based, the easier it is to talk ourselves into feeling that we have no control over our flaws. And that makes it easier for us to excuse, hence less likely to change, those flaws. Your correspondent has actually read an interesting study (momentarily unfindable) in which researchers had test subjects read a paper that powerfully argued that human have no free will: afterwards, the test subjects exhibited less motivation on tasks of self-control, ethical restraint, delayed gratification, etc. The belief that we are at the whims of our brain chemistry can be a self-fulfilling prophecy. There’s an obvious trade-off here: Insofar as brain chemistry actually does cause some people’s problems, its cruel and unfair to hold them accountable for those problems. But if we’re too lax, we harm people by giving them too-easy excuses.

2. The popularization of psychology has been infuriating in its invasion of political discourse. The conversations we really should be having in politics are about policies themselves. “Is policy X good or bad?” should be the question that constrains every political conversation. But, partly as a result of the popularization of psychology (though this may be just a result of nasty partisanship more generally) we are spending much more time talking about people and their presumed motivations. Using the ostensibly scientific language of psychology can be a neat way to veil the most bitter demonizations of your political opponents. Are Obama’s economic policies motivated by a professorial resentment of the rich? Are sanctions against Iran motivated in part by anti-Muslim animus? It doesn’t really matter for policy itself. What matters is whether the policies are just, legal, and likely to have better effects than their alternatives. And the more we pathologize our opponents’ psychology, the less time we have for discussion of those things. Does this sound like a pet peeve? It is.

3. The brain sciences broadly are displacing the humanities, reducing the time the educated public devotes to the latter. Taken to its extreme, this could be bad for at least three reasons: (i) The literary imagination is an intrinsically pleasurable and worthwhile thing. The story about oxytocin and the evolution of pairbonding is scientifically true, but Aristophanes’ myth is a bit more touching, and may be worth revisiting and passing down to the generations just for that. (ii) Given that our brains have evolved to learn from narratives, stories may inevitably be more resonant and memorable for us and therefore better learning devices as well; to get really good at thinking about the inner lives of humans, we probably still need to supplement our psychology textbook with some Jane Austen. (iii) Since psychology is a relatively new field, there are almost certainly insights about our selves contained in the “embodied wisdom” of literary traditions that psychology has not yet re-discovered. Given all that, it’s probably a bad idea to throw out the humanities just yet.

***

On the whole, metacognition is a hugely good thing for mankind. Part of the solution to the problems above is just to do meta-cognition more deeply and intelligently — that is, to be smarter and more knowledgeable about brain science. For example, the problem in #1 could be mitigated if we were more cognizant of and guarded against our own susceptibility to suggestion. Problem #2 can be alleviated precisely by recognizing Problem #2 — that is, by admitting that we often use ‘psychology’ as a handy veil for plain hostile demonizations of others. Brain science itself provides the best argument to prevent Problem #3, by illuminating how we learn from stories.

Advertisements

10 thoughts on “Metacognition Changed Our Lives — For Good and Bad

  1. “Insofar as brain chemistry actually does cause some people’s problems, its cruel and unfair to hold them accountable for those problems.”

    This leads to an interesting case against the insanity defense. If the law presumes determinism/materialism, then saying “my brain made me do it!” is not very meaningful, since, um, everyone’s brain made them do everything.

    Number 2 is a great point. The Erik Corollary to this (yes– I went there) is when your opponent finds a way to ascribe your behavior to genetics. Boom! See, he can’t even /help/ but act that way! That’s so /like/ him!

    • “This leads to an interesting case against the insanity defense. If the law presumes determinism/materialism, then saying “my brain made me do it!” is not very meaningful, since, um, everyone’s brain made them do everything.”
      –Exactly. Determinism has hugely troubling moral implications.

  2. Re: “Your correspondent has actually read an interesting study (momentarily unfindable) in which researchers had test subjects read a paper that powerfully argued that human have no free will: afterwards, the test subjects exhibited less motivation on tasks of self-control, ethical restraint, delayed gratification, etc. The belief that we are at the whims of our brain chemistry can be a self-fulfilling prophecy.”

    I know the NYtimes wrote this study up and I tried to find it too but couldn’t. Seems pretty obvious that it has been removed from the internet by some very powerful people who don’t want us to know that we have free will

  3. Metacognition isn’t a tool you can just whip out on a whim. The problem with your argument is that a lot of these biases that you say can be solved with knowing and correcting for through metacognition (confirmation bias, self fulfilling prophecy, emotional bias, etc.) can’t easily be solved with just knowing that they play a part in your own decision making. It’s also important to consider that the first step in metacognition, knowing that these biases exist, can only occur because the research that has shed light on these biases came from external observation not internal (metacognitive introspection resulted in freudian psychology which we now know is 100% untrue). Sure, now we have read up on these psychological pitfalls we can consciously try to train ourselves to correct for them but after the awareness step. However, behavioral execution requires significant training and cognitive resources. There’s a lot of research that shows that as our cognitive resources are more depleted cognitive biases of all sorts become stronger. Unless we are monks that can clear our minds of distractions and focus on correcting these extremely strong biases, our cognitive resources are at many times maxed out. Of course, training your mind to correct for these biases is not useless BUT I think much more effective ways are to develop social mechanisms (e.g. stickk.com or using shame as a tool to motivate one’s self from overeating) , external architecture mechanisms (putting up a barrier so phsyical characteristics don’t influence judges’ opinions of an auditiioning candidate) , and computers to help us make more optimal decisions.

    • Mark, brilliant, extremely illuminating stuff. Agreed on all points: We can never fully overcome our biases, and so ultimately we’ll need to design institutions and procedures to filter out our biases, rather than just hope that individuals will all just become less biased over time.

  4. “reducing international conflict, as people become more cognizant of how xenophobia could be biasing their perceptions of other countries’ intentions, etc.” <– metacognition can go both ways (right or wrong). this happens to be a way in which theory of mind results in a good consequence (because policymakers recognize that xenophobia exists and this recognition does not depend on metacognition but rather education)

    "Are Obama’s economic policies motivated by a professorial resentment of the rich?" <— this is theory of mind gone "wrong" or resulting in a less desirable outcome for policy. People create their own theories of what is going on in Obama's head.

    It's also important to point out that "theory of mind" does not equal "psychology" does not equal "metacognition". Metacognition in the strict sense refers to knowing about knowing (and is on an individual level). Theory of mind is using your mind to think about what others may be thinking (whether we extrapolate theories of mind using our own minds as models has been investigated and shows that no, there are various other ways in which we construct what others may be thinking). All humans have extremely well developed Theory of Mind capabilities and this ability to think about human motivations outside of one's self is what some people erroneously refer to as "psychology". Just because policymakers are considering the motivations of others in their decisionmaking and coming up with erroneous theories of mind is not going to show that people are using "psychology" wrongly in policy circles. This is just people being people. Using psychological research which comes out of rigorous empirical methods has huge benefits in debiasing decisions that may come out of policymakers trying to "guess" how policies will affect individuals behaviorally and emotionally (which is how it was done for a long, long time until the last two decades where behavioral research has become more important).

    • Awesome points, and once again, agreed on all.

      You’re right that I’m being a bit lazy in my uses of “metacognition” and “psychology” and “theory of mind.” I’ll just clarify that part of the problem is that this post tries to mix intellectual history and brain science. While psychology, metacognition, and theory of mind are all separate concepts, it does seem to me that they have been historically bundled together — i.e., as ‘psychology’ has begun to displace the humanities with the educated public, educated people have become more inclined to implicitly think about what’s going on in other people’s brain chemistry. I guess I’m just really saying that the brain has become more salient in all of our thinking about everything. So, yes, while imputing motives to one’s political opponents is partly just ‘people being people,’ it seems to me it also has to do with the new intellectual preeminence of psychology. That’s not to say that we can “blame” psychologists for nasty, politicized imputations of motives. It just means that there are (small and outweighed) potential downsides to the overall intellectual shift which has made the physical brain more salient.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s