On Friday, your correspondent hosted a party and, typically, began to inflict his ideas on unsuspecting strangers. One guest — a Muscovite cognitive scientist researching early childhood learning at the Harvard Ed school — turned out to be a very willing victim. A long conversation about cognitive science ensued, centering on the value of “metacognition” — that is, thinking about thinking.
A little background: One of the most important Important Facts about the Modern World is the rise and dispersion of psychology. Throughout our history, we humans have always talked about our selves. But before the 20th century, our major tool for doing so was literature. When we sought to understand our lives and feelings, we looked for comparisons in myths and stories, we borrowed metaphors and constructed our own, we compared and shared our experiences with our acquaintances through conversation in an utterly unscientific manner; and with the rise of novels, we began to compare our inner experiences to those imputed to the characters within. But, post-Freud, educated people everywhere have instead begun to appropriate the vocabulary of psychology to understand themselves. We started with pseudo-scientific Freudian and Jungian lexica, and have since progressed to the language of brain chemistry. E.g., in the olden days, when we had a question like “why do couples cuddle?” we would maybe use a saccharine metaphor about a vine and a tree or borrow the myth of Aristophanes; from the 1920s through the 1970s or so we would talk about our relationship with our mothers or archetypes or something; today we tell a story about oxytocin and pair bonding and evolutionary psychology.
Is this revolutionary change in how we think about ourselves — and our new fixation on thinking about our brains — a good thing? Yes and no.
Your correspondent and interlocutor agreed on the main benefits of metacognition. The single biggest, most important takeaway from cognitive science (though many earlier philosophers, most notably Nietzsche, happened upon the same insight without scientific pretensions) is that the human mind does not reflect reality, but rather, reconstructs it from a number of different sources — perception, ideas, biases, social cues, and all kinds of motivations. That is, our mind is less like a mirror and more like a painter: A typical painting obviously in some ways corresponds to the subject on which it is based, but it differs from a direct reflection (or re-presentation) according to, for example, the aspects of the scene the painter selected to bring within his frame, the aspects he has chosen to focus on, his attitude toward the scene and the subjects within, the other painters and schools of painting he identifies with and has learned from, the painters whom he hates and wishes to dissociate from, his own idiosyncracies, the flaws of his own hand coordination and brushes and palette, and his own desire to win fame by cultivating a unique style. To get a little cheesy about it, we can follow Oscar Wilde in saying that a portrait reveals more of the painter than the sitter. (The key problem with this metaphor is that people can easily see how a painting differs from the actual object it represents, whereas we are, tautologically, incapable of directly seeing how our mind’s reconstruction of reality is different from real reality.) This goes even to the very basic level of how, for example, basic visual perception works: We naively assume that we are ‘taking in’ our entire field of vision all of the time, but in fact our brain really only directly visually perceives the objects of our most intense focus and unexpected changes in the field of vision, while it continually reconstructs the rest based off of memories and expectations and other senses.
When you start to think about your mind this way — and combine that basic model of the brain with more specific examples and insights from other parts of the brain sciences, such as social psychology — it really does change your life. Before you write someone off as a jerk, you start to ask questions like: “Why am I perceiving this person as a jerk? Well, look at that jerkish behavior! But, then again, ‘jerkish’ is a matter of interpretation. If he had a different facial structure would I feel quite so sure? If my best friend did the exact same thing, would I consider it jerkish behavior? Or would I, feeling a little warmer toward the subject, be more likely to write it off as playfulness? Or might I attribute the behavior to a rare product of particularly bad, uncontrollable circumstances?” And then, you start to think things like this: “I think this group is wicked: Are they actually wicked? Yes, of course! I read all kinds of news reports about the wicked things they do! But wait. Could it be that, for whatever reason, I am disproportionately attentive to news stories about the wicked things this group does? Or, beyond that, could people in the news media be disproportionately attentive to, hence disproportionately inclined to report on, the wicked things this group does? In short, what do the actual rigorous statistics say?” Where your correspondent has written ‘this group‘ the reader should insert, according to her own animosities, words like ‘people from neighborhood X’ or ‘Democrats’ or ‘Republicans’ or ‘evangelicals’ or ‘Jewish financiers’ or ‘working class toughs’ or ‘rich people’ or ‘football players’ or ‘lesbians’ or ‘XYZ activists’ or whatever.
And it also affects every interaction in social life: “That person at this party does not like me — Execute avoidance maneuvers. But am I sure he does not like me? Well, his body language is standoffish. Then again, so is mine. But that’s just because I know he does not like me! Wait: Maybe the problem is he thinks I do not like him? How to fix that?” In short, when you fully internalize the idea that your mind is a painter and not a mirror, it makes you much, much more skeptical about yourself and all of your ideas, and, accordingly, less assertive in your animosities, more willing to give people the benefit of the doubt, and, where applicable, more interested in actual scientific methods for figuring things out. This kind of mindset is really important for basic social life, but it could also have hugely useful social consequences. It could do things from, e.g., (1) getting more people to take advantage of cheap housing in putatively ‘bad’ but statistically safe neighborhoods (which, in practice, would also lead to greater integration), to (2) reducing international conflict, as people become more cognizant of how xenophobia could be biasing their perceptions of other countries’ intentions, etc.
Does your correspondent sound pretty rapturous about the potential metacognition for human comity and happiness, etc? He is. Your correspondent and interlocutor also agreed that, in the digital present and the digital future, in which it is safe to assume that a piece of information that cannot be found via Google within 10 minutes probably does not exist, metacognition will become an increasingly essential intellectual tool. Since in a few years we all will have Google in our eyeglasses, we’re not going to need to be trained to get or retain information, but we will need to be very good at culling it, interpreting it, and preventing our own brains from distorting it. So metacognition is becoming increasingly important intellectual tool.
But as the euphoria of the celebration subsided, your correspondent, the next day, considered three downsides to our society’s shift from literature to psychology as a way of understanding ourselves:
1. The more we conceive of all aspects of our personality and behaviors as brain-chemistry based, the easier it is to talk ourselves into feeling that we have no control over our flaws. And that makes it easier for us to excuse, hence less likely to change, those flaws. Your correspondent has actually read an interesting study (momentarily unfindable) in which researchers had test subjects read a paper that powerfully argued that human have no free will: afterwards, the test subjects exhibited less motivation on tasks of self-control, ethical restraint, delayed gratification, etc. The belief that we are at the whims of our brain chemistry can be a self-fulfilling prophecy. There’s an obvious trade-off here: Insofar as brain chemistry actually does cause some people’s problems, its cruel and unfair to hold them accountable for those problems. But if we’re too lax, we harm people by giving them too-easy excuses.
2. The popularization of psychology has been infuriating in its invasion of political discourse. The conversations we really should be having in politics are about policies themselves. “Is policy X good or bad?” should be the question that constrains every political conversation. But, partly as a result of the popularization of psychology (though this may be just a result of nasty partisanship more generally) we are spending much more time talking about people and their presumed motivations. Using the ostensibly scientific language of psychology can be a neat way to veil the most bitter demonizations of your political opponents. Are Obama’s economic policies motivated by a professorial resentment of the rich? Are sanctions against Iran motivated in part by anti-Muslim animus? It doesn’t really matter for policy itself. What matters is whether the policies are just, legal, and likely to have better effects than their alternatives. And the more we pathologize our opponents’ psychology, the less time we have for discussion of those things. Does this sound like a pet peeve? It is.
3. The brain sciences broadly are displacing the humanities, reducing the time the educated public devotes to the latter. Taken to its extreme, this could be bad for at least three reasons: (i) The literary imagination is an intrinsically pleasurable and worthwhile thing. The story about oxytocin and the evolution of pairbonding is scientifically true, but Aristophanes’ myth is a bit more touching, and may be worth revisiting and passing down to the generations just for that. (ii) Given that our brains have evolved to learn from narratives, stories may inevitably be more resonant and memorable for us and therefore better learning devices as well; to get really good at thinking about the inner lives of humans, we probably still need to supplement our psychology textbook with some Jane Austen. (iii) Since psychology is a relatively new field, there are almost certainly insights about our selves contained in the “embodied wisdom” of literary traditions that psychology has not yet re-discovered. Given all that, it’s probably a bad idea to throw out the humanities just yet.
On the whole, metacognition is a hugely good thing for mankind. Part of the solution to the problems above is just to do meta-cognition more deeply and intelligently — that is, to be smarter and more knowledgeable about brain science. For example, the problem in #1 could be mitigated if we were more cognizant of and guarded against our own susceptibility to suggestion. Problem #2 can be alleviated precisely by recognizing Problem #2 — that is, by admitting that we often use ‘psychology’ as a handy veil for plain hostile demonizations of others. Brain science itself provides the best argument to prevent Problem #3, by illuminating how we learn from stories.