You're a Mind Reader
You are a Mind Reader. Those words in the first chapter of nicholas epleys mindwise sound more like "You're a wizard, Harry" than a renowned psychology Professor's Magnum Opus. But he makes a good point:
On a daily, we read other’s minds without even thinking about it. From the moment I wake up (and dress to impress - but what’s gonna impress others?) to the moment I go to sleep (and wonder why my friend was so irritable today), I x-ray people’s skulls, inferring what they might think, feel & want.
And the list goes on:
- Saying the right thing to motivate your team (requires knowing their motives)
- Saying the right thing to impress your mother-in-law (requires knowing her beliefs & preferences)
This inherent ability to look into other people’s heads has enabled humans to become the most socially connected species to ever walk this planet. But its ease is also its downfall. Mind Reading is so automatic, so ingrained in our day-to-day, that we rarely even notice we’re doing it - and barely ever question its results. The Result: A ridiculous overconfidence in its accuracy. For Example, Married Couples playing the Newlywed Game predicted an accuracy of 80%, when the actual accuracy was 30% (and random answering would’ve given 20%)¹ .
This is not to say that this ability is useless - humans consistently perform over chance in pretty much any mind reading task they are given. That’s impressive! They’re just never as accurate as they believe to be. But why is that?
The brain doesn’t correct for what it doesn’t know
As could be expected from a process as complex as x-raying someone else’s skull, the errors causing our shortcomings are quite broad. However, one of the themes I repeatedly came across was the problem of not knowing what you don’t know. This sounds stupid, I know. But I don’t make the rules. Here’s how this shows in the 3 most common sources of error:
Excessive Egocentrism
The realisation that other people have other perspectives is something we grow into, not grow up with. We all remember playing Hide & Seek with a kid that is completely confident in his hiding spot: right on the couch with their hands over their eyes - I can’t see me, so he can’t see me.
For the longest time, psychology was convinced that this egocentrism is something we grow out of - surely, wouldn’t be stupid enough to believe something like that, right? Well, as Epley et al. cleverly demonstrated in a 2004 study², our intuitive thinking remains egocentric - we just get better at overcoming that reflex with contemplative thinking. The brain makes inferences based on our own perspective first, just to correct them to fit other perspectives later.
The problem is that these corrections routinely come too late & too little, leaving lots of egocentric residue in our judgements. A (non-exhaustive) selection of all the wonderful things this causes are:
- Overclaiming³,
- the Spotlight Effect & Curse of Knowledge and my personal favourite:
- People attributing their own attitudes & beliefs to be god’s attitudes & beliefs - and if theirs change, god’s change⁴.
That experiment is definitely worth its own article - there’s a lot to unpack - but the key takeaway is that the less we know about another mind (and with god, there’s not a lot of knowing involved), the more we project our own - but our brain doesn’t tell us. I mean, no one in their right mind would tell you that “god just believes what I believe” - but apparently he does.
Spectacular Stereotypes
Our opinions span a much bigger space that what we could ever directly observe. People will happily tell you what the average person thinks, even if they’ve never gone out and surveyed 10,000 random people. George W. Bush was certain about Saddam Hussein’s intentions, even though they never had dinner together. To imagine the minds of these distant, relatively unknown others, the mind pieces together a picture from what you've read or heard from others.
And this should, theoretically, make for wonderful results - Humans are remarkably good at extracting averages from a group: In a split second, people can reliably determine the average size of a group of circles⁵. However, sadly, people are not circles 😒. In a group of circles, no circle is out for a smoke break - no circle has to be read about in the news & all circles are honest. Also, you don't have to judge the circles' invisible features like beliefs or intentions.
The result of this imperfect world is beautifully illustrated in this 2011 study⁶: Respondents were shown pairs of pie charts representing wealth distribution in 3 "hypothetical countries" and were asked to pick the country they'd prefer if they were randomly assigned to one of the wealth quintiles.
Unsurprisingly, people tend to like equality (-> 92% chose Sweden over the US). Also unsurprisingly, the results seem to confirm common stereotypes:
- Liberals were more likely to pick Sweden than conservatives were
- Relatively poor people were more likely to pick Sweden than relatively rich
- Women were more likely to pick Sweden than men
What makes this study worth mentioning is the magnitude of these differences - and what people predicted them to be:
- The difference between conservatives & liberals was 3.5% (Predicted: 35%)
- The difference between rich & poor was 3% (Predicted: 40%)
- The gender difference was 2% (Predicted: 12%)
The brain is great at extracting average tendencies from a complicated world. But: (and this but is huge) The expected magnitude of those differences will predictably be wildly wrong! Our brain assumes circle circumstances, but the real world is just more complex. And, well, since we don’t correct for what we don’t know, the result is an oversimplified mental model world in which “being a conservative” becomes 100% of a person’s personality and differences become absurdly exaggerated. The mechanisms behind stereotypes deserve their own article for sure.
Ambiguous Actions
For the longest time, people believed that the earth is flat - and who could blame them: seeing roundness requires a broader field of view than our eyes can achieve at ground level. After all, you can’t infer what you can’t see, right?
In my first semester of social psych, we learned about the fundamental attribution error. Sounds eerie, doesn’t it? Also known as the correspondence bias, this refers to the false belief that a person’s actions correspond directly to their mind. If someone’s rude to you at a party, that’s because they’re a rude person, right? Makes sense. But what if his dog passed away yesterday? What if he just got fired today? Maybe both?
Inferring that a person is rude because they acted rude is like inferring the earth is flat because it looks flat - you disregard what you don’t know about the person/earth. People’s actions are heavily influenced by context, but that context is often hard to see. A 1977 study⁷ illustrated this so beautifully that I just have to share it:
The Setup: A mock quiz show
- Volunteers made up a studio audience watching a quiz show
- One volunteer is randomly chosen to be the questioner, instructed to go backstage and think of 10 very difficult trivia questions (on his own, before google existed)
- One volunteer is randomly chosen to be the contestant, who then has to answer these very difficult questions on the show
- As expected, the contestants only averaged 4/10 correct answers to these quite difficult questions (e.g. "How many islands are in the Philippines?")
- All involved then had to rate how knowledgable the contestant and the questioner are
The Result: Looking bright - Looking dim
First of all, it’s important to keep in mind that there was no real difference in intellect between questioners and contestants. They were randomly chosen. And everyone knew that. But well, one of them was asking smart questions that someone else couldn’t answer - looking very bright. The other one was struggling to answer those smart questions - looking pretty dim.
And so, the studio audience rated the questioners to be exceptionally knowledgeable, whereas the contestants barely made it above average. Even the contestants rated the questioners to be more knowledgeable than themselves.
Funny enough, the only group that didn’t assume a difference in intellect were the questioners, since they’d sat backstage struggling to come up with the questions in their field of expertise. They knew that their field of expertise doesn’t map perfectly onto other’s field of expertise. They knew the context best.
Our mind does not correct its confidence for what it doesn’t know. You might project your own attitudes, then sprinkle in a little stereotype and completely discard the context - and still be sure you’ve done a great job at mind reading. Remember, these processes happen almost automatically, and the results are delivered to your conscious mind as a fact. And the underlying mechanisms are so ingrained in our thinking that psychologists call them “fundamental”. So sadly, we’re not getting rid of that overconfidence anytime soon. So what can we do to improve our actual accuracy?
Perspective-Getting > Perspective-Taking
When it comes to improving our mind reading abilities, two intuitive ideas tend to come to mind:
- Improve perception & interpretation of body language & facial clues
- Improve your perspective taking
Well, tough luck. When put to the test, body language conveyed minimal information beyond verbal clues⁸. But every liar has a tell? Mhhh, no. The intuitive sense of “true emotions leaking out” when lying is another wonderful example of the spotlight effect - people tend to dramatically overestimate the rate at which others can detect their lies⁹.
And perspective taking? The ability to imagine other’s perspectives can be impressively accurate - for example, you don't give your son the same direct feedback that you would give a colleague because you can imagine his reaction before it happens (and kids need more encouragement than your friend).
It’s a fascinating mechanism combining egocentrism and stereotyping: applying what you know about others and running a simulation in your mind, like “Would I like this present if I were my spouse?” or “Would I understand this presentation if I were my client?”. But if you don't know what it's like, then no amount of imagination will magically create that knowledge. If you ask couples to predict their spouse’s attitudes, instructing them to write a paragraph from their spouse’s perspective will actually decrease their accuracy (relative to a control that just predicts without perspective taking)! Similarly, if your belief about the other mind is mistaken (hello stereotypes), perspective taking will actually magnify that mistake’s impact.
So, what does Epley recommend to become a better mind reader? Ironically, his advice is: Ask, don’t read! I should’ve seen it coming. Hundreds of pages detailing all the limitations that come with x-raying someone’s head in, and this mind reading expert’s best advice is JUST ASK. Delightful. And the worst part: it’s good advice!
In the experiment where couple’s were asked to predict their spouse’s attitudes, there was a third condition: perspective getting. Partners were allowed to interview their partners on the questions before responding to the survey (no taking notes though) and then answer. Oh wonder, they performed way better!
It kinda feels like cheating, doesn’t it? I mean, of course asking is gonna be more accurate than guessing. So why do we spend so much time reading when we could just ask?
It all comes back to our hilarious overconfidence. In the experiment, the couples were also asked to predict their accuracy. Now surely, they must’ve known asking is better than guessing, right? Ummmmmmmmm, no.
“Getting perspective first requires knowing that you need it” - Nick Epley
Now, perspective getting is still far from fool-proof. You still need to make sure you understand others correctly, and make the barriers for telling the truth as low as possible. After all, no employee will give you their honest opinion on your leadership style if they fear getting fired. But even with its limitations, it’s our best shot.
“Sometimes a sense of humility is the best our wise minds can offer, recognizing that there’s more to the mind of another person than we may ever imagine”— Nick Epley
Links
1 Swann, W. B., Jr, & Gill, M. J. (1997). Confidence and accuracy in person perception: do we know what we think we know about our relationship partners?. Journal of personality and social psychology, 73(4), 747–757. https://doi.org/10.1037//0022-3514.73.4.747
2 Epley, N., Morewedge, C. K., & Keysar, B. (2004). Perspective taking in children and adults: Equivalent egocentrism but differential correction. Journal of experimental social psychology, 40(6), 760-768. https://doi.org/10.1016/j.jesp.2004.02.002
In the study, children & adults (separately) were invited to play a “communication game”. They (as the “mover”, left picture) received instructions from a second person (in on the experiment) to move different objects around. Before the game, they were shown the instructor’s perspective (right picture).
The Key Instruction: “Move the small truck”. From the mover's perspective, the "small truck" is the one middle row, right box. That truck is obstructed for the instructor though, so to him the "small truck" is the one top left. The “egocentric move”, therefore, is moving the smallest truck - whereas the “right move” the middle truck. To see how people react, they placed a camera in the middle square to capture the participants’ eye movement. So how did kids & adults compare?
Well, both looked at the smallest (-> “egocentric”) truck first - the adults were just faster to get over their egocentric reflex (“Ahhhh, he meant that truck”). Also, adults grabbed the smallest truck half as often (25%) as the kids (50%) - but that’s not really the point here :)
3 Ross, M., & Sicoly, F. (1979). Egocentric biases in availability and attribution. Journal of personality and social psychology, 37(3), 322. https://web.mit.edu/curhan/www/docs/Articles/biases/37_J_Personality_Social_Psychology_322_(Ross).pdf
Overclaiming: If you add up how much people (e.g. married couples, team members) report to have contributed to a joint product (e.g. household chores, team projects), you routinely end up with waay more than 100% - Everyone sees their own contributions disproportionately - which gets worse, the more people involved (-> the bigger the team, the over the claiming). Funny enough, this works both for desirable (doing the dishes) and undesirable (starting a fight) actions.
4 Epley, N., Converse, B. A., Delbosc, A., Monteleone, G. A., & Cacioppo, J. T. (2009). Believers' estimates of God's beliefs are more egocentric than estimates of other people's beliefs. Proceedings of the National Academy of Sciences of the United States of America, 106(51), 21533–21538. https://doi.org/10.1073/pnas.0908374106
5 Ariely, D. (2001). Seeing sets: Representation by statistical properties. Psychological science, 12(2), 157-162. https://doi.org/10.1111%2F1467-9280.00327
6 Norton, M. I., & Ariely, D. (2011). Building a Better America—One Wealth Quintile at a Time. Perspectives on Psychological Science, 6(1), 9–12. https://doi.org/10.1177/1745691610393524
7 Ross, L. D., Amabile, T. M., & Steinmetz, J. L. (1977). Social roles, social control, and biases in social-perception processes. Journal of Personality and Social Psychology, 35(7), 485–494. https://doi.org/10.1037/0022-3514.35.7.485
8 Gesn, P. R., & Ickes, W. (1999). The development of meaning contexts for empathic accuracy: Channel and sequence effects. Journal of Personality and Social Psychology, 77(4), 746–761. https://doi.org/10.1037/0022-3514.77.4.746
9 Gilovich, T. & Savitsky, K. & Medvec, V.. (1998). The Illusion of Transparency: Biased Assessments of Others' Ability to Read One's Emotional States. Journal of personality and social psychology. 75. 332-46. https://doi.org/10.1037//0022-3514.75.2.332
Comments ()