‘Perils of Perception. Why We’re Wrong About Nearly Everything’ – a book based on the constant interplay between the rational and the emotional. An interview with Bobby Duffy, author of the book

Is there a benefit in being wrong?

Although the idea of ‘Perils Of Perception. Why We’re Wrong About Nearly Everything’, the recent book written by Bobby Duffy, Professor of Public Policy and Director of the Policy Institute, King’s College London, might be seen as quite simple – measuring what is versus what we think it is -, it raises a lot of useful questions regarding the way we interpret and act upon the information we receive. Once you open the introductory chapter even, you get to answer a very simple perceptual question, before moving on to the more difficult, more subtle stuff: is the Great Wall of China visible from space or not?

This question and a few others were the basis for the discussion with the author, Mr. Bobby Duffy, during his visit in Romania for the book launch by Publica publishing house, on 16th, of May. The interview was carried out by Alina Stepan, Country Manager Ipsos Romania and Cluster Head Ipsos-South East Europe and took place at Carturesti Verona.

One of the hypotheses you put forward in the book is that some people want to believe that The Great Wall of China is visible from space. What is the benefit for us in holding such a belief?

Great question! You are absolutely right that it is a really simple concept. You look at what people think is reality, you compare that with the actual and the difference tells you much more about how we think than you would expect. It is a very simple concept but it actually has a lot of depth to it. The Great Wall of China question – “Do you think that the Great Wall Of China can be seen from outer space with the naked eye?” -, actually opens up a lot of explanations for how we think about things. Because the answer is that it cannot be seen from outer space, but when you ask people in surveys around the world, around a half of people thinks it is visible from outer space, time and time again.

We are kind of stuck with this and there are a lot of explanations for that: it is just fast thinking? Is it a trivial question you have seen somewhere and got stuck to your brain? You have seen nothing that contradicts it, so it just sticks with you. But is more to it than that, there is also an emotional aspect to it, because it is an interesting and vivid fact that attracts us. We are attracted to those vivid stories, things that are unusual and things that make us feel something about ourselves.

There is something very compelling about the idea that humans can build something that is visible from that far away, that astronauts, aliens even could see our handy work from outer space, and therefore we are more attached to it than we think and that kind of sets up one of the key things of the book: our perceptions of reality are driven by our emotions, much more than we would think. Sometimes we like to think about ourselves as very rational, considered humans, that we are thinking things through, that we are fact-based, evidence-based. But is not true. Our emotions and our perceptions of reality interact all the time.

That has big implications for politics, for how we live in society, but obviously for business, marketing, brands as well. That sense of emotional connection, how that drives how we see things is vital to understand.

Bobby prese_pic1

Thinking fast AND slow, not fast OR slow

That actually adds a nuance to the split people usually (and wrongly) make, that system one is purely emotional and system two is purely rational. 

We see a lot of fast and slow system one–system two thinking in the social psychology, in behavioral economics and behavioral science. It is a very useful concept for understanding different types of reactions humans have, but not the entire truth of it. The idea that you have one block here and one block over here and they do not talk to each other is not how the human brain works. We are in constant interplay between the rational and the emotional, the fast and the slow. It kind of works in a much more integrated, complicated pattern, than thinking that you can be just one or the other.

That, in turn, has big implications for communications. I think in the distant past we had just to tell people the features of the product or tell them your policies and your commitments as a politician and let people decide, that was the model. Then we grew into knowing that emotion and the narrative are more important in how people react, and the pendulum swung away from facts and information to emotion and just engaging people. I would say it swung too far in a sense that it is now all about emotion, all about narrative, not about the realities and the facts. We have set up this false oposition that you can either have facts and information or you can have emotions, and this is not really how we work as humans.

What we have got to do is combine the two. You could tell brilliant stories to people with facts, with information, with data, and getting that across turning data into stories, turning information into stories, turning reality into stories, that is the key here and this is what we should be aiming for, not to ignore the features, the facts, and just focus on how you make people feel.

20190516_173801

Our misperceptions show what we are afraid of

Should we aim to correct all these misperceptions, or do they have a good side, they respond to a need, they have a function in society?

The causes of the misperceptions have really important human functions. One of the misperceptions is called rosy retrospection, which means you look at the past more fondly. There have been lots of great psychological studies that showed that, when you ask people about their history, they literally forget the bad bits from that, which is good for our psychological health, as we do not want to hold on to those bad bits too much. It is good to let things go, but what it means as a consequence for how we view the present and how we view the future is that we think that the past is better than it was and the present and the future are worse than they really are. And that has a bad side effect.

 You have got a mix there of good human level of “yes, it is good that we do not remember all the bad” and bad from the same point of view. There are definitely benefits in the causes of the misperceptions themselves, actually the underlying reasons for them are in the human benefits. For me, the main benefit for misperceptions are what they reveal about how we think and in particular what we are worried about. We know that when people are particularly worried about an issue, they tend to inflate the scale of it in their heads. You take examples from social issues like immigration levels: people inflate immigration levels with very high guesses. For example, in Romania, the average guess is 23% of the population are emigrants when it is 1% in reality.

Bobby prese_pic2

What this reveal is a concern in a much more direct way than asking people what they are worried about, because this is less filtered, less thought through. You just ask people to estimate something, then part of the response is they are trying to get the right answer and part of the response is sending a message about what concerns them. Psychologists call this “accuracy goals” and “directional goals”. People have a goal of sending a message, even if they are aware of it or not. It is a slightly unconscious element to it which makes misperceptions very useful.

None of the book and none of this data is about saying that people are stupid or that they should be perfectly informed. It takes time to learn about realities. If people are wrong about things, that’s partly because they have been spending their time doing other things. It takes time to inform yourself perfectly. The aim is not to be perfectly informed, because that would be exhausting. The fact that these misperceptions exist shows that we are getting along with the other parts of our life as well, to some degrees so it’s not all bad.

The more emotionally expressive you are, the most wrong your answers can be

Beyond the political and economic context of a country investigated in your book, is there a cultural factor explaining the differences between some countries being closer to reality than others?

Across all of the studies, we group together all the data and look at it overall, as this is a big data that Ipsos got together (13 countries were included in all of the studies). An index about who’s most right and who’s most wrong showed that in the top two worst countries Italy is the worst by far, the most wrong, then the US. Alternatively, the top two most accurate countries were Germany and Sweden, with Sweden being the most accurate.    

It is an amazing thing. Why is such a difference in these countries?! In the book, we tried to pull together all the evidence, all the explanations that we could find, like education levels, the media environment, political environment. None of it particularly explains very much of the variation. There was only one index that was very highly related, and this was the measure of how emotionally expressive that country is. That’s from a global study, observationally, about how often people touch each other when they talk or how louder their voices are when they laugh in a conversation. What we found is that Italy scores very highly and so is the US. On the other hand, Germany and Sweden score very low. This makes sense to me, because a big explanation for misperceptions of realities and how we report them is our emotional reaction and how willing we are to show our emotions, those emotions when someone asks you a question about immigration or crime rates or any of the estimations. We have to avoid cultural stereotypes between countries, we cannot say that everyone in those countries is like that or that being in a particular way points to a certain fact, but there is this trend (more emotionally expressive countries tend to have the most misperceptions) within the different ones that seems to fit.

This is an emotional issue. It’s not about educational levels, not about the media particularly, not about politics individually, it’s about the whole emotional response.

Bobby prese_pic3

Giving a very spontaneous response rather than a thoughtful one might bring a difference as well. I was thinking if the reaction time for answering these questions is different among the countries.

In one of the studies we asked people “How sure are you that you’ve got the right answer? How confident are you in your answers?” There was an almost perfect inverse relationship, where the more confident the country, the more wrong they are, not the more right. This relates to a social psychology effect, called Dunning-Krugger effect [n.red. The Dunning-Krugger effect occurs where people fail to adequately assess their level of competence — or specifically, their incompetence — at a task and thus consider themselves much more competent than everyone else.]

People who are not skilled in something or think they do not have the knowledge are better than they really are, because they do not have the ability to know how wrong they are. Scandinavian countries were cautious about how much they got it right, but they were more correct on it, whereas Italy was pretty confident in its answers, for example, which were quite wrong. That show whether in their minds there is a considered or thought-through response or not; people are not that aware of how considered it is. There is though qualification on that, as in some experiments if I incentivize you to get the right answer, rather than if I just ask you in a survey, that is if I say “if you get close [to the correct answer] you are going to get some money or a voucher”, people get more accurate.

That fits again with the psychology models, because if you are answering these questions you have an accuracy goal, when you are trying to get the right answer, and you have a directional goal, when you want to send the message. All the incentive does is focus on and increase the accuracy goal, and deflate your focus on the directional goal, whether you are aware of it or not. You can incentivize people to encourage more considered thinking, and that does tend to make people more accurate and to trigger more considered thoughts.

I’ve seen in your book that you actually exposed people to the correct information, once you gathered the erroneous one. Are there some demographic or social reasons why some people are more reluctant to solve this cognitive dissonance by accepting the new (and correct) information?

There is one particular example where we ask people to guess the immigration levels for their country, then  we took people who guessed more than twice the actual level and we asked them an additional question, which said “Your national statistics service says that the emigration is only this level, but you say it’s twice or more that level. Why do you think that is so much more?”

The top two answers in that question were “I/we don’t believe you” because:

  1. ‘National Statistics Service does not count illegal immigration, so their figures are wrong.’
  2. ‘I just do not believe you / You’re lying’.

People do hold on to those types of perceptions they have on the reality. Like you say, that’s related to the Theory of Cognitive Dissonance: when you’ve got a mental image of something, you think you know the fact, and if someone comes along and contradicts it you try to hold on to it, because it causes you psychological pain to change your world view.

Bobby prese_pic4

These are related to confirmation bias, that you look for information that confirms your view and dismiss information that does not. That kind of points to the answer that there are not so much demographic differences explaining this, it’s more how people strongly (or not) identify with the issue.

In the case of things like immigration levels, there’s much more likely that people who dismiss the government information, people who got strong views about immigration and people who support immigration reduction [political] parties will overestimate the immigration levels for their countries. Therefore, it’s important to know people’s identity and what do they identify themselves with, in order to know if the message is going to work with them. Some people are holding to their identities really strongly, if it’s important to them, others are more fluid, and they will accept new information.         

This is not about immigration or politics, it does apply in business or lots of other ways. You must understand much more about people’s identity and where they are starting from, before you try to convince them or get messages across to them.

Moving to the commercial side, this is why building brand equity is difficult. Changing people’s opinion about brands it’s not so easy either.

That’s right, it’s difficult. But I think there is a broader point about also understanding the underlying ideology, underlying values and world views. There is a lot more interest in both the private and the public sector in understanding effectively people’s moral values that underly so much of their thinking. Jonathan Haidt, an American academic, has the ‘moral foundations theory’, which classifies people according to tested psychological metrics on how people view things like threat and security. These metrics represent a very powerful proof in predicting others’ opinions and behaviors. That’s the kind of thinking we need to bring into the private sector more and into understanding communications and brands. Building on that I think that is a strong way forward.