Thanks to social media and technology, the impact of fake news and misinformation has become increasingly relevant in modern societies. In particular, in the political field, we have experienced how fake news can influence election outcomes and enable polarization. In the final event of the academic year, we delved into the psychological factors that underpin the spread of misinformation, the effect of memory on our decisions and the methods that can be adopted to fight the diffusion of inaccurate news.
We invited to join the discussion Professor Norbert Schwarz, Provost Professor of Psychology and Marketing at the USC Marshall School of Business; Nuala Walsh, Non-executive Director of the Global Association of Applied Behavioral Scientists and Vice Chair of UN-women UK; and the worldwide expert of communication Tim Ward, Board member of the Non-profit International Insights and co-author of the book: “Pro Truth: A Practical Plan for Putting Truth back into Politics”.
Professor Norbert Schwarz investigated the role of intuition in perceiving the truthiness of a particular piece of information and the various misleading factors that can influence our judgement. He suggested that truth is usually taken for granted and that we need some signals to trigger the sense of falsehood and to make us second-guess our initial intuition.
The behavioral scientist Nuala Walsh explored the relationship between fake news and fake memories, claiming that in addition to the huge role played by technology in spreading disinformation, emotions and biases are equally important factors that help the proliferation of fake news. She argued that the reality of this phenomenon lies indeed in the memory and that the union of fake memories and fake claims can play a dangerous role in judgement and decision-making processes.
Last, Tim Ward presented us with more practical perspective on the topic. He examined the most commonly used cognitive biases in politics suggesting viable ways to fight them. He taught us how to go from processing fluency information to “get a fishy smell” and start questioning news on media and in political speeches. In particular, he focused on Trump’s campaign of 2016, analyzing the politician’s behavior and rhetoric.
How does the gut know truth?
Our first speaker, Professor Schwarz, introduced us to the ways in which our intuition suggests us whether something is likely to be true or false. Firstly, he explored the concept of ‘truthiness’, that is the quality of seeming or being felt true without necessarily being true. This led his research efforts to ask what makes information seem true and feel right. Emphasizing that accepting information that we get is the default, he then delved into the factors that can trigger truth testing. He argued that when people perform truth testing, they use extremely rational criteria and he identified five questions people ask themselves in testing the truth of a claim:
- Is it compatible with what I know?
- Is it internally coherent?
- Do others believe it?
- Is the source trustworthy?
- Is there enough supporting evidence?
If the answer to these questions is affirmative, then you are able to read the information fluently and embrace it easily; on the other hand, if the answer is negative, you will stumble.
However, the ease of accepting fluent statements can lead us to approve erroneously false yet fluent claims. Since we are very sensitive to the feeling of ease and processing smoothness, any statement that triggers this feeling is more likely to be believed and accepted. Variables that provoke this emotion are of various type: there is the stimulus kind, namely whether the content is coherent and consistent; the perceivers’ knowledge of the subject, personal goals and beliefs; the presentation variables, namely whether the content is repeated, in which color and font it’s presented, whether it’s presented in rhyme or using a certain accent, or whether there is background noise during the exposure; and the context, namely the concept accessibility and task relevance.
The importance that all these variables assume is given by the fact that people are very sensitive to how easy or difficult something is to process but not to why that is the case.
Professor Schwarz then showed us some findings in several areas of interest:
- Social consensus
Relying on the fact that other people believe it, a claim is more likely to be assumed true. Observing a group discussion, it has been noticed that according to the number of times an opinion is repeated or presented by different people, the estimated consensus increased with repetition and the number of people presenting the fact.
Relying on the way facts are presented to us, we don’t test the truth of a statement with our knowledge when we believe that the statement is true. In a study, when people were asked “How many animals of each kind did Moses take on the Ark?”, they were more likely to notice the error in the question when the font was smaller and less clear.
Relying on the coherence and compatibility of a story, we intuitively think that it’s likely to be true.
Relying on the credibility of the sources, a message is more likely to be considered true when it comes from familiar and trusted experts.
A research showed how the complexity and ethnicity of names on eBay caused accounts with good reviews and hard to pronounce names to be trusted to the same extent as accounts with bad reputation but easy to pronounce names.
5. Supporting evidence
Relying on the amount of evidence presented, the trustworthiness of a claim increases when we can associate in our brain evidence in support of it.
Fake news and fake memories
Our second guest, Nuala Walsh, explored the link between fake news and memory.
She argued that fake news, because of repetition, become fictitious stories, which are easily remembered and hence create fake memories.
Then, she drew some differences between the two. While fake news are created intentionally by external actors, fake memories are products of natural distortions and are self-generated internally.
Furthermore, fake memories are often triggered by emotions such as nostalgia or fear and they are impossible to check, on the other hand, fake news can be checked.
Regarding similarities, Professor Walsh pointed out that both fake news and memories are association mechanisms that use complex network of interconnected data that is hard to control and detect.
Following, she focused on the reasons why people don’t bother checking information, identifying ego as one of the main sources of this issue, and she pinpointed the characteristics that false information must have in order to be shared. Some of these features are social currency, namely the tendency to spread only news that boost the individuals’ reputations and make them look good, and the ability to trigger emotions and suggestibility.
An example of this phenomenon comes from a study conducted on the most shared articles of the New York Times, where the correlation between these stories and emotions triggered by them was investigated. The findings showed that the most promoted articles were the ones that triggered emotions such as anger, amusement and anxiety, displaying the importance of emotions provoked in the extent to which people share or don’t share. The power of feelings can be pointed out also in memories, as people tend to think they remember something just because they feel emotional about a specific topic. These suggestions have a huge impact on memories and several implications in false testimony, as it has been demonstrated by research according to which in the US, 69% of death penalty cases are misguided judgments led by wrong identification of the suspect.
Claiming that misinformation and misremembering have a compound effect caused by the repetition effect, Walsh suggested some ways to reduce their power and scale such as pausing, probing, nudging people in the right direction through the use of technology, with fact checking websites for example, and framing decisions.
Cognitive biases in political speeches and practices: Trump’s campaign example
Our last speaker, Tim Ward, focused on practical examples of cognitive biases in politics and on practical ways to fight them restoring truth in the political debate.
Observing Trump’s campaign in 2016, Ward noted how the politician promoted a new technique linked with his dishonest attitude: starting from the assumption that all politicians tend to lie, he observed that Trump, differently from his predecessors that – once exposed by media after lying – backed down, would continue to lie, and would turn to the media exclaiming “Fake news!” and converting the ordinary corrective measures into an attack. Therefore, in the aftermath of 2016, Ward wrote a book, entitled: “Pro Truth: A Practical Plan for Putting Truth back into Politics “ within which he detected some tools to overcome this issue. Using the example of the Trump campaign, we learnt about the use of the following biases in politics.
- Confirmation bias
It’s the most natural tool used by politicians according to which they tell people what they want to hear, to confirm their prior beliefs.
One of the most evident examples of tailoring political messages comes with the wall policy implemented by Trump. This policy, that consisted in building a wall between the US and Mexico borders, started as a metaphor in political speeches: it began as a way of talking about making immigration more difficult but after having observed the reaction of the public, Trump decided to make it an actual government policy, indulging his supporters’ emotions.
2. Rosy retrospective
It’s the idea according to which people believe that things were better in the past and want to restore a past idyllic situation. An example of this type of bias, always used by Trump, is its slogan: “Make America Great Again”
3. Halo – Horn effect
It’s the phenomenon according to which when we associate a negative or positive thing with someone, we then tend to continue associating other positive or negative things with the same person. Trump exploited this effect in his tweets, tagging his opponents with negative characteristics, such as Crooked Hilary, Lying Cruz, Sleepy Joe and inducing the public to expect such behaviors from these people.
4. Group attribution error
It’s the cognitive bias according to which the attitudes and behaviors of one person accurately reflect the behavior of a whole group. Trump used it in his immigrant campaign: Muslims are terrorists, Mexican are criminals and therefore, we shouldn’t let them in the country.
Eventually, Ward proposed some methods that can be used against such prejudices and false beliefs, such as actively searching for evidence that support the opposite view of a politician, question if the lie serves the liar while listening to speeches, and most importantly never trigger the backfire effect, namely never correct with facts and evidence a person’s beliefs but try to find common values and grounds before disproving them.