Misinformation is rampant in today’s world, but mental health professionals can employ psychological science and other strategies to counter it, helping clients make better decisions and (hopefully) enjoy greater wellbeing.
Jump to section
Introduction
At this writing in 2024, we have come through a ghastly, costly global pandemic in which millions lost their lives, brutal wars are raging in at least Ukraine and the Middle East, and voters in many countries are facing elections this year to select a new leader. All of these situations, and more, have been contributing to a deluge of misinformation being promulgated in the public sphere; much of it poses risks to individual health and/or societal wellbeing (related article: Rethinking Wellness: A Holistic Perspective on Health), let alone the preservation of democracy.
In this article, we pose several examples of how false information endangers our mental health, examine the psychodynamics behind the rampant spread of misinformation, and summarise what mental health professionals of all stripes can do to counter it. Specifically, we investigate these questions:
- What are the factors that make people vulnerable to believe – and perhaps act on – misinformation?
- What interventions can we use to counter its effects?
The psychological and sociological factors that help misinformation spread
As the capacity for misinformation has ramped up to warp speed globally through social media and other online platforms, so, too, has the research which aims to understand why people believe it and resist correcting falsehoods. Multiple psychological and sociological factors come into play, but first, let’s distinguish between two terms you have undoubtedly heard: misinformation and disinformation.
Misinformation and disinformation
Misinformation can be defined as information that is inaccurate or contrary to scientific consensus, whereas disinformation refers to a deliberate effort to knowingly circulate misinformation to gain money, power, or reputation (Swire, 2022). An example of misinformation related to health might be that you can make up, say, four or five hours of lost sleep in a single night (for more tips on effective sleep, read this article). Science now knows that the body can recover one or two hours at most (Blackmores, 2021). In a more sinister vein, an ongoing (massive) example of disinformation at this writing is the Kremlin’s narrative around the “special military operation” (euphemism for war) in Ukraine. Not only has the Russian government put out faked videos billed as “fact-checks”, but it has also made accurate reporting of the war a criminal offence (APS, 2022).
Why countering misinformation is important
Any mental health professional even peripherally engaged with cognitive behaviour therapy (CBT) is aware of how easily clients can hold irrational, unkind beliefs and thoughts which undermine their mental and even physical health. Similarly, we could pose myriad examples of client scenarios wherein the adoption by the client of misinformation has hurt their psychological health and will continue to do so unless a serious effort is made to counter the information. Here are just a few examples:
- An adult client with ADHD (for more on this topic, see the Mental Health Academy course Working with ADHD in Adults) has been told all his life that he is lazy and wonders, “Why bother trying to accomplish things?”. Neuroscience has now established that the ADHD brain is wired differently than a neurotypical brain.
- A client continues to have her autonomy, self-esteem, and sense of self assaulted by her gaslighting narcissistic lover. The narcissist’s falsely critical accusations are motivated by his (maladaptive) need to have unlimited power and control, so the treatment is disinformation (i.e., deliberate). For a deeper dive on this topic, read Case Study: Narcissism in a Romantic Relationship.
- A client’s close friends tell them that taking medication strips people of their autonomy, so the client suddenly ceases to take their medication, opting instead to “go cold turkey”. This is widely considered to be unwise and can be dangerous to the client.
The factors
As noted, misinformation can spread because of psychological factors within us individually or because of sociological dynamics; some factors contain elements of both.
Swinging with the in-group
Have you heard the word “homophily”? How about the related popular term, “echo chamber”? Both of these refer to the fact that we tend to believe information that comes from our own “in-groups” more than that which arrives from an “out-group”. Most of us are “guilty” of homophily – that is, liking those who are similar to us (“homo” meaning “similar” or “same”, while “-phily” draws from the Greek “philos”, meaning “friendship/love”). Likewise, information that is compatible with our sense of identity or worldview can be processed and accommodated more easily than that which seems somehow alien or not like us, because we process information under the influence of pre-existing values and ideas: the phenomenon of motivated cognition (DeAngelis, 2023; APA, n.d.; Swire, 2022).
An extreme example of this is the one-sided “echo chamber”, created by the content-recommendation algorithms common on social media platforms, combined with users’ preferences to engage with people who share their beliefs. In such self-contained ecosystems of information, even highly uncommon beliefs are presented without opposition, so users perceive that a significant number of people share the same views (APS, 2022). Examples here can include mis- or disinformation spread by American Republicans at which Democrats may scoff, or the opposite: information spread by Democrats that most of them believe, but which Republicans label as false.
In a health example, the pandemic exposed many echo chambers related to vaccine safety. The spread of misinformation through the development of “multiple realities”, in which distortions cater to people’s wishful thinking, may be accepted as objective truths when there is little or no evidence (APS, 2022).
Confirmation bias; motivated reasoning
Most people seek out information which confirms their prior beliefs (confirmation bias) and they also engage in some motivated reasoning (defined as “reasoning toward a desired conclusion rather than an accurate one”) – so suggests a professor involved with studying the science of denial (DeAngelis, 2023).
Continuing with our vaccine example above, we could expect that someone who has identified themselves as being an “anti-vaxxer” might be inclined to seek out all the articles and bits of information they can find about how vaccines can make a person sick with the virus that they purportedly protect against. Political forces may knowingly exploit individuals’ psychological biases and blind spots to intentionally sow disinformation; such campaigns are conducted with malicious intent to harm individuals or groups or to push public opinion toward a particular ideology. Sadly, these can be frighteningly harmful at both individual and societal levels, engendering political polarisation, conspiracy thinking, and mistrust of democratic processes (APS, 2022).
Repetition and emotion: sure to impress
Hucksters and conmen know that the more they can repeat the (fraudulent) claims about their snake oil, the more likely prospective customers are to buy the oil. In a research summary, University of Western Australia cognitive psychologist Dr. Ullrich Ecker pointed to studies showing that when people see/hear the same false information multiple times – which social media accomplishes exponentially – the information becomes lodged in the memory and becomes difficult to unseat even when new information is presented. This phenomenon is called the “illusory truth effect” (DeAngelis, 2023).
Here we can remind ourselves of oft-debunked claims that despite being proven false, still linger in the public discourse. One example is the assertion that vaccines cause autism.
Along with repetition, we are more susceptible to misinformation if we are in a heightened state of emotion – say, fear or outrage – when the information is presented. Similarly, if the information appeals to such emotions, it is more likely to be believed and more difficult to discern “fake news” from the real deal (APA, n.d.; DeAngelis, 2023). Even without intense emotions present, our tendency is to focus on understanding new information and figuring out how to act on it rather than evaluating it for accuracy (APA, n.d.), so it is no wonder that, in the midst of pre-frontal cortex-inhibiting emotions, we are even less able to consider the facts analytically.
The profile of the person susceptible to misinformation
Finally, here, we note the personal traits and experience that research has shown make a person more vulnerable to belief in misinformation. Those with greater educational attainment, analytical reasoning capability, and numeracy skills have been demonstrated to enhance resistance to misinformation, while anxiety (that factor of emotion again) increases a person’s likelihood of believing it. Greater social media use is linked to more susceptibility to misinformation, as are positions of politically extreme ideology. And if you have been frustrated about getting older, take heart: older individuals are shown to be better at identifying misinformation than younger adults (APA, n.d.; DeAngelis, 2023).
Anti-misinformation interventions
So, knowing what the factors are that make it hard to resist the tug of misinformation, what can we as mental health professionals do to counter its effects?
While researchers have identified categories of interventions at both the systems level (broad, systemic changes encompassing legislation and technology) and the individual level (individual behaviour change), we focus here on the latter as more within the typical therapist’s sphere of influence. Research (Gwiaździński et al, 2023) has identified four common individual-level interventions with some power to counter misinformation: debunking, pre-bunking, accuracy nudging, and literacy training/warning labels.
Debunking
Fact-checking, or debunking, corrects misinformation and is used after people are exposed to misinformation. When it includes a detailed explanation that refutes the incorrect information and replaces it with facts, it is most successful. Also, the earlier a piece of misinformation is refuted, the more likely it is to “stick” in the mind of the individual being exposed to it, because the longer misinformation has been in place, the more persistent it is.
Debunking can be time-consuming, and its effect fades over time, necessitating the same kind of repetition that we cited earlier as a factor in strengthening misinformation (Swire, 2022). Finally, here, some studies have shown that targeting behaviour directly can be especially effective when the desired behavioural change conflicts with the target individuals’ world view, because attempts to change only the behaviour (and not the whole world view) don’t require the individuals to alter beliefs that misinformation may have shaped. Here, “choice architecture” can be engaged, using psychological science to design the presentation of options in ways that make certain behaviours easier to follow through on – and transparency helps (APS, 2022; DeAngelis, 2023; APA, n.d.).
Pre-bunking
Less common a term than debunking, pre-bunking (pre-emptive debunking) attempts to keep people from falling for the misinformation in the first place. Psychological inoculation is commonly used here; it involves exposing people to a weak version of a falsehood to build resistance to future persuasions. So, there could be a forewarning about an impending attack on a belief (e.g., “Warning; people may try to manipulate you by . . . “ or “We warn you: your partner may try to keep you under his wing by talking sweetly to you and promising that he will never hit you again”), followed by a statement that pre-emptively refutes the claim (e.g., “This is not true because . . . “ or “Statistics show that once people have engaged in violence against their partners, they will continue to do it”). Brief reminders, or “boosters”, can extend the effects of pre-bunking statements over time. This type of intervention has solid potential to forestall misinformation on a large scale but does depend on people actively participating in the messaging (Swire, 2022; APA, n.d.).
Accuracy nudges
These can range from passive design choices, like putting graphic warnings on cigarette packages, to proactive public education campaigns. The idea is that the nudge is built into the environment in which a decision will occur, so the individual is automatically exposed to the intervention. For example, an accuracy nudge could ask social media users to consider whether information is truthful before sharing it. Through interventions that change how users experience and consume content in social media, technocognition can slow down the rate of communication so that people have time to consider the quality of the information they’re sharing (Gwiaździński et al, 2023).
A social norm nudge would highlight community standards. Motivational nudges are said to reward people for being as accurate as possible (APA, n.d.; APS, 2022). “Boosters” or brief reminders over time can help extend pre-bunking efforts. Cognitive-behaviourally based therapies, at least, are heavily involved in psychoeducational “nudges” to help clients move toward and maintain adaptive thoughts and behaviours – so to some extent, mental health professionals already engage in this misinformation-countering intervention.
Literacy training/warning labels
Finally, interventions which help people improve their capacity to judge the quality and accuracy of information encompass the realms of health, media, and digital literacy. Such interventions may be included in formal education or community outreach programs, where enhancing critical thinking skills can (and perhaps, should) be a central skill for combating misinformation. Award-winning writer and mediator David Evans shares these questions, which he advocates having available at all times during decision-making processes where critical thinking and countering misinformation are important. He encourages us to ask:
- Who is saying it? (Are they dependable and trustworthy?)
- How do they know what they are saying (Is their source credible and reliable? Are they open about where they got their information from?)
- What’s in it for them (Is there a conflict of interest, or do they have an obvious incentive to promote the idea they are sharing?)
- Have you explored your own biases? (As noted above, these may dispose us to falsely accept or reject an idea being presented) (Evans, 2021).
If you are interested in scientific literary, consider reading our 3-part series on evidence-based practice: What is Evidence-Based Practice?, Evidence-Based Practice: Acquiring the Evidence, and Evidence-Based Practice: Appraising, Applying and Assessing the Evidence.
A word of caution
Spotting accurate information – or helping others do the same – does not guarantee that the newly “enlightened” person (including yourself) will be ready or willing to change their mind. There’s a science and art to having meaningful and productive conversations with people who have differing options than ours; even if we strongly believe those opinions are based on the wrong information! This article by Dr. Adam Grant (who also wrote a very practical book on the subject: Think Again) provides some interesting perspectives, tips, and guidelines for anyone wishing to have better conversations about difficult/polarising topics.
Conclusion
In an age where everyone can become an instant journalist by simply posting whatever they want online, the ability to detect and counter misinformation is more important than ever. Along with our traditional counselling and psychotherapy skills, mental health professionals can have a large role in promoting healthy behaviours by using evidence-based tools, such as we have noted in this article, to counter both misinformation and disinformation.
Key takeaways
- Countering misinformation and disinformation is crucial for both individual and public health.
- Many factors determine how susceptible a person is to believing misinformation; these include our identification with certain ideologies and groups, confirmation bias, repetition and emotion, and certain traits and experience (e.g., lower educational attainment, anxiety, and greater social media use).
- Debunking, pre-bunking, nudging, and literacy training (including critical thinking skills) are all evidence-based interventions to help counter misinformation.
- Providing accurate information to someone whose opinion is based on misinformation does not guarantee they will be ready or willing to change their mind.
References
- Agiesta, J. & Edwards-Levy, A. (2023). CNN. https://edition.cnn.com/2023/08/03/politics/cnn-poll-republicans-think-2020-election-illegitimate/index.html
- American Psychological Association (APA). (n.d.) APA. Using Psychological Science to Understand and Fight Health Misinformation: An APA Consensus Statement.
- Association for Psychological Science (APS). (2022). New APS white paper takes on misinformation. APS. https://www.psychologicalscience.org/observer/white-paper-misinformation
- Blackmores. (2021). What is sleep debt and how do you recover from it. Blackmores. https://www.blackmores.com.au/stress-relief/how-long-does-it-take-to-make-up-a-sleep-debt
- DeAngelis, T. (2023). American Psychological Association (APA). Psychologists are taking aim at misinformation with these powerful strategies. https://www.apa.org/monitor/2023/01/trends-taking-aim-misinformation
- Evans, D. (2021). Psychology Today. How to Approach Critical Thinking in This Misinformation Era. https://www.psychologytoday.com/us/blog/can-t-we-all-just-get-along/202108/how-approach-critical-thinking-in-misinformation-era
- Gwiaździński, P., Gundersen, A. B., Piksa, M., Krysińska, I., Kunst, J. R., Noworyta, K., Olejniuk, A., Morzy, M., Rygula, R., Wójtowicz, T., & Piasecki, J. (2023). Psychological interventions countering misinformation in social media: A scoping review. Frontiers in psychiatry, 13, 974782. https://doi.org/10.3389/fpsyt.2022.974782
- Swire, B. (2022). Countering health misinformation. Harvard T.H. Chan School of Public Health. Countering health misinformation: 5 lessons from an expert research psychologist. https://www.hsph.harvard.edu/chc/resources/countering-health-misinformation-lessons/