---
title: Debunking Misinformation
tags: under-construction, misinformation, debunking
permalink: https://c19vax.scibeh.org/pages/misinfo_debunking
---
{%hackmd FnZFg00yRhuCcufU_HBc1w %}
{%hackmd GHtBRFZdTV-X1g8ex-NMQg %}
<!-- This page is to input information about debunking misinformation and conspiracy theories. It is referenced from the handbook -->
fallout from conspiracy theories (effects on medical staff): https://www.theguardian.com/commentisfree/2021/jan/01/healthcare-workers-covid-conspiracies-coronavirus-deniers
nice guide from Canadian health authorities: https://www.canada.ca/en/public-health/services/reports-publications/canada-communicable-disease-report-ccdr/monthly-issue/2020-46/issue-11-12-november-5-2020/vaccine-misinformation-found-online.html
# Debunking misinformation
## Origins
Debunking refers to the practice of refuting a factual claim ([Lewandowsky et al., 2020](https://skepticalscience.com/debunking-handbook-2020-downloads-translations.html)). The origins of this practice can be traced to the “muckrakers” of the Progressive Era in American journalism ([Graves & Amazeen, 2019](https://oxfordre.com/communication/view/10.1093/acrefore/9780190228613.001.0001/acrefore-9780190228613-e-808)). The writing of journalists such as Samuel Hopkins Adams, Upton Sinclair, George Seldes, and I.F. Stone challenged the claims of patent-medicine producers and other industrial and political misinformation peddlers ([Amazeen, 2020](https://journals.sagepub.com/doi/10.1177/1464884917730217); [Cassedy, 1964](https://www.jstor.org/stable/2710829?refreqid=excelsior%3Aa5cae0bf14aa48a6dc6c410895f5352f&seq=1#metadata_info_tab_contents)). The emergence of the internet enabled nascent “fact-checking” sites in the U.S. such as Spinsanity and Snopes.com to produce annotated analyses of political rhetoric and online rumors. The practice of debunking became professionalized with journalist-run efforts in the U.S. from FactCheck.org in 2003 followed four years later by PolitiFact.com and The Washington Post’s Fact Checker. Since 2015, the [International Fact Checking Network](https://www.ifcncodeofprinciples.poynter.org/) has institutionalized the practice of debunking around the world ([Graves & Amazeen, 2019](https://oxfordre.com/communication/view/10.1093/acrefore/9780190228613.001.0001/acrefore-9780190228613-e-808)).
## Is Debunking Effective?
Studies generally show that corrections are effective at reducing belief in misinformation (see [Graves & Amazeen, 2019](https://oxfordre.com/communication/view/10.1093/acrefore/9780190228613.001.0001/acrefore-9780190228613-e-808) for a review). While some studies have found conditions where attitude change did not take place (e.g. [Garrett, Nisbet, & Lynch, 2013](https://onlinelibrary.wiley.com/doi/abs/10.1111/jcom.12038)), many have provided evidence of success ([Amazeen, Thorson, Muddiman, & Graves, 2018](https://journals.sagepub.com/doi/abs/10.1177/1077699016678186); [Fridkin, Kenney, & Wintersiek, 2015](https://www.tandfonline.com/doi/abs/10.1080/10584609.2014.914613)). A meta-analysis across a multitude of debunking studies concluded that corrections can be modestly effective under specific conditions, particularly when it comes to health-related misinformation ([Walter & Murphy, 2018](https://www.tandfonline.com/doi/full/10.1080/03637751.2018.1467564)).
Specific to the debunking of vaccine-related misperceptions, however, the evidence is mixed. Although some have failed to correct misperceptions about childhood vaccines ([Bode & Vraga, 2015](https://onlinelibrary.wiley.com/doi/abs/10.1111/jcom.12166); [Pluviano, Watt, & Della Sala, 2017](https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0181640)), other research offers evidence of success ([Nyhan et al., 2014](https://pediatrics.aappublications.org/content/133/4/e835); [Trevors & Kendeou, 2020](https://doi.org/10.1177/1747021820913816); [van der Linden, Clarke, & Maibach, 2015](https://bmcpublichealth.biomedcentral.com/articles/10.1186/s12889-015-2541-4)). Nonetheless, debunking misperceptions does not always lead to changes in related behavioral intentions ([Nyhan et al., 2014](https://pediatrics.aappublications.org/content/133/4/e835); [Swire et al., 2017](https://royalsocietypublishing.org/doi/10.1098/rsos.160802)).
## Knowing *When* to Debunk Covid-19 Misinformation
Because resources (both time and attention) are limited, knowing which claims to correct in the media is important. Debunking claims that few people have heard about may also needlessly draw additional attention to the misinformation ([Lewandowsky et al., 2020](https://skepticalscience.com/debunking-handbook-2020-downloads-translations.html)]. Furthermore, belief in some types of misinformation have little correspondence to attitudes or intentions to engage in certain types of problematic behaviors such as aversion to mask wearing or participating in risky leisure activities ([Enders et al., 2020](https://misinforeview.hks.harvard.edu/article/the-different-forms-of-covid-19-misinformation-and-their-consequences/)). One rule of thumb is to correct misperceptions that at least 10% of a population holds ([Ajzen & Fishbein, 1980](https://books.google.com/books/about/Understanding_attitudes_and_predicting_s.html?id=AnNqAAAAMAAJ)).
According to a nationally representative study among U.S. adults conducted in June 2020, one in four (25%) participants either agreed or strongly agreed with the statement, “The coronavirus is being used to force a dangerous and unnecessary vaccine on Americans” ([Enders et al., 2020](https://misinforeview.hks.harvard.edu/article/the-different-forms-of-covid-19-misinformation-and-their-consequences/)). This is consistent with a report from the British Academy and the Royal Society for the SET-C ([Science in Emergencies Tasking: COVID-19](https://royalsociety.org/-/media/policy/projects/set-c/set-c-vaccine-deployment.pdf)) group which stated that vaccine-related misinformation often involves distrust of pharmaceutical companies, governments, and science.
It is also important to consider the situational context of where the misinformation is presented. A one-on-one personal conversation offline is different than a mediated conversation that can be amplified ([MacDonald, 2020](https://www.canada.ca/en/public-health/services/reports-publications/canada-communicable-disease-report-ccdr/monthly-issue/2020-46/issue-11-12-november-5-2020/vaccine-misinformation-found-online.html)). For example, studies have found that anti-vaccination mothers have a disproportionate voice online ([Krishna, 2017](https://www.tandfonline.com/doi/full/10.1080/1062726X.2017.1363047); [McKeever et al., 2016](https://www.tandfonline.com/doi/full/10.1080/15205436.2016.1148172)). By not engaging in the correction of vaccine misinformation online, a false consensus may grow perpetuating the false notion that most parents are against vaccinating their children ([McKeever & McKeever, 2019](https://theconversation.com/anti-vaccination-mothers-have-outsized-voice-on-social-media-pro-vaccination-parents-could-make-a-difference-120572)). Seeing others corrected online – observational correction – can lead to more accurate attitudes ([Vraga & Bode, 2017](https://journals.sagepub.com/doi/full/10.1177/1075547017731776)).
Before engaging, particularly if you are a healthcare provider, it is important to know whether the source of misinformation is a vaccine science denier or a simple refuser ([MacDonald, 2020](https://www.canada.ca/en/public-health/services/reports-publications/canada-communicable-disease-report-ccdr/monthly-issue/2020-46/issue-11-12-november-5-2020/vaccine-misinformation-found-online.html)). Simple refusers will usually have one or two main concerns about vaccine uptake whereas science deniers are at the extreme end of the vaccine hesitancy continuum with very negative attitudes toward vaccinations such that providing additional scientific arguments is unlikely to be effective. [MacDonald (2020)](https://www.canada.ca/en/public-health/services/reports-publications/canada-communicable-disease-report-ccdr/monthly-issue/2020-46/issue-11-12-november-5-2020/vaccine-misinformation-found-online.html) offers strategies to follow with each.
## Knowing *How* to Debunk Covid-19 Misinformation
*Providing Factual Alternatives*
One of the most effective methods of correcting misinformation is to provide an alternative factual explanation. This helps to “switch out” the inaccurate information for the accurate information ([Johnson & Seifert, 1994](https://psycnet.apa.org/doiLanding?doi=10.1037%2F0278-7393.20.6.1420); [Ecker, Lewandowsky, & Tang, 2010](https://link.springer.com/article/10.3758/MC.38.8.1087)). Ideally, the new accurate information should ideally be as simple (or even simpler) than the original misinformation.
*Citing a trustworthy source*
When it comes to the correction of misinformation, trustworthiness seems to play a much larger role than expertise ([Guillory & Geraci, 2013](https://www.sciencedirect.com/science/article/pii/S2211368113000752?casa_token=cgn3m-6JoP4AAAAA:wahJ4RuuncM_myF0h7vTqDkHRsCkTtxtXIqE7oslToBNGAmvuP85UQUG3-cfKeENoHWTRvb9Vg)). Where expertise is the extent to which the source is able to give accurate information, trustworthiness reflects the perceived willingness of the source to provide accurate information (Pornpitakpan, 2004).Thus, providing information from a reputable source that is perceived to be trustworthy is important.
*Explain why*
Providing just a few sentences about *why* the misinformation is false is much more effective than just stating that the information is untrue ([Ecker, O'Reilly, Reid, & Chang, 2020 ](https://onlinelibrary.wiley.com/doi/pdf/10.1111/bjop.12383)). This could entail explaing why the misinformation was initially thought to be correct, why it is wrong, or why the alternative information is accurate ([Kendeou, Walsh, Smith, & O'Brien, 2014](https://https://doi.org/10.1080/0163853X.2014.913961)). Research has found that detailed explanations help to prevent ‘belief regression’ ([Swire, Ecker, & Lewandowsky, 2017](https://psycnet.apa.org/doiLanding?doi=10.1037%2Fxlm0000422); [Kowalski & Taylor, 2017](https://psycnet.apa.org/record/2017-19188-001)). This is where individuals initially update their belief after being exposed to the correction, but this belief change is not sustained over time.
Much of the misinformation on social media about vaccines is in the form of storytelling – emotional anecdotes that can be amplified through likes and shares ([Shelby & Ernst, 2013](https://doi.org/10.4161/hv.24828)). For instance, a particularly pervasive narrative about the measles-mumps-rubella (MMR) vaccine is that it is injurious, causing children to develop autism overnight ([Krishna, 2018](https://www.tandfonline.com/doi/full/10.1080/10410236.2017.1331307); [Shelby & Ernst, 2013](https://doi.org/10.4161/hv.24828)). This is a story that endures despite the link between the MMR vaccine and autism having been debunked ([Farrington et al., 2001](https://pubmed.ncbi.nlm.nih.gov/11395196/); [Godlee et al., 2011](https://doi.org/10.1136/bmj.c7452)). In part, this is because misinformation travels faster and farther than corrections ([Friggeri et al., 2014](https://www.aaai.org/ocs/index.php/ICWSM/ICWSM14/paper/view/8122)). However, it is also the emotional content prioritized by social media algorithms that amplifies its spread ([Peters et al., 2009](https://onlinelibrary.wiley.com/doi/abs/10.1002/ejsp.523); [Vaidhyanathan, 2021](https://newrepublic.com/article/160661/facebook-menace-making-platform-safe-democracy)). Therefore, rather than simply employing staid facts to correct health-related misinformation, narratives can also be used when debunking ([Cappella et al., 2015](https://www.ingentaconnect.com/content/trsg/trs/2015/00000001/00000002/art00008;jsessionid=70n8cb8lc9kt2.x-ic-live-01); [Lewandowsky et al., 2012](https://journals.sagepub.com/doi/10.1177/1529100612451018); [Shelby & Ernst, 2013](https://doi.org/10.4161/hv.24828)) but note that debunkings do not need a storyline to be effective (Ecker, Butler, & Hamby, 2020).
Evaluating the plausibility of scientific claims of vaccination safety in light of alternative (and compelling), but fallacious explanations (Lombardi, et al., 2016; Sinatra & Lombardi, 2020) can also be helpful.
<!--Page contributors: Michelle Amazeen, Briony Swire-Thompson-->
:::spoiler Relevant sections of the handbook (delete this after page complete)
About a third of the people who do not intend to get the vaccine are hardcore vaccine deniers / anti-vax exteremists [DOI: 10.37016/mr-2020-44 - Link to wiki: vaccine deniers](https://hackmd.io/mRZDOWq7RfqYD_cDF4y-UQ). These people often also believe in misinformation and / or conspiracy theories about COVID-19 [DOI: 10.1098/rsos.201199] and they are difficult to convince about the necessity, efficiency and safety of the COVID-19 vaccines.
=> They should be the target of systematic debunking efforts against misinformation (see below & Link to wiki: Debunking misinformation and conspiracy theories).
People who generally fear vaccines might not be conscious hardcore deniers but instead they might be confused due to health illiteracy [DOI: 10.1016/S1473-3099(20)30724-6].
=> This can be tackled with the use of appropriate materials that facilitate comprehension of statistics and graphs [REF: Fischhoff, Brewer, & Downs, 2011; Gigerenzer, 2015]
:::
{%hackmd GHtBRFZdTV-X1g8ex-NMQg %}
{%hackmd oTcI4lFnS12N2biKAaBP6w %}