Jump to content

Underlying theories of misinformation

From Wikipedia, the free encyclopedia

The belief and spread of misinformation (incorrect or misleading information) occur for many reasons beyond ignorance, such as moral values and motivated reasoning[1][2]. This is because decision-making entails both the cognitive architecture of the individual as well as their social context[3]. Thus, the phenomena of misinformation can be explained using not just traditional science communication theories, but also various psychological and social theories. These theories attempt to explain why individuals believe and share misinformation, and they also inform the rationale behind various misinformation interventions seeking to prevent the spread of false information.

Science communication theories

[edit]

Information deficit model

[edit]

Information deficit model attributes false beliefs to a lack of understanding or lack of information. This model assumes that misinformation may be corrected by providing individuals with further credible information. However, critics argue that the model fails to address other reasons why individuals believe false information, such as the illusory truth effect (repeated statements receive higher truth ratings than new statements)[4]. In fact, in one study, participants failed to rely on stored knowledge and instead relied on repeated false statements[4]. This explains why, for example, people deny climate change despite having access to evidence that suggests otherwise[5]. Thus, the theory has largely been debunked as a reliable explanation for why individuals believe misinformation.

Misinformation interventions such as fact-checking and debunking stem from the underlying theory of the information deficit model, as they seek to correct false information with true information. While they may be useful in cases involving non-controversial or technical/quantitative issues, they tend to be less useful when related to highly salient or controversial issues or race, ethnicity, and culture[2].

Psychological theories

[edit]

Inoculation theory

[edit]

Inoculation theory is a psychological theory that posits that preemptive exposure to misinformation techniques strengthens an individual's resilience to misinformation they encounter later on, because it will be easier to spot and refute[6]. The crucial difference between this theory and the information deficit model is that the former highlights the importance of knowing the forms, techniques, and characteristics of misinformation, rather than knowing the veracity of misinformation itself.

The most common form of misinformation interventions rooted in inoculation theory are pre-bunking and gamified interventions that seek to inform the participant about the various ways that misinformation appears online. Examples of gamified interventions include Bad News, Harmony Square, and Go Viral!, among others[7].

Inattentional blindness

[edit]

Inattentional blindness is a theory that suggests that individuals fail to perceive information due to lack of attention. Research exploring attention and the sharing of misinformation found that participants shared misinformation because their attention was focused on factors other than accuracy[8].

The inattentional blindness theory, then, suggests that shifting attention to accuracy and veracity will increase the quality of news that people subsequently share, offering a useful framework for countering misinformation[8]. The most prominent type of misinformation intervention relying on the theory of inattenional blindness is nudging, which attempts to shape the decision-making and behavior of users online in a way that prevents the spread of misinformation.

Social theories

[edit]

Although useful, psychological theories do not adequately capture the social nature of holding and sharing beliefs, especially online. Social theories offer an alternative to psychological theories by addressing this context.

Affect control theory (ACT)

[edit]

Affect control theory (ACT) is a social theory proposing that individuals "perceive events and construct lines of social action that maintain preexisting sentiments for themselves"[9]. According to ACT, socialization imbues concepts with shared connotative meanings, known as sentiments, which humans use to make sense of experiences[10].

Research suggests that the "interpretation, encoding, and response to false information" is a process driven by affects—including the affect of credibility[10]. One study, for example, suggests that when people interact with misinformation that challenges their beliefs and perceptions, they will either reinterpret the information (deflect) or adjust their beliefs based on the credibility of the source of information. In fact, the researchers found that demonstrating that a source spreads falsehoods deliberately (disinformation) is more effective in discrediting opponents than claiming they spread falsehoods unintentionally (misinformation)[10]. This is one example how ACT may be useful for developing strategies for discrediting sources of falsehoods[9].

Social network theory

[edit]

Social network theory describes the structure of relationships and interactions between social actors. A fundamental concept in social network theory is a network, which consists of nodes or actors with a set of ties or connections between them. Nodes may be people, organizations, or other types of social entities, and ties may be communications, alliances, friendships, and more[11].

Such a representation of social actors is very applicable to online environments such as social media, where users (nodes) interact with other users by following, sharing, liking, re-posting, etc. (ties). The application of social network theory to social media provides useful insights into the spread of misinformation. For example, tightly connected networks may be used to represent echo chambers.

This theory is useful for devising countermeasures to misinformation on a social media platform level, such as down ranking or removing posts and enabling forwarding restriction policies on suspicious users. It is also useful for evaluating such countermeasures using social network metrics such as centrality, dispersibility, and influenceability[12].

  1. ^ Amin, Avnika B.; Bednarczyk, Robert A.; Ray, Cara E.; Melchiori, Kala J.; Graham, Jesse; Huntsinger, Jeffrey R.; Omer, Saad B. (2017-12). "Association of moral values with vaccine hesitancy". Nature Human Behaviour. 1 (12): 873–880. doi:10.1038/s41562-017-0256-5. ISSN 2397-3374. {{cite journal}}: Check date values in: |date= (help)
  2. ^ a b Nyhan, Brendan; Reifler, Jason. "Misinformation and Fact-checking: Research Findings from Social Science" (PDF). New America Foundation.
  3. ^ Ecker, Ullrich K. H.; Lewandowsky, Stephan; Cook, John; Schmid, Philipp; Fazio, Lisa K.; Brashier, Nadia; Kendeou, Panayiota; Vraga, Emily K.; Amazeen, Michelle A. (2022-01). "The psychological drivers of misinformation belief and its resistance to correction". Nature Reviews Psychology. 1 (1): 13–29. doi:10.1038/s44159-021-00006-y. ISSN 2731-0574. {{cite journal}}: Check date values in: |date= (help)
  4. ^ a b Fazio, Lisa K.; Brashier, Nadia M.; Payne, B. Keith; Marsh, Elizabeth J. (2015-10). "Knowledge does not protect against illusory truth". Journal of Experimental Psychology: General. 144 (5): 993–1002. doi:10.1037/xge0000098. ISSN 1939-2222. {{cite journal}}: Check date values in: |date= (help)
  5. ^ Hansson, Sven Ove (2017-06-01). "Science denial as a form of pseudoscience". Studies in History and Philosophy of Science Part A. 63: 39–47. doi:10.1016/j.shpsa.2017.05.002. ISSN 0039-3681.
  6. ^ Traberg, Cecilie S.; Roozenbeek, Jon; van der Linden, Sander (2022-03-01). "Psychological Inoculation against Misinformation: Current Evidence and Future Directions". The ANNALS of the American Academy of Political and Social Science. 700 (1): 136–151. doi:10.1177/00027162221087936. ISSN 0002-7162.
  7. ^ Kiili, Kristian; Siuko, Juho; Ninaus, Manuel (03 January 2024). "Tackling misinformation with games: a systematic literature review". Interactive Learning Environments: 1–16. doi:10.1080/10494820.2023.2299999#d1e429. ISSN 1049-4820 – via Taylor and Francis Online. {{cite journal}}: Check date values in: |date= (help)
  8. ^ a b Pennycook, Gordon; Epstein, Ziv; Mosleh, Mohsen; Arechar, Antonio A.; Eckles, Dean; Rand, David G. (2021-04). "Shifting attention to accuracy can reduce misinformation online". Nature. 592 (7855): 590–595. doi:10.1038/s41586-021-03344-2. ISSN 1476-4687. {{cite journal}}: Check date values in: |date= (help)
  9. ^ a b Kroska, Amy; Powell, Brian; Rogers, Kimberly B.; Smith-Lovin, Lynn (2023-01-01). "Affect Control Theories: A Double Special Issue in Honor of David R. Heise". American Behavioral Scientist. 67 (1): 3–11. doi:10.1177/00027642211066044. ISSN 0002-7642.
  10. ^ a b c Campos-Castillo, Celeste; Shuster, Stef M. (2023-02-01). "So What if They're Lying to Us? Comparing Rhetorical Strategies for Discrediting Sources of Disinformation and Misinformation Using an Affect-Based Credibility Rating". American Behavioral Scientist. 67 (2): 201–223. doi:10.1177/00027642211066058. ISSN 0002-7642.
  11. ^ Daly, Alan J.; Borgatti, Stephen; Ofem, Brandon (2010). "Social Network Theory and Analysis". Social network theory and educational change. Cambridge (Mass.): Harvard Education press. ISBN 978-1-934742-80-8.
  12. ^ Ng, Ka Chung; Tang, Jie; Lee, Dongwon (2021-10-02). "The Effect of Platform Intervention Policies on Fake News Dissemination and Survival: An Empirical Examination". Journal of Management Information Systems. 38 (4): 898–930. doi:10.1080/07421222.2021.1990612. ISSN 0742-1222 – via Taylor and Francis Online.