Skip to Main Content
Article navigation

Issues of information bias and accuracy are of increasing concern around the world across disparate educational contexts and settings. A growing consensus definition proposes that misinformation is inaccurate and misleading, while disinformation is deliberately so (Cooke, 2017; Oltmann et al., 2018). Both types of unreliable information are being rapidly created and disseminated across multiple information and communication technologies and platforms. The discourse surrounding mis/disinformation is often highly polarized and highly politicized. With this special issue of Information and Learning Sciences dedicated to the study of teaching and learning about misinformation, we seek to move beyond partisan arguments about sources and causes of misleading, inaccurate, false and satirical information to consider positively framed educational interventions.

Educational discourses around teaching and mis/disinformation have often been framed as concepts such as a “battle for truth” (Albert, 2019) and positioning the entire internet as a “toxic place” (Singer and McConnell, 2021). These discourses feed into moral panics (Bratich, 2020) and technopanics (Marwick, 2008) that often appear with the introduction of new information and communication technologies. The solution to these dire pronouncements of widespread societal decline is typically identified as teaching critical thinking and source evaluation heuristics (Auberry, 2018) or using teaching apps to guide students in how to avoid the influence of mis/disinformation (Roozenbeek et al., 2020). Such simplistic pedagogical framing ignores the social structures that give rise to the spreading of false, misleading and often harmful information, and the complex social contexts in which they grow and spread. The papers in this issue offer more nuanced discussions of misinformation and disinformation in teaching and learning, and they consider how we as educators can help student learners, as well as public and key stakeholders, become more discerning in their interactions with information across many mediated and sociotechnical contexts.

Although the papers in this special issue focus largely on mis/disinformation in online communities and sources, we must recognize that misinformation and disinformation are manifestations of existing societal problems. Typically, these are enduring issues, such as social power imbalances, political bullying, economic inequity, political polarization and racism (Agosto, 2021). The spreading of mis/disinformation through the media is not new either. It can be traced in the US mass media back to the height of the “yellow journalism” era in the late 19th century, and likely much further back than that (McQueen, 2018). In the 1890s, sensationalism and exaggeration joined with poorly researched and rarely verified reporting, as seen in the intense competition between two New York City newspapers (Campbell, 2001). These media outlets valued sensationalism over truth in an effort to gain increasing market shares and proportionate profits.

We can readily identify some of those same trends and values today, across different information platforms and technologies, but often amplified in a way that earlier media could not have imagined. For example, disinformation about the COVID-19 pandemic and vaccination often relies on spurious connections, exaggerations, distortions and misrepresentations of research (Bond, 2021). But the potential reach of such purveyors (including the so-called “disinformation dozen,” the 12 people responsible for most COVID-related disinformation on Facebook) is exponential, as millions of people view and often share such claims (Center for Countering Digital Hate, 2021). During the height of yellow journalism, mis/disinformation was produced to drive newspaper sales; the same profit motives spur much contemporary mis/disinformation (Bond, 2021). YouTube personalities who release political conspiracy videos, for example, have much to gain from racking up views to bring in advertising dollars and little to gain from restricting their rhetoric to verifiable fact.

If these issues are not new, why is addressing mis/disinformation a pressing current pedagogical concern? Part of the current upsurge of concern stems from the highly polarized political sphere in the US and many other countries today, which feeds into, and is fed, by the structure of social media and speeds the spread and reach of mis/disinformation (Agosto, 2021; Osmundsen et al., 2021).

As Stephen Rea’s essay in this issue argues so cogently, political volatility and sociotechnical design intertwine to create digital extremism. Social media formats, in particular, support the spreading of highly emotionally charged content, such as exaggerated clickbait headlines and videos featuring angry speeches about artificially constructed social crises. As beings who typically connect strongly with human displays of emotion, people tend to believe highly emotional content more than they believe carefully verified scientific, scholarly or journalistic content, which is typically presented much more calmly and with relative dispassion (Sharot, 2017). This explains why even in cases in which disinformation has been retracted or disproven, so often the original misleading messages live on online in video clips or memes shared over and over again among sympathetic audiences.

To cite a recent example, the Mayo Clinic website states with dispassionate clarity that “The drug ivermectin, used to treat or prevent parasites in animals and in humans, isn’t a drug used to treat viruses. The FDA hasn’t approved use of this drug to treat or prevent COVID-19. Taking large doses of this drug can cause serious harm” (DeSimone, 2021). Yet rumors that ivermectin can “cure” COVID-19 infections persist in social media, despite efforts of platform providers to label or erase COVID-related mis/disinformation and despite the CDC’s August 2021 well-publicized warning of adverse health effects associated with human use of ivermectin (CDC Health Advisory Network, 2021). A quick search of Twitter in late January 2022 returned several recent testimonial tweets from people claiming to have been “saved” by ivermectin. One such tweet, posted on January 28, 2022, ends with the rallying cry “Refuse to be bullied!,” meaning that people suffering from COVID infections should insist on ivermectin treatment. It received over 1,300 likes in a matter of hours.

Thus, social media serves to multiply the speed, reach and believability of misinformation and disinformation (Vosoughi et al., 2018). And the damage it creates can be far worse than in previous media environments, when the news and information cycle was much slower and more often comprised professional news organizations with editors, fact-checkers and established journalistic ethical standards (Agosto, 2018).

As a result, in today’s complicated media environment, mis/disinformation pedagogy requires a much broader approach than just teaching information evaluation and critical thinking skills for identifying misleading and false information. It requires stepping back from a focus on information and media as end products to leading students in conversations about the malfunctioning social structures that give rise to the creation of mis/disinformation in the first place. Rather than merely teaching students how to identify and avoid racist content online, for example, we must also teach them to recognize embedded power imbalances and inequalities in social, educational, economic, health and government structures and encourage learners to challenge these imbalances. Or, rather than focusing on the falseness of the pro-ivermectin Tweets, we need to teach students to consider why people lack trust in medical authority, how attitudes toward science can perpetuate medical misinformation, how the medical industry economic structure incentivizes disinformation for financial gain and so on.

It is also important to acknowledge the role of confirmation bias in diminishing our critical thinking and ability to challenge problematic messages and social structures, often despite receiving contradictory evidence after forming initial opinions (Nickerson, 1998). Returning to the example above, the ivermectin-promoting Twitter user’s rallying cry “Refuse to be bullied!” suggests confirmation bias at work. She is apparently aware of prevailing warnings against using the drug to treat COVID-19 infections, yet her belief that it helped her outweighs any contradictory messages. In a response to her original tweet, she later added, “They wish to silence the truth. it’ll never happen.”

Perhaps most significantly, we need to frame current educational conversations about mis/disinformation around the concepts of power, privilege and equity, which underlie human information practices and the information systems that humans design. Research shows that older adults, people of color and those with low education or low income, and especially those having multiple axes of marginalization – are both faced with more mis/disinformation and are more susceptible to it (Seo et al., 2021). Issues of power, privilege and equity are therefore fundamental to understanding how mis/disinformation begins and spreads, and these issues are also key to helping learners come to recognize that the messages in both professional and user-driven media embed the social structures from which they rise and the biases of the humans who create them. The ivermectin enthusiast from the Twitter example above, in another tweet, describes texting with her doctor about her COVID pneumonia treatment, indicating privilege of technology access, medical care, literacy, self-agency and more, all of which amplify the power and influence of her harmful message.

Clearly, educators have an important role to play in better preparing people to recognize misleading content, politically- and profit-driven vitriol and other forms of mis/disinformation. In the USA, we have largely ceded the responsibility for monitoring online discourse to platform providers (Agosto, 2021). This ad hoc regulation has proven to be uneven and ineffective, giving disproportionate power to the corporations that run these platforms and undue influence to advertisers and other self-promoting, profit-making entities. Again, we note that educational intervention is crucial. Educators at all levels should teach their students to challenge the existing power structures that produce and perpetuate mis/disinformation. However, pedagogy alone cannot solve the problem of mis/disinformation. Sociotechnically informed pedagogy is one part of an imperative multistakeholder strategy, including policy, economic, health and technical interventions.

This vision of an effective sociotechnically informed pedagogy underlies the papers in this issue. These studies and scholarly essays address theories, methods, research findings and frameworks for action in teaching practices across a wide range of disciplines and domains. They include research based on original data, as well as inquiries, observational reports, viewpoints and case studies written among collaborative researcher and practitioner coauthors. Together they problematize concepts of misinformation and disinformation within education, discuss evidence-based approaches scaffolded on knowledge of how students learn and share practical guidance on teaching information discernment and critical literacies in an ambiguous information ecosystem.

First, in a viewpoint essay informed by experience in developing open teaching modules, Rea lays the conceptual foundation for the papers that follow. “Teaching and Confronting Digital Extremism: Contexts, Challenges, and a New Digital Civics” highlights the role of context in teaching audiences about mis/disinformation. Rea outlines pedagogical challenges that can best be addressed with a classic sociotechnical approach which combines the study of information technologies and their social contexts of development and use. He defines “digital extremism” as the intersection of digital disinformation campaigns and political extremism, amplified by features of networked communication tools. Among the pedagogical challenges he identifies, “the trouble of getting students to understand digital extremism as a sociotechnical problem rather than as a social-or-technical problem” (p. 9) is likely of the greatest interest to educators and scholars working within the information and learning sciences. Information technology, Rea reminds us, does not operate in a vacuum. People – not platforms – create mis/disinformation. As such, sociotechnical approaches must include civics education and civics awareness.

Similarly, Michael Spikes and David Rapp position civics and current events-focused pedagogies as tools for cross-disciplinary instruction in “Examining Instructional Practices in News Media Literacy: Shifts in Instruction and Co-Construction.” They present a comparative case study of three US secondary classrooms to show how teachers choose, both through instructional planning and in-the-moment teaching decisions, to combine elements of instructionism (which views the teacher as key to determining content and method) and constructivism (which privileges individual experience and previous knowledge for guiding content and method). Spikes and Rapp’s deconstruction of teaching practice cases provides empirical support for “personal deliberation, social negotiation, and practice” (p. 28) as core to student learning and illustrates the blending of instructionist and constructivist techniques necessary for scaffolding secondary school students’ critical readings of news-related media.

Sarah McGrew and Ira Chinoy also position mis/disinformation pedagogy within the disciplinary lens of news media and journalism. In “Fighting Misinformation in College: Students Learn to Search and Evaluate Online Information through Flexible Modules,” McGrew and Chinoy test a set of four modules designed for asynchronous learning at the undergraduate level. The modules expand common postsecondary educational privileging of searching and evaluating information for academic purposes to present a holistic framing of information searching as core to students’ effective decision-making both in school and out. The results offer a positive angle on the mis/disinformation “problem”: students who completed the four relatively low-demand models showed meaningful gains in generating and navigating reliable search results, skills that can improve both academic and daily life information interaction.

Next, in “Information Source and Content: Articulating Two Key Concepts for Information Evaluation,” Iulian Vamanu and Elizabeth Zak return to the two core principles that guide much of the mis/disinformation pedagogy at the primary through postsecondary levels to argue their continuing cogency in teaching “information consumers” critical analysis skills. Although there are many available sets of information evaluation questions that have been proposed for identifying mis/disinformation, the authors argue that most lack bases in empirical research. They use the example of a recent viral piece of COVID-19 “research” to test their own set of evaluation questions, drawn from research in library and information science, evolutionary psychology and rhetoric studies. Their analysis of these literatures shows that methods of identifying source credibility and information soundness lie at the core of recommended information analysis practices across disciplines.

The final two papers in this themed issue each examine theory developed outside of education as the basis for pedagogical design interventions that can prepare students to challenge problematic social structures underlying the production and spread of mis/disinformation. With “Disinformation Detox: Teaching and Learning about Mis- and Disinformation Using Socio-Technical Systems Research Perspectives,” Paris Britt, Fina Marcello and Rebecca Reynolds seek to provide undergraduates with a fuller understanding of mis/disinformation as part of a broader information and information technology ecosystem. They offer a locally adaptable undergraduate syllabus with detailed instruction modules for engaging students in study and discussion. The syllabus frames critical literacy within a deeper understanding of the many broader social factors discussed above, such as power and privilege as embedded in technological and economic infrastructures. In addition to thinking critically about infrastructures, the authors argue the importance of leading students to think about mis/disinformation from a sociotechnical perspective, focusing not just on decontextualized “information” but on the people whose lives it most impacts – “those (masses) who are downstream, marginalized, under-represented, vulnerable, and most harmed” (p. 102).

Finally, Zimmerman and colleagues bring together Wilson’s (2016) general theory of information, a foundation of the information science literature and Levine’s (2014) truth default theory, a newer theory from the social psychology and communication literature in “Science Default to Truth in Information Behavior: A Proposed Framework for Understanding Vulnerability to Deceptive Information.” They create a framework for understanding how people navigate a complex information ecosystem that posits the detection of mis/disinformation as learned information practices embedded in social interaction and trust. As we see with the other papers in this issue, the concept of context underlies their approach: “How a person, in context, encounters and processes information affects their understanding of the information and how they decide to act upon that information” (p. 120).

Looking across all of the papers in this issue, context stands as a unifying theme. Together these works suggest that pedagogical approaches can only be effective if they frame mis/disinformation within the social context(s) in which it is created, distributed and challenged. Mis/disinformation does not exist in a social vacuum; neither can teaching and learning about it.

Agosto
,
D.E.
(
2018
), “
An introduction to information literacy and libraries in the age of fake news
”, in
Agosto
,
D.E.
(Ed.),
Information Literacy and Libraries in the Age of Fake News
,
Libraries Unlimited
,
Westport
, pp.
1
-
9
.
Agosto
,
D.E.
(
2021
), “
Q+A: Could information literacy lessons fix our ‘information disorder’ crisis?
”,
Interview by Britt Faulstick, 16 December, Drexel University News Blog
, (accessed 18 December 2021).
Albert
,
M.
(
2019
), “
Deleting the deception': New effort targets disinformation among youngest voters
”,
KYFF.com, 4 November
, (accessed 10 January 2022).
Auberry
,
K.
(
2018
), “
Increasing students’ ability to identify fake news through information literacy education and content management systems
”,
The Reference Librarian
, Vol.
59
No.
4
, pp.
179
-
187
.
Bond
,
S.
(
2021
), “
Just 12 people are behind most vaccine misinformation on social media, research shows
”,
NPR, 14 May
, (accessed 10 January 2022).
Bratich
,
J.
(
2020
), “
Civil society must be defended: misinformation, moral panics, and wars of restoration
”,
Communication, Culture and Critique
, Vol.
13
No.
3
, pp.
311
-
332
, available at:
Campbell
,
W.J.
(
2001
),
Yellow Journalism: Puncturing the Myths, Defining the Legacies
,
Greenwood Publishing Group
,
London
.
CDC Health Advisory Network
(
2021
), “
Rapid increase in ivermectin prescriptions and reports of severe illness associated with use of products containing ivermectin to prevent or treat COVID-19
”, (accessed 26 January 2022).
Center for Countering Digital Hate
(
2021
), “
Disinformation dozen
”, (accessed 26 January 2022).
DeSimone
,
D.C.
(
2021
), “
COVID-19 drugs: are there any that work?
”,
Mayo Clinic
, (accessed 28 January 2022).
Levine
,
T.R.
(
2014
), “
Truth-default theory (TDT) a theory of human deception and deception detection
”,
Journal of Language and Social Psychology
, Vol.
33
No.
4
, pp.
378
-
392
, doi: .
McQueen
,
S.
(
2018
), “
From yellow journalism to tabloids to clickbait: the origins of fake news in the United States
”, in
Agosto
,
D.E.
(Ed.),
Information Literacy and Libraries in the Age of Fake News
,
Libraries Unlimited
,
Westport
, pp.
12
-
36
.
Marwick
,
A.E.
(
2008
), “
To catch a predator? The MySpace moral panic
”,
First Monday
, Vol.
13
No.
6
, available at:
Nickerson
,
R.S.
(
1998
), “
Confirmation bias: a ubiquitous phenomenon in many guises
”,
Review of General Psychology
, Vol.
2
No.
2
, pp.
175
-
220
.
Oltmann
,
S.M.
,
Froehlich
,
T.J.
and
Agosto
,
D.E.
(
2018
), “
What do we do about “fake news” and other forms of false information? The roles of the organization of false information, professional ethics and information literacy
”,
Proceedings of the Association for Information Science and Technology
, Vol.
55
No.
1
, pp.
719
-
721
.
Osmundsen
,
M.
,
Bor
,
A.
,
Vahlstrup
,
P.B.
,
Bechmann
,
A.
and
Petersen
,
M.B.
(
2021
), “
Partisan polarization is the primary psychological motivation behind political fake news sharing on Twitter
”,
American Political Science Review
, Vol.
115
No.
3
.
Roozenbeek
,
J.
,
Van Der Linden
,
S.
and
Nygren
,
T.
(
2020
), “
Prebunking interventions based on ‘inoculation’ theory can reduce susceptibility to misinformation across cultures
”,
Harvard Kennedy School Misinformation Review
, Vol.
1
No.
20
, available at:
Seo
,
H.
,
Blomberg
,
M.
,
Altschwager
,
D.
and
Vu
,
H.T.
(
2021
), “
Vulnerable populations and misinformation: a mixed-methods approach to underserved older adults’ online information assessment
”,
New Media and Society
, Vol.
23
No.
7
, pp.
2012
-
2033
.
Sharot
,
T.
(
2017
),
The Influential Mind: What the Brain Reveals about Our Power to Change Others
,
Henry Holt and Company
,
New York, NY
.
Singer
,
P.
and
McConnell
,
M.
(
2021
), “
Want to stop the next crisis? Teaching cyber citizenship must become a national priority
”,
Time
, (accessed 28 January 2022).
Vosoughi
,
S.
,
Roy
,
D.
and
Aral
,
S.
(
2018
), “
The spread of true and false news online
”,
Science
, Vol.
359
No.
6380
, pp.
1146
-
1151
.
Wilson
,
T.D.
(
2016
), “
A general theory of human information behaviour
”,
Information Research
, Vol.
21
No.
4
,

or Create an Account

Close Modal
Close Modal