Study Characteristics
Disciplines
Table 1 shows a breakdown of disciplines and the number of articles found therein. The largest concentration of the articles returned by our search was from librarian and information sciences journals (n = 104, 47%), followed by the field of education (n = 61, 26.8%). College students were the largest target population represented in our study. While efforts to tackle mis/disinformation have traditionally involved librarians and educators in formal educational settings, the breakdown of our articles illustrate the sheer extent of their importance in addressing disinformation literacy.
Table 1
Discipline breakdown in articles retrieved
Librarian and Information Literacy Studies
|
112
|
Education
|
61
|
Media and Communication Studies
|
16
|
Public/Medical Health
|
7
|
Psychology
|
5
|
Behavioural Science
|
4
|
Journalism
|
3
|
Visual Literacy Studies
|
2
|
Other
|
17
|
Methodology and Target Population
Our analysis of methodologies reveals that essays, newsletters, and opinion pieces (36.1%, n = 82) were the most prevalent type of article, and these tend to offer a general review of issues in the information landscape and advocacy for media education. The second most prevalent were overviews of current programming and pedagogies (31.3%, n = 71), which included descriptive case studies, summaries and suggested approaches to media literacy but lacked formal evaluations of particular approaches. Our corpus also included articles that used correlational and experimental study designs (28.6%; 65), of which 21 (9.2%) reported on field studies implemented in college or library settings. Finally, literature reviews (4.0%, n = 9) systematically identified themes, patterns, and gaps in media literacy research related to mis/disinformation. We were concerned to note that although media literacy interventions present the most useful basis for developing and implementing approaches against disinformation, they are clearly underrepresented among extant literature.
Of the 227 articles reviewed, many were aimed at youth populations in formal education settings (46.7%, n = 106), while the remainder targeted: populations of multiple or unspecified age groups (34.8%, n = 79), librarians and educators (11.0%, n = 25), or adult populations outside formal education (7.5%, n = 17). The articles that aimed at youth catered to students in higher education (26.4%, n = 60), high school (11.0%, n = 25), or to “youth” or “students” of unspecified ages (9.3%, n = 21). As we will discuss, the lack of articles focused on adults outside of formal education demonstrates a noteworthy gap in the disinformation literature with respect to adult-specific needs around media literacy.
Defining Mis/Disinformation-Related Terms
The very concept of disinformation is contested and variously defined across the literature. We did not find an overarching term that encompassed all mis/disinformation threats but instead a variety of related terms. The most prominent term to emerge in our review was “fake news” (81.1%, n = 184) yet it was not without criticism.. Hobbs (2017b) notes that “fake news” is too broad to be a useful descriptor, while Head et al. (2018) stated that “fake news” is itself a propaganda strategy given its right-wing appropriation and associations. Faix (2018) stated that “fake news” has “come to be synonymous for any news which the reader doesn’t agree with, doesn’t like, or doesn’t want to acknowledge” (p. 44).
Further, the majority of articles (53.3%, n = 121) did not define the terms they used, and terms such as “fake news”, “misinformation”, “disinformation”, and “false information” were used interchangeably. However, within studies that defined their use of disinformation-related terms, there is extensive discussion on how to differentiate them.
Intentionality was stated to be an important marker of distinction between various false information-related terms, particularly misinformation and disinformation. Many studies drew upon Wardle and Derakhshan (2017), who define misinformation as false information shared without the intent of harm, disinformation as false information knowingly shared to cause harm, and mal-information as genuine and often private information shared to cause harm. These concepts were designated as information disorders under the label of fake news (cf. Damasceno 2021). Alvarez (2021) supported the view that fake news is an umbrella term for mis/disinformation, but also noted that mistakes and malicious intent can be difficult to distinguish, especially when targeted disinformation may be unwittingly spread by users who believe it is genuine. Given the complexity of false information across the digital landscape, our review reveals that definitions of concepts used to discuss and understand its nature and implications often lack consensus.
Summary of Realised Interventions
Of the 227 articles we reviewed, only 51 (22.5%) discussed interventions, programs, or curricula that attempted to foster (dis)information literacy knowledge and/or skills. From this sample, 22 articles (9.7% of total) formally assessed these interventions. While we found no consistent pedagogical or evaluative approach, some interventions shared suggested modes of delivery.
Course-Based Interventions
Several studies (n = 9) discussed course-based interventions that focused on teaching news or digital information literacy skills. For example, Cilella (2019) reported on a series of library-based workshops where a librarian teaches the history of fake news, examines people’s motivations for producing and sharing fake news, and runs an activity called “Real or Fake?. The author stated the activity helped the workshops gain traction and raise awareness about the difficulty of determining the validity of content online.
Murrock and colleagues (2018) described their Learn-to-Discern curriculum with undergraduate students where they defined phenomena like propaganda, fake news, and manipulation, discussed media ownership and types of media, and identified the consequences of harmful media practices like dehumanisation, stereotypes, and hate speech. Through practical exercises, students analysed media using debunking tools and identifying markers of fake or manipulated content and propaganda. After the training, most students reported cross-checking the news and retaining cross-checking behaviour a year and a half after the training. One month after the training, 80–90% of them reported using the news literacy skills they learned.
Roschke (2018) and Blakemore et al. (2020) evaluated Massive Open Online Courses (MOOCs). One MOOC focused on how the media works and provided instruction on how to be “more active and informed media users” (Roschke, 2018, p. 7). The second incorporated a digital health literacy component that sought to improve students ability to evaluate health-care information (Blakemore et al., 2020). The digital literacy portion was adapted over the course of eight “runs” of the course. The first run included a video on how to conduct online searches for resources, and another introduced students to the PubMed database. The MOOC was adapted four times to include elements like a plagiarism check and assessment guide, reliability analysis task of four websites, and videos about source evaluation. The authors evaluated the number and quality of references used in final assignments as an indicator of digital literacy and found improvements between the first and the final two runs.
Hobbs and colleagues (2018) described a virtual knowledge exchange between American and German undergraduate students that focused on discussions and analysis of political propaganda. The program consisted of five learning experiences in which students discussed their previous experiences learning about propaganda, judged examples of international propaganda as harmful or beneficial, analysed and annotated propaganda videos, and compared and contrasted political campaign videos from German and American contexts. From this experience, students gained greater awareness of the importance of cultural specificity and context of the propaganda as well as the power of propaganda and its emotional potency (Hobbs et al., 2018).
Hanz and Kingsland (2020) provided a thorough description of their reoccurring workshop on fake news offered initially to international high school students and subsequently to undergraduate and graduate students, faculty, staff, and alumni at McGill University. The workshop was conducted 19 times over two years and adapted throughout. It included presentations and interactive discussions on history of fake news, forms of journalistic bias, the CRAAP test applications and limitations, the “4 moves and a habit” evaluation tool, tips to recognise photo alterations, Google search techniques, and analysing news articles that cite academic articles. Interactive components included sharing their own examples of fake news, reflecting on personal bias, and applying the CRAAP test to tweets and a news article. They ended off with a game of Fake or For Real. Takeaways included the need to provide a list of websites that debunk fake news during activities in which participants judge the veracity of information and to ensure that websites/resources provided do not contain misleading, false, or triggering content.
Colglazier (2017) described the high school history course he teaches and the lessons learned over time. He highlights the importance of asking questions rather than simply lecturing. His approach included teaching (and models to) students how to discern the source of information by cross-checking information with other sites (lateral reading), analysing information, and viewing online search results of pages beyond the first.
Jones (2018) described a classroom lesson that focused on deconstructing a news article that referenced an unpublished paper by a Harvard scholar on the policing and targeting of Black people in the USA. Students were assigned a New York Times article to read along with supplementary study questions and basic background information about the article’s topic of police brutality as context. Students were asked to think about story framing and author credibility. The second part of the lesson involved a class discussion in which students learned about information reliability and limitations of research methodologies. Then students were asked to analyse an Instagram post that presented a racially-biased interpretation of the Harvard scholar's work by questioning the post’s intentions and interpretation. Jones (2018) noted that students should have the opportunity to think through critical questions deeply after the class discussions.
In Walton and colleagues’ UK-based study (2018) on information discernment, A-level students identified three information sources they used recently, indicating their level of trust and reasons for choosing these sources. In small group workshops (fewer than 7 participants), groups designed posters on what constitutes good and bad sources. Follow-up interviews with seven students indicated that the sessions encouraged students to rely more on books and less on Wikipedia as their only source. Interviews with the students, teachers and librarians confirmed that students showed a greater sense of scepticism about information, and displayed improved questioning behaviour about source credibility.
Public Events
Three studies provided details about public events on disinformation and journalism. Rush (2018) and De los Santos (2018) described panel events in which various professionals (namely professors, TV and radio journalists, museum leaders, graduate students) discussed the history and challenges of fake news, how it goes viral, and how to judge reliability of news and consume it responsibly. Participants in De los Santos et al.’s (2018) study found value in the shared authority present in the event in which participants had the chance to ask journalists questions and scrutinise the news while journalists asked participants about their news choices.
Branstiter et al. (2018) presented three activities they conducted in a university setting that aimed to bring media literacy instruction to students who may not receive it in the classroom. Their activities included a fact-checking party during the final 2016 US presidential debate where students raised flags to indicate statements as being true, false, or red herrings; a teach-in where faculty gave 10-minute lightning talks about their area of research, including journalism, communication, and women and gender studies; and the designation of parts of the library as “free speech zones” where students could get a copy of the US constitution and write what they would change about it. These activities were meant to complement their media literacy library guides. The authors suggested that activities geared towards students should be planned outside exam time and advertised appealingly to encourage students to attend.
Web-based Interventions
Two articles conducted web-based interventions (Bonnet & Sellers, 2020; Tully et al., 2019). Tully and colleagues (2019) tested the effectiveness of news literacy tweets on people’s perceptions of information credibility and news literacy beliefs. The purpose of these tweets was to “mitigate the impact of exposure to misinformation about two health issues” and “boost people’s perceptions of their own media literacy and media literacy’s value to society broadly” (Tully et al., 2019, p. 23). They found that while news literacy messages can affect people’s perception of message credibility, viewing such messages once is not sufficient. Campaigns should include multiple messages and these messages should be tailored to the context of the information.
Bonnet and Sellers (2020) described an online, quiz-based activity in which participants received a quiz about COVID-19 for five days, and “scrutinized memes on social media, doctors' credentials, news stories, treatment options, and research about the virus” in order to determine its credibility, trustworthiness, relevance of source, and accuracy (p. 2). After completing the quiz, participants received extensive feedback on how to evaluate and further research each day’s topic. Participants found the quiz fun, educational, and that it raised awareness about how difficult it has become to judge the validity of information without conducting further research. However, participants recommended offering multiple answer options on certain questions instead of only one, as some information could be labelled as both “valid” and “controversial”.
Game-based Interventions
Three game-based interventions were present in our dataset; two focused on information verification and judging the reliability and validity of information (Katsaounidou, 2019; Yang et al., 2021), and another used role-playing as a fake news producer to build awareness of how media deception can occur (Maertens et al., 2021). The former two games, MAthE (Katsaounidou, 2019) and Trustme! (Yang et al., 2021) both contributed positively to participants' ability to evaluate information or news online. However, Yang and colleagues (2021) found that their Trustme! game did not enhance scepticism, though it did improve information discernment skills. After playing Maertens and coauthors’ (2021) game, The Bad News Game, participants found fake news headlines significantly less reliable than before playing it. However, the authors caution that ratings of reliability do not necessarily reveal participants' beliefs regarding the news messages, recommending that future studies should determine whether participants still believed the information regardless of their reliability assessment.
Visual Resources
Five articles reported on interventions that tested the effects of visual resources (infographic and videos) on various indicators of media literacy. Domgaard and Park (2020) randomised participants into three groups to view either an infographic that had tips on finding fake news (infographic condition), a document that contained the text of the infographic (text-only condition), or a prompt ushering participants to press “next” (control group). They found that the majority of people had trouble verifying misinformation about vaccines, but that participants in the infographic condition were significantly better at doing so than text-only and control groups, showing lower trust in the false news articles than the control group. These results suggest that “visual cues can positively affect cognitive processing during the verification of vaccine news articles … and allowed participants for better retention of news literacy skill implementation while reading vaccine news articles” (p. 982).
Lewandowsky and Yesilada (2019) had participants watch either “inoculation” videos that described common rhetorical markers found in Islamophobic or radical-Islamist disinformation (e.g., making hasty generalisations, invoking emotion/using emotional language with moral undertones, and polarization) or a control video on bitcoin. This video was followed up with another video that contained either Islamophobic or radical-Islamist content. They found that the participants who watched the inoculation video perceived the Islamophobic or radical-Islamist video to be less reliable than control group participants, indicated less agreement with the content of the video, and were less likely to share it.
Vraga et al. (2021) explored the effects of Facebook user corrections of sunscreen-related misinformation. They also investigated whether watching a news literacy video before watching videos that either promoted or shared misinformation about sunscreen would enhance the effects of the user correction. The news literacy video highlighted individual fact-checking such as source evaluation, considering intent, and looking for evidence. They found that being exposed to misinformation can affect one’s beliefs in facts and that a few corrections may not be able to strengthen beliefs to the point before exposure to misinformation. The authors also found that the news literacy video did not inoculate the participants against the misinformation. The authors predicted that the perceived credibility of the source of the misinformation (a person who was identified as a doctor and a fictional brand that appeared to be a health brand) is likely to undermine suggestions to evaluate the source. They suggest that, in this context, alternative literacies such as health or science literacy might be more relevant than the skills taught in news literacy, and that future research should explore how providing a mixture of literacies can aid people in better identifying and resisting health misinformation.
Alexander and Wood (2019) explore the effects of a satirical news video on students' engagement with news literacy education as well as its impact on how students critically consume the information in the video. They showed students enrolled in a mandatory first year class at California State University five clips from satirical news shows like The Daily Show and The Colbert Report that touched on the concepts of plagiarism, authority, commercial bias, ethical use of information, information privacy, media ownership and bias in journalism. Over 90% of students agreed that satirical news made the class more enjoyable, facilitated their understanding, and encouraged them to be more critical media consumers. The authors warn, however, that the satirical nature of the videos is not always understood and educators should provide context and guided discussion alongside satirical videos.
Finally, Hwang and coauthors (2021) compared the effects of a general media literacy education video, a deepfake-specific literacy video, and no literacy program on adults' willingness to share a deepfake video. They found that while seeing a deepfake video enhanced the vividness, credibility, and persuasiveness of misinformation and willingness to share it compared to text-only misinformation, participants who saw the general media literacy educational video were least likely to find deepfake videos as vivid and persuasive. While there was no difference in perceived credibility of the misinformation between the experimental conditions, there was a significant difference between these conditions and no literacy video. These results suggest that an educational video tailored specifically to deepfake videos may not be necessary and that a general media literacy video would be effective at reducing the likelihood of accepting and sharing misinformation.
Recommendations from the Research
Considering Emotions and Practising Mindfulness
Within the scholarship that addresses the role of emotion (14.9% of total, n = 34), the need to understand how our emotions are targeted by news, propaganda, and disinformation via personalised messaging was frequently emphasised. A first and necessary step for media literacies is to develop awareness of the goals and strategies of targeted, emotion-focused narratives (Shumaker, 2019; Serrano-Puche, 2021; Sivek, 2018; Alvarez, 2021). Attempts to spread disinformation strategically rely on emotionally provocative, memorable content (Jaeger & Taylor, 2021; Serrano-Puche, 2021). Thus, students should be aware of how emotional data is collected and used, and how their emotions are targeted and exploited, and the effect that could have on their online and offline behaviours (Sivek, 2018).
Scholars stress the importance of understanding the ways disinformation exploits identities and amplifies pre-existing divisions in society (Logue, 2020; Shumaker, 2019). This may include becoming more aware of the psychological and emotional tendencies that lead them to engage with or resist information, and that shape their information sharing behaviours (Burkhardt, 2017; Logue, 2020; Doyle, 2017). When individuals recognize that their actions and attitudes are connected to their emotional state, they can more readily recognise how media could affect their emotions (Whiting, 2021).
Mindfulness surfaced as a practice invaluable within our fast-paced media culture and pervasive barrage of emotionally-heightened information (Sivek, 2018; Berkman, 2021; Canada, 2021; Wineburg & McGrew, 2019; Tsvetkova, 2017; Middaugh, 2019). It can be used to monitor reactions to and help us avoid snap judgements (Berkman, 2021). Drawing on Buddhist teachings, some scholars suggest promoting awareness of when, where, and how we are exposed to news and how we read and process news and information. Students can be encouraged to dedicate a specific portion of their day to consuming the news and to log their emotional reactions in the process (Sivek, 2018). Mindfulness media education integrates small pauses, breaths, and moments of reflection and attention to emotions, feelings, and thoughts. Media literacy programs can encourage “click restraint” (Wineburg & McGrew, 2019) and slow reading to regulate information flow and help maintain information equilibrium (Tsvetkova, 2017).
Other psychological and emotional competencies that emerged in our review included: self-awareness and self-management (Whiting, 2021; Cronkhite et al., 2020; Logue, 2020), empathy (Friesem, 2018; Berkman, 2021; Logue, 2020), compassion (D’Olimpio, 2021), and scepticism (Roschke, 2018; Khan & Idris, 2019; Sullivan, 2019; Burkhardt, 2017; Arth et al., 2019; Guess et al., 2020; Vraga & Tully, 2021).
Structural and Historic Knowledge and Civic Engagement
In line with critical media literacy and in response to exploitation of cognitive biases and the increasing use of emotional targeting and dataveillance, several authors highlighted the importance of teaching about structural and macro-level factors that contribute to the spread of mis/disinformation (n = 14). Learners need to be aware of algorithmic personalization and governance and that algorithms are not neutral but instead reflect the (often economic) intentions and goals of the programmer and the platform (Cohen, 2018; Dixon, 2021; Friesem, 2018; Hobbs, 2020). Many authors suggested teaching about the economics of the news and social media industries in addition to the influence of advertising profits on the design and function of algorithms, social media platforms, and news production (Alcolea-Díaz et al., 2020; Damasceno, 2021; De Abreu, 2021; Fielding, 2019; Friesen, 2018; Manfra & Holmes, 2018; Roquet, 2019).
Several articles recommended teaching about the history of fake news (n = 5). This gives students a deeper and broader understanding of its techniques and goals, and how and why it circulates (Spratt & Agosto, 2017). Educators and librarians may use news articles from different historical contexts, as a springboard for exploring similar and/or differing themes and techniques in the present (LaPierre, 2020).
Some authors (n = 6) indicated that media literacy should be intentionally civic and political (Arth et al., 2019; Berkman, 2021; Fister, 2021; Friesen, 2018; Middaugh, 2019; Mihailidis & Viotty, 2017), emphasising media literacy's aim as empowering learners as active citizens. Arth et al. (2019) recommended that media literacy skills be repositioned as a mode of civic participation rather than a skill for critically analysing individual texts. Finally, Fister (2021) mentions that we should frame media literacy through a lens of democracy, rather than taking a partisan position. This is in line with several other authors who encourage perspective taking and empathy in the consumption of media and social dialogue (Berkman, 2021; Friesem, 2018).
Lateral Reading and the Research and Scientific Process
Several articles suggested teaching lateral reading in media education (n = 12). Lateral reading is the process of searching for information about a source using external resources external to to evaluate credibility (Faix, 2018; Miller, 2018). As opposed to checklists that tend to focus on the singular text in isolation, lateral reading encourages visiting other websites to verify information from the website of interest (Wineburg & McGrew, 2019). Readers can search for information about who owns and publishes the information, funding sources, reviews, and other perspectives about the websites’ content (Fielding, 2019; Faix, 2018; Manfra & Holmes, 2020).
A goal of teaching about lateral reading is to have students understand how the Internet and search engines are structured, and how to make searching and navigating effective and efficient (Wineburg & McGrew, 2019). Damasceno (2021) suggests using reflection questions to guide lateral reading like “what do I know about the source”, or “if this content is true, what else would be true?” Tynes et al. (2021) suggest that lateral reading alone is not enough and that it be paired with critical reading so students understand how searches yield biased results. Overall, lateral reading is a frequently suggested intervention as it attends to the complexity of the fake news context (Sullivan, 2019).
Educators and librarians also suggested teaching research skills (n = 5). Sorrell (2019) introduced ways to incorporate Indigenous knowledge in information literacy where students engage in the research stages of “thinking” (choosing, a research topic), “planning” (developing research strategies), “life” (application of findings), and “assurance” (ensuring information/knowledge is responsibly accessed and utilised). The confusion around COVID-19 highlighted the need for greater awareness of scientific and research methods, including scientific consensus processes (Berkman, 2021; Linvill, 2019). Vraga et al. (2020) suggested that increasing general public knowledge of the scientific process “may facilitate acceptance of evolving recommendations… without undermining trust in scientists and health professionals” (p. 474; also see Badke, 2020).
Fact-Checking Sites
Fact-checking has been deemed by many as “an essential skill for staying informed” (Batchelor, 2017, p. 144); indeed, a host of articles indicated that educators and librarians are increasingly compiling lists of automated fact-checking sites to aid students in detecting false and biased information (n = 10). Educators and librarians introduced fact-checking sites and tools to uncover potential prejudice (Faix & Fyn, 2020). There is consensus on the most reliable fact-checking sites, which include: Factcheck.org, run by the Annenberg Public Policy Center; Politifact.com, run by The Tampa Bay Times; and Snopes.com, run by David Mikkelson (Faix, 2018; Cilella, 2019; Jacobson, 2017; Batchelor, 2017; De Abreu, 2021; Baker, 2016). In a review of 10 fact-checking sites, Mallon (2018) also recommended Factcheck.org and Allsides.com, as well as games like Factitious and Bad News that provide engaging ways to test information discernment skills. Browser extensions such as Media Bias Fact Check, Fake News Alert, and B.S. Detector are also suggested to help flag fake or suspicious information while searching on the web (Wade & Hornick, 2018). When considering strategies for implementing fact-checking sites in the classroom, Faix (2018) suggests beginning with a fun trending topic rather than one that students may already have a bias about.
Collaborative Learning and Active Participation
A number of scholars emphasised the importance of collaborative learning and active participation using discussion-based pedagogies as opposed to teacher-directed or lecture-based interventions (n = 10). This includes having students debate ethical issues about what counts as propaganda (Heller, 2021), group assignments and peer review to encourage collaborative problem solving (Kaufman, 2021), class discussions about source credibility and trustworthiness (Jones, 2018; Glisson, 2019; Hobbs, 2017a), and exploring misinformation through virtual reality (Young et al., 2021). Discussion-based interventions help students build confidence while thinking critically and deeply about media texts (Jones, 2018). While attempts to correct false information can lead to backfire effects and resistance from learners, including learners as active participants can give them more agency and help them “retain information more effectively” (Glisson, 2019, p. 477; Kaufman, 2021). Active learning approaches encourage learners to consider their role as civic participants, actively engaging in media in more complex ways that go beyond fact-checking (Manfra & Holmes, 2018).
Visual Literacy Strategies
Our review findings reveal the need for visual literacy skills to respond to a complex media ecosystem where images are highly vulnerable to distortion (n = 6). Visual thinking strategies encourage students to slow down and question images, trace the source of a photograph, and legitimize URLs (Snelling, 2019). In their modules for teaching about fake news, librarians include the skills required to detect image manipulation, such as reverse image searching (Dahri & Richard, 2018). Swerzenski (2021) advocated expanding students’ visual literacy skills to include photo editing so that students are equipped with understanding how airbrushing, layers, and filters shape both perceptions of images and culture at large. Students may also consider the affordances and risks of using images or memes to accompany an article or topic and how these visuals affect our understanding of the story (Wade & Hornick, 2018; Hobbs, 2017a). To keep up with the increasing circulation of fake visual content, educators can provide current and relevant examples of manipulated visual material and identify techniques (Walker, 2019).
Student- and Culturally-Centered Approaches
Several authors (n = 6) highlighted the importance and need for media literacy interventions being student-centered and tailored to the needs of the target individuals (Auberry, 2018; Brashier & Schacter, 2020; Moore & Hancock, 2020; Mutsvairo & Bebawi, 2019; Rosenzweig, 2017; Samtani, 2019). One size does not fit all, especially across geographic location, age group, learning styles, and social groups. For example, Mutsvairo and Bebawi (2019) spoke about how Western conceptualisations of fake news do not “resonate with local contexts in Africa and the Middle East” (p. 147), and indicated that educators should understand the regions and cultures in which programs are implemented and how fake news manifests in these contexts. For the older adult population, Brashier and Schacter (2020) stated that interventions should meet the shifting goals of this population while Moore and Hancock (2020) suggested that programs should provide resources for fact-checking, since retired older adults may have more time to engage in fact-checking and be more civically and politically minded than younger people. Finally, Samtani (2019) stated that immigrants tend to be the target of many misinformation campaigns and that for media literacy interventions to have the desired effect, immigrants’ needs (e.g., language, media diet) and perceptions of the media and its role in society should be considered.
Debates in the Literature
Despite unanimous concern regarding the spread of false information, many debates arose regarding how to best address this concern.
Debates about Checklist Approaches
We found that educational programs and media literacy guides often relied on information evaluation frameworks, usually in the form of checklists, to guide learners to assess the credibility and reliability of media content and authors. A frequently used and recommended framework is the CRAAP Test (Currency, Relevance, Authority, Accuracy, Purpose; California State University, Chico, 2010). The CRAAP test was utilised and promoted in Canada (Hanz & Kingsland, 2020), Pakistan (Naeem & Bhatti, 2020), the UK (Clough & Closier, 2018), and Ireland (De Paor & Heravi, 2020). In South Africa, 44 out 48 of the librarian guides addressing disinformation contained links to evaluation tools such as the CRAAP test and “How to Spot Fake News” (Bangani, 2021). The same trend existed in the US, where all 21 librarian guides evaluated suggested checklist approaches to detecting fake news or evaluating news sources (Lim, 2020).
In response to contemporary threats in our media ecosystem and criticisms of the CRAAP Test’s inability to address them, some authors suggested either adaptations to this test or novel frameworks. One adaptation was the “CRAAP-based” RADAR framework, which focused on evaluating rationale, authority, date, accuracy, and relevance (Neely-Sardon & Tignor, 2018). Other proposed checklists include the IMVA/IN framework (independent, multiple, verify, authoritative/informed, named), SIFT (stop, investigate the source, find trusted coverage, and trace claims; Ojala, 2019), and CARS (credibility, accuracy, reasonableness, support; Jacobson, 2017; Gardner, 2017).
A third group of scholars challenged the effectiveness of checklists and suggested more updated and nuanced approaches (n = 12; 5.3%; e.g., Feekery & Jeffrey, 2019; Hodgin & Kahne, 2018). Checklists were criticised as being outdated, ineffective, and unrealistic in the context of a fast-paced infodemic (McGrew et al., 2017; Sullivan, 2019). Others expressed concern that the checklist approach “bypasses the need for critical thinking” (Saunders & Budd, 2020, p. 2; Johnson, 2018), “flattening” complex decision-making processes into a set of heuristics (Beene & Greer, 2021, p. 6), and limiting opportunities for open-ended, inductive, and exploratory approaches to text (Shenton, 2021).
Debates about Authority
What counts as authoritative and credible information if there is no agreement about expertise and intent? How can educators teach both the importance of approaching sources critically while also retaining trust in and respect for field professionals? These questions were described as some of “the biggest challenges” facing educators and librarians (Badke, 2020, p. 37; De Paor & Heravi, 2020; Clough & Closier, 2018; Lim, 2020; Saunders & Budd, 2020; Smith, 2017).
Hobbs (2017b) highlighted how questions of authority are more complex as social media comes with “new forms” of authority (e.g., social media influencers). Scholars showcased differing opinions regarding the “Authority is Constructed and Contextual” frame from the Association of College and Research Libraries (ACRL) Framework, updated in 2016. Traditional instruction on authority focused on author credentials with a preference for scholarly and government publishers, privileging institutional knowledge dominated by demographics with social power. The new approach to the authority frame recognizes different types of authority, understands “lived experience as a particular kind of authority”, and acknowledges that authoritative information can be presented traditionally or untraditionally, such as through oral storytelling (Saunders & Budd, p. 3).
Others shared concerns that the “Authority is Constructed and Contextual” Frame does not adequately address today’s ‘post-truth’ reality (Lynch & Hunter, 2020). Saunders and Budd (2020) stated that the frame goes too far in challenging traditional indicators of authority, encouraging students to reject authoritative sources and consider more “unlikely” sources. Additionally, understandings of authority that are too open-ended leave it up to the receiver to determine the trustworthiness of any piece of information, leading to a call for a return to a world in which “we trust the expertise of those who know” (Badke, 2020, p. 38).
Debates about Scepticism
Debates regarding the evaluation of authority led to further questions and debates about the value of encouraging “scepticism”. Some authors strongly advocated scepticism as a quality critical for rejecting mis/disinformation (Sullivan, 2019) and an effective way to “combat the biases and psychological preferences built into our brains, at least long enough to consider alternatives” (Burkhardt, 2017, p. 26). Vraga and Tully (2021) found that those who are more news literate are more sceptical of social media information quality.
However, several scholars noted the challenge of encouraging scepticism without creating cynicism and distrust of all information (Albert et al., 2019). Scepticism involves doubt and asking questions, skills that are described as necessary for participatory democracy, while cynicism involves distrust and “creates disengagement and disillusionment” (Vraga & Tully, 2021, p. 154). For other scholars, scepticism comes at the expense of distrust in government (Albert et al., 2019) and mainstream news (Guess et al., 2020; Vraga & Tully, 2021). Albert and coauthors (2019) even described scepticism as a “deterrent” (p. 34), and suggested educators and librarians instead “counter student skepticism” (p. 36). Sullivan (2019) cautioned specifically against the “dark side” of scepticism, namely that of “self-serving skepticism”(p. 99).
The risks associated with scepticism led some scholars to advocate a balanced approach that encouraged verification and analysis of credibility and content, while maintaining a sense of trust. This “healthy scepticism” approach encourages individuals to understand their participatory role and responsibility in sharing problematic information (Khan & Idris, 2019; Roschke, 2018). To avoid cynicism, individuals must not feel overwhelmed by information choices and must recognize the value of media literacy in a democratic society (Vraga & Tully, 2021).
Debates about Responsibility
Across this review, divergent perspectives about who is responsible and accountable for the health of the information and media environments appeared to influence the design and approach of the interventions. The extent to which individual users are positioned as responsible for the proliferation of disinformation varied. Some interventions helped participants to avoid disinformation consumption, often by relying on fact-checking and checklist frameworks (Dixon 2021, Wade & Hornick 2018), while others taught participants to prevent the spread and sharing of disinformation by emphasizing media literacy as a self-reflective practice (Lynch 2020, Krutkowski 2019, Roquet 2019). While most media literacy interventions focused on tools and strategies that can be employed by individual users, other articles invoked the role of governments and technology companies, such as social media platforms, in managing the disinformation landscape at large via policymaking and regulation (Gilchrist 2018, Hanz & Kingsland 2018). Faix (2018) notes that media literacy interventions based on source evaluation hold individual users responsible for seeking credible information without addressing the agents who create and publish disinformation, yet this burden of individual responsibility is also a necessity. Given that the profitability and pervasiveness of disinformation is unlikely to be eradicated, individuals need to be empowered in order to make critical and informed decisions of their own.
Debates about Addressing the Political
Another prominent debate revolved around the degree to which the political aspects of disinformation in media literacy should be addressed, especially given the increased post-2016 concern regarding fake news. The term “fake news” was itself criticised as being politically-laden given that it often serves to “amplify a right wing political agenda to undermine the credibility of centrist and center-left media organisations” (Head et al. 2018, p. 41). One group of scholars advocated for rejecting the stance of neutrality and to actively teach political topics (e.g., elections, policies, and conspiracy theories; Jaeger, 2021; Hobbs, 2017a; Fister 2021). Alvarez (2021) called upon libraries to take a proactive political stance by adhering to diversity, equity, and inclusion practices among staff, while also partnering with governments, nongovernmental organizations, civil society, and community groups such as local task forces and boards. While the politically motivated nature of disinformation is broadly highlighted within these studies, specific ideologies or actors are not necessarily mentioned.
Another group of scholars recommended avoiding explicit political discussions so as to prevent further polarization or backlash. Young et al. (2021) stated that communities predicated on disinformation offer emotional and social affirmation to members, and cannot be dismantled by cold truths. Instead, programming must help people to build new communities based on truth, most effectively implemented by a broad approach to literacy. Similarly, Ireland (2017) encouraged media literacy training that teaches about elements of disinformation within general entertainment media to make programming both personally relevant and broadly inclusive. Dar (2021) forwarded a “devil’s advocate” approach, whereby various perspectives to news events are considered without alienating learners of diverse political views.