The Case for a Lateral Reading Approach to Online Source Evaluation

by

Students today face an online world that is full of false and misleading information. While there are reliable and trustworthy sources out there, finding them can be a difficult task. It’s tempting to think that since students are digital natives, they are well-equipped to locate the good information among the bad, but this is, unfortunately, far from true1.

Foundational research carried out in the US has demonstrated that middle- and high-school students overwhelmingly lack the skills required to sort and interpret the information that reaches them through digital channels1. But there is good news, too: the application of contemporary source evaluation strategies has been shown to dramatically improve outcomes2.

In 2020, to address a need for tools and knowledge specific to Canada, civic education charity CIVIX launched a verification skills program built around contemporary evaluation techniques.

The program—CTRL-F: Find the Facts—is based on the same “lateral reading” skills professional fact-checkers use to quickly reach accurate conclusions about online sources and claims3. CIVIX collaborated with external evaluators to carry out a nationwide study to gain insight into how students evaluate online information and to measure the impact of CTRL-F participation on their reasoning and accuracy, applied to real-life examples4.

CIVIX found that, like their American counterparts, Canadian students struggle to evaluate online information. By applying the simple research skills taught within CTRL-F, however, their performance on post-tests improved dramatically. CTRL-F students were also much more likely than their peers in a control group to correctly assess trust and apply sound reasoning to example prompts. Study findings provide compelling evidence that CTRL-F’s lateral reading skills represent a pedagogical shift necessary for building the next generation of informed citizens.

In the study, conducted during the 2020/21 school year, 80 teachers took 2,324 students in grades 7 to 12 through the CTRL-F program. Teachers agreed to teach CTRL-F in its entirety—approximately seven hours of classroom instruction—and to participate in a two-hour training workshop before beginning. Students were taught to verify information, investigate the source of information, check to see if other reliable sources were saying the same thing, and trace the information to its original source.

Students received a pretest containing four examples of sources and claims, a mix of reliable and unreliable. A week following the end of instruction they received a second, comparable test. A control group of 363 students received no CTRL-F instruction.

On the pretest, when students were asked to indicate their level of trust in a source and to explain their answer, the overwhelming majority (79%) referred to superficial elements of the site itself. These included: the presence or absence of contact information, author names, the professional appearance of the site, and the number of ads on the page. Simply put, looking at these signals failed students. The most concerning result: only 6% were able to successfully identify the agenda behind an anti-LGBTQ website.

In looking for an explanation for these findings, one need only consider the current dominant pedagogical approaches to information evaluation. Where CTRL-F teaches students to leave the page where information is found (lateral reading), it is much more common that students are instructed to close-read, analyzing the information itself for markers of credibility (vertical reading)5.

Vertical reading strategies are often packaged into checklists for students to apply. The most common of these is the CRAAP test, which asks students to thoroughly scrutinize information and score it according to a set of criteria related to currency, relevance, accuracy, authority, and purpose6. Close-reading this way, however, is an outdated approach that is prone to backfire when applied to online information, where key context is often absent3. Despite this, checklists are still the go-to tools for many media literacy organizations, educators, and university libraries.

Looking at CTRL-F post-test responses illustrates the potential of lateral reading. Take, for example, the rationale of one student who gave that anti-LGBTQ advocacy site a high trust rating: “The website isn’t cluttered with advertisements. There is contact information. They have stated their objectives.” Because the student used superficial criteria to reach a conclusion, we can’t know if that high trust score accords with their true thoughts.

On the post-test, the same student had this to say about the decision to assign a low trust score to a climate-change-denying group: “At first, it seemed like the site was reliable, but after reading the Wikipedia site for a bit I found out that they are a leading promoter of climate change denial.”

This is the result the CTRL-F program aims to bring about. And, overall, post-test results were highly encouraging. There was an impressive increase in the use of lateral reading between tests: on the pretest, students left the page to conduct research only 11% of the time; on the post-test that number increased to 59%. Use of reasoning also improved significantly—just 9% of student pretest responses cited meaningful context to support a correct answer, with this being true 50% of the time on the post-test.

There is still much room for improvement, but these results represent the impact of just one seven-hour intervention. It is CIVIX’s belief that if students were to encounter lateral reading as the default, across grades and subjects, they would be much better equipped to navigate our increasingly complex and polluted information environment and participate as informed citizens in our democracy.

The CTRL-F program, which is named for the keyboard shortcut for “find,” is available for free to all educators, in both English and French. The curriculum is anchored by short and engaging expert-led videos and interactive practice examples drawn from current events and a range of platforms, from professional news organizations to TikTok.

Professional development is available to support educators looking to implement the program, with workshops offered twice monthly throughout the school year. More information about programming, workshops, and the study findings can be found at ctrl-f.ca.


References
1 Wineburg, Sam and McGrew, Sarah and Breakstone, Joel and Ortega, Teresa. (2016). Evaluating Information: The Cornerstone of Civic Online Reasoning. Stanford Digital Repository.

2 Wineburg, Sam and Mcgrew, Sarah. (2017). Lateral Reading: Reading Less and Learning More When Evaluating Digital Information. Stanford History Education Group Working Paper No. 2017-A1.

3 Breakstone, J., McGrew, S., Smith, M., Ortega, T. and Wineburg, S. (2018). Why we need a new approach to teaching digital literacy. Phi Delta Kappan, 99(6), pp.27-32.

4 Pavlounis, D., Johnston, J., Brodsky, J., & Brooks, P. (November 2021). The Digital Media Literacy Gap: How to build widespread resilience to false and misleading information using evidence-based classroom tools. CIVIX Canada.

5 McGrew, S. (2021). Challenging approaches: Sharing and responding to weak digital heuristics in class discussions. Teaching and Teacher Education, 108, p.103512.

6 Blakeslee, S. (2004). The CRAAP test. Loex Quarterly, 31(3), 4.


ABOUT THE AUTHORS

Ken Boyd and Jessica Leigh
Ken Boyd and Jessica Leigh Johnston work at CIVIX Canada, a Canadian civic education charity dedicated to building the habits and skills of citizenship among youth under the voting age.


This article is featured in the Fall 2022 issue of Canadian Teacher Magazine.

You may also like