Next Article in Journal
Protection of Immigrant Children and Youth at Risk: Experiences and Strategies of Social Integration in Portugal
Next Article in Special Issue
Libraries Fight Disinformation: An Analysis of Online Practices to Help Users’ Generations in Spotting Fake News
Previous Article in Journal
Child and Adolescent Multiple Victimization and/or Polyvictimization: A Portuguese Comparative Study
Previous Article in Special Issue
Source Information Affects Interpretations of the News across Multiple Age Groups in the United States
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Fake News and the “Wild Wide Web”: A Study of Elementary Students’ Reliability Reasoning

1
College of Education, University of Mary Hardin-Baylor, Belton, TX 76513, USA
2
College of Education and Human Sciences, Oklahoma State University, Stillwater, OK 74078, USA
*
Author to whom correspondence should be addressed.
Submission received: 20 August 2021 / Revised: 23 September 2021 / Accepted: 25 September 2021 / Published: 1 October 2021
(This article belongs to the Special Issue Fighting Fake News: A Generational Approach)

Abstract

:
Online research presents unique challenges for elementary students as they develop and extend fundamental literacy skills to various media. Some features of internet text differ from that of traditional print, contributing to the challenges of discerning “fake news.” Readers must understand how to navigate online texts to conduct research effectively, while applying critical thinking to determine the reliability of online information. Descriptive data from an ongoing study revealed that children in grades 1–5 lack some basic understanding of how to search the “wild wide web.” Just as children benefit from explicit instruction related to text features, children benefit from instruction related to the features of the internet. This article presents a study of website evaluation that occurs early in the search process prior to the selection of a particular website or article. The application of the web literacy skills required to conduct an internet search is addressed, and recommendations prompt teachers to consider searches beyond the “walled garden,” as well as ways to handle the “messiness” of internet exploration.

1. Introduction

Research processes for most of today’s young learners include online searches. However, the ability to conduct online searches and discern online information is a challenge for children and adolescents [1,2,3,4], and children struggle with basic skills [5,6]. This struggle is in part due to the unique features of what we call the “wild wide web” [4], which contains unvetted content, fake news, ads, and other features that distract from desired information and make internet searches complicated [2,7]. The term “fake news” has been used to describe fabricated news, with no factual basis, that is presented to the public as a credible report [8,9]. Loos, Ivan, and Leu [10] (2018) as well as other researchers [11,12,13,14] suggest that fake news threatens information access, which is a basic right of all citizens. In addition, the threat of “fake news” on the internet complicates instruction related to the internet as an information source. To discern credible information and news, readers must apply critical thinking to develop what we call “reliability reasoning” [5] (pp. 85–86), or the ability to determine the credibility of online information. The development of such critical thinking requires instruction and practice; yet many teachers are reluctant to allow children to search the wild wide web due to safety concerns [4]. A 2019 study found that most teacher-recommended websites designed for elementary students operate in neat, tidy, and safe walled gardens; students navigate pre-vetted websites, avoiding the “wild wide web” [4] (p. 97). In a walled garden environment, searches are restricted to content within the host’s website [15], which limits authentic experiences and does not pose the same “messy” (p. 112) challenges of discerning between relevant content and ads and other distractors.
How can students discern information on the internet without authentic practice? Children will use the internet as an information source, with or without instruction on how to do so. As educators, we have a responsibility to keep our young readers safe, and we also have a responsibility to equip them to handle the discoveries and distractions of wild online reading. Therefore, over the past five years we have continued our work with elementary students in grades 1–5 to understand the skills students exhibit when it comes to searching for and evaluating information on the internet. We recently revisited a 2006 University of Connecticut study in which seventh graders lacked the skills needed to determine the credibility of a website about a tree octopus. Using the same website twelve years later, we re-examined how 68 first- through fifth-grade students evaluated the source and shared rationales about its authenticity. Although the students in our study were more critical of the tree octopus article, 65% of students trusted the information. Only at the fifth-grade level did more students question the accuracy of the website information than those who trusted it [6]. Many students believed the tree octopus article to be credible because it had “real” pictures. If young learners trust “real” photos, then other issues present with fake news, in which articles hide behind a “mask of legitimacy” [3], may be problematic. Since this study, we have extended our work with students in grades 1–5 to evaluate concepts of online text and concepts of online research. For the purpose of this article, we discuss findings related to internet searches on the “wild wide web,” using two tasks that require the narrowing and evaluation of websites and their content. The guiding question for this study was: What search and evaluation skills do students in grades 1–5 demonstrate during an internet query?

2. Background

As students evaluate paper-based or web-based information, they must apply critical thinking skills, which involve the ability to analyze, assess, and reconstruct information [16]. Dewey [17] (1933) considered critical thinking to be a stance or disposition in which a learner actively applies reflective thinking. This view situates critical thinking within a constructivist theoretical perspective. Dewey suggested that learners think critically when “selecting and weighing the bearing of facts and suggestions as they present themselves, as well as of deciding whether the alleged facts are really facts and whether the idea used is a sound idea or merely a fancy” (pp. 119–120). Evaluating online information also reflects a new literacies perspective. A dual-level theory of New Literacies conceptualizes new literacies on two levels: upper case and lowercase new literacies [18]. In general, New Literacies (upper case) attempts to explain the phenomenon of new literacies (lower case) created by the emergence and constant influence of technology and the expanding definitions of literacy [18]. As patterns of findings evolve from new literacies studies, they inform this theory [18]. Critical literacies are among the principles of New Literacies that appear to be common across the research and theoretic work taking place.
The ability to think critically is a key factor in evaluating online information and becoming web literate [18,19]. Readers must become healthy skeptics [19] of online information, developing what we call reliability reasoning [5] to determine deceptions and truths found on the internet. Because we live in a world of convenient internet access and abundant information, teachers must understand, teach, and model web literacy skills [2], which entail the knowledge and skills required to locate, evaluate, synthesize, organize, and communicate information found online [2,20]. As Dalton [21] (2015) reported, “Web literacy is huge. It’s everything we do on the Web” (p. 605). Much of the literature related to web literacy skills focuses on the ability to evaluate the content of an article or other information found on the internet. We expand on current discussions to include search processes that lead to the desired content. We suggest that the issue of evaluating information begins early in the search procedure, prior to the selection of a particular website or article. The process of searching the internet and thinking critically about online information is often referred to as web literacy [2,20]. Students must understand how to conduct effective research, and part of this process requires them to understand the massive nature of the internet. A typical internet search results in millions of website suggestions. Students need basic knowledge of what a browser is and that an online search provides unlimited content. Students also need to practice evaluative skills and reliability reasoning in order to recognize ads and inappropriate or unrelated content.
When a search is initiated, internet users can see the number of “hits” a search produces in various ways. When using Google, the search engine provides the number of websites that the search resulted in. Figure 1 shows that a search for “dolphins” resulted in about 285,000,000 results. When using a tablet, such as an iPad, Safari is typically the search engine used. With Safari, the number of results is not listed, but users can select “more results” at the bottom of the search.
Because of the vast amount of information on the internet, the ability to narrow a search plays a role in finding information. Teaching students to narrow online searches enables them to significantly reduce the amount of information they must sift through. There are many ways to narrow a search, including altering key word phrases, using quotation marks, or applying Boolean terms. A search can be narrowed further by using tools such as the Google toolbar, which enables internet users to conduct advanced searches using criteria such as language, readability, file type, usage rights, or other settings. For example, a Google search for fake news yields approximately 992,000,000 results. By conducting an advanced search, the results requiring the words “fake news” in the title are either inclusionary or exclusionary terms. A search of this nature yields 2,520,000 results. The results could be narrowed further by selecting language, location, date ranges, or domain options until a manageable number with a specific focus is curated.
After a search is conducted and potentially narrowed, the next step is to determine which website to select for further examination. In our work, we have noticed that many students go straight to images, searching for visuals. As adult learners, we do this sometimes as well. However, ads and suspicious content may be avoided by applying evaluation skills early in the search process. Once a website is selected, evaluation continues as students examine the website’s content for relevance and accuracy. Reliability reasoning is no easy task! One could check the website’s URL for clues about a website’s content. Internet users must understand the domain and extension (.edu, .org, .com), find the author, and utilize many other clues URLs may provide. For example, the tilde (~) is a clue that the website is a personal page authored by any person without review or validation of content. In addition, suspicious content can be cross-referenced with other sites. Web literacy skills require critical thinking, a necessary skill in the information age.

3. Materials and Methods

Since 2016, we have conducted ongoing research to learn more about elementary students’ web literacy skills [6]. In order to assess web literacy skills, we initially developed the Concepts of Online Text (COT), which measured the knowledge of online navigation and text features of students in grades 1–5. Traditional assessments of concepts about print inspired the development of the instrument, which includes an observation protocol of online text, similar to the observation protocol Marie Clay [22] (1979) used with print-based text. Table 1 provides a comparison of the COT and Clay’s concepts about print assessment. The COT instrument consists of seven tasks that align with two main constructs: (1) website orientation and navigation and (2) knowledge of webpage text features. Construct 1 involves the orientation of a website, including the understanding of principles involving directional arrangement of text and media. Construct 2 involves the identification and understanding of webpage text features such as author, publisher, titles, headings, menus, captions, graphics, and hyperlinks. While emerging readers typically master print awareness and concepts of print in kindergarten [23], research conducted with the COT, published in 2018, indicated that knowledge of text features and website navigation occurs during the later elementary years [6].
The COT-R, an updated protocol, extends the assessment instrument to evaluate knowledge of internet research. The COT-R instrument added a research component to the assessment, which included additional constructs: (3) Application of Research Skills and (4) Evaluation of Online Information. Construct 3 involves the ability to use digital skills to search, save, cite, and share information. Construct 4 involves the ability to evaluate search results, websites, and content for relevance and the credibility/trustworthiness of sources. For the purpose of this study, we focus on construct 4, the evaluation of information found during an authentic search on the wild wide web.

Data Collection and Analysis

In the spring of 2020, we began recruiting teachers across the US to administer the COT-R to students in grades 1–5, with the goal of administering the assessment to at least 500 students. Prior to the pandemic, we recruited teachers from four states—one west coast state, an east coast state, and two southern states. Teachers completed a brief training session, in which the interview protocol administration and scoring processes were explained. After gathering both guardian consent and student assent, teachers conducted one-on-one interviews using the COT-R protocol. Teachers began data collection, which was interrupted temporarily as the doors of schools across the nation closed. Although data collection resumed during the fall of 2020, recruiting teachers to collect data was difficult, as teachers were overwhelmed with COVID-19-related issues. Therefore, data collection continued through the spring of 2021. A total of 354 first- through fifth-grade students participated in this study, including 183 female participants and 171 male participants. The authors and certified teachers trained to give the assessment collected the data. Table 2 presents the number of participants per grade level.
Students in this study used a laptop or desktop using a Google search engine. The research tasks began with a prompt in which students were asked to search for an animal, specifically a dolphin. If the participant needed help with spelling, the administrator assisted by spelling the word aloud or typing it for the student, if needed. Many students selected the target word from the auto-complete drop-down box. It was also noted that a few students used the microphone feature to start their search. Then, students examined search results and discussed their search. Two tasks were assessed, including the ability to narrow information and the ability to evaluate information. The first task was evaluated with the following prompts: (1) Show me how many websites your search provided and (2) Show me how you could narrow the dolphin search to find what dolphins eat. Examples of answers that received credit for question one had to be specific. For example, a student might say, “A search for ‘dolphins’ provides 86,000 sites.” Most searches will reveal multiple pages of sites, so the child would earn credit for the question if he/she understood that results extend beyond those visible on the first page. Counting visible links or websites on first screen is NOT correct. Examples of answers that received credit on question two included: the website titles/subtitles, context clues, and credible sources. Examples of answers that received no credit: first link, an advertisement, or images. The number of correct responses on each task for each grade level was calculated and converted to a percentage. Examples of actions that received credit on question two included the addition of keywords, typing a more specific question, or using quotation marks (with two or more words). Boolean terms (and, or, not) or the use of advanced searches would also count as an appropriate action. If students simply clicked on a link or indicated they did not know how to narrow a search, they received no credit. Teachers were provided a space to take notes during the administration of the assessment.
The second task was evaluated with the following questions and prompt: (3) How do you know which website will provide the best information about your topic; (4) Click on one of the websites you found. How can you tell if this website is relevant to your search? In other words, how can you tell if this website will give you the kind of information you need; and (5) How can you tell if this website will provide correct information that is true, or accurate? Examples of answers that received credit on question 3 included: the website titles/subtitles, context clues, and credible sources. Examples of answers that received no credit: first link, an advertisement, or images. Students received credit on question 4 if they were able to determine that the website(s) they selected matched their topic. For example, the child might say, “It is about dolphins.” A website about the football team, the Miami dolphins, would be an inappropriate response to this question. Students received credit on question 5 if they were able to explain a way to check the validity or credibility of the website. They could respond with answers such as “Go to the home page and look for information about the publisher,” “It is part of the Family Education Network (reliable source),” “Cross-reference the website,” or “I trust the author because s/he is a scientist (or other occupation)”. Examples of answers that received no credit include: it is the first website; it is not an advertisement, and it is a .org or .net (not always reliable). Again, teachers were provided a space to take notes during the administration of the assessment.
In order to analyze data, we examined student responses for the five tasks that are reflective of Construct 4. Binary data were analyzed using quantitative statistics in which students scored a “1” for a correct response and a “0” for an incorrect response. The number of correct responses on each task for each grade level was calculated and converted to a percentage. Teacher notes on the surveys were a potential qualitative data source. Even though few teachers included written notes, this qualitative source was analyzed by a search for themes that came out of each task/question.

4. Results

The primary purpose of this research was to examine student knowledge and evaluation of information found during an authentic internet search. The findings pertain to initial outcomes for Construct 4 of the COT-R observational survey. Through the observational survey process, we were able to evaluate the search and evaluation skills of 354 students in grades one through five.

4.1. Task 1: Narrowing Information

Task 1 addressed the ability to narrow information and was evaluated with two prompts. The number of correct responses on each prompt per grade level was calculated and converted to a percentage. Table 3 presents findings from the prompt in which students had to determine how many websites a search provided. Overall, 20.1% of the participants earned credit for their response to this task. It was noted that in most cases, students either counted the number of results on each page or did not know how to determine the answer.
Table 4 presents findings from the prompt in which students had to narrow the dolphin search to find out what dolphins eat. Overall, 81.9% of the participants earned credit for their response to this task. It was noted that most students typed in a question in the search bar in order to narrow the search. For example, a common search was “What do dolphins eat”?

4.2. Task 2: Evaluating Information

Task 2 addressed the ability to evaluate information encountered in an internet search and was assessed with three questions. The number of correct responses for each question per grade level was calculated and converted to a percentage. Table 5 presents findings from the prompt in which students had to determine which of the search results would provide the best information about the dolphin topic. Overall, 40.9% of the participants earned credit for their response to this task. One COT-R test administrator noted that students often referred to images when asked this question. This could explain why fewer students earned credit for this prompt as opposed to the next one.
Table 6 presents findings from the prompt in which students had to first select a website and then tell if the website was relevant to their search. Overall, 74.6% of the participants earned credit for their response to this task. Because students searched what dolphins like to eat, many students were able to use images on the website they selected to confirm they had found what dolphins like to eat. Images seemed to catch a child’s attention more easily than other text features.
Table 7 presents findings from the prompt in which students had to determine the accuracy of the website they selected. Student performance was the weakest on this task. Overall, 18.9% of the participants earned credit for their response. Observation notes indicated that many students believed websites had correct information because the pictures were real.

5. Discussion and Implications

Although the tasks included in this study provide just a glimpse into the search process, it is clear that young readers need to develop skills to be savvy consumers of online information. According to our findings, many elementary students demonstrated a lack of knowledge about online research. We believe this is in part due to a misunderstanding of the nature of the internet as an information source. It is apparent in the first prompt for Task 1 that participants did not understand that a web search typically results in millions of website suggestions. This lack of understanding is a problem, and it is not a new one. A 2008 study of 7-, 9-, and 11-year-old children searching the internet in the home reported that the majority of the participants never went beyond the first page of results during a search [24]. The researchers also found that the first website result was typically selected to examine further. Students need to understand that the internet, a global library system, has become the largest repository for locating information [2]. They also need to understand that much of the information on the internet has not been vetted and, therefore, must be scrutinized.
Task 1 also assessed students’ abilities to narrow an internet search. Students performed well at this task, with 90% of students in grades 3–5 narrowing searches effectively. It was interesting that students knew how to narrow searches by changing the keyword to a question. For example, many students asked, “What do dolphins eat?” By using a question in the search bar, students were able to obtain more specific results that did not include websites about the Miami Dolphins, for instance. Even though search engines use key words in the websites they search, the questions asked seemed to provide a combination of key words that worked for this particular search. However, because students did not understand the vast number of results provided by a search, we wonder if students would have narrowed the initial search if they had not been instructed to do so. We also wonder if students tend to use questions instead of key words as they search the internet. These questions would be worth further investigation.
Task 2 assessed students’ abilities to evaluate information during an internet search. The results of the three prompts are similar to previous studies in which students were asked to evaluate information. The participants across all grade levels struggled with the question How do you know which website will provide the best information about your topic? To answer this question, students must start the process of evaluating information before they select a website. For example, it would be appropriate for students to avoid ads or irrelevant websites. Students in younger grades may not be able to read well enough to determine which website to select. It was noted that after initiating a search, many of the younger students started clicking images or websites without examining the list of website results. However, 52.4% of third graders, 46.2% of fourth graders, and 55.7% of fifth graders missed this question, indicating a need for explicit instruction related to how to examine search result lists. Internet search results are not numbered, but companies such as Google apply an algorithm that is used to determine search results. A library search using a database does present numbered search results, with vetted articles. Teaching students to search library databases may help in information location. However, students need to be taught the skills necessary for locating accurate information on the wild wide web.
The last question was the most difficult for all students: How can you tell if this website will provide correct information that is true or accurate? Only 18.9% of participants were able to answer this question correctly. Participants were unable to verbalize ways to examine credibility. Inaccurate answers were common, which students either mentioning or pointing at “real” photos. This finding suggests students are often fooled by fake news on the internet that includes realistic photos.
Teacher data collectors for this study were surprised by student performance on the COT-R. Perhaps educators assume students know more than they do when it comes to the internet. We know this is the case with general technology use, as researchers [25,26] have challenged Prensky’s [27] idea that children born after the 1980s are “digital natives” who are fluent with computer and internet technology. Because students lack knowledge about searching the internet, they are at risk of being fooled by fake news. Education is key. We recommend that they need increased opportunities to practice internet searches in safe environments. We are not suggesting teachers should provide the websites for research. We support instruction in which students engage in the “messiness” of online searches [4] (p. 98), where teachers guide students to become critical consumers of information. Students need authentic opportunities to safely search the wild wide web with teacher support and guidance. The need for strong web literacy skills will “increase, not decrease, the central role teachers play in orchestrating learning experiences for students as literacy instruction converges with internet technologies” [18] (p. 1173).
What does this mean for educators? Just as teachers teach nonfiction text features in paper-based books and how to use the glossary, heading, charts, tables, and facts vs. opinions, in the online information age, they are charged to teach how to determine source credibility and help them to develop reliability reasoning. This instruction needs to begin at an early age if we are to equip students with the tools and thought processes needed to critically examine information. The International Society for Technology in Education published standards for students identifying web literacy competencies for learning in the digital age [28]. The standards, adopted in all 50 US states and in many countries, are available in eight languages. Standard 3 relates to the content of this article with its focus on students as “Knowledge Conductors” [28] (para. 4) The corresponding skill states that “Students evaluate the accuracy, perspective, credibility and relevance of information, media, data or other resources” (para. 4). The age range for the student standards is unclear. Perhaps such skills should be the focus of teachers around the globe.

6. Conclusions, Limitations, and Implications for Future Research

The guiding question for this study was: What search and evaluation skills do students in grades 1–5 demonstrate during an internet query? Findings from this study indicate that upon initiating an internet query, participants in grades 3–5 could narrow an internet search efficiently. Even students in grades 1 and 2 had some success narrowing an internet search, with more than half of the students demonstrating this skill with some success. Students also demonstrated proficiency at determining which of the websites may be relevant to their search. However, participants did not understand the breadth of the results their query provided. Only at the fifth-grade level could half of the students understand how many websites their query produced. The most challenging of the research tasks was the evaluation of information. Participants lacked the evaluation skills needed to determine which website would provide the best information about their topic. Then, once they selected a website, they lacked the evaluation skills necessary to determine if the website was true, or accurate. Only 18.9% of the participants responded to this evaluation task with acceptable answers.
An educational approach using media literacy [10] and teaching strategies to determine reliable and trustworthy sources may be among the most important literacy work in the 21st century. Fake news will need to be addressed explicitly with educational strategies to equip students to navigate the wild web. Just as teachers model concepts with young students using big books [29] and enlarged texts, they can do the same with internet searches on large presentation screens. For example, rather than having an image or video at the ready, teachers can model search process methods, including some typical internet missteps [30] about their process, starting from the search engine or opening page of a website.
This study had a number of limitations which should be considered by researchers seeking to replicate the study. Although our goal was to collect data from across the United Sates, most of the data in this study were collected from four states. In order for the data to be more generalizable, data need to be representative of each state in the United States. We feel the 2020 pandemic impacted our ability to recruit teachers during the spring of 2020. In addition, some states continued online learning during the fall of 2020. Finally, limited sociodemographic information prevented deeper analysis related to the implications of this study.
The findings from this study have implications for teacher preparation and development. Preservice teachers’ literacy education should extend to concepts of digital print. In addition, in-service teachers’ continued professional development should include evolving digital literacy skills. Navigating online texts is a current need, not a future need. Understanding student knowledge of digital literacy, as well as ways digital texts and media work in an online environment, provides insight into the instruction needed in current elementary settings. Rather than assuming students will learn the needed skills as they engage with online text, we must acknowledge the need for explicit instruction and the benefit of learning through experience.
Our plans for future research include the use of the COT-R with older participants. We will extend data collection into grades 6–8. Future research could also compare the performance of students from varying demographics, such as rural versus urban schools, or schools with and without 1:1 technology initiatives. Finally, the inclusion of participants from across the globe would provide further insight into students’ search and evaluation skills.

Author Contributions

Conceptualization, J.P. and S.V.; methodology, J.P.; formal analysis J.P. and S.V.; investigation, J.P. and S.V.; resources, J.P. and S.V.; writing—original draft preparation, J.P. writing—review and editing, S.V.; project administration, J.P. and S.V.; funding acquisition, J.P. and S.V. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the International Literacy Association’s Elva Knight Research Grant and the University of Mary Hardin Baylor’s Graduate Research Grant.

Institutional Review Board Statement

The study was approved by the Institutional Review Board of the University of Mary Hardin-Baylor (protocol #33, approved on 20 June 2019).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data is contained within the article.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Wineburg, S.; McGrew, S.; Breakstone, J.; Ortega, T. Evaluating Information: The Cornerstone of Civic Online Reasoning. Stanford Digital Repository. 2016, pp. 1–27. Available online: http://purl.stanford.edu/fv751yt5934 (accessed on 19 August 2021).
  2. Leu, D.J.; Forzani, E.; Timbrell, N.; Maykel, C. Seeing the forest, not the trees: Essential technologies for literacy in the primary grade and upper-grade classroom. Read. Teach. 2015, 69, 139–145. [Google Scholar] [CrossRef]
  3. Dumitru, E. Testing children and adolescents’ ability to identify fake news: A combined design of quasi-experiment and group discussions. Societies 2020, 10, 71. [Google Scholar] [CrossRef]
  4. Vasinda, S.; Pilgrim, J. Are we preparing students for the web in the wild? An analysis of features of websites for children. J. Lit. Technol. 2019, 20, 97–124. [Google Scholar]
  5. Pilgrim, J.; Vasinda, S.; Bledsoe, C.; Martinez, E. Critical thinking is CRITICAL: Octopuses, online sources, and reliability reasoning. Read. Teach. 2019, 73, 85–93. [Google Scholar] [CrossRef] [Green Version]
  6. Pilgrim, J.; Vasinda, S.; Bledsoe, C.; Martinez, E. Concepts of online text: Examining online literacy tasks of elementary students. Read. Horiz. A J. Lit. Lang. Arts 2018, 57, 68–82. Available online: https://scholarworks.wmich.edu/cgi/viewcontent.cgi?article=3460&context=reading_horizons (accessed on 19 August 2021).
  7. Coiro, J.; Dobler, E. Exploring the online reading comprehension strategies used by sixth-grade skilled readers to search for and locate information on the internet. Read. Res. Q. 2007, 36, 378–411. [Google Scholar] [CrossRef]
  8. Allcott, H.; Gentzkow, M. Social media and fake news in the 2016 election. J. Econ. Perspect. 2017, 31, 211–237. [Google Scholar] [CrossRef] [Green Version]
  9. Brennen, B. Making sense of lies, deceptive propaganda, and fake news. J. Media Ethics 2017, 32, 179–181. Available online: https://epublications.marquette.edu/cgi/viewcontent.cgi?article=1491&context=comm_fac (accessed on 19 August 2021). [CrossRef]
  10. Loos, E.; Ivan, L.; Leu, D. Save the pacific northwest tree octopus: A hoax revisited. Or: How vulnerable are school children to fake news? Inf. Learn. Sci. 2018, 119, 514–528. [Google Scholar] [CrossRef]
  11. Van den Hoven, M.J. Towards ethical principles for designing politico-administrative information systems. Informatiz. Public Sect. 1994, 3, 353–373. [Google Scholar]
  12. Rawls, J. Political liberalism; Columbian University Press: New York, NY, USA, 1993. Available online: http://cup.columbia.edu/book/rawlss-political-liberalism/9780231149716 (accessed on 19 August 2021).
  13. Bovens, M.A.P. Information rights: Citizenship in the information society. J. Political Philos. 2002, 10, 317–341. [Google Scholar] [CrossRef]
  14. Bovens, M.A.P.; Loos, E.F. The digital constitutional state: Democracy and law in the information society. Inf. Polity 2002, 185–197. [Google Scholar] [CrossRef]
  15. Technopedia. Walled Garden. n.d. Available online: https://www.techopedia.com/definition/2541/walled-garden-technology (accessed on 19 August 2021).
  16. Foundation for Critical Thinking. Our concept and Definition of Critical Thinking. 2017. Available online: https://www.criticalthinking.org/pages/our-conception-of-critical-thinking/411 (accessed on 19 August 2021).
  17. Dewey, J. Experience and Education; Macmillan: New York, NY, USA, 1938. [Google Scholar]
  18. Leu, D.J.; Kinzer, C.K.; Coiro, J.; Castek, J.; Henry, L.A. A dual-level theory of the changing nature of literacy, instruction, and assessment. In Theoretical Models and Processes of Literacy, 7th ed.; Alvermann, D.E., Unrau, N.J., Sailors, M., Ruddell, R.B., Eds.; Routledge: New York, NY, USA, 2019; pp. 319–346. [Google Scholar]
  19. Leu, D. Schools Are an Important Key to Solving the Challenge of Fake News. 2017. Available online: https://education.uconn.edu/2017/01/30/schools-are-an-important-key-to-solving-the-challenge-of-fake-news/ (accessed on 19 August 2021).
  20. November, A. Web Literacy for Educators; Corwin: Thousand Oaks, CA, USA, 2008. [Google Scholar]
  21. Dalton, B. Charting our path with a web literacy map. Read. Teach. 2015, 68, 604–608. [Google Scholar] [CrossRef]
  22. Clay, M.M. The Early Detection of Reading Difficulties, 3rd ed.; Heinemann: Portsmouth, NH, USA, 1979. [Google Scholar]
  23. Bear, D.R.; Invernizzi, M.; Templeton, S.; Johnston, F. Words Their Way: Word Study for Phonics, Vocabulary, and Spelling, 5th ed.; Pearson: Saddle River, NJ, USA, 2015. [Google Scholar]
  24. Druin, A.; Foss, E.; Hatley, L.; Golub, E.; Guha, M.L.; Fails, J.; Hutchinson, H. How children search the internet with keyword interfaces. In Proceedings of the 8th International Conference on Interaction Design and Children, IDC 2009, Como, Italy, 3–5 June 2009. [Google Scholar] [CrossRef]
  25. Hargittai, E. Digital natives? Variation in internet skills and uses among members of the “net generation”. Sociol. Inq. 2010, 80, 92–113. [Google Scholar] [CrossRef]
  26. Margaryan, A.; Littlejohn, A.; Vojt, G. Are digital natives a myth or reality? University students’ use of digital technologies. Comput. Educ. 2011, 56, 429–440. [Google Scholar] [CrossRef]
  27. Prensky, M. Digital natives, digital immigrants. Horizon 2001, 9, 1–16. [Google Scholar]
  28. International Society for Technology in Education. ISTE standards for Students. 2016. Available online: https://iste.org/iste-standards (accessed on 19 August 2021).
  29. Stahl, K.A. Complex text or frustration level text: Using shared reading to bridge the difference. Read. Teach. 2012, 66, 47–51. [Google Scholar] [CrossRef]
  30. Warlick, D.F. Redefining Literacy 2.0, 2nd ed.; Linworth Books: Columbus, OH, USA, 2009. [Google Scholar]
Figure 1. Google search result.
Figure 1. Google search result.
Societies 11 00121 g001
Table 1. Considerations for concepts of online text assessment based on concepts about print assessment.
Table 1. Considerations for concepts of online text assessment based on concepts about print assessment.
M. Clay’s Concepts About Print Assessment Concepts of Online Text Assessment
Concepts of print-based text Reader prompts Considerations for COT assessment development
Orientation or layout of text/front of book Where is the front of the book?
Where is the back of the book?
Open the book to where the story begins.
What parts of a website does a student need to know?
The URL leads to the “book”/site. Do students know this term? Know its purpose?
Consider layout of a website—similarities and differences from a print-based text.
Print, not pictures, carries the message Show me the picture.
Show me the words.
Components of a webpage all carry meaning: print, visuals, hyperlinks, structure/organization, etc.
Direction of print Show me where to start reading. Where do I read after this?
  • Direction of print/reading is different on a webpage/website (not necessarily linear). How does a reader scroll, move forward/back?
Page sequencing Where do I read after this?
  • “Page” sequencing: webpages within a site (not necessarily linear)
  • How does a reader “turn pages” in a non-linear format?
Print features particular to online text:
  • Hyperlinks. (various formats and purposes (definitions, additional information, graphics, etc.)
  • Differences between websites and webpages (one hyperlink can lead to another website- taking the reader to another “book” rather than another page/chapter within the same book); can the reader differentiate?
  • Titles and headings (throughout website/webpage)
Difference between letter and word. Show me one letter.
Show me one word.
Show me the first letter in a word.
Show me the last letter in a word.
This is requisite knowledge needed for reading online text.
Return sweep Where do I read after this?
  • Same skill needed for tracking online text; however, online text may require clicking on “read more” types of links to additional webpages for complete text then navigating back to original page.
One-to-one correspondence Point to each word as I read this line.
  • This is requisite knowledge needed for reading online text.
Punctuation Do you know what this is? What is this for?
  • This is requisite knowledge needed for reading online text.
  • Synthesizing information:
  • How does a reader look at the various print components on a website/webpage and synthesize meaning? How do they determine the main idea or topic of a site/page?
  • How does a reader determine the author/owner/publisher of a website?
  • How does a reader determine the publication date of a website?
Evaluation information:
  • What information does a reader need to evaluate the credibility of a website?
  • Which components are indicators a website can/cannot be trusted?
Source: Pilgrim et al., 2018 [6].
Table 2. Number of participants per grade level.
Table 2. Number of participants per grade level.
Grade LevelN
First58
Second78
Third105
Fourth52
Fifth61
TotalN = 354
Table 3. Show me how many websites your search provided.
Table 3. Show me how many websites your search provided.
Grade % Correct
13.4
25.1
318.1
425
554.1
Table 4. Show me how you could narrow the dolphin search to find what dolphins eat.
Table 4. Show me how you could narrow the dolphin search to find what dolphins eat.
Grade% Correct
153.4
266.7
393.3
494.2
598.4
Table 5. How do you know which website will provide the best information about your topic?
Table 5. How do you know which website will provide the best information about your topic?
Grade% Correct
120.7
225.6
352.4
446.2
555.7
Table 6. Click on one of the websites you found. How can you tell if this website is relevant to your search? In other words, how can you tell if this website will give you the kind of information you need?
Table 6. Click on one of the websites you found. How can you tell if this website is relevant to your search? In other words, how can you tell if this website will give you the kind of information you need?
Grade % Correct
137.9
258.9
386.7
490.4
595.1
Table 7. How can you tell if this website will provide correct information that is true or accurate?
Table 7. How can you tell if this website will provide correct information that is true or accurate?
Grade% Correct
11.7
22.6
326.7
425
537.7
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Pilgrim, J.; Vasinda, S. Fake News and the “Wild Wide Web”: A Study of Elementary Students’ Reliability Reasoning. Societies 2021, 11, 121. https://0-doi-org.brum.beds.ac.uk/10.3390/soc11040121

AMA Style

Pilgrim J, Vasinda S. Fake News and the “Wild Wide Web”: A Study of Elementary Students’ Reliability Reasoning. Societies. 2021; 11(4):121. https://0-doi-org.brum.beds.ac.uk/10.3390/soc11040121

Chicago/Turabian Style

Pilgrim, Jodi, and Sheri Vasinda. 2021. "Fake News and the “Wild Wide Web”: A Study of Elementary Students’ Reliability Reasoning" Societies 11, no. 4: 121. https://0-doi-org.brum.beds.ac.uk/10.3390/soc11040121

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop