The Cambridge Analytica story has foregrounded the imperative that we teach students to distinguish between fact and fiction online. All too often, however, the responsibility for this is left to librarians, who often lack sufficient contact time with students in which to do any meaningful work, or, even worse, left to no-one at all. Subject teachers have full syllabi in which detailed work on how to evaluate truth is hard to shoe-horn in. There needs to be some discussion over how this is taught explicitly and how it can then be used across the curriculum.
The standard approach to teaching students how to evaluate websites is to use fake websites which have been created for pedagogical purposes. Here are some examples:
- Dihydrogen Monoxide
- The Pacific Northwest Tree Octopus
- All About Explorers
- Dog Island
- Download More RAM
Students are then asked to evaluate these websites, often in conjunction with legitimate websites, to detect which are hoaxes. Common evaluation techniques are usually based around a checklist of concerns: the CRAP Detection method, for example. CRAP stands for Currency, Reliability, Authority and Purpose (or Point of View). Students are asked to evaluate any website against these criteria and then give an evaluation. As an IT teacher, I have included this kind of thing in my curriculum for many years. At my school the headmistress felt I should not use a word like CRAP, so I had to invert the acronym, as in the poster shown here.
- Currency: Is the information reasonably up to date? Does it matter in this case?
- Authority: Can the author be trusted? Are they an expert in their field? Do they have a reputation? Authority can mean an individual writer or the website or publisher as a whole.
- Reliability: Is the information factual or is it just an opinion? Does the author give sources so you can check up on what they are claiming?
- Purpose: Can you detect any bias? Is the site trying to sell you something? Are they trying to persuade you about something?
There are some problems with this approach, however. Students find it very difficult to move from a checklist to an overall evaluation. Students tend to get bogged down in the detail and lose sight of the big picture. For example, a student may correctly identify the author as being suspect, but then rate the website as reliable because it is up to date. Or they may discount a website simply because it is anonymous. Because context means everything, and truth depends on a wide range of concerns, it is hard enough for adults to pick through the minefield of detecting fake news online, for a teenager it is doubly difficult. No single factor should usually be taken as definitive.
So much rests on possessing a robust general knowledge. I would argue that while checklists are useful, they need to be combined with a process-oriented approach which is better able to balance all the factors involved.
The use of fake websites (usually created entirely for the purpose of teaching website evaluation) is also somewhat problematic. More suitable for younger students, with older teenagers it is better to evaluate real world examples. Conspiracy Theory websites present a much more nuanced content base for honing evaluation skills. The problem, though, is that conspiracies are not necessarily fake, and even highly intelligent and critical thinkers can disagree over which should be taken seriously and which not. As recent court papers attest, drug companies do tell deliberate falsehoods and historians have exposed false flag operations such as the Gulf of Tonkin incident. And yet students need to be inoculated against undue trust in conspiracy theories. If you Google Climate Change on YouTube, most of the videos apparently question the reliability of scientific evidence. If you Google Vaccinations and Autism you might well be mislead into believing it is a real debate.
The only way to untangle fact from fiction is to have a world view which is based on a really good understanding of the Sciences and Humanities. Truth can be evaluated both on the basis of Coherence, that what is being claimed makes logical sense, and Correspondence with the real world, with data and evidence. Real facts can be totally misinterpreted, and logical claims can be based on shaky evidence. No checklist approach can really help untangle this, and yet evaluation needs to be based on a range of factors.
With this in mind I have, over the years, developed a model for teaching website evaluation which takes note of the factors, and tries to define an overarching process for evaluating coherence and correspondence. The poster gives some idea of the process, but I usually design worksheets customized for the particular task at hand with a space for answering the questions.
The first step is to complete an evaluation matrix. This can be calibrated in different ways, but produces an evaluation diamond which gives a graphic representation of the different factors. This allows the student to look into the Currency, Authority, Reliability and Purpose of the website, but to keep this information in the background. It does not immediately lead to an evaluation. The matrix, though, forms a visual reminder. The tighter the diamond the more likely the website is to be fake.
The student then answers four questions which are designed to get them to think about how the information presented corresponds with the real world and is coherent. It is only with the fourth question that students are asked to give an overall evaluation based on their gut feel. This is done to try and discourage making an evaluation until all other factors have been considered.
- Does the information fit with everything else you know about the world?
- Is the information confirmed in other sources?
- Does it make sense?
- What does your gut tell you? Give a rating from 1 (Fake) to 10 (Reliable)
Students seem to enjoy filling in the CARP diamond, and comparing the shapes they produce with others’ responses. Having a visual summary of the evaluation checklist really helps stimulate discussion. The four questions allow students to use a search engine to fact check the content and the author in greater detail. I would recommend that you scaffold this in any worksheet you provide. I always find it useful to get students to work in groups to evaluate a few websites, and then have a report back to the whole class where the group delivers its findings. You can use an online platform like flipgrid to facilitate feedback. By working in groups students are encouraged to voice their responses to the website and defend their points of view.
I believe it is also vital to correct poor findings – and yes, I have had groups make presentations that the tree octopus is real, or that dihydrogen monoxide (water) is a dangerous substance.