by J Roozenbeek · Cited by 206 — TrendLabs Research Paper. Retrieved from documents.trendmicro/assets/white_papers/wp-fake-news-machine-how- propagandists-abuse-the-internet..
28 pages

38 KB – 28 Pages

PAGE – 1 ============
Manuscript accepted and in press at the Journal of Risk Research The Fake News Game: Actively Inoculating Against the Risk of Misinformation Jon Roozenbeek* University of Cambridge Sander van der Linden ** University of Cambridge *Correspondence to: Jon Roozenbeek, Department of Slavonic Studies, Sidgwick Avenue, University of Cambridge, Cambridge, UK CB3 9DA. E – mail: [email protected] ** Correspondence to : Dr. Sander van der Linden , Department of Psy chology, Downing Street, University of Cambridge, Cambridge, UK CB2 3EB. E – mail: [email protected]

PAGE – 2 ============
2 Abstract The rapid spread of online misinformation poses an increasing risk to societies worldwide. To help counter this, w e are actively tasked with creating a news article about a strongly politicized issue (the European refugee crisis) using misleading tactics, from the perspective of different types of fake news producers. To pilot test the efficacy of the game, we conducted a randomized field study (N=95) in a public high school setting . R esults provide some preliminary evidence that playing the fake news game reduced the perceived reliability and persuasiveness of fake new s articles. Overall, these findings suggest that educational games may be a promising vehicle to inoculate the public against fake news. Keywords : fake news, inoculation theory, misinformation , post – truth, influence.

PAGE – 3 ============
3 Introduction In an age where almost half of all news consumers receive and share their news from online sources (Mitchell et al. , 2016) , false information can reach large audiences by spreading rapidly from one individual to another (van der Linden et al., 2017a) . – (Lö fstedt, 2005 ), s ome observers claim we have entered an – ( Higgins, 2016) . In fact – in 2016 , reflecting appeals to e (Oxford Dictionaries, 2016) . Although not new (C ooke, 2017), t he spread of false information has become synonymous with the term Google Trends analysis reveals that this term began to gain relevance in US Google searches around the time of the US presidential election in 2016, and has rem ained popular since 1 . The risk that fake news poses to evidence – based decision – making is increasingly recognized by governments. For example , UK parliament recently launched a n investigation in to how is threatening modern democracy (Harriss & Raymer, 2017) and the World Economic Forum (2013) ranked the spread of misinformation as one of the top risks facing the world today. The study of the spread of false information, particularly through social media and online networks, has become a significant object of scholarly research ( Boididou et al., 2017; Mustafaraj & Metaxas, 2017; Shao et al., 2017 ; van der Linden et al., 2017a ) . Scholars have theoriz ed that fake news can exert a sign ificant degree of influence on political campaigns and discussions ( e.g., Allcott & Gentzkow, 2017; Groshek & Koc – Michalska, 2017; Gu, Kropotov, & Yarochkin, 2017; Jacobson, Myung, & Johnson, 2016) . Alt hough extensive research exists on political misinformation (for a recent review, see Flynn, Nyhan, & Reifler, 2017) , there is some 1 https://trends.google.nl/trends/explore?q=fake%20news

PAGE – 4 ============
4 debate about the extent to which fake news i nfluences public opinion (Shao et al., 2017 ; van der Linden, 2017 ) , including social media (Bakshy, Messing, & Adamic , 2015; Flaxman, Goel, & Rao, 2016; Fletcher & Nielsen, 2017 ; Guess et al., 2018 ) . Nonetheless, a majority (64%) of Americans report that fake news has left them feeling confused about basic facts (Barthel, Mitchell, & Holcomb, 2016) , and a study carried out by YouGov (2017) found that while many people believe they can tell the difference between true and fake news, only 4% of those surveyed could systematically differentiate the two . Similarly, a survey conducted by Ipsos MORI found that 75% of Americans who were familiar with a fake news headline thought the story w as accurate (Silverman & Singer – Vine, 2016) . This is concerning because the functioning of democracy relies on an educated and well – informed populace (Kuklinski et al., 2000) and as such, the spread of misinformation has the potential to undermine both science and society (Lewandowsky et al., 2017 ; van der Linden et al., 2017a ) . For example, the viral spread of misinformation on issues such as climate change and vaccines can undermine public risk judgments about not only the state of scientific agreement but also the perceived seriousness of the se issue s ( Lewandowsky et al., 2017; van der Linden et al., 2017b). Given these findings, a more recent line of inquir y looks at how th e fake news dilemma may be solved (Bakir & McStay, 2017; Lazer et al., 2017 ; van der Linden, 2017 ) . For example, recent risk management initiatives have involved the announcement of controversial fake news laws ( Bremner , 2018). Other p roposed s olutions range from making digital media literacy a primary pillar of education (Select Committee on C ommunications, 2017) , to preventing false information from going viral in the first place or counteracting it in real time (Bode & Vrag a, 2015; Sethi, 2017; Vosoughi, Mohsenvand, & Roy, 2017) . Lewandowsky et al. (2017) call for technological solutions that incorporate psychological principles, which they refer to as

PAGE – 5 ============
5 a recent edition of Science , van der Linden et al. (2017a) call for a preemptive solution grounded in inoculation theory, which we explore further here. Inoculation Theory The diffusion of f ake news can be modeled much like the spread of a viral contagion (Budak, Agrawal, & El Abbadi, 2011; Kucharski, 2016) . I noculation theory offers an intuitive solution to (van der Linden, 2017) . Inoculation theory was originally pioneered by William McGuire (1964) in an attempt to induce attitudinal resistance against persuasion and propaganda, in a manner analogous to biological immunization. To illustrate: injections that contain a weak ened dose of a virus can confer resistance against future infection by activating the production of antibodies. Inoculation other words, by preemptively exposing p eople to a weakened version of a (counter) – argument, and by subsequently refuting that argument, attitudinal resistance can be conferred against future persuasion attempts (Papageorgis & McGuire, 1961) . The inoculation process has an affective and cognitive component, often referred to as 2 (McGuire, 1964; McGuire & Papageorg is, 1962) . The role of perceived risk or providing people with specific arguments to help resist persuasion attempts (Compton, 2013; McGuire, 1964; McGuire & Papageorgis, 1962) . Inoculation has a rich history in communication (see Compton, 2013 for a review) , and the approach has been applied in various contexts , most 2 Threat is not always manipulated, and there is some disagreement over its importance (see Banas & Rains, 201 0) .

PAGE – 6 ============
6 notably politi cal campaigns (Pfau & Burgoon, 1988; Pfau et al., 1990) and health risks (Compton, Jackson, & Dimmock, 2016; Niederdeppe, Gollust, & Barry, 2014; Pfau, 1995) . A meta – analysis found that inoculation is effective at conferring resistance (Banas & Rains, 2010) . Importantly, however, inoculation research has traditionally centered around protecting the types of belie little is known about how inoculation works with respect to more controversial issues (McGuire, 1964; Wood, 2007; van der Linden et al., 2017 b ) . Importantly, in two recent studies , van der Linden et al. (2017 b ) and Cook, Lewandowsky, and Ecker (2017) found that inoculating people with facts against misinformation was effective in the context of a highly politicized issue (global warming), regardless of prior attitudes. Similarly, Banas and Miller (2013) were able to inoculate people w ith facts in the context of 9/11 conspiracy theories . Although promising, m ost of these studies have been lab – based, and rel y on passive active refutation , meaning that participants are provided with both the counter – arguments and refutations rather than having to actively generate pro – and counter – arguments themselves (B anas & Rains, 2010) . McGuire hypothesized that active refutation would be more effective (McGuire & Papageorgis, 1961) counter – arguing is a more involved cognitive process and s ome early research has supported this ( e.g. Pfau et al., 1997) . In addition, many studies use a so – – message , i.e. inoculating people against specific information to which they will be ex posed later on , rather refutational – different where the message refutes challenges that are not specifically featured in a subsequent attack. Although research to date has mostly found subtle differences between different inoculation proc edures (Banas & Rains, 2010) , t he hypothesis that inoculation could provide against the risk of fake news is intriguing because such general immunity

PAGE – 8 ============
8 eliciting attitudinal threat and issue engagement (Pfau et al., 2009). Thus, c onsistent with inoculation theory , we therefore hypothesize that playing the game will elicit greater affect ive involvement as compared with a control gro up (H4). M ethod The Fake News Game The basic structure of the fake news game is as follows: first, players are divided into groups of 2 – 4 people. These groups are then randomly assigned one of four key characters. The charact ers were developed to reflect common ways in which information is presented in a misleading manner ( Marwick & Lewis, 2017 ) . The goal of each group is to produce a news article that This way, each group approaches the same issue from a different angle. In short, t he four characters are: 1) the , who strives to make a topi c look small and insignificant , 2) the alarmist , who wants to make the topic look as large and problematic as possible, 3) the , whose goal is to get as many clicks (and by extension ad revenue ) as possible, and lastly 4) the , who distrusts any kind of official mainstream narrative and wants their audience to follow suit . Each group is given a so – ca s the background of the article that the players will produce . Each group is also given – sheet in which the issue at hand is explained in detail . In our experiment , the overarching topic was immigration, and the specific salient risk issue a report by the Dutch Central Agency for the Reception of Asylum Seekers (COA) 4 from 2016 (COA, 2016) , which stated that the number of incide nts in and around Dutch asylum centers rose between 2015 and 2016. The fact sheet mentions the number of incidents in 4 https://www.coa.nl/en/

PAGE – 9 ============
9 both years, plus additional information such as the number of filed police reports and cases that made it to court. Additionally, the fact sheet lists a number of possible reas ons behind the rise in incidents. Based on the specific goals and motivations of their character, players are then instructed to use the information from the fact sheet to create a fake news article . The article itself has a systematic structure . In order : a) an image, b) title, c) header, d) paragraph 1: numbers and facts, e) paragraph 2: interpretation and presentation of numbers and facts , f) paragraph 3: the cause of the problem, g) paragraph 4: consequences of the problem, h) paragraph 5: expert opini on, and i) conclusion. For each part , groups are given a set of cards with 4 alternatives, each of which presents and interprets th e fact sheet in a specific way consistent with one specific character. Players then pick one of these alternatives for each p art of their article, based on their assessment of what their character would choose. Next, they put their choices together in the correct order to form the final article. The group with the most correct answers, i.e. the group that chose the most cards that correspond with their character , wins. Please see photos from the field (Figure 1) for a visual overview of the game. Figure 1. Fake News Game.

PAGE – 10 ============
10 Sample and Participants We tested the game at a secon dary public high school in the central – e astern part of the Netherlands. On the day of the experiment, a total of 4 classes of students 5 ( N = 95) between the ages of 16 and 19 took part in the game. Students from these classes were randomly as signed to an experiment al (n = 57 ) or control (n = 38) group 6 . The average age in the sample was 16.2 years ( M = 16, SD = 0.81). In total, 59% of the students were male and 41% were female. The groups were somewhat unbalanced so that participants in the treatment group were more likely to be younger ( p < 0.01) and female (74% vs . 53%, p = 0.08). Experimental Design and Procedure We administered a reading task as well as a survey to evaluate our hypotheses and the effectiveness of the fake news game. The reading task involved rea ding one of two (randomly assig ned) fake news articles about an issue that related closely to the topic of the game . All facts and figures in these articles, except for proper names and names of institutions, were made up. The two articles were the same (insofar as this is possible) in their overarching topic, setup, structure, length, language use, and in terms of the techniques the y used to mislead the audience ( Marwick & Lewis, 2017 ; Hansen, 2017) . Both articles related to polarized and contested risk issues about immigration and the refugee crisis in the Netherlands, and assigned blame to the sam e organiz ation (the European Union). Additionally, were implemented. In both articles , hyperbole ( stylized exagerration, see McCarthy & Carter, 5 In the Dutch system, Hoger Algemeen Voortgezet Onderwijs (HAVO) and Voorbereidend Wetenschappelijk Onderwijs (VWO) are higher educational divisions that prepare high school students for university - level education. 6 There were two control and two treatment groups that were approximately th e same size on paper, but on the day of the experiment there were more ab sentees in the control group, resulting in the observed imbalance. PAGE - 11 ============ 11 2004) was used 4 times , the appeal ( attempting to convince readers by appealing to perceived 'common sense', see Hansen, 2017) was used 4 times , arguments from authority ( asserting an argument is true based on the credentials of the person making it, ibid.) was used twice , conspiratorial r easoning ( theorizing that small groups are working in secret against the common good, see Hofstadter, 1964 ; van der Linden, 2015 ) twice, demoniz ation o f the out - group (Atkinson, 2005) once wha ( discred i ting via hyprocisy accusations or the 'tu quoque' - fallacy, see Hansen, 2017) once and the attack (Walton, 1998) also once . The difference between the two articles lies in their framing Entman, 1993) . A rticle 1 focused on the increasing number of people making their way to Eu rope from Libya. The framing of the immigration issue was such that the article displayed a negative attitude towards immigration ed to protect its borders hor of immigrants cr , which has caused Dutch lose faith i terms , but instead focused on the dire situation in European refugee camps: , because of a failure on behalf of the EU to create safe living cond itions for refugees, e . As such, the two articles represent two dominant but ideologically opposite frames about the European refugee crisis (Greussing & Boomgaarden, 2017; Rodríguez Pérez, 2017; Zeitel - Bank, 2017) . We introduced this variation to control for potential political biases. The full ( t ranslated ) articles can be found in the supplement . Before the start of the game, all participants fill ed out a short questionnaire to measure their general familiarity with the topic, political ideology, and demographic background information. After this, participants in the treatment condition (n=57) were divided into groups, given materials, assigned a character, and , after a brief explanation by the research assistants, 38 KB – 28 Pages