Developing Academic Practice

New digital methods in a polluted information landscape

Developing Academic Practice 2021, 11–18.

Abstract

Fake news, or what is now more accurately termed information disorder, is an increasing problem. However, the ‘infodemic on the pandemic’ also highlights a related issue: the problem of fraudulent science and its amplification through social media and questionable scientific publishing practices, particularly deceptive or predatory publishing. This paper reports how insights gleaned from a critical digital pedagogy, and new work on the verification of sources, can help students and early career researchers navigate the ‘polluted’ information landscape that has become a ubiquitous feature of our online world. As embodied in the module ‘Critical Thinking in an Age of Fake News and False Accounting’ which is part of the University of Liverpool’s International Summer School, it shows how commonplace and easily accessible ’Open Source Intelligence Techniques’ (OSINT) like reverse image searching and geolocation can be used to develop a more critical perspective on what constitutes reliable sources of information. The research seeks to establish the degree of awareness students have of these techniques and whether they use them in their academic life. It thus contributes to existing initiatives on academic integrity and the digital goals embodied in new curriculum developments. The paper also asks deeper, more speculative questions about the strategic amplification of disinformation and its relationship to Miltonian ‘market place of ideas’ conceptions surrounding knowledge and understanding.

New digital methods in a polluted information landscape

Abstract

Fake news, or what is now more accurately termed information disorder, is an increasing problem. However, the ‘infodemic on the pandemic’ also highlights a related issue: the problem of fraudulent science and its amplification through social media and questionable scientific publishing practices, particularly deceptive or predatory publishing. This paper reports how insights gleaned from a critical digital pedagogy, and new work on the verification of sources, can help students and early career researchers navigate the ‘polluted’ information landscape that has become a ubiquitous feature of our online world. As embodied in the module ‘Critical Thinking in an Age of Fake News and False Accounting’ which is part of the University of Liverpool’s International Summer School, it shows how commonplace and easily accessible ’Open Source Intelligence Techniques’ (OSINT) like reverse image searching and geolocation can be used to develop a more critical perspective on what constitutes reliable sources of information. The research seeks to establish the degree of awareness students have of these techniques and whether they use them in their academic life. It thus contributes to existing initiatives on academic integrity and the digital goals embodied in new curriculum developments. The paper also asks deeper, more speculative questions about the strategic amplification of disinformation and its relationship to Miltonian ‘market place of ideas’ conceptions surrounding knowledge and understanding.

Introduction

In a book exploring knowledge in a social world, the social epistemologist Alvin Goldman (1999) comments on John Milton’s ‘Areopagitica’ and the emergent idea that reliable knowledge can be best obtained through robust and sustained inquiry in a ‘marketplace of ideas’. Goldman argues, persuasively, that economic theory itself makes no such claims about the ultimate moral and social value of the kind of ‘products’ - intellectual or other - that will emerge and dominate in a competitive market. On closer examination, the ‘marketplace of ideas’ conception and the notion that ‘the truth will out’ or that ‘the facts will speak for themselves’ through strong debate is highly questionable.

The spread of viral mis and disinformation, the deliberate undermining of public trust in science and the fact that ignorance is not a natural state waiting to be corrected but something that can be actively manufactured through information operations, further undermines the claim (Wardle & Derakhshan, 2017; Krasodomski-Jones, Smith, Jones, Judson, & Miller, 2019). Although it may be comforting to believe that progressive ideas will ultimately win out - where the nasty Nazis limp off into obscurity having been dealt the telling (argumentative) blow - such conceptions seem increasingly utopian. And their utopian nature raises difficult questions for the academy and its purpose. Having created a short module on critical thinking and fake news and having hoped that the former would have a positive impact on the latter, the author has had to question ‘market based metaphors’ surrounding knowledge creation while developing a critical digital pedagogy to illustrate the mechanics of disinformation spread and how the veritistic consequences of market-based free speech are anything but obvious in an online world.

One factor that questions the ‘market’ as some kind of maximizing process for truth and knowledge is that the efficacy of debunking initiatives is now widely debated (Chan, Jones, Hall Jamieson, & Albarracín, 2017; Lewandowsky, Ecker, Seifert, Schwarz, & Cook, 2012). Small print, back-page retractions, or perfunctory apologies no doubt do not help, but to argue against early advocates of marketplace metaphors like John Stuart Mill, the ‘marketplace’ simply does not guarantee that ‘truth will prevail in its collision with error’. And this is especially true in conditions of great information asymmetry (Nadler, Crain, & Donovan, 2018). As Thorson (2016) suggests, and despite the dogged nature of fact checking and debunking initiatives, ‘belief echoes’ are common. As Cassam (2016) argues, even when people receive the most optimal information about a particular topic, their own ‘vice epistemology’ may prevent them from believing it. So, might identity-protective cognition (Kahan, Braman, Gastil, Slovic, & Mertz, 2007) filter bubbles (Pariser, 2011) and confirmation biases which have all received significant attention. The work of Phillips (2018) on amplification might also mean that strategic silence is sometimes better in ensuring that truth wins out.

Methodology

The module aimed to introduce students to a number of key concepts and techniques within the fake news and fake science ecosystem. It did this through a blended learning approach which used the virtual bulletin board padlet alongside small discussion groups and mini lectures with tasks. The key concepts that were introduced included: ‘amplification’ and its problematical relationship to the ‘marketplace of ideas’ metaphor; information disorder; prebunking and debunking; Inoculation; filter bubbles; and strategic silence. Techniques included: geolocation and reverse image search; the use of CrowdTangle to obtain information on social media and Raw Graphs to model this information in a network diagram; and basic use of the graphic design platform Canva to produce a newsletter. A two-hour, unstructured, focus group discussion was held at the end of the module to ascertain opinions in terms of the main research question. Eighteen students took part.

Inoculation theory and prebunking: Modelling the epistemic stance of falsifiers

Students examined research developed by Roozenbeek and Van der Linden (2019). They created a fake news game which guides students through the types of processes which lead to the production of fake news. The game is useful in demonstrating some of the business processes surrounding the spread of problematical information and has a theoretical background related to the inoculation theory pioneered by William McGuire in the early sixties which tried to produce attitudinal resistance to propaganda. McGuire’s research suggested that just like biological immunization, it might be possible to create mental ‘antibodies’ which make us less credulous after a selective exposure - a prebunking not a debunking (McGuire & Macguire, 1999).

Students produced a newsletter using Canva and a fake story from the U.K. or their own country. Students were divided into role-playing teams that either actually produced fake news or fact-checked it. Thus, I took the four ‘characters’ proposed by Rozenbeek and Van Der Linden; the denier, the alarmist, the clickbait monger and the conspiracy theorist and creatively transformed them into ‘reporters’ for a newsletter called The Facts: Giving You the Plain and Simple Truth. Hence, we have ‘Claudius Click-Bait Monger’, royal correspondent; ‘Alan Alarmist’ who wants to magnify everything; ‘David Denial’, a climate change activist who thinks it’s time for an ‘honest’ and ‘adult’ discussion about rising sea temperatures (and who works as a lobbyist for a fossil fuel company) and finally ‘Clara Conspiracy’, who just knows the moon landings were faked - just look at that flag blowing!

Wardle and Derakhshan’s (2017) three elements of information disorder: agent, message, and interpreter.

Students wrote a column in The Facts adopting the likely epistemic stance of one of these reporters and then the fact-checkers (who can also be ‘characterized’) debunked them. If inoculation theory is accurate, students are prebunked during the creation process and perhaps debunked afterwards. The ‘producers’ had to think about what makes something ‘newsworthy’. The fact-checkers had to go through a procedure to investigate the likely veracity of their claims. They also thought closely about the motivation that the ‘characters’ have for writing this type of copy while using Wardle and Derakhshan’s (2017) three elements of information disorder: agent, message, and interpreter.

Academic integrity and new digital techniques: The use of reverse image search

Some of the aforementioned ideas have also enabled students to investigate the contract cheating industry which now poses immediate problems for academic integrity (Westminster Higher Education Forum, 2019). The hope here was that that since the reality of contract cheating is less inadvertent than plagiarism, arguments appealing to integrity might sometimes be ignored by individuals determined to cheat. Showing students how they, themselves are being duped may produce a salutary effect. Using RevEye, students reverse image search the ‘writers’ employed by these sites and find that many, if not most, are stock images from photobanks. I call this particular activity ‘Ancestry’. For example, the ‘genealogy’ of ‘Beverley Lopez’, from a now defunct diploma mill, was traced through TinEye (see Figure 2).

‘Ancestry’ or the genealogy of ‘Beverly Lopez’ using TinEye reverse image searching. Original image from a misleading ‘diploma mill’.

Students also examined sites which makes extensive use of Oxbridge imagery or the Long Room at Trinity College Dublin. Simply geolocating the real ‘offices’ of these ‘service providers’ and having a walk around using Google pegman/street view reveals the difference between the imagery of the website and the ‘mop cupboard’ reality of an office which, if not exactly in the ‘middle of nowhere’, is certainly nowhere near the prestigious address implied by the site. I call this activity ‘Location! Location! Location!’ To further develop students’ critical visual capacities, they take First Drafts verification challenges on visual information (https://firstdraftnews.org/training/). They also learn the kind of verification fundamentals as outlined by Silverman (2013) at the Poynter Institute. These can be given more theoretical rigour through a consideration of the work of Birdsell and Groarke (2007) on visual argumentation and Van Leeuwen (2007) on legitimation.

Amplification and the role of networks: Tracking the spread of dubious scientific research

Since academic integrity is also a problem for some academics, students on the module have also examined sites from deceptive or predatory academic publishers which seem unimpeachable because of their glossy nature and apparently ‘serious’ content.

Using the CrowdTangle browser extension over the deceptive site allows students to see who has been spreading the ‘research’ on Twitter, Facebook, Instagram, and Reddit. The CSV data from CrowdTangle can later be used to produce network visualizations using applications like Table2Net, Neo4j, or Raw Graphs. This can be theoretically enriched by introducing students to the actor-network ideas of Bruno Latour (2013). Thus, on one deceptive online site they found a research paper on heart disease in the U.K. which claimed that despite years of cuts, the country had apparently done really rather well.

Unsurprisingly, this was taken up by legacy media (The Times) and then tweeted about by a former government minister. A quick trawl through Lexis Library News revealed a further spread. It is likely that both the journalist and the MP didn’t actually know the true nature of the journal reporting the research. It is, however, discouraging to note that the same publisher has recently been noted by the Liverpool Open Access Twitter site because of its acceptance of a spoof paper unhappily entitled, ‘What’s the deal with birds?’ This acceptance seems to be part of a pattern of charging outrageous article processing charges and ignoring peer review.

Results

It is difficult to say if my long-term efforts at prebunking via inoculation theory have been successful. Currently, the project and the module lack rigorous data. The focus group discussion at the end revealed that students believed they were more aware of the problems of information disorder after taking the module. Of the eighteen students who took part, none had used reverse image search in a critical way to examine the origins of images and, although they were familiar with Google pegman, had not thought about using it as a quick check for the likely veracity of a contract cheating business. All of them felt they needed to be more careful about what information they shared after learning how to use CrowdTangle and Raw Graphs to model the spread of misinformation. One student did, however, say it was unrealistic to expect people to be constantly on their guard and she would continue to inadvertently spread both mis and disinformation. Another questioned the utility of the term information ‘disorder’. Was there ever a time, she wondered, when information was ‘perfect’? How could it not be ‘disordered’? The course had, after all, mentioned that propaganda was not ‘new’. Questions like these probably speak to some lack of conceptual rigour in the emerging literature. Some students were shocked by the idea that the good might not necessarily win out in a ‘market place of ideas’. How could your idea ‘product’ ever get to a ‘market’ if it was so overcrowded? And you could not spread it like some state actor or large organization?

Conclusion

Discussions throughout the module and at the focus group at the end suggest that, although students may be ‘digital natives’, they are rarely critical ‘consumers’ or ‘users’ of digital material. There is an obvious need for a pedagogy which allows them to make critical use of existing technologies (like geolocation) while at the same time asking deeper questions about the versions of human flourishing currently propagated by tech companies.

References

Birdsell, D. S., & Groarke, L. (2007). Outlines of a theory of visual argument. Argumentation and Advocacy, 43(3-4), 103-113. Google Scholar

Bruner, J. (1991). The narrative construction of reality. Critical Inquiry, 18(1), 1-21. Google Scholar

Cassam, Q. (2016). Vice epistemology. The Monist, 99(2), 159-180. Google Scholar

Chan, M. P. S., Jones, C. R., Hall Jamieson, K., & Albarracín, D. (2017). Debunking: A meta-analysis of the psychological efficacy of messages countering misinformation. Psychological Science, 28(11), 1531-1546. Google Scholar

Goldman, A. (1999). Knowledge in a social world. Oxford: Clarendon Press. Google Scholar

Kahan, D. M., Braman, D., Gastil, J., Slovic, P., & Mertz, C. K. (2007). Culture and identity-protective cognition: Explaining the white-male effect in risk perception. Journal of Empirical Legal Studies, 4(3), 465-505. Google Scholar

Krasodomski-Jones, A., Smith, J., Jones, E., Judson, E., & Miller, C. (2019). Warring songs: Information operations in the digital age. Demos. https://demos.co.uk/project/warring-songs-information-operations-in-the-digital-age/. Google Scholar

Latour, B. (2013). Reassembling the social. An introduction to actor-network-theory. Journal of Economic Sociology, 14(2), 73-87. Google Scholar

Lewandowsky, S., Ecker, U. K., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest, 13(3), 106-131. Google Scholar

McGuire, W., & Macguire, W. J. (1999). Constructing social psychology: Creative and critical aspects. Cambridge: Cambridge University Press. Google Scholar

Nadler, A., Crain, M., & Donovan, J. (2018). Weaponizing the digital influence machine. Data & Society Research Institute. https://datasociety.net/wp-content/uploads/2018/10/DS_Digital_Influence_Machine.pdf. Google Scholar

Pariser, E. (2011). The filter bubble: How the new personalized web is changing what we read and how we think. London: Penguin. Google Scholar

Phillips, W. (2018, 22 May). The oxygen of amplification: Better practices for reporting on extremists, antagonists, and manipulators. Data and Society. https://datasociety.net/library/oxygen-of-amplification/. Google Scholar

Roozenbeek, J., & Van Der Linden, S. (2019). The fake news game: Actively inoculating against the risk of misinformation. Journal of Risk Research, 22(5), 570-580. Google Scholar

Schwitzgebel, E. (2012). Self-ignorance. In J. Liu & J. Perry (eds). Consciousness and the self: New essays (pp. 184-197). Cambridge: Cambridge University Press. Google Scholar

Silverman, C. (2015). Verification Handbook for Investigative Reporting: A Guide to Online Search and Research Techniques for Using UGC and Open Source Information in Investigations. European Journalism Centre. https://datajournalism.com/read/handbook/verification-2. Google Scholar

Thorson, E. (2016). Belief echoes: The persistent effects of corrected misinformation. Political Communication, 33(3), 460-480. Google Scholar

Van Leeuwen, T. (2007). Legitimation in discourse and communication. Discourse & communication, 1(1), 91-112. Google Scholar

Wardle, C., & Derakhshan, H. (2017). Information disorder: Toward an interdisciplinary framework for research and policy making. Strasbourg: Council of Europe. Google Scholar

Westminster Higher Education Forum. (2019). Contract cheating in higher education: Prevention, detection, disruption, and legislative options. Seminar. Westminster Higher Education Forum. https://www.westminsterforumprojects.co.uk/publication/tackling-contract-cheating-in-HE-19. Google Scholar

References

Birdsell, D. S., & Groarke, L. (2007). Outlines of a theory of visual argument. Argumentation and Advocacy, 43(3-4), 103-113. Google Scholar

Bruner, J. (1991). The narrative construction of reality. Critical Inquiry, 18(1), 1-21. Google Scholar

Cassam, Q. (2016). Vice epistemology. The Monist, 99(2), 159-180. Google Scholar

Chan, M. P. S., Jones, C. R., Hall Jamieson, K., & Albarracín, D. (2017). Debunking: A meta-analysis of the psychological efficacy of messages countering misinformation. Psychological Science, 28(11), 1531-1546. Google Scholar

Goldman, A. (1999). Knowledge in a social world. Oxford: Clarendon Press. Google Scholar

Kahan, D. M., Braman, D., Gastil, J., Slovic, P., & Mertz, C. K. (2007). Culture and identity-protective cognition: Explaining the white-male effect in risk perception. Journal of Empirical Legal Studies, 4(3), 465-505. Google Scholar

Krasodomski-Jones, A., Smith, J., Jones, E., Judson, E., & Miller, C. (2019). Warring songs: Information operations in the digital age. Demos. https://demos.co.uk/project/warring-songs-information-operations-in-the-digital-age/. Google Scholar

Latour, B. (2013). Reassembling the social. An introduction to actor-network-theory. Journal of Economic Sociology, 14(2), 73-87. Google Scholar

Lewandowsky, S., Ecker, U. K., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest, 13(3), 106-131. Google Scholar

McGuire, W., & Macguire, W. J. (1999). Constructing social psychology: Creative and critical aspects. Cambridge: Cambridge University Press. Google Scholar

Nadler, A., Crain, M., & Donovan, J. (2018). Weaponizing the digital influence machine. Data & Society Research Institute. https://datasociety.net/wp-content/uploads/2018/10/DS_Digital_Influence_Machine.pdf. Google Scholar

Pariser, E. (2011). The filter bubble: How the new personalized web is changing what we read and how we think. London: Penguin. Google Scholar

Phillips, W. (2018, 22 May). The oxygen of amplification: Better practices for reporting on extremists, antagonists, and manipulators. Data and Society. https://datasociety.net/library/oxygen-of-amplification/. Google Scholar

Roozenbeek, J., & Van Der Linden, S. (2019). The fake news game: Actively inoculating against the risk of misinformation. Journal of Risk Research, 22(5), 570-580. Google Scholar

Schwitzgebel, E. (2012). Self-ignorance. In J. Liu & J. Perry (eds). Consciousness and the self: New essays (pp. 184-197). Cambridge: Cambridge University Press. Google Scholar

Silverman, C. (2015). Verification Handbook for Investigative Reporting: A Guide to Online Search and Research Techniques for Using UGC and Open Source Information in Investigations. European Journalism Centre. https://datajournalism.com/read/handbook/verification-2. Google Scholar

Thorson, E. (2016). Belief echoes: The persistent effects of corrected misinformation. Political Communication, 33(3), 460-480. Google Scholar

Van Leeuwen, T. (2007). Legitimation in discourse and communication. Discourse & communication, 1(1), 91-112. Google Scholar

Wardle, C., & Derakhshan, H. (2017). Information disorder: Toward an interdisciplinary framework for research and policy making. Strasbourg: Council of Europe. Google Scholar

Westminster Higher Education Forum. (2019). Contract cheating in higher education: Prevention, detection, disruption, and legislative options. Seminar. Westminster Higher Education Forum. https://www.westminsterforumprojects.co.uk/publication/tackling-contract-cheating-in-HE-19. Google Scholar


Details

Author details

Leeke, Philip

Thiruvenkataswami, S.