The Rise of Misinformation & Disinformation

By Scott Reddiex MRSV
Editor-in-Chief, Science Victoria

In this edition, we’re looking at four crises currently faced by our planet. For three of these, it’s relatively easy to see the threat: declining biodiversity is threatening the existence of species; air, water, and soil pollution are impacting the health of every living thing; and climate change is increasingly throwing the Earth’s systems into chaos. They really feel like the horsemen of an apocalypse.

The fourth of these crises – the rise of misinformation – might sound less catastrophic than its counterparts, but it is arguably the horseman who preceded the others. Understanding, curbing, and countering misinformation is vital if we stand any hope of meaningfully addressing these planetary crises.

What is misinformation? What problems does it cause? And, most importantly, what can we do about it?

Fake news. Photograph: modified from Jorge Franganillo via Unsplash

The mis- and dis- of information

‘Information’ informs an audience about something. It is data that has been processed and given context so that it conveys meaning. The numbers 12, 18, and 90 are seemingly random data, but with the context of 12 goals, 18 behinds, 90 points, those numbers inform the audience of Collingwood’s winning score in the 2023 AFL Grand Final.

The difference between misinformation and disinformation is intent. Both refer to false, inaccurate, or misleading information; misinformation is shared without the intent to deceive, while disinformation is created and shared with the intention of deliberately deceiving or misleading others. If misinformation is like manslaughter, then disinformation is like murder – either way, the information is dead.

The spread of false information is nothing new, nor are large scale campaigns that utilise misinformation and disinformation as a tool. Wartime propaganda is one example: during World War 2, broadcasts in Germany by ‘Der Chef’ (‘The Chief’) sounded like military chatter, but were actually made by a German defector to spread fake news and sow discord within the Reich.1 

Disinformation campaigns are not limited to countries at war. Over the last century, we have seen campaigns to obscure the truth or outright deceive the public and alter health policy from many industries, most notably the fossil fuel sector, tobacco, alcohol, and sugar.

Photograph: Elena Leya via Unsplash.

Sickly sweet

Sugar intake has been associated with obesity and other diseases for more than 2000 years, but the cost and limited availability of refined sugar restricted its widespread use in ‘Western’ diets until American cane plantations enabled mass production.2,3

The 20th century saw sugar production costs decrease further, the invention of high-fructose corn syrup (HCFS), and the increased availability of processed foods and drinks containing added sugar. This coincided with a rise in cases of obesity, diabetes, and coronary heart disease, with many studies indicating a connection.4,5

In response, the sugar industry (i.e., companies including food and beverage manufacturers, and those involved in sugar crop farming and refinement) maintained a multifaceted campaign to obscure the science and undermine public health policy on sugar.6

A central part of this campaign was the generation of disinformation, and supporting the spread of misinformation through trade associations, front groups, and PR firms. This included companies establishing in-house ‘research institutes’, which would then publish misleading studies and communications that downplayed or ignored the health impacts of sugar. 

A 2008 fact sheet from the Nestle Research Centre confronts the warnings about sugar’s connection with obesity, stating that “the data on which these warnings are based are limited”, and that “messages to reduce sugar consumption to prevent body weight gain, although seemingly plausible, are therefore contrary to the evidence provided by current epidemiological research.”6,7

Another successful tactic of the disinformation campaign was to ‘shift the blame’ for obesity and other diseases to fat from sugar – which contributed to the marketing of ‘low fat’ and ‘no fat’ (but higher sugar) products we are all familiar with.8,9,10

Our consumption of sugar is now decreasing, thanks in part to new sugar-free sweeteners and public health campaigns. However the actions of the sugar industry is just one example of how disinformation can become widely shared misinformation, like “fat makes you fat”.

Reasons behind the rise

With a long history of misinformation and disinformation, why has it now increased to the level of a planetary crisis?

The University of Melbourne’s A/Prof Andrew Perfors puts it down to two fundamental reasons: the advent of the internet, and our difficulties in determining which information we should believe.11

Put simply, it comes down to technology and psychology.

The internet and social media have rapidly accelerated the spread and escalated the risks posed by misinformation. Once restricted to legacy media formats, incorrect information is now able to quickly (and repeatedly) reach its audience and disrupt societies, as we witnessed during the COVID-19 pandemic.

Facebook, Twitter, TikTok, YouTube, Instagram, LinkedIn, Google Search and Google News, Snapchat, and Apple News are key digital platforms for news and information for many Australians, as detailed by the Australian Communications and Media Authority (ACMA) in their 2020 position paper on addressing misinformation.12 As unregulated as they can seem, these platforms have at least some ability to flag inaccurate, misleading, or inappropriate content. In contrast, content shared on encrypted messaging platforms like WhatsApp and Telegram is significantly harder to address.

Social media apps. Photograph: modified from Nathan Dumlao via Unsplash

On all of these digital platforms, disinformation can be introduced by a malicious group or individual, and then shared within and across platforms by many different people – who may have no intention to mislead or deceive – as misinformation.13 This can result in a single piece of misinformation being seen repeatedly by a single viewer in a short span of time, which unfortunately makes it even harder to identify it as false.14

When we are bombarded rapidly with information across different platforms every day, we’re not taking the time to perform a critical assessment on the validity of every story. It’s incredibly difficult to prove or disprove the majority of what we read, and so we use some mental shortcuts to quickly separate fact from fiction… but these can let us down.

The mental shortcuts we take to reach decisions are what psychologists call ‘heuristics’. Our tendency to believe a piece of information we see repeatedly is an example of heuristic decision making – we tend to favour things that we are ‘familiar’ with, even if it’s something we first read a few days ago. In other words, we tend to think that if everyone’s saying it, then it must be true.14,15

We also tend to believe things that are easier for us to process. It’s a big reason why communicating scientific expertise is most effective when the science is embedded within an engaging narrative, written with simple language and minimal jargon. It’s also why misinformation conveyed as part of a compelling narrative is more effective.16

With misinformation impacting public health,17 impairing our response to climate change,18 and sowing discord around the world,19 how can we protect ourselves and our communities?

Stemming the rising tide of misinformation

If we attribute the rise of misinformation to technology and psychology, then any response needs to address both of these factors.

Managing the technology element without blanket bans on websites requires nuance – especially when the offending sites are not hosted in Australia, nor subject to our national regulatory frameworks and jurisdiction for enforcement.

The federal government’s Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2023 was released as an ‘exposure draft’ in June 2023, with the aim of both defining misinformation and empowering the ACMA to address it.20 Consultation on the draft closed in August 2023, and received many submissions from digital platforms, media outlets (including the ABC and SBS), and the general public.21 The bill is expected to be introduced to parliament in 2024, and we will have to wait and see what powers the ACMA will be given – and how effectively they address misinformation in practice.

Addressing the ‘psychology’ aspect largely focuses on building resilience to misinformation in the population. In other words, helping people identify misinformation so that we’re all a little less susceptible to it.

At a population level, it means things like building critical thinking into education, so that students know how to effectively fact-check the claims they encounter. It’s not enough to rote-learn an answer – you need to show your working out, which makes logical fallacies a little easier to spot.

A recent report from the University of Canberra’s News and Media Research Centre provides recommendations on how to support resilience to misinformation at the population level, including encouraging students to fact-check new or unfamiliar claims, and even creating a dedicated agency to build information literacy in the population.22

On an individual level, we need to be aware of the cognitive shortcuts we take every day, and understand how we can improve our own decision making. Slowing down the rate we are bombarded with information is a good start. Rather than swiping from one headline or social media post to the next, we can pause and process the new data. 

Check the source of the information, and the author’s credentials – a TV chef might not be an authority on vaccination. Have some healthy scepticism around sensational claims (e.g., a ‘miracle cure!’ should warrant further investigation), and know how to recognise a trustworthy source of scientific information.22,23 

Best of all: when in doubt, don’t share it.

Finally, STEMM professionals need to do better with communicating their expertise with a general audience. We need to remember that the majority of high quality, peer-reviewed research sits behind paywalls, written in very technical language. If you aren’t clearly explaining your work to everyone in accessible language, then it’s likely not being heard. Or worse – it has been misinterpreted, and misrepresented.

Misinformation and disinformation have existed as long as our ancestors have been able to communicate, and it’s going to continue existing for as long as our species persists. We therefore need to make an active effort to ensure that we are all better equipped to identify it and limit its spread, so that we minimise its very real impacts on our societies.

References:
1. Shaer, M. (2017, March 23). Fighting the Nazis With Fake News. Smithsonian Magazine; Smithsonian Magazine. smithsonianmag.com/history/fighting-nazis-fake-news-180962481/
2. Johnson, R. J., et al. (2017). Perspective: A Historical and Scientific Perspective of Sugar and Its Relation with Obesity and Diabetes. Advances in Nutrition (Bethesda, Md.), 8(3), 412–422. doi.org/10.3945/an.116.014654
3. Kunjalal Bhishagratna. (1907). An English Translation of the Sushruta Samhita Based on Original Sanskrit Text.
4. Johnson, R. J., et al. (2017). Perspective: A Historical and Scientific Perspective of Sugar and Its Relation with Obesity and Diabetes. Advances in Nutrition (Bethesda, Md.), 8(3), 412–422. doi.org/10.3945/an.116.014654
5. Faruque, S., et al. (2019). The Dose Makes the Poison: Sugar and Obesity in the United States – a Review. Polish Journal of Food and Nutrition Sciences, 69(3), 219–233. doi.org/10.31883/pjfns/110735
6. Goldman, G., at al. (2014). Added Sugar, Subtracted Science: How Industry Obscures Science and Undermines Public Health Policy on Sugar. Union of Concerned Scientists. jstor.org/stable/resrep00044, full text: ucsusa.org/sites/default/files/2019-09/added-sugar-subtracted-science.pdf
7. Roy, V., Nestlé. (2008). Food and Nutrition Communication. Nestlé. nestle.it/sites/g/files/pydnoa476/files/asset-library/documents/pdf_cartellastampa/zucchero.pdf
8. O’Connor, A. (2016, September 12). How the Sugar Industry shifted blame to fat. The New York Times. nytimes.com/2016/09/13/well/eat/how-the-sugar-industry-shifted-blame-to-fat.html
9. Domonoske, C. (2016, September 13). 50 Years Ago, Sugar Industry Quietly Paid Scientists to Point Blame at Fat. NPR. npr.org/sections/thetwo-way/2016/09/13/493739074/50-years-ago-sugar-industry-quietly-paid-scientists-to-point-blame-at-fat
10. Kearns, C. E., Schmidt, L. A., & Glantz, S. A. (2016). Sugar Industry and Coronary Heart Disease Research. JAMA Internal Medicine, 176(11), 1680. doi.org/10.1001/jamainternmed.2016.5394
11. Perfors, A. (2022, May 20). Why are we so vulnerable to bad information? Pursuit. pursuit.unimelb.edu.au/articles/why-are-we-so-vulnerable-to-bad-information
12. ACMA. (2020). Misinformation and news quality on digital platforms in Australia: A position paper to guide code development. acma.gov.au/sites/default/files/2020-06/Misinformation%20and%20news%20quality%20position%20paper.pdf
13. Bond, S. (2021, May 14). Just 12 People Are Behind Most Vaccine Hoaxes On Social Media, Research Shows. NPR.org. npr.org/2021/05/13/996570855/disinformation-dozen-test-facebooks-twitters-ability-to-curb-vaccine-hoaxes
14. Pennycook, G., et al. (2018). Prior exposure increases perceived accuracy of fake news. Journal of Experimental Psychology: General, 147(12), 1865–1880. doi.org/10.1037/xge0000465
15. Schwikert, S. R., & Curran, T. (2014). Familiarity and recollection in heuristic decision making. Journal of Experimental Psychology: General, 143(6), 2341–2365. doi.org/10.1037/xge0000024
16. Bullock, O. M., et al. (2021). Narratives are Persuasive Because They are Easier to Understand: Examining Processing Fluency as a Mechanism of Narrative Persuasion. Frontiers in Communication, 6. doi.org/10.3389/fcomm.2021.719615
17. Barua, Z., et al. (2020). Effects of misinformation on COVID-19 individual responses and recommendations for resilience of disastrous consequences of misinformation. Progress in Disaster Science, 8, 100119. doi.org/10.1016/j.pdisas.2020.100119
18. Pierre, J., & Neuman, S. (2021, October 27). How Decades of Disinformation about Fossil Fuels Halted U.S. Climate Policy. NPR. npr.org/2021/10/27/1047583610/once-again-the-u-s-has-failed-to-take-sweeping-climate-action-heres-why
19. Donovan, J. (2024, January 5). Jan. 6 was an example of networked incitement − a media and disinformation expert explains the danger of political violence orchestrated over social media. The Conversation. theconversation.com/jan-6-was-an-example-of-networked-incitement-a-media-and-disinformation-expert-explains-the-danger-of-political-violence-orchestrated-over-social-media-220501
20. Thorpe, A. (2023, June 26). Australia vs the internet: What can the government do to fight online misinformation? ABC News. abc.net.au/news/2023-06-27/misinformation-disinformation-australia-proposed-laws-crackdown/102524764
21. New ACMA powers to combat misinformation and disinformation. (2023, July 24). Department of Infrastructure, Transport, Regional Development, Communications and the Arts. infrastructure.gov.au/have-your-say/new-acma-powers-combat-misinformation-and-disinformation
22. O’Neil, M., Ackland, R., & Cunneen, R. (2023). Building resilience with information literacy and information health. University of Canberra News and Media Research Centre. doi.org/10.25916/d15n-g243
23. Wang, M. (2023, December 13). Health misinformation is rampant on social media – here’s what it does, why it spreads and what people can do about it. The Conversation. theconversation.com/health-misinformation-is-rampant-on-social-media-heres-what-it-does-why-it-spreads-and-what-people-can-do-about-it-217059