The threat of disinformation on social media in the lead-up to New Zealand’s 2023 election loomed large for the Electoral Commission and academics studying fake news.
So how bad did it really get?
As part of the New Zealand Social Media Study, we analysed more than 4,000 posts on Facebook from political parties and their leaders. Our study focused on the five weeks ahead of election day.
What we found should give New Zealanders some comfort about the political discourse on social media. While not perfect, there was not as much misinformation (misleading information created without the intent to manipulate people) and disinformation (deliberate attempts to manipulate with false information) as everyone feared.
Looking for fake news and half truths
To identify examples of both of those, a team of research assistants analysed and fact-checked posts, classifying them as either “not including fake news” or “including fake news”.
Fake news posts were defined as completely or mostly made up, and intentionally and verifiably false.
An example of this type of disinformation would be the “litter box hoax”, alleging schools provided litter boxes for students who identified as cats or furries.
Originating from overseas sources, this story has been debunked multiple times. In New Zealand, this hoax was spread by Sue Grey, leader of the NZ Outdoors & Freedoms Party.
Social media can be information poison when we need facts most
In cases of doubt, or when the research assistants couldn’t prove the information was false, they coded the posts as “not including fake news”. The term “fake news” was therefore reserved for very clear cases of false information.
If a post did not include fake news, the team checked for potential half-truths. Half-truths were defined as posts that were not entirely made up, but contained some incorrect information.
The National Party, for example, put up a post suggesting the Ministry of Pacific Peoples had hosted breakfasts to promote Labour MPs, at the cost of more than $50,000. While the ministry did host breakfasts to explain the most recent budget, and the cost was accurate, there was no indication the purpose of this event was to promote Labour MPs.
How 2023’s election compared to 2020
At the beginning of the campaign, the proportion of what we identified as fake news being published on Facebook by political parties and their leaders was 2.5% – similar to what we saw in 2020.
The proportion of fake news posts then dropped below 2% for a long period and even fell as low as 0.7% at one point in the campaign, before rising again in the final stretch. The share of fake news peaked at 3.8% at the start of the last week of the campaign.
Over the five weeks of the campaign, we identified an average of 2.6% of Facebook posts by political parties and their leaders in any given week qualified as fake news. In 2020, the weekly average was 2.5%, which means the increase of fake news was minimal.
The sources of much of the outright fake news were parties on the fringes. According to our research, none of the major political parties were posting outright lies.
But there were posts from all political parties assessed as half-truths.
Half-truths stayed well below 10% during the five weeks we looked at, peaking at 6.5% in the final week. On average, the weekly share of half-truths was 4.8% in 2023, while in 2020 it was 2.5%.
So while the number of “big lies” – also known as “fake news” – did not increase in 2023 compared to 2020, the number of “small lies” in political campaigns is growing.
All of the political parties took more liberties with the truth in 2023 than they did in 2020.
Playing on emotions and oversimplifying
More than a third of all misleading posts in 2023 were emotional (37%), targeting voters’ emotions through words or pictures. Some 26% of the social media posts jumped to conclusions, while 23% oversimplified the topics being discussed. And 21% of the posts cherry-picked information, meaning the information presented was incomplete.
Some of the social media posts we identified as fake news or half-truths used pseudo-experts: people with some academic background, but who are not qualified to be expert witnesses on the topic under discussion (18%).
We also saw anecdotes of unclear origin, instead of scientific facts (15%), while 7% had unrealistic expectations of science, such as expecting science to offer 100% certainty.
Some of the posts included the claim that the posts’ authors had a silent majority behind them (5%). Another 5% of the social media posts identified as disinformation included personal attacks, rather than debating someone’s arguments.
Staying vigilant
The levels of misinformation and disinformation on social media during the past two elections in New Zealand have been fairly low – and certainly no cause for panic. But that doesn’t mean it will always stay that way.
On the one hand, we need to keep an eye on the social media campaigns in future elections and, in particular, monitor the development and use of misinformation and disinformation by political parties on the fringe.
Beyond fake news: social media and market-driven political campaigns
We also need to keep eye on the major parties, as small lies might pave the way for more fake news or conspiracy theories in the future.
On the other hand, we need to resist overstating the use of misinformation and disinformation in New Zealand. Currently, there doesn’t appear to be the appetite to spread disinformation on social media by our major political parties or leaders.
This is a good thing for the health of our democracy, and we need to ensure it stays that way.