Last week, social media giant Meta announced major changes to its content moderation practices. This includes an end to its fact-checking program, starting with the United States.
Meta’s platforms – which include Facebook, Instagram and Threads – will no longer employ human fact-checkers and moderation teams, relying instead on a user-sourced “community notes” model. This is a similar method to current content moderation on X (formerly Twitter).
Meta’s hateful conduct policy also changed last week to allow more “free speech”. Advocate groups and experts warn this could lead to an increase in abusive and demeaning statements about Indigenous people, migrants and refugees, women and LGBTQIA+ people.
Many experts now also fear an increase in disinformation and misinformation on Meta’s platforms.
Health content has been a focus of concern about online misinformation, particularly in relation to COVID. There has been less discussion of the potential impact of Meta’s new policies on sexual and reproductive health info online – but the impacts could be profound.
A ‘town square’ for health info
Since the COVID pandemic, online platforms have been increasingly important for sexual and reproductive health organisations.
On social media, organisations such as Family Planning Australia can easily and inexpensively share factual information about sensitive and potentially stigmatising public health issues, including unplanned pregnancy and HIV.
For good or ill, Meta’s platforms are spaces where public health information can reach diverse audiences. This can be especially helpful for people in rural and regional areas, young people, and anyone not already connected with reliable health services.
Facebook and Instagram serve as “town squares” for many health service providers. But what happens when the community members most in need of contraception or sexual health information and services no longer feel safe in this town square?
Meta has claimed community notes will be compiled from multiple sources to avoid bias. Some online sex educators were initially optimistic that the policy changes might make it easier to share sexual and reproductive health content.
Internal training materials leaked from Meta reportedly show that comments like “gays are freaks” or “transgender people are mentally ill” are now permissible. This would create significant risks for users and for the healthcare services sharing info online.
From too much censorship to targeted attacks
Meta’s own Oversight Board has acknowledged that the platform has over-censored content related to nudity, sexuality and gender in the past. This has resulted in sexual and reproductive health content being blocked or “shadow-banned” (when the content is hidden from other users without the poster’s knowledge).
The community notes process replaces human moderators with crowd-sourcing. Information is gathered from multiple users with a diverse range of political views. A note is then added to flag misinformation. Meta is currently recruiting users on Threads, Facebook and Instagram for the US community notes rollout.
But investigations of this system on X have shown notes are often added hours after false or misleading content has already gone viral.
What’s more, the process itself can be weaponised. Research into what has been termed “user-generated warfare” has found that politically motivated users are already manipulating community guidelines to attack content creators on Instagram and TikTok.
This includes targeted attacks on women’s health organisations and LGBTQIA+ organisations as part of an “anti-rights pushback”. Globally, government and non-government groups have led organised campaigns opposing both reproductive freedom and trans rights.
Malicious tactics include false reporting of images for violating community guidelines. They can also involve coordinated pile-ons of hate speech in the comments under a social media post.
Women, trans people and other LGBTQIA+ people are disproportionately affected by these forms of social media manipulation. Sexual and reproductive health content creators have responded by self-censoring health information or removing themselves from social platforms.
Where to from here?
Evidence suggests the move to community notes has already led many LGBTQIA+ and women’s health organisations to close their X accounts.
Health service users are leaving Meta platforms, too. Some health outreach organisations are encouraging community members to stay connected through private newsletters and mailing lists.
But not everyone is comfortable sharing their email address. Social media platforms offer privacy and anonymity for vulnerable people who may not have access to other sources of reliable sexual and reproductive information.
There is no perfect solution. Social media users are exploring new platforms, such as Bluesky. This means sexual and reproductive health service providers will need to be open to experimenting with untested platforms, too.
This may be hard for organisations that have already invested time and money in health promotion on Meta platforms. But in a rapidly evolving global political environment, business as usual is not an option.