World News Intel

Imagine you share an Instagram post about an upcoming protest, but none of your hundreds of followers like it. Are none of your friends interested in it? Or have you been shadow banned?

Social media can be useful for political activists hoping to share information, calls to action and messages of solidarity. But throughout Israel’s war on Gaza, social media users have suspected they are being censored through “shadow banning” for sharing content about Palestine.

Shadow banning describes loss of visibility, low engagement and poor account growth on platforms like Instagram, TikTok and X (formerly Twitter). Users who believe they are shadow banned suspect platforms may be demoting or not recommending their content and profiles to the main discovery feeds. People are not notified of shadow banning: all they see is the poor engagement they are getting.


This article is part of Quarter Life, a series about issues affecting those of us in our twenties and thirties. From the challenges of beginning a career and taking care of our mental health, to the excitement of starting a family, adopting a pet or just making friends as an adult. The articles in this series explore the questions and bring answers as we navigate this turbulent period of life.

You may be interested in:

Why you might start to hate the influencers you once loved

We are living in a ‘digital dark age’ – here’s how to protect your photos, videos and other data

When can your boss fire you for social media use? An expert on the law explains


Human Rights Watch, an international human rights advocacy non-governmental organisation, has recently documented what it calls “systemic censorship” of Palestine content on Facebook and Instagram. After several accusations of shadow banning, Meta (Facebook and Instagram’s parent company) argued the issue was due to a “bug” and “had nothing to do with the subject matter of the content”.

I have been observing shadow bans both as a researcher and social media user since 2019. In addition to my work as an academic, I am a pole dancer and pole dance instructor. Instagram directly apologised to me and other pole dancers in 2019, saying they blocked a number of the hashtags we use “in error”. Based on my own experience, I conducted and published one of the very first academic studies on this practice.

Why platforms shadow ban

Content moderation is usually automated – carried out by algorithms and artificial intelligence. These systems may also, inadvertently or by design, pick up “borderline” controversial content when moderating at scale.

Most platforms are based in the US and govern even global content according to US law and values. Shadow banning is a case in point, typically targeting sex work, nudity and sexual expression prohibited by platforms’ community guidelines.

Moderation of nudity and sexuality has become more stringent since 2018, after the introduction of two US laws, the Fight Online Sex Trafficking Act (Fosta) and Stop Enabling Sex Trafficking Act (Sesta), that aimed to crack down on online sex trafficking.

The laws followed campaigns by anti-pornography coalitions and made online platforms legally liable for enabling sex trafficking (a crime) and sex work (a job). Fearing legal action, platforms began over-censoring any content featuring nudity and sexuality around the world, including of legal sex work, to avoid breaching Fosta-Sesta.

Although censorship of nudity and sex work is heralded as a means to protect children and victims of non-consensual image sharing, it can have serious consequences for the livelihoods and wellbeing of sex workers and adult content creators, as well as for freedom of expression.

Platforms’ responses to these laws should have been a warning about what was to come for political speech.

Social media users reported conversations and information about Black Lives Matter protests were shadowbanned in 2020. Now journalistic, activist and fact-checking content about Palestine also appears to be affected by this censorship technique.

Platforms are unlikely to admit to a shadow ban or bias in their content moderation. But their stringent moderation of terrorism and violent content may be leading to posts about Palestine that is neither incitement to violence nor terror-related getting caught in censorship’s net.

How I proved I was shadow banned

For most social media users, shadow banning is difficult to prove. But as a researcher and a former social media manager, I was able to show it was happening to me.

As my passion for pole dancing (and posts about it) grew, I kept a record of my reach and follower numbers over several years. While my skills were improving and my follower count was growing, I noticed my posts were receiving fewer views. This decline came shortly after Fosta-Sesta was approved.

It wasn’t just me. Other pole dancers noticed that content from our favourite dancers was no longer appearing in our Instagram discovery feeds. Shadowbanning appeared to also apply to swathes of pole-dancing-related hashtags.

I was also able to show that when content surrounding one hashtag is censored, algorithms restrict similar content and words. This is one reason why some creators use “algospeak” editing content to trick the algorithm into not picking up words it would normally censor, as seen in anti-vaccine content throughout the pandemic.

Shadowbanning: what are you (not) seeing on your feed?
AYO production/Shutterstock

Check if you are being shadow banned

TikTok and Twitter do not notify users that their account is shadow banned, but, as of 2022, Instagram does. By checking your “account status” in the app’s settings, you can see if your content has been marked as “non-recommendable” due to potential violations of Instagram’s content rules. This is also noticeable if other users have to type your full profile name for you to appear in search. In short, you are harder to find. In August 2023, X owner Elon Musk said that the company was working on a way for users to see if they had been affected by shadow bans, but no such function has been introduced. (The Conversation has contacted X for comment.)

The ability to see and appeal a shadow ban are positive changes, but mainly a cosmetic tweak to a freedom of expression problem that mostly targets marginalised groups. While Instagram may now be disclosing their decisions, the effect is the same: users posting about nudity, LGBTQ+ expression, protests and Palestine are often the ones to claim they are shadow banned.

Social media platforms are not just for fun, they’re a source of work and political organising, and a way to spread important information to a large audience. When these companies censor content, it can affect the mental health and the livelihoods of people who use it.

These latest instances of shadow banning show that platforms can pick a side in active crises, and may affect public opinion by hiding or showing certain content. This power over what is visible and what is not should concern us all.



Source link

Share.
Leave A Reply

Exit mobile version

Subscribe For Latest Updates

Sign up to best of business news, informed analysis and opinions on what matters to you.
Invalid email address
We promise not to spam you. You can unsubscribe at any time.
Thanks for subscribing!