World News Intel

BRUSSELS — The European Union’s effort to crack down on illegal and harmful digital content in the bloc is about to take off.

As soon as Friday, a total of 19 very large online platforms and search engines — called VLOPs — visited by more than 45 million Europeans every month will have to comply with the EU’s online content rulebook adopted in 2022, the Digital Services Act (DSA). 

Social media platforms in the crosshairs include Facebook, Instagram, TikTok, X (formerly Twitter), YouTube, Snapchat, LinkedIn and Pinterest. Online marketplaces Amazon, Booking, AliExpress, Zalando, Google Shopping, as well as Wikipedia and Google Maps, will also face the new regulatory music. The new rules will also apply to Google and Apple’s app stores, and to Google Search and Microsoft’s Bing.

Companies will have to swiftly take down illegal content, like photos of child sexual abuse, as well as assess and propose concrete measures to counter major risks their platforms pose for society, including the spread of disinformation and cyberbullying. They will have also to be more transparent about how they function.

The European Commission will be able to issue fines of up to 6 percent of a company’s annual global revenue. It could also temporarily ban a tech company from operating within the bloc under exceptional cases of serious non-compliance. Brussels will be supported by national watchdogs in countries where companies have their European headquarters, including Ireland. 

The very large online platforms and search engines will also have to pay a fee of up to 0.05 percent of their global revenues to fund the Commission’s enforcement work.

Here are five key obligations these large websites and apps will face under the new rulebook:

1. Remove illegal content 

Companies like Facebook and TikTok will have to “expeditiously” remove illegal content — as defined by European and national EU laws — when notified by national authorities or individuals. Online platforms will need to have clear and easy mechanisms for users to flag content they believe is illegal. 

Platforms will have to suspend users who often post illegal content, but not before giving them a warning. 

Online marketplaces like Amazon and AliExpress will have to make their “best efforts” to check up on their online traders in a bid to stamp out illegal products — everything from fake luxury shoes to dangerous toys. If they realize consumers have bought an illegal product, they will have to warn them or else make the information public on their website. 

2. Keep a lid on harmful content like disinformation and bullying  

In an unprecedented measure, online platforms and search engines will have to hand over to the Commission a detailed annual report of the so-called systemic risks they pose for Europeans. 

Companies from Snapchat to X will have to determine how the cogs in their systems — like their algorithms recommending content and ads to people — potentially contribute to the spread of illegal content and disinformation campaigns. They will also have to see if their platforms open the door for cyber violence, undermine fundamental rights like freedom of expression, and adversely affect people’s mental health. 

Every six months, platforms will have to open up and provide long-guarded information | Justin Sullivan/Getty Images

They will then have to implement measures to limit the risks they’ve identified. These could include adjusting their algorithms; creating tools for parents to control what their children are seeing and to verify the age of users; or labeling content like photos or videos that were generated by artificial intelligence tools. 

Companies will be scrutinized by the Commission, vetted researchers, and auditing firms. The latter will specifically go through the assessment and the measures to either approve the companies’ work or make further recommendations.

Social media platforms and search engines will also have to quickly assess and adapt their services to stem the spread of falsehoods in a crisis like a natural disaster, a war or a pandemic.

3. Give power to their users

Very large online platforms and search engines will need to have easily understandable terms and conditions — and apply them in a “diligent, objective and proportionate manner.”

Companies will have to inform users if they remove their content, limit its visibility, or stop its monetization, and tell them why. Platforms including Elon Musk’s X will also need to warn users and explain any suspension (like in the case of journalists temporarily banned from Twitter). Users will be empowered to challenge the platforms’ decisions with the company, in out-of-court bodies and, finally, in court. 

Tech companies will have to explain the parameters behind their algorithms’ content recommendations and offer at least one algorithmic option that doesn’t recommend content based on people’s personal data.

4. End of some targeted ads 

Platforms will be banned from targeting people with online ads based on sensitive personal data including their religion, sexual preference, health information and political beliefs. They also won’t be allowed to collect children’s and teenagers’ personal data to show them targeted ads. 

So-called dark patterns — manipulative designs nudging people into agreeing to something they don’t actually want, like consenting to be tracked online — will also be outlawed. 

5. Reveal closely guarded information about how they operate

Every six months, platforms will have to open up and provide long-guarded information, including details about the staff moderating their content — such as size, expertise and European languages spoken. 

They must disclose the use of artificial intelligence to remove illegal content and its error rate. They will also have to make public their assessment reports and their auditing reports on how they have limited serious risks to society including threats to freedom of speech, public health and elections. They will need to have a repository with information about the ads that have run on their platforms.

Regulators will have access to the companies’ data and algorithms, and can inspect their offices and request sensitive business documents. Vetted researchers will also be empowered to access platforms’ internal data for specific projects.

Source link

Share.
Leave A Reply

Exit mobile version

Subscribe For Latest Updates

Sign up to best of business news, informed analysis and opinions on what matters to you.
Invalid email address
We promise not to spam you. You can unsubscribe at any time.
Thanks for subscribing!