World News Intel

Press play to listen to this article

Voiced by artificial intelligence.

LONDON — Britain’s attempt to rein in the internet has turned into a political omnishambles.

The country’s masterplan — known as the Online Safety Bill — would require social media giants like Facebook and YouTube to quickly remove illegal content like revenge porn or hate speech, or face hefty penalties and even potential criminal prosecutions for tech executives who fail to act.

Yet the landmark rules, which are expected to become law by the autumn, have fallen victim to the political chaos that has seen the Conservative Party live out its identity crisis — in real-time — over the last five years. The proposals have split consecutive governments along ideological lines, with politicians divided between those eager to slap new limits on Big Tech and others yearning to cut back on regulation in the post-Brexit era.

As Rishi Sunak, the country’s prime minister, counts down the clock ahead of next year’s expected general election, the Online Safety Bill has become a litmus test for what the United Kingdom stands for as it charts its own path after leaving the European Union. But there are serious doubts the legislation can deliver on the U.K.’s twin promises of creating a safer internet and promoting itself as a place to do business — all while upholding freedom of speech.

POLITICO talked to more than two dozen government ministers, backbench politicians, former and current officials, tech executives and civil society campaigners involved in shaping the proposals, many of whom spoke on the condition of anonymity to discuss internal deliberations. 

What emerged was a fight for the soul for Britain’s approach to digital rulemaking that has left few, if anyone, satisfied with how the legislation is shaping up.

Repeated changes in government policy — driven by four prime ministers and five digital ministers since the proposals were first published in 2019 — have created a Frankenstein bill aimed at pleasing everyone.

“It is one of those pieces of legislation that is created with good intentions in response to a demand that something is done,” said Hugh Bennett, a former No. 10 Downing Street adviser who worked on the bill under former Prime Minister Liz Truss. “But rapidly (it) becomes a highly overwrought bill that will not necessarily even achieve its aims.”

“You’re looking at 200 clauses, and it’s obviously going to be a hugely consequential piece of legislation that’s going to impact tens of thousands of businesses around the country, not just Big Tech firms,” he added.

Jeremy Wright, a former British digital minister who published the initial proposals, said the repeated delays in getting the rules over the line — because of domestic political turmoil — had led to the U.K. missing an opportunity to show “true global leadership” on digital rulemaking.

“You can’t be one of the second or third to do it,” he said. “That’s not leadership.”

But Michelle Donelan, the science and technology minister now in charge of the bill, said London was protecting children “once and for all,” and that the legislation had been framed to strip out illegal content from people’s social media feeds.

“Even other countries that have started in this space haven’t been as broad as we have,” she told POLITICO.

Self-regulation is over

Britain’s masterplan for tech regulation got its start under Theresa May.

With her inbox overflowing with Brexit matters after she took office in the wake of the referendum, she only turned her attention, in earnest, to policing the excesses of the internet in late 2018.

It marked a shift from London’s laissez-faire approach to tech — one that had made the country the golden child of Europe’s tech industry — that relied on the platforms, not the government, doing the heavy lifting on content moderation.

Two events drove that U-turn — making social media frontline politics in a way that hadn’t happened before.

In 2017, 14-year-old Molly Russell died by suicide after viewing a stream of self-harm and suicide-related Instagram posts that had been served up to her via the company’s complex algorithms, according to the coroner’s subsequent report. A year later, the Cambridge Analytica scandal, in which an academic had illegally scooped up reams of people’s personal information on Facebook and weaponized it for political gain, reaffirmed Westminster’s desire to act tough on Big Tech.

Prince and Princess of Wales, William and Catherine, meet Ian Russell, the father of Molly Russell died by suicide at 14 years of age | Yui Mok – WPA Pool/Getty Images

“The Molly Russell case was a huge turning point,” said Lorna Woods, an academic at the University of Exeter whose ideas about a so-called “duty of care,” or legal requirements for social media companies to keep users safe online, lies at the heart of British legislation.

It was also personal. 

Many British politicians had suffered repeated online trolling with little, to no, support from social media giants to make it stop. With lawmakers getting abused online almost every day — particularly in partisan post-Brexit Britain — the need for action became paramount, according to four politicians and officials who spoke to POLITICO.

“I use social media myself,” said Margot James, a U.K. digital minister from January 2018 to July 2019. “It is very evident there is next-to-no comeback on people who abuse others online and orchestrate campaigns that encourage violence against people.”

At a meeting with campaigning groups like the NSPCC in 2018, James became convinced the current rules, which allowed social media to police themselves, were not fit for purpose after hearing how repeated attempts at self-regulation had failed. The NSPCC has long highlighted harrowing accounts of how children are abused online or have suffered from mental health issues after doom-scrolling through graphic content. 

“I was skeptical about this at the time, but was really persuaded of the need for change and powerful independent statutory regulation by this meeting,” James added. 

Buy-in across government

In the Home Office, Sajid Javid — then in charge of that ministry, and who had previously pushed for age verification of online pornography as culture minister under David Cameron — was also on board.

In a routine visit to the child exploitation protection command at the country’s National Crime Agency, he saw, first hand, just how widespread the grooming and real-time sexual abuse of kids had become, he told POLITICO.

“It was the first time I was shown the kinds of things they’re trying to deal with, and just how widespread it was, and how horrific it was,” he said.

What resulted was some of the toughest digital regulation anywhere in the Western world. 

At a launch event attended by London’s tech glitterati and online safety campaigners at the British Library in London, Wright and Javid set out plans for a new “duty of care,” underpinned by an independent regulator, to make companies take more responsibility for people’s online safety and tackling harm caused by illegal content or activity on their services. 

That included lengthy risk assessments pinpointing where the companies believed issues would arise; binding codes of conduct overseen by a regulator; and hefty fines, including against individual tech executives, for failure to act.

“If we had moved ahead on the timetable that we had in mind when we produced the white paper, which would have meant we passed legislation several years ago frankly, I think genuinely the U.K. would have been a global leader in this,” Wright told POLITICO. 

“We would have been ahead of the European Union, and almost any other comparable jurisdiction, in having a comprehensive regime of online regulation,” he added. 

Lost momentum

That hope and ambition lost its momentum in the early days of the Boris Johnson administration.

Johnson was torn between his free speech instincts and pressure from fellow lawmakers, according to an official who worked with him, who spoke on the condition of anonymity to discuss internal deliberations. 

Johnson would “yo-yo” from being a supporter of tough rules to questioning their impact on free speech after reading newspaper articles either for or against the proposals, the official added.

In the ongoing tensions over the U.K.’s departure from the EU, digital rule-making was quickly demoted below more critical issues like Northern Ireland and a trade agreement, according to three other officials involved in those discussions.

Tory MP Oliver Dowden was former British PM Boris Johnson’s first pick as digital minister in early 2020 | Kirsty O’Connor – WPA Pool/Getty Images

When the COVID-19 pandemic hit the country in early 2020, new tech legislation also joined a second tier of priorities, based on discussions with two of those policymakers, although the team of officials in the department working on the bill were “completely protected” during the pandemic, in contrast to other parts of the department which were scrambled to deal with the crisis-hit arts sector.

Oliver Dowden, Johnson’s first pick as digital minister in early 2020, pushed on with the legislation, publishing a draft bill and setting up a cross-party committee of lawmakers to scrutinize the legislation.

But two officials working in the department, who also spoke on the condition of anonymity, said they did not detect the same drive as some of his predecessors, with another current government official saying they thought he was “almost scared of it.”  

Over his 18-month tenure as digital minister, for instance, Dowden held six meetings with outside groups like Facebook, the NSPCC and antisemitism campaigners, specifically about the online harms proposals, according to U.K. government transparency filings. That’s half the number of meetings he had over the same time about soccer-related matters like the Euros and aborted European Super League, based on the same documents.

Four tech lobbyists, who held repeated meetings with lower-level U.K. officials over that time period, said many of those discussions focused more narrowly on COVID-19-related matters like vaccine misinformation and the need to promote verified pandemic information on search and people’s social media feeds.

“It was part of the discussion, but it wasn’t the focus,” said one of those individuals, who spoke on the condition of anonymity because he was not authorized to speak publicly, in reference to the Online Safety Bill. “There were bigger things going on then than the online harms.” 

Dowden’s defenders chafe at the criticism.

He met the parents of Molly Russell and was concerned that, because children were spending increasing amounts of time online, the legislation was only gaining in importance, according to an official who worked with the minister.  

“[The bill] advanced more during his time in terms of becoming something of substance compared to any other secretary of state,” they added.

The Dorries effect

The political seesaw of Britain’s tech rulemaking then got an unlikely supporter: Nadine Dorries.

The pugnacious former nurse — who had authored a series of romance novels and appeared on a British reality TV show — wanted to expand the legislation when she became the country’s digital minister in September, 2021.

Her push for tougher rules, including controversial proposals that would force the likes of YouTube and Twitter to take action against perceived harmful content that had not broken any British laws, was borne out of her time as the U.K.’s health minister when she had seen, first had, the harmful effects on young people from what was happening online. 

“I was quite determined when I got there to make it happen because it had to,” she told POLITICO. The bill, she added, “wasn’t in a brilliant place.”

There had been “no real political will previously to make it happen,” she added.

Dorries leant heavily on her personal relationship with Johnson, convincing the deregulator-in-chief to go further than ever before. That lobbying led to London’s efforts to rein in the internet to include provisions forcing Big Tech companies to stamp out online content which praised self-harm, harassment and eating orders.

The expansion of rules was welcomed with muted applause within the Office of Communications, or Ofcom, the broadcasting and telecommunications regulator that had been tasked with overseeing the country’s new digital regime.

The agency had gone on a hiring spree in preparation for its expanded role, poaching the likes of Tony Stower from the NSPCC and the ex-Google executive Gill Whitehead. Yet despite the new staff — and promises of hundreds more officials in the coming years — Ofcom was still struggling to wield its existing powers related to combating illegal content online, according to four officials who spoke to POLITICO on the condition of anonymity. 

Two of those individuals said the prospect of handling even more powers — especially those related to policing so-called “harmful, but legal” content — had many questioning how Ofcom could pull that off without angering politicians eager for results.

Still, the brutal murder of David Amess in October 2021 re-energized the beefed-up proposals after Ali Harbi Ali, a British national who killed the Conservative politician, had been radicalized, in part, by watching Islamic State videos online, according to British prosecutors.

During emotive speeches in the House of Commons following the fatal stabbing, politicians from all sides vented anger over Amess’ death and how he had voiced growing concern about direct threats against lawmakers and the increased level of toxicity within public discourse.

“He was appalled by what he called the vile misogynistic abuse that female MPs had to endure online, and he told me recently that he wanted something done about it,” Mark Francois, a Conservative backbench lawmaker, said during tributes to Amess in the House of Commons.

Johnson, ever the astute political operator, seized the moment. In the wake of Amess’ murder, he told politicians in October, 2021 the digital rules would be published in the House of Commons before the end of year — even though ministers and department officials had not yet signed off on the draft. The proposals were eventually sent to lawmakers in March, 2022.

That coincided with recommendations from a joint committee of politicians from both the House of Commons and House of Lords that suggested boosting protections for children online and forcing social media companies to be responsible for how their platforms affect society. Several of those additions found their way into the legislation.

“I have very little sympathy for the platforms,” Damian Collins, who chaired that parliamentary committee, told POLITICO. People’s “experiences are shaped by AI-driven recommendations that have been created for business reasons that exist to drive engagement. That engagement is driving people toward content that meets the criminal threshold.”

Sunak era

Johnson’s victory lap on digital rulemaking was cut short after he was forced to resign last summer over rolling ethics scandals.

Again, simmering divisions within the Conservative Party about the U.K.’s approach to policymaking quickly bubbled to the surface when Kemi Badenoch, a rising star who pitched herself in the leadership contest to the right of the party, described the Online Safety Bill as “legislating for hurt feelings.”

Truss and Sunak — the frontrunners to become the next prime minister who had said little, to nothing, about the proposals — quickly pledged to revisit the plans, partly to see off Badenoch’s leadership challenge.

The short-lived Truss administration had little time to delve into the bill. But Donelan, the current science and technology minister who was appointed by Truss and then held onto her job when Sunak took over, backed changes that scrapped the Dorries era diktat forcing the largest companies to be responsible for policing legal, but harmful, material.

In its place, the government added a requirement for users to be given the option to avoid viewing content like hate speech or self-harm posts via the ability to turn off such material. 

Those provisions, according to three of the tech executives who spoke on the condition of anonymity, would certainly be vote-winners with the public. But, in practice, they would be difficult to implement given the complexities of how social media platforms operate, especially when Brits communicated with those outside the country.

Sunak, whose tech industry background favors policymaking focused on boosting innovation, not holding companies’ heels to the fire, also gave way to prominent Conservative backbenchers like Bill Cash and Miriam Cates, who added stronger provisions that would hold company executives criminally liable for any wrongdoing. 

Other Conservative pet projects, including requirements for the biggest platforms to remove content that promoted illegal immigrants coming to the U.K. in makeshift boats from France, were also snuck into the legislation.

“There’s a huge underestimation of how easy this is to fix,” said Richard Allan, a Liberal Democrat in the House of Lords and a former senior lobbyist for Facebook. 

End game

Even as the legislation is expected to be passed by the autumn, British politicians still can’t help but fiddle with the proposals. 

In the House of Lords, which is expected to vote on the bill sometime next month, lawmakers from all sides are jockeying to add further requirements to toughen up age verification requirements for porn sites and insert greater protections for kids, according to five lawmakers and campaigners involved in those amendments.

Beeban Kidron, the director of some of the Bridget Jones franchise who previously successfully lobbied the U.K. government to boost online privacy protections for children, is leading the push for beefed-up safety measures for kids. That stems from her work, as a crossbench Baroness, to support Molly Russell’s family to get answers from Meta about the type of self-harm posts which were shown to the 14-year-old before she died.

“My inbox is filled with parents who are bereaved and can’t get more information from tech companies about what was going on. It’s pure torture,” she said.

Still, Kidron — a longtime supporter of tough rules for tech companies, particularly for kids — is realistic about how much can be done to digital proposals that have gone through the wringer of consecutive British governments that have all tweaked the rules for their own political aims.

“I don’t think the length [of time] it has taken, or the people it has gone through, has helped the bill,” she added.

Source link

Share.
Leave A Reply

Exit mobile version

Subscribe For Latest Updates

Sign up to best of business news, informed analysis and opinions on what matters to you.
Invalid email address
We promise not to spam you. You can unsubscribe at any time.
Thanks for subscribing!