UK Regulator Tells Tech Companies to Do More for Women Online

Additional Coverage:

UK Watchdog Demands Tech Giants Step Up to Combat Online Abuse Against Women

London, UK – Britain’s communications regulator, Ofcom, issued a stern warning to tech companies on Tuesday, demanding they take more decisive action to tackle the pervasive issue of online abuse targeting women and girls. The watchdog unveiled a comprehensive five-point plan aimed at holding websites and apps accountable for the harm inflicted upon their female users.

The plan, crafted with vital input from victims, survivors, safety experts, women’s organizations, and groups advocating for men and boys, outlines practical steps firms must implement to counter “online misogynistic abuse, pile-ons, stalking, and intimate image abuse,” according to an Ofcom news release.

Key features of the plan include:

  • Thought-Provoking Prompts: Social media platforms will be asked to display “prompts” encouraging users to reconsider posting potentially harmful content, without outright blocking them.
  • “Timeouts” for Repeat Offenders: Platforms should make it more challenging for users to misuse features for victimizing others by subjecting repeat offenders to temporary account “timeouts.”
  • Diverse Content Recommendation: Encouraging platforms to recommend a wider array of content and perspectives to prevent the formation of “toxic echo chambers.”
  • Demonetization of Abusive Content: Removing monetization from posts or videos that promote misogynistic abuse and sexual violence.
  • Mass Abuse Prevention: Implementing rate limits on posts to curb “pile-ons” of abuse and enabling users to quickly block or mute multiple accounts.
  • Streamlined Reporting: Allowing users to file multiple reports, track their progress, and bundle safety and privacy features, including making location visibility an opt-in choice.
  • Combatting Image-Based Abuse: Utilizing “hash-matching” technology to detect and delete non-consensual intimate images, blurring nudity, and providing access to support resources, including how to report potential crimes.
  • Proactive Vulnerability Testing: Expecting platforms to test new services and features for potential misuse and provide specialized training on gender-based harms to moderation teams.

Melanie Dawes, Ofcom’s chief executive, emphasized the gravity of the situation, stating, “When I listen to women and girls who’ve experienced online abuse, their stories are deeply shocking. Survivors describe how a single image shared without their consent shattered their sense of self and safety.

Journalists, politicians and athletes face relentless trolling while simply doing their jobs. No woman should have to think twice before expressing herself online, or worry about an abuser tracking her location.”

The initiative also garnered support from Chris Boardman, chairman of Sport England and an Olympic cycling gold medalist, who highlighted the concerning rise of abuse against women in sports, which he believes deters women and girls from physical activity. “Toxic online abuse has terrible offline impacts.

As women’s sport grows, so does the abuse of its stars, and that affects women from every walk of life. The hard-won gains in women’s sport must not be destroyed by misogyny,” Boardman said.

While the code is not mandatory, as it falls under the existing Online Safety Act of 2023, Ofcom views it as an additional layer of protection exceeding current legal obligations. The regulator plans to review progress in summer 2027 and will make formal recommendations to the government if further legal strengthening is deemed necessary.

However, online safety advocacy groups and victims expressed concerns about the voluntary nature and timeline of the plan. Rachel Huggins, chief executive of the children’s online safety NGO Internet Matters, warned that many tech firms might not implement the guidance unless it becomes compulsory, leading to continued high levels of online harm.

Former England football player Lianne Sanderson, who has experienced death threats online, advocated for mandatory user identity verification. “There needs to be online identification to have an account because someone like myself has had death threats,” Sanderson told Sky News.

“I’m a woman, a woman of color. You know I’m gay.

There’s loads of things that people can’t handle. And that’s where it comes to me.

It’s not even about my job.”

Ofcom is currently considering making hash-matching technology mandatory for image-based sexual abuse on an urgent basis, with consultations underway regarding its implementation.

This move by the UK aligns with a growing global effort to enhance online safety. Malaysia recently announced it may require platforms like Instagram, Snapchat, and TikTok to verify user ages to protect children. In Australia, Facebook and Instagram have begun deactivating accounts of users under 16, with a new legal ban taking effect on December 10th.


Read More About This Story:

TRENDING NOW

LATEST LOCAL NEWS