Social media firms must make it easier to report misogyny, says Ofcom’s chief, after its study found 60 per cent of women have suffered harmful behaviour online including harassment and trolling in just a month.
In an interview with The Telegraph, Dame Melanie Dawes, the watchdog’s chief executive, said the “shocking” abuse that women faced online was getting worse while at the same time they had lost confidence in the ability of social media firms to remove it when they complained.
“People don’t feel that when they report something that there’ll be any action,” said Dame Melanie, who pledged that Ofcom would be “straight into the companies and asking for information” once it was formally empowered as the regulator by the Government’s new online safety laws.
She said a priority on taking over would be to ensure there were effective ways for people to report abuse. It is understood this will include enabling “bystanders” to report misogyny, harassment and trolling rather than just leaving it up to the victim to get it taken down.
The Ofcom research, based on 6,000 people, found that although men had experienced harmful behaviour online (64 per cent) in the past four weeks, women were more likely to be distressed by it, at 43 per cent versus 33 per cent of men.
Women found hateful, offensive or discriminatory online content particularly concerning compared to men (85 per cent versus 70 per cent of men), as well as trolling (60 per cent versus 25 per cent of men).
Dame Melanie said social media must prevent and crackdown on “illegal” content like revenge porn, harassment and stalking. “We will be going straight in there and asking for information on what they’re doing about what is already illegal,” she said.
“We would then say: ‘talk to the women on your services, understand who’s on your services and who actually is experiencing a problem. Find out what they think about the tools to report [abuse] and show them that you’re acting when something is going wrong because at the moment, there isn’t the confidence there’.”
Another key target will be the social media firms’ algorithms, which she blamed for being behind the worst online harms because they were designed to boost revenues, profits and advertising rather than to protect users.
“Some of the worst harms are caused when things get shared with hundreds of thousands of other people. That’s when trolling and pile ons really occur,” she said.
“Algorithms are too often designed around the business and not around the user. They are often built around engagement. That’s the business model. That’s what drives advertising revenue for them.
“They are built to amplify engagement, but we know that that also means that they often amplify harm. So that’s the third thing that we think the platform’s need to be looking into.”
She said she wanted the social media companies to ensure they were safe in advance. “Too often today we find that new products, whether it’s the metaverse or new services are trialled on the public often on quite young people as well as sometimes on children,” she said.
“And then it’s much harder to retrofit the safety features later. So we want to see this thinking much more early on in the decision making.”
The Online Safety Bill, which will place new regulations and requirements on tech firms and social media platforms to protect their users, is currently making its way through Parliament.
As the official regulator, Ofcom will have powers to fine companies up to 10 per cent of their global turnover, block services that fail to comply with the law and bring criminal proceedings against executives who fail to comply with its investigations or requests for information.