‘Dangerous’ Facebook groups push people to ‘extreme interests’ – whistleblower

Facebook groups can be “dangerous” because they use algorithms that “take people who have mainstream interests and push them to extreme interests”, a whistleblower has told MPs.

Frances Haugen said groups on the site can become “echo chambers” that reinforce and radicalise people’s opinions.

She said, for example, that users with left-wing opinions can be pushed to the radical left, while those looking for healthy recipes can be pushed to anorexia content.

Facebook whistleblower explains why Instagram is ‘more dangerous’ than other platforms – Frances Haugen’s testimony as it happened

Image: MPs heard that Facebook groups can become ‘echo chambers’ reinforcing and radicalising people’s opinions. File pic

Ms Haugen, a former product manager in the company’s civic integrity unit, was giving evidence to MPs considering the UK’s draft online safety bill.

Advertisement

She said: “One of the things that happens with groups and networks of groups is people see echo chambers that create social norms…” and that people with opposing views are “torn apart”.

More on Facebook

Facebook: Leaked documents show Apple’s threat to remove app over ‘sale of maids’

Facebook whistleblower explains why Instagram is ‘more dangerous’ than other social media | Frances Haugen evidence to MPs as it happened

Facebook under fresh pressure as whistleblower prepares to give evidence to MPs

Related Topics:

“When that context is around hate you see a normalisation of hate, a normalisation of dehumanising others, and that’s what leads to violent incidents,” she added.

Among her evidence, she also told the committee that Facebook “unquestionably” makes online hate worse, sees safety as a “cost centre”, and that “critical teams” are understaffed.

“If you make noise saying we need more help… you will not get rallied around for help because everyone is under water,” she said.

She said “engagement-based metrics” – which focus on how many people like, share or comment on a post – were a major problem on all social media sites.

She said they favoured polarised content and were “biased towards bad actors”.

“The events we’re seeing around the world, things like Myanmar and Ethiopia, those are the opening chapters because engagement-based ranking does two things: one, it prioritises and amplifies divisive and polarising extreme content and two it concentrates it,” said Ms Haugen.

The whistleblower previously appeared before the US Senate and her testimony in the UK on Monday comes as fresh stories appeared based on the thousands of documents she leaked.

She has already levelled a series of allegations, such as claiming Facebook’s platforms “harm children, stoke division and weaken our democracy”, and that it refused to act because executives put profits above safety.

What is the online safety bill – and why are some people worried?

Facebook boss Mark Zuckerberg has rejected her claims, saying her attacks were “misrepresenting”, and that it “cares deeply about issues like safety, well-being and mental health”.

Lena Pietch, the firm’s director of policy communications, has also said Ms Haugen worked for them for less than two years and that during her Senate appearance she testified more than six times to not working on the subject matter in question.

The UK parliament’s draft online safety bill aims to establish a new way to regulate online content – covering not just abusive messages but all harmful material online, from bullying through to pornography.

MPs spoke to the whistleblower in the hope of making the upcoming law more effective and fit for purpose.

Ms Haugen told them that for any social media platform over a couple of million users, the public should have the right to understand how it might be affecting society.

“Any tech company that has a large societal impact we need to be thinking about how do we get data out of that company,” she said.

Instagram is the most potentially harmful social media service for teenagers, the former Facebook employee added, because it is about “social comparison” and “bodies”.

Follow the Daily podcast on Apple Podcasts, Google Podcasts, Spotify, Spreaker

For example, there have been persistent claims that it can make eating disorders worse for some people.

The photo-sharing site recently said it was pausing efforts to make a bespoke version for children, and Ms Haugen said she was “deeply worried” that it was not possible to make Instagram safe for young teens.

Among the documents she recently leaked was an internal study that suggested peer pressure generated by Instagram led to mental health and body image problems among young girls – and in some cases, eating disorders and suicidal thoughts.

Other stories from her leaks that emerged today include claims a lack of moderators who speak local languages and understand cultural contexts have led to Arabic content being unnecessarily censored in places such as Gaza and Syria.

And also that violence has been incited by political groups in India and Myanmar for the same reason.

Documents suggest Facebook has understood these shortcomings for years but done little about it.

A Facebook spokesperson told the AP news agency on Monday that in the last two years it had hired more staff with local dialect and topic expertise to solve such problems.

Writing in The Telegraph, Facebook’s vice president of content policy Monika Bickert said the social media giant invested £13bn on safety and security since 2016 and that more than 40,000 people were working in these areas.

“Contrary to claims that have been made about Facebook recently, we’ve always had the commercial incentive to remove harmful content from our sites,” she wrote. “People don’t want to see it when they use our apps and advertisers don’t want their ads next to it.”

Please use Chrome browser for a more accessible video player

Frances Haugen: Algorithm Facebook uses is ‘dangerous’

She said Facebook “almost halved the amount of hate speech people see” on the platform over the last three quarters.

Ms Bickert also said it took part in more than 400 peer-reviewed studies in the past year to help “make our apps better for the people who use them”.

“Contrary to recent claims, our research doesn’t conclude that Instagram is inherently bad for teens,” she wrote. “While some teens told us Instagram made them feel worse when they were struggling with issues like loneliness, anxiety and sadness, more teens told us that Instagram made them feel better when experiencing these same issues.

“But if even one young person feels worse, that’s one too many, so we use our research to understand bad experiences and prevent them.”