LONDON – Facebook whistleblower Frances Haugen told British lawmakers on Monday that the social media giant is fueling online hate and extremism, failing to protect children from harmful content and having no incentive to fix the issues, giving impetus to efforts by European governments to address the issue work towards stricter regulation of technology companies.
While her statement reflected much of what she said the US Senate This month, her personal appearance drew a great deal of interest from a UK parliamentary committee, which is much more advanced in drafting legislation to curb the power of social media companies.
It comes the same day as Facebook
published his latest earnings and that The Associated Press and other news organizations started posting stories based on thousands of pages of internal company documents she received.
Haugen told the UK Legislators’ Committee that Facebook groups fuel online hate, saying algorithms that prioritize engagement take people with mainstream interests and push them to the extreme. The former Facebook data scientist said the company can add moderators to prevent groups over a certain size from being used to spread extremist views.
“No doubt it makes the hate worse,” she said.
Haugen said she was “shocked to hear this recently Facebook wants to double the metaverse and that they will hire 10,000 engineers in Europe to work on the Metaverse,” Haugen said, referring to the company’s plans for an online immersive world, which it believes will be the next big internet trend.
“I thought, ‘Wow, you know what we could have done with security if we had 10,000 more engineers?'” she said.
Facebook says it wants regulation for tech companies and is happy the UK is leading the way.
“While we have rules against harmful content and regularly publish transparency reports, we agree that we need industry-wide regulation so companies like ours don’t make these decisions alone,” Facebook said Monday.
It pointed out that $13bn (£9.4bn) has been invested in security since 2016 and claimed that it has “almost halved” the amount of hate speech over the past three quarters.
Haugen accused Facebook-owned Instagram of failing to prevent children under 13 — the minimum age for users — from opening accounts, saying it doesn’t do enough to protect children from content that makes them feel bad about their bodies, for example.
“Facebook’s own research describes it as an addict’s narrative. Kids say, ‘This makes me unhappy, I feel like I don’t have the ability to control my use of it and I feel like if I leave I’ll be ostracized,'” she said.
The company last month delayed plans for a children’s version of Instagram, aimed at those under the age of 13, to address concerns about the vulnerability of younger users. Haugen said she’s worried it might not be possible to make Instagram safe for a 14-year-old and that “I sincerely doubt it’s possible to make it safe for a 10-year-old.”
She also said that Facebook’s moderation systems are worse at capturing content in languages other than English and that’s a problem even in the UK as it’s a diverse country.
“These people also live in the UK and are being fed misinformation that is dangerous and radicalizing people,” Haugen said. “Therefore, voice-based reporting is not just a matter for the individual, it’s a matter of national security.”
When asked if she thinks Facebook is fundamentally evil, Haugen hesitated, saying, “I can’t see into the hearts of men.” Facebook wasn’t evil, it was negligent, she suggested.
“It believes in a world of shallowness and will not accept the consequences of its actions,” pointing to its huge, single-story, open corporate office as the embodiment of the philosophy.
She argued that there is a culture at Facebook that discourages low-level employees from raising concerns with senior executives. For many of them, including CEO Mark Zuckerberg, it’s the only place they’ve worked, which adds to the cultural issue, she said.
It was Haugen’s second appearance before the legislature after her attested in the United States about the danger they believe the company poses, from harming children to inciting political violence to promoting misinformation. Haugen cited internal research documents she secretly copied before quitting her job at Facebook’s civic integrity department.
The documents Haugen obtained from the U.S. Securities and Exchange Commission, alleging that Facebook put profits ahead of safety and hid its own research from investors and the public. A few stories based on the files have already been released, revealing the internal turmoil afterwards Facebook was caught unawares by the January 6 riots in the US Capitol and like it vacillated on curbing divisive content in India, and there is more to come.
Representatives from Facebook and other social media companies plan to speak to the UK committee on Thursday.
UK lawmakers are drafting an online safety bill that would call for the establishment of a regulator that would hold companies accountable for removing harmful or illegal content from their platforms, such as: B. terrorist material or images of child abuse.
“This is a moment similar to Cambridge Analytica but potentially bigger because I think it offers a real window into the soul of these companies,” Damian Collins, the lawmaker leading the committee, said before the hearing.
He was referring to the 2018 debacle involving data mining company Cambridge Analytica, which collected details on up to 87 million Facebook users without their permission.
Haugen is due to meet with European Union officials in Brussels next month, where the bloc’s executive commission is updating its digital rulebook to better protect internet users by making online businesses more accountable for illegal or dangerous content.
Under UK rules, which are expected to come into force next year, the Silicon Valley giants face a maximum fine of up to 10% of their global sales for breaches. The EU is proposing a similar penalty.