Facebook is “unquestionably” making online hate worse and the world is witnessing just the “opening chapters” of the harm it has caused, a whistleblower has said.
Frances Haugen, a former employee of Mark Zuckerberg’s social media empire, made a series of explosive revelations in evidence to MPs on Monday.
The data engineer, who left the tech giant in May and handed thousands of documents to the US government, told how Facebook groups used “dangerous” algorithms that could “take people who have mainstream interests and push them to extreme interests”.
The ex-product manager said the Silicon Valley firm also sidelined voices calling for change over fears for shareholders’ profits, saying “anger and hate is the easiest way to grow on Facebook” and it is “easier to provoke people to anger, than empathy or compassion".
It comes amid widespread fears social media firms are failing to stop misinformation, extreme hate and harmful content, such as that encouraging self harm and bullying.
Asked if the world would see more violent events, like Donald Trump protestors storming Capitol Hill in January, Ms Haugen said: “I have no doubt that the events we are seeing around the world, like in Myanmar and Ethiopia, those are the opening chapters because engagement-based ranking does two things.
“One, it prioritises and amplifies divisive, polarising and extreme content, and, two, it concentrates it.”
She said that hateful content may be “hyper-concentrated” to small sections of the site’s users but it was nonetheless “dangerous”.
She also disclosed she had become “deeply concerned” about Facebook’s safety mechanisms, which were designed in US English, being used in the UK, and the site should be spending as much as double its £14bn budget for safety.
AFP via Getty Images)
"UK English is sufficiently different that I would be unsurprised if the safety systems that they developed primarily for American English were actually under-enforcing in the UK,” she said.
Turning to misinformation and anti-vaxxer content, Ms Haugen said the platform was "hurting the most vulnerable among us" and leading people down "rabbit holes".
"Facebook has studied who has been most exposed to misinformation and it is … people who are socially isolated," she said.
"I am deeply concerned that they have made a product that can lead people away from their real communities and isolate them in these rabbit holes and these filter bubbles.
"What you find is that when people are sent targeted misinformation to a community it can make it hard to reintegrate into wider society because now you don't have shared facts."
Never miss the top politics stories again
From coronavirus to Brexit, our daily politics newsletter is there to guide you these turbulent times.
The newsletter is sent out twice daily with the latest UK & world politics news, along with leading opinion and analysis.
You can sign up here .
Ms Haugen went on to tell MPs and peers that Facebook's own research suggested Instagram , which the tech giant also runs, is dangerous for young people.
She said it allows bullying to follow children home from school and carry on in their bedrooms at night via the app.
She said the firm also has the ability to make a "huge dent" on the problem if they wanted to but they do not because "young users are the future of the platform and the earlier they get them the more likely they'll get them hooked".
"When I was in high school, it didn't matter if your experience in high school was horrible, most kids had good homes to go home to and they could at the end of the day disconnect, they would get a break for 16 hours," she explained.
"Facebook's own research says now the bullying follows children home, it goes into their bedrooms.
"The last thing they see at night is someone being cruel to them.
"The first thing they see in the morning is a hateful statement and that is just so much worse."
The former executive refused to label the actions of the company as "evil" or "malevolent", however, but said there was a "pattern of inadequacy".
"I cannot see into the hearts of men but I do believe there is a pattern of inadequacy that Facebook is unwilling to acknowledge its own power," she said.
"It believes in a world of flatness which hides the difference, like (that) children are not adults.
"They believe in flatness, won't accept the consequences of their actions and, so, I think that is negligence and ignorance, but I can't see into their hearts, so I don't want to consider it malevolent."
Home Secretary Priti Patel said she had a “constructive” meeting with Ms Haugen on Monday and said "tech companies have a moral duty to keep their users safe".
Imran Ahmed, Chief Executive of the Centre for Countering Digital Hate, meanwhile, has accused Facebook of "denying, delaying and deflecting blame".
He said: "Big Tech companies deny responsibility. They deflect blame. They delay taking action. And now it's clear it was part of a strategy to deceive us all.
"Thanks to brave whistleblowers like Frances Haugen, we know that Facebook executives ignore repeated warnings and pleas from employees to control the deadly misinformation and hate speech that flows unabated on their platforms.”
A Facebook spokesman said: "Contrary to what was discussed at the hearing, we've always had the commercial incentive to remove harmful content from our sites.
“People don't want to see it when they use our apps and advertisers don't want their ads next to it.
“That's why we've invested $13 billion and hired 40,000 people to do one job: keep people safe on our apps. As a result we've almost halved the amount of hate speech people see on Facebook over the last three quarters – down to just 0.05 per cent of content views.
“While we have rules against harmful content and publish regular transparency reports, we agree we need regulation for the whole industry so that businesses like ours aren't making these decisions on our own.
“The UK is one of the countries leading the way and we're pleased the Online Safety Bill is moving forward."
- Velaphi Khumalo tells court his Facebook post about whites not hate speech
- Edwards says urgency makes this session different
- McLaren boss insists he will not resign after whistleblower says team atmosphere is 'toxic'
- Facebook post saying Indians not welcome in KZN is ‘likely fake’
- Flyovers are hardly that, rains will make them worse
- Facebook had an emotional breakdown of sorts on Twitter, and as expected, users showed no mercy- Technology News, Firstpost
- As Facebook Fixes Data-Breach Scandal, Investor Optimism Rises
- Hardline monks in Burma vow to stay on Facebook despite ban
- Myanmar monks vow to stay on Facebook to tell the truth despite Facebook ban
- Facebook gave away your personal info to SIXTY companies ‘without permission’ – including Apple, Samsung and Microsoft
- Why you shouldn’t wish people happy birthday on Facebook
- 'Satire' often being used as cover for fake news : Facebook
- Oops! Facebook shared 14 million users’ posts with the whole world rather than just their friends
- Facebook is warning users about ANOTHER privacy scandal affecting 14M users – here’s how to find out if you’re affected
- Should Chrome copy Safari and help block Facebook?
- Want to tackle fake news on Facebook? You could use these tips
- A Man Is Auctioning His Facebook Data on eBay, and It's Going Great [Update: Not Anymore]
- Facebook to show UK users more background information on articles and publishers
Facebook whistleblower says site is making hate worse - and this is only the start have 1339 words, post on www.mirror.co.uk at October 25, 2021. This is cached page on Auto News. If you want remove this page, please contact us.