Despite internal research that Facebook’s platform was exploiting and exacerbating divisiveness among its users, top executives ignored the findings that the algorithms were doing the exact opposite of the company’s stated public mission to bring people together.
That’s according to new reporting Tuesday from the Wall Street Journal which in a comprehensive dive into the company’s treatment of its platform’s capabilities to divide users found that executives knew in 2018 what the site was doing to users but declined to take action.
“The most persistent myth about Facebook is that it naively bumbles its way into trouble,” tweeted New York Times tech columnist Kevin Roose. “It has always known what it is, and what it’s doing to society.”
New from @JeffHorwitz & me: Facebook spent years studying the its role in polarization, according to sources and internal documents. One internal slide laid out the issue like so. ”Our algorithms exploit the human brain's attraction to divisiveness.” https://t.co/PZWLUt68rs
— Deepa Seetharaman (@dseetharaman) May 26, 2020
“Our algorithms exploit the human brain’s attraction to divisiveness,” a presentation to the company’s leaders in 2018 declared. The research further warned that if the problem was “left unchecked,” the platform’s system would feed users “more and more divisive content in an effort to gain user attention & increase time on the platform.”
Facebook's engineers confirmed it was polarizing users, and algorithms were nudging users into extremist content.
— Jordan Weissmann (@JHWeissmann) May 26, 2020
But hard-right groups were more likely to spread propaganda than hard-left ones.
Guess what happened next. https://t.co/HUfwf0jCs9 pic.twitter.com/FhRefRLGll
As the Daily Beast explained:
The social-media giant, which boasts its mission is to “connect the world,” reportedly launched a research project in 2017 led by Facebook’s former Chief Product Officer Chris Cox to study how its algorithms aggravate divisive and harmful content. The task force named “Common Ground” assigned employees into “Integrity Teams” throughout the company. The team reportedly found that while some groups united people from various backgrounds, others only accelerated conflict and misinformation.
Despite the evidence laid out in Cox’s team’s presentation, however, the company’s board—particularly vice president of global public policy Joel Kaplan—rejected taking action to change the platform’s algorithm and incentives.
According to the Journal, Kaplan has outsized power at Facebook when it comes to determining the direction of the company:
A former deputy chief of staff to George W. Bush, Mr. Kaplan became more involved in content-ranking decisions after 2016 allegations Facebook had suppressed trending news stories from conservative outlets. An internal review didn’t substantiate the claims of bias, Facebook’s then-general counsel Colin Stretch told Congress, but the damage to Facebook’s reputation among conservatives had been done.
[…]
Disapproval from Mr. Kaplan’s team or Facebook’s communications department could scuttle a project, said people familiar with the effort. Negative policy-team reviews killed efforts to build a classification system for hyperpolarized content.
Journalists took to Twitter to note the timeline of the meeting and their requests for information on the platform’s incentives.
And the people who’ve been reporting on this for years are met with complete derision and sometimes outright lies by those at Facebook when we bring it up. https://t.co/2LrlmACMsZ
— Ben Collins (@oneunderscore__) May 26, 2020
“I visited Facebook HQ around this time to ask about the growing instances of mass violence linked the platform,” tweeted New York Times reporter Max Fisher. “One executive after another looked me in the eye and said they had no reason to believe the platform itself drove bad behavior.”
The reporting, said tech accountability advocacy group Freedom From Google and Facebook, “proves what we’ve been saying all along: Facebook knows what it’s doing, intentionally continues to cause harm to increase engagement and profit, and will never fix these problems themselves.”
“Until Congress, state attorneys general, the FTC, or DOJ breaks up Facebook’s monopoly and further regulates them,” the group added, “nothing is going to change.”
THIS ARTICLE ORIGINALLY POSTED HERE.