What Else Do the Leaked ‘Facebook Papers’ Show? Angry face emojis have 5x the weight of a like thumb emoji… and more other stuff

The documents leaked to U.S. regulators by a Facebook whistleblower “reveal that the social media giant has privately and meticulously tracked real-world harms exacerbated by its platforms,” reports the Washington Post.

Yet it also reports that at the same time Facebook “ignored warnings from its employees about the risks of their design decisions and exposed vulnerable communities around the world to a cocktail of dangerous content.”

And in addition, the whistleblower also argued that due to Mark Zuckberg’s “unique degree of control” over Facebook, he’s ultimately personally response for what the Post describes as “a litany of societal harms caused by the company’s relentless pursuit of growth.” Zuckerberg testified last year before Congress that the company removes 94 percent of the hate speech it finds before a human reports it. But in internal documents, researchers estimated that the company was removing less than 5 percent of all hate speech on Facebook…

For all Facebook’s troubles in North America, its problems with hate speech and misinformation are dramatically worse in the developing world. Documents show that Facebook has meticulously studied its approach abroad, and is well aware that weaker moderation in non-English-speaking countries leaves the platform vulnerable to abuse by bad actors and authoritarian regimes. According to one 2020 summary, the vast majority of its efforts against misinformation — 84 percent — went toward the United States, the documents show, with just 16 percent going to the “Rest of World,” including India, France and Italy…

Facebook chooses maximum engagement over user safety. Zuckerberg has said the company does not design its products to persuade people to spend more time on them. But dozens of documents suggest the opposite. The company exhaustively studies potential policy changes for their effects on user engagement and other factors key to corporate profits.

Amid this push for user attention, Facebook abandoned or delayed initiatives to reduce misinformation and radicalization… Starting in 2017, Facebook’s algorithm gave emoji reactions like “angry” five times the weight as “likes,” boosting these posts in its users’ feeds. The theory was simple: Posts that prompted lots of reaction emoji tended to keep users more engaged, and keeping users engaged was the key to Facebook’s business. The company’s data scientists eventually confirmed that “angry” reaction, along with “wow” and “haha,” occurred more frequently on “toxic” content and misinformation. Last year, when Facebook finally set the weight on the angry reaction to zero, users began to get less misinformation, less “disturbing” content and less “graphic violence,” company data scientists found.
The Post also contacted a Facebook spokeswoman for their response. The spokewoman denied that Zuckerberg “makes decisions that cause harm” and then also dismissed the findings as being “based on selected documents that are mischaracterized and devoid of any context…”

Responding to the spread of specific pieces of misinformation on Facebook, the spokeswoman went as far to acknowledge that at Facebook, “We have no commercial or moral incentive to do anything other than give the maximum number of people as much of a positive experience as possible.”

She added that the company is “constantly making difficult decisions.”

Source: What Else Do the Leaked ‘Facebook Papers’ Show? – Slashdot

Organisational Structures | Technology and Science | Military, IT and Lifestyle consultancy | Social, Broadcast & Cross Media | Flying aircraft