Four takeaways from Facebook whistleblower's complaints



Haugen, the 37-year-old former Facebook (FB) product supervisor who labored on civic integrity points on the firm, revealed her identity during a “60 Minutes” segment that aired Sunday night. She has reportedly filed at the very least eight whistleblower complaints with the Securities and Exchange Commission alleging that the corporate is hiding analysis about its shortcomings from traders and the general public. She additionally shared the paperwork with regulators and the Wall Street Journal, which printed a multi-part investigation displaying that Facebook was conscious of issues with its apps.
“60 Minutes” published eight of Haugen’s complaints on Monday. Here are 4 takeaways from the complaints:
Internal documents cited within the complaints present Facebook is aware of each that hate speech and misinformation on its platforms are having a societal affect and that its “core product mechanics, such as virality recommendations and optimizing for engagement, are a significant part of why these types of speech flourish.”
In one study of the misinformation and polarization dangers encountered by way of suggestions, it took just some days for Facebook’s algorithm to suggest conspiracy pages to an account following official, verified pages for conservative figures similar to Fox News and Donald Trump. It took lower than every week for a similar account to get a QAnon advice. And in response to paperwork entitled “They used to post selfies now they’re trying to reverse the election” and “Does Facebook reward outrage” cited within the complaints, not solely do Facebook’s algorithms reward posts on topics like election fraud conspiracies with likes and shares, but in addition “‘the more negative comments a piece of content instigates, the higher likelihood for the link to get more traffic.'”
One document entitled “What is Collateral Damage?” even goes as far as to notice “the net result is that Facebook, taken as a whole, will be actively (if not necessarily consciously) promoting these types of activities. The mechanics of our platform are not neutral.”

Facebook has taken restricted motion to handle present misinformation

According to an inner document on problematic non-violating narratives referenced in at the very least two of the complaints, Facebook removes as little as 3% to 5% of hate speech and fewer than 1% of content material that is thought-about violent or inciting to violence. That’s as a result of the amount is an excessive amount of for human reviewers and it is difficult for its algorithms to precisely classify content material when context have to be thought-about.

Internal paperwork on Facebook’s position within the 2020 election and January 6 rebellion additionally counsel these spreading misinformation are not often stopped by the corporate’s intervention mechanisms. One doc notes that, “Enforcing on pages moderated by page admins who post 2+ pieces of misinformation in the last 67 days would affect 277,000 pages. Of these pages, 11,000 of them are current repeat offenders pages.”

Despite Facebook’s claims that they “remove content from Facebook no matter who posts it, when it violates our standards,” according to Haugen, “in practice the ‘XCheck’ or ‘Cross-Check’ system effectively ‘whitelists’ high-profile and/or privilege users.” An inner document on mistake prevention cited in a grievance contends that “‘over the years many XChecked pages, profiles and entities have been exempted from enforcement.'”
Internal documents on “quantifying the concentration of reshares and their VPVs among users” and a “killswitch plan for all group recommendation surfaces” point out Facebook additionally rolled again some modifications confirmed to scale back misinformation as a result of these modifications diminished the platform’s progress.
Additionally, Haugen claims the corporate falsely advised advertisers that they had executed all they may do to stop the rebellion. According to a doc cited within the filing titled “Capitol Riots Breaks the Glass,” the safer parameters Facebook applied for the 2020 election, like demoting content material similar to hate speech more likely to violate its Community Standards, had been truly rolled again afterwards and reinstated “only after the insurrection flared up.”
In one doc, a Facebook official states “we were willing to act only *after* things had spiraled into a dire state.”

Facebook has misled the general public concerning the unfavorable results of its platforms on kids and teenagers, particularly younger women

When requested throughout a congressional listening to in March whether or not Facebook’s platforms “harm children” Facebook CEO Mark Zuckerberg said, “I don’t believe so.”
However, based mostly on Facebook’s personal inner analysis cited in one of Haugen’s complaints, “13.5% of teen girls on Instagram say the platform makes thoughts of ‘Suicide and Self Injury’ worse” and 17% say the platform, which Facebook owns, makes “Eating Issues” similar to anorexia worse. Their analysis additionally claims Facebook’s platforms “make body image issues worse for 1 in 3 teen girls.”

Facebook is aware of its platforms allow human exploitation

Although Facebook’s neighborhood requirements state that they “remove content that facilitates or coordinates the exploitation of humans,” inner firm paperwork cited in considered one of Haugen’s complaints counsel the corporate knew “domestic servitude content remained on the platform” previous to a 2019 investigation by BBC News right into a black marketplace for home employees on Instagram.

“We are under-enforcing on confirmed abusive activity with a nexus to the platform,” one doc entitled “Domestic Servitude and Tracking in the Middle East” stated. “Our investigative finding demonstrate that … our platform enables all three stages of the human exploitation lifecycle (recruitment, facilitation, exploitation) via real-world networks. … The traffickers, recruiters and facilitators from these ‘agencies’ used FB profiles, IG profiles, Pages, Messenger and WhatsApp.”

Leave a Reply

Your email address will not be published. Required fields are marked *