By Anne Sherry, J.D.
In a rare look at a process that is usually anonymous and confidential, 60 Minutes published some of the complaints filed with the SEC by Facebook whistleblower Frances Haugen. Haugen appeared on the program and testified before a Senate subcommittee to describe the harms that Facebook’s algorithms cause to users’ and others’ mental and physical well-being. In complaints filed with the SEC, Haugen identified material statements Facebook officials have made about the company’s policies and practices that contradict its actual practice and the effects of its algorithms.
Whistleblower complaints. The whistleblower complaints published by 60 Minutes describe a range of undisclosed or misrepresented effects of the current structure of Facebook’s platforms and their algorithms. One of the disclosures concerns hate speech; Haugen alleges that Facebook publicly stated that it proactively removes over 90 percent of identified hate speech when internal records show that as little as 3 to 5 percent of hate speech is actually removed. Another complaint takes issue with CEO Mark Zuckerberg’s statements to Congress that Facebook does not harm children or profit from social media addiction, when Facebook’s research actually revealed that Instagram made thoughts of suicide and self-injury, as well as eating issues such as anorexia and bulimia, worse for 13.5 percent and 17 percent of teen girls, respectively. According to that internal research, “We make body image issues worse for 1 in 3 teen girls.”
The other whistleblower complaints describe how Facebook contributes to global division and ethnic violence by prioritizing resources to certain countries over others; failed to address the known occurrence of human trafficking and domestic servitude via Facebook and Instagram; misled the public about equal enforcement while in fact it was whitelisting high-profile users; and contributed to the January 6 insurrection at the U.S. Capitol.
According to the complaints, these misrepresentations and omissions are important to investors for two reasons. First, investors will be harmed by the information’s coming to light, causing users to use the platforms less, resulting in less advertising revenue. Second, some investors will not want to invest in Facebook knowing the harm that it facilitates.
Senate testimony. In her Senate testimony, Haugen repeatedly pointed to the ways Facebook’s algorithms harm users and foment misinformation and violence. For example, she said that young users looking for healthy recipes are very quickly steered to content that promotes eating disorders. Facebook also dedicates the vast majority of its content-moderation resources towards English-language moderation because those tend to be the most profitable regions for the company, allowing for situations like the use of Facebook to incite violence and genocide in Ethiopia, where six languages are spoken. Haugen said she does not believe that content moderation is the solution; it has to rest in changes to the ways Facebook’s algorithms incentivize and reward outrage and misinformation. These can be simple changes, she said: for example, simply requiring that users have clicked on a link before they can share it.
Haugen also does not believe that Facebook should be broken up, stating that the same issues would persist at separate companies. Instead, she argued for regulatory oversight as well as transparency via full access to research data regarding Facebook. Even Facebook’s own oversight board does not have insight into the company’s algorithms, she said. This is “like the Department of Transportation regulating cars by watching them drive down the highway. Imagine if no regulator could ride in a car, pump up its wheels, crash test a car, or even know that seat belts could exist.” Without understanding what causes the problems with Facebook, its regulators cannot craft specific solutions, Haugen said.