Last year, whistleblower and former Facebook product manager Frances Haugen shared the thousands of pages of internal company documents she obtained with 17 US news outlets, the Securities and Exchange Commission, and a separate consortium of European news organizations.
These documents, called “The Facebook Papers,” formed the basis of reported news articles and analysis about Facebook’s behavior in response to key issues like how its platform might harm children, or play a role in enabling political violence, and how it reportedly prioritizes profit over safety. Redacted versions of these papers with the names of Facebook users and lower-level employees blacked out were also shared with Congress for its investigation.
However, since these articles were published, academics and smaller media outlets have been arguing that these papers be made available to the wider public for independent study and analysis. Gizmodo, which was one of the outlets that had access to the papers last year, announced in November that it would publish these papers, and said it was partnering with a group of independent experts to review the documents and block out information related to private individuals prior to publishing.
Today, the first set of these papers are now live on Gizmodo’s site and are categorized by topics such as “Papers About the Jan. 6 Capitol Attack,” “Papers Describing the Election-Related Task Force Monitoring ‘Complex Financial Organizations,’” and “Election-Related Platform and Product Updates.” Outside of the 2020 election, which was the overarching theme of all the papers Gizmodo uploaded today, there are also documents which detail how Meta (Facebook’s parent company) “tackles sensitive issues like sex trafficking, disinformation, and voter manipulation.”
Look through the internal screenshots, memos, posts, agendas, and research documents from Meta released this week here. Gizmodo will upload a new set of documents on the designated web page every week.
So what are some high-level takeaways? The Washington Post reported that the papers showed that internal research findings at Facebook often clashed with what Mark Zuckerberg, CEO of Meta, told the public. Additionally, Facebook was inconsistent in how it policed harmful content, and failed to do so effectively in most of the world. Lastly, Facebook reportedly designed its product to maximize engagement rather than promote safety or stem misinformation.
Additionally, according to AP News, these documents also revealed conflicts within Facebook when employees brought up problems related to how it regulated users and how its news feed algorithm surfaced toxic content only for the company to ignore the concerns. Moreover, Facebook was losing much of its young adult users that found the platform to be “boring, misleading and negative,” AP reported. This makes the content ecosystem of Facebook more fragile, as older adults tend to share more misinformation and disinformation. Recently, The Washington Post also found that Facebook was paying a Republican consulting firm to plant false negative stories about its competitor, TikTok.
Even after a rebrand and a pivot, Facebook, now Meta, is having a rough start to the year. In February, Meta lost more than $200 billion in one day as its stocks plunged. That same month, the company reported that Facebook lost average daily users in the third quarter of 2021. Since then, Meta has been trying to increase transparency into the inner workings of its platforms, starting with Instagram.
Currently, Meta, along with several other big tech companies, has also been working “round-the-clock” to fact check content from the Russia-Ukraine war, The New York Times reported. These added actions on social and internet platforms have incited discussions around conflict-related content moderation across the world.