Facebook Whistleblower Exposes Platform’s Harmful Tactics

During+her+Oct.+5+testimony%2C+Haugen+shed+light+on+Facebooks+malicious+tactics%2C+which+included+intentionally+promoting+content+harmful+to+teens+mental+health.

Rebekah Anastasia Ericson

During her Oct. 5 testimony, Haugen shed light on Facebook’s malicious tactics, which included intentionally promoting content harmful to teens’ mental health.

This past September, as Americans discussed COVID vaccines, the nation’s border crisis and climate change, former Facebook employee Frances Haugen was buried in a more concealed issue involving her former employer.

At last, she decided to speak up, and The Wall Street Journal listened, publishing an investigative report called “The Facebook Files” on Oct. 1. 

The troublesome information remained shrouded in mystery until an episode of “60 Minutes” premiered on Oct. 3, introducing Haugen as the “Facebook whistleblower.”

Haugen had worked for Facebook as a product manager before quitting earlier this year. The former employee presented documents, research and other files alleging the company had been knowingly causing harm to their users and doing nothing about it, despite internal pushback and recommendations from sought-after professionals. 

Since the massive leak, Haugen has filed for protection with the National Whistleblower Center to divulge information — without legal repercussions — kept hidden by Facebook officials from the public.

According to Haugen’s documents, major flaws existed in Facebook’s algorithmic models. At a Congressional subcommittee hearing on Oct. 5, she had the opportunity to speak directly to senators about her findings.

During her testimony, Haugen alleged that Facebook had many harmful business practices, and despite supposed evidence found in their internally-conducted studies that suggested otherwise, the practices continued. The company prioritized profitability instead of the well-being of its users, Haugen claimed.

“I saw Facebook repeatedly encounter conflicts between its own profits and [users’] safety,” Haugen told the Senate committee. “Facebook consistently resolved these conflicts in favor of its own profits.”

Specifically, Haugen claimed that Facebook created its algorithms to facilitate user engagement by amplifying sensational content such as hate speech, radical ideas and misinformation — all in the name of profits. 

Because these types of content can elicit strong reactions among Facebook users, the company intentionally molded its algorithms to promote such content.

Haugen claimed that Facebook prioritized profits over the well-being of its users. Photo courtesy of Deposit Photos.

Backed by immense research and data, these algorithms preyed on people’s insecurities, specifically those of teenagers. Knowing this, Facebook organized and presented content in a way that created an intentionally addictive product for young, vulnerable users.

The harm created by Facebook’s content and algorithms impacted children and teens regarding issues like body image and mental health. 

One document leaked by Haugen showed that 32% of teens interviewed for a research study reported that Instagram (which is owned by Facebook) made them feel insecure about their bodies. Similarly, another Facebook study found that 13.5% of teens surveyed said the content presented on Instagram strengthened their suicidal thoughts.

As the future of Facebook remains in jeopardy, Haugen hopes her revelations have enlightened people regarding what occurs behind Facebook’s closed doors and can encourage conversations on how to resolve the issues.

Changes could come in numerous ways — national privacy laws, more safeguards in place for children and reassessing the use of algorithms are just a few approaches to resolving such a significant problem. 

It is unclear what recourse will occur as the issue progresses. As conversations continue and investigators delve deeper into allegations, Facebook will undoubtedly have more questions to answer, and Frances Haugen has the receipts.