Some of Facebook’s darkest secrets have been shared with the world12 min read
Reading Time: 8 minutes

On the surface, Facebook seems like a rather harmless company. It’s not a polluting industry. In fact, Facebook is even trying to go carbon-neutral. So, what is it about Facebook that makes it so dangerous? And how dark could its secrets be? Well, as we’ve learned over the years, the secrets of a company that is only trying to bring people together are far darker than one would imagine. So, without further ado, let’s meet Frances Haugen and see what she has to say.
At this point, you may think, is this article even worth reading? Who even uses Facebook anymore? Well, almost everyone. And that’s because Facebook also owns the popular social media apps: Instagram and Whatsapp.

Meet Frances Haugen

This woman that you see to your left is Frances Haugen. Before last week, Frances was just an ordinary citizen who had lost her job in the middle of the pandemic. Before that, she was an employee at Facebook in California, and worked as part of a team that researched on “democracy and misinformation“.

By revealing Facebook’s internal data, Frances Haugen has become what is known as a whistleblower.

However, a while ago, Frances noticed something. Her team at Facebook had conducted research that showed that Facebook’s methods to attract and keep users were unsafe. They found that Facebook’s leaders were prioritizing profit over safety. And, as a result, they were knowingly harming their users’ mental health and the very fabric of democracy.
When no one implemented change despite Frances’ requests, she took matters into her own hands. Frances decided to show the world what Facebook had found. To do that, she photocopied thousands of pages of Facebook’s internal, private research. Finally, armed with all the proof she needed, Frances bravely shared the photocopied documents with one of America’s largest news publications: The Wall Street Journal.

Frances’ Concern
It’s no secret that governments, globally, have been worried about the impact that Facebook’s content can have on our society. As a result, everyone has pushed Facebook to monitor what gets posted on its site, and that’s exactly what Facebook does. Facebook has both people and automated computer programs sifting through everything that billions of people post on its platforms. And the moment that violent or inappropriate content is spotted, it is immediately removed. However, with such a large mass of content to be monitored, Facebook only catches about half of the inappropriate posts that are put up on its platforms.
Unlike the world’s governments, Frances Haugen believes that asking Facebook to remove inappropriate posts is only scratching the surface. She believes that the real reason that Facebook is so unsuccessful in its mission to clean up its platforms is, in fact, its own set of ruling principles or algorithms.

What did the documents reveal?
The documents that Frances Haugen shared revealed some extremely harmful consequences of Facebook’s guiding principles or algorithms. Here are a few:

In the country of Myanmar, viral fake news and hate speech about the Rohingya Muslim community spread on Facebook and eventually had deadly consequences for the people who belonged to the community.

Facebook’s content checking software only operates in 40 languages globally. As a result of that, a whole lot of false and dangerous content that is posted in other languages remains undetected.
.
Facebook’s algorithm vs Misinformation in India
Even India is impacted by Facebook’s inability to report misinformation and harmful content in different languages. Haugen’s documents showed that 40% of the content promoted on Facebook regarding West Bengal was fake and dangerously misleading. Due to a language barrier, Facebook has also unknowingly promoted pages that call for senseless violence against the Muslim community in India. The algorithms have done this simply because these posts generated more reactions and kept people coming back to Facebook’s social media platforms.
.

“Thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse.”
Facebook Files, Wall Street Journal
One of the most shocking revelations from the Facebook Files was that Instagram’s internal research found that its platform is worsening mental health among teenage girls.
For Frances Haugen, the biggest problem with Facebook is that they know that what they do is harmful and yet, continue to do it. However, it is worthy to note that Facebook took the initiative to conduct such research in the first place. Do you think that whistleblowing will prevent companies from looking into the negative impact of what they do in the future?
.
Woah! that was a lot! Relax a bit, and come back to learn about what Frances wants from Facebook.
