Recently there have been internal leaks of documents about Facebook to the Congress and press. Around 2 billion people around the world use a product owned by Meta. The document leaked by employee whistleblower Frances Haugen revealed a host of problems. This include –

  • how political extremism spread on the platform
  • how the company struggled to contain anti-Covid vaccine content posted by its users
  • how Instagram can affect teenagers mental health

It was a clear indication with the leaked documents that the company was aware of serious harms caused by its products. In many cases, the company failed to address them. In a statement, the Facebook spokesperson said, “We take steps to keep people safe even if it impacts our bottom line. To say we turn a blind eye to feedback ignores these investments, which includes the over $5 billion we’re on track to spend this year alone on safety and security, as well as the 40,000 people working on these issues at Facebook.”


For many years Congress has debated how and if it should regulate social media platforms. Experts have raised concerns about the fact that grave long-term consequences of the various social media platform will harm society. Facebook has said it welcomes regulation. Many Democrats and Republicans have now started believing that something must be done to rein in Facebook after researching Instagram’s harm to teenagers.

As per the company, it is seeking expert guidance on how to address some of its problems. For two and half years, the company has been calling for updated regulations on its business. The question now that has arisen is that Facebook can be fixed.

Facebook can be fixed, or if not entirely, still some of its issues can be improved. There are a few things that the company needs to change. Under the meta umbrella Instagram, WhatsApp, Messenger, and Oculus is there, along with Facebook. The company’s concentrated power could be defanged if the company spins off these businesses many experts believe. This will allow the small competitors to arise and challenge the company to do better.

In the US, the social media industry has no dedicated oversight agency. There is a need to make a new agency or at least increase funding for the existing FTC so that safety standards on the internet can be regulated. Section 230 is the law that protects free speech. It does that by protecting technology companies from facing legal consequences because of their users posting content on their platform which can cause real-world harm. But reforming 230 in a way that will entrench incumbents or won’t run into Fist amendment challenges will be difficult.

Facebook is like a black box to researchers, journalists, and analysts trying to understand how its complex and ever-changing algorithms dictate what billions of people see online. Some experts believe that Facebook should be legally required to share specific internal data with vetted researchers about what information is circulating on their platforms; the company executives should be criminally prosecuted for downplaying people harm their company causes or misleading business partners.