It’s been quite a ride for Facebook. Whistleblower Frances Haugen emerged to give a shocking interview on “60 Minutes,” an hours-long outage shut down Facebook’s products, Haugen testified on Capitol Hill, and hackers reportedly offered 1.5 billion people’s public data scraped from Facebook for sale.

All this happening in a single week leads me to ask some tough questions about whether politicians may overreact.

Is it possible that policymakers would base policy on the views of two data scientists who, at least in their testimony, fail to distinguish between focus groups and peer-reviewed research? Since Haugen’s appearance in Washington, another Facebook whistleblower, Sophie Zhang, testified on October 18th in London.

Will policymakers factor in academic research about social media’s positive effects as well as negative ones? Consider research by Mizuko Ito, Candice Odgers, and Victoria Rideout. Rideout’s latest study found that 43% of teens said social media made them feel better when struggling mentally while 17% made them feel worse. Self-critical social comparison and body image struggles have long been part of adolescence.

Is there a risk that platforms now will avoid attempting to find out the negative impacts of their products?

This does not look like a Big Tobacco moment. Social media users are not cigarette smokers. Yes, tobacco use has social elements, but the focus of policymakers was on its physical, not psychological, impact.

Potential policy responses are difficult to design. If algorithms are regulated, they will have to put rules on how users are engaged and content recommended.  Other areas to be investigated include how companies leverage data, their transparency, the balance of human and algorithmic content moderation, and whether different languages users in distant countries are treated fairly.

Without offering detailed recommendations for dealing with these difficult issues, let me propose two high-level solutions. Facebook should:

  • Acknowledge that it’s no longer just a corporation and structure itself accordingly.
  • Support government networks of help services for Internet users around the world.

Neither additional human moderators nor tweaking algorithms are likely ever to catch up with harmful content and behavior, according to the Facebook files. In terms of speech, moderators will never have enough context for content to make fail-proof moderation decisions – is it a parody, sarcasm, “just a joke,” a cruel joke, etc., etc.?

Algorithms for deleting content have to be “fed” tons of data to make good moderation “decisions,” and the offline-world speech and behavior reflected in that data is nuanced and keeps changing. It’s unlikely the algorithm can ever catch up. And there are different norms and definitions all over the world to be factored in.

Governments should support or establish Internet helplines. These provide platforms with the context they need to delete contextual harmful content such as harassment, hate speech, and bullying. A network of Internet helplines in Europe is up and running. More than a decade ago, the European Commission helped set one up more than a decade ago. There’s also user care provided by the eSafety Commissioner’s Office in Australia and by NetSafe in New Zealand.

Ideally, every country should have an Internet helpline, including the U.S. They should be independent of both industry and government in the U.S. but could be partially funded by both (and individuals), as is the National Center for Missing & Exploited Children. Each country’s helpline needs to be structured and funded as appropriate for its context.

Facebook needs to organize itself differently, in keeping with the role it has come to play. Because of its penetration into the everyday lives of people all over the world, how much data it handles, the infrastructure it provides, and so much more, Facebook has – in practice, on the ground – evolved away from being a mere corporation. It has become part utility, part social institution as well.

Facebook needs to see itself and act as a social institution as much as a corporation. Direct knowledge of its impact on vulnerable people in every culture and political system where it has a presence needs to be folded into product development, acquisitions, and every management decision. The company needs a chief safety officer in the “C suite” – an office that has real power, doesn’t fall under marketing or lobbying, supports the kind of investigative work that journalists do, and has sufficient budget to contribute to helpline operations around the world.

It feels like the ground is shifting – let’s see how much.

Anne Collier runs The Net Safety Collaborative. She serves on the Trust & Safety advisories of Facebook, Snapchat, Twitter, Yubo, and YouTube, and has received funding from some of these companies.