Facebook whistleblower says Digital Services Act can rein in social media giant

Haugen: ‘Facebooks’s profit optimisation machine is generating self-harm and self-hate especially for vulnerable groups like teenage girls’

Former Facebook employee Frances Haugen
Former Facebook employee Frances Haugen

Facebook whistleblower Frances Haugen testified at a European Parliament joint hearing on Monday.

Haugen’s testimony may help lawmakers in Brussels in their policy measures to have the best possible chance at reining in big tech companies like Facebook who now hold an undeniable position of power within European economies and societies.

Her hearing comes at an opportune time in the midst of an EU push to regulate the online space through comprehensive regulation known as the digital services act (DSA).

At the time of writing the DSA is surrounded by its own controversy as groups that represent journalists, media freedom organizations and civil society groups did not all exactly welcome the proposed legislation with open arms.

Some, such as the #OffOn coalition, were extremely concerned with its current state and actually described the unintended effects of the law as an affront to fundamental freedoms, rights and democracy.

“If there are only two things everyone takes away from these disclosures it should be that first, Facebook chooses profit over safety every day, and without bold action from lawmakers this will continue. The second is that Facebook has exploited its ability to hide the actual behaviour of the platform to allow our safety to decay to an unacceptable level.

“If Facebook is allowed to continue to operate in darkness we will only see escalating tragedies as a result. I came forward at great personal risk because I believe we still have time to act, but we must act now thank you.”

The committees for the internal market and consumer protection, legal affairs, economic and monetary affairs, industry research and energy and the committee for civil liberties, justice and home affairs were joint conveners of the Haugen hearing.

“I am especially interested to learn if Frances Haugen has specific suggestions for our work on the Digital Services Act regarding the liability of online platforms and marketplaces, recommender systems and algorithms,” said MEP Christel Schaldemose.

Haugen’s 15-minute presentation was a strong indictment of Facebook’s business practices, brimming with stern warnings for the public, counter-balanced by hopeful assertions about the potential the DSA has to rectify this and take everyone to a brighter digital future.

“Facbeook damages the health and safety of our communities and threatens the integrity of our democracy,” Haugen, formerly Facebook’s lead product manager for civic misinformation and then counter-espionage.

“Facebook repeatedly encountered conflicts between its own profits and our safety. Facebook consistently resolved these conflicts in favour of its profits. The result has been a system that amplifies division, extremism and polarization. It undermines societies around the world, in some cases, this dangerous online talk has led to actual violence that harms and even kills people. In other cases, their profit optimization machine is generating self-harm and self-hate especially for vulnerable groups like teenage girls.”

Despite her harsh critique of Facebook, Haugen pointed out that the world is at a crossroads and that this juncture is a “once in a generation opportunity”.

“The DSA has the potential to be a global gold standard,” Haugen said, calling the law “a content-neutral approach  to address the systemic risks and harms of the overall business model.”

Haugen also drew attention to three critically important points: risk assessments and access to privacy aware data streams, comprehensive rules and stands for the business model and the danger of loopholes and exceptions.

Interestingly, Haugen signalled her concerns over exceptions in the DSA for regulating journalistic content. “Let me be clear: every modern disinformation campaign will exploit news media channels on digital platforms by gaming the system; if the DSA makes it illegal for platforms to address these issues, we risk undermining the effectiveness of the law indeed we may be worse off than today’s situation.” 

She explained that the problem is that “only Facebook gets to look under the hood” in terms of accessing their own data and internal knowledge of the algorithms at play, such as the engagement-based content-ranking systems.

Haugen recommended that Facebook be made more accountable and transparent in terms of its algorithms, safety protocols and data, all of which should be disclosed and subject to public auditing, research and analysis by academics as as governments.

“Simply put, people need to know exactly how Facebook works and governments and regulators need to be given access to Facebook’s data to concretely assess whether actions they are taking to rectify problems are really being done in good faith and having a real effect,” she said.

She also cautioned on the over-reliance on artificial intelligence to monitor and police the platform because of its problem understanding the wider context which limits its utility.

Haugen described the current state of AI used for this purpose like “asking 1,000 third-graders to solve a problem... AI simply is not developed enough yet to regulate the complexity of the multilingual online space.”

Haugen also pointed out the differential harm taking place to non-English speaking users of the platform who simply do not get the same sort of protection from disinformation and harmful content because Facebook is inept at understanding and addressing such content in the plethora of languages used on the platform.

Haugen said the DSA could be what finally forces platforms like Facebook to take responsibility for the effects their products and services have on society.

“Even Facebook’s own internal regulators can’t even access the company’s own data on public safety, much less conduct an independent audit... How is the public supposed to assess if Facebook is resolving conflicts of interest in a way that is aligned with the pubic good, if it has no visibility and no context into how Facebook really operates? This must change.”

Ewropej Funded by the European Union

This article is part of a content series called Ewropej. This is a multi-newsroom initiative part-funded by the European Parliament to bring the work of the EP closer to the citizens of Malta and keep them informed about matters that affect their daily lives. This article reflects only the author’s view. The action was co-financed by the European Union in the frame of the European Parliament's grant programme in the field of communication. The European Parliament was not involved in its preparation and is, in no case, responsible for or bound by the information or opinions expressed in the context of this action. In accordance with applicable law, the authors, interviewed people, publishers or programme broadcasters are solely responsible. The European Parliament can also not be held liable for direct or indirect damage that may result from the implementation of the action.

More in Ewropej 2024