Former Facebook executive: ‘it is a global problem, it is ripping society apart’
Chamath Palihapitiya, former social media giant executive feels guilt over his work on 'tools that are ripping apart the social fabric of how society works'
Former Facebook executive Chamath Palihapitiya said he feels “tremendous guilt” over his work on “tools that are ripping apart the social fabric of how society works,” joining a growing chorus of critics of the social media platform.
Palihapitiya, who was vice-president for user growth at Facebook before leaving the company in 2011 said:
“The short term, dopamine-driven feedback loops that we have created are destroying how society works. No civil discourse, no cooperation, misinformation, mistruth.”
The remarks were made at a Stanford Business School event in November, and were surfaced by tech site the Verge on Monday.
“This is not about Russian ads,” he said. “This is a global problem… It is eroding the core foundations of how people behave by and between each other.”
Palihapitiya’s comments last month were made just 24 hours after Facebook’s founding president Sean Parker, criticized the way the company “exploit[s] a vulnerability in human psychology” by creating a “social validation feedback loop,” during an interview.
Parker said that he was “something of a conscientious objector” to using social media, a sentiment shared by Palihapitiya, who said that he was now hoping to use the money he made at Facebook to have a positive effect.
Palihapitiya also called on his audience to “soul search” about their own relationship with social media.
“Your behaviours, you don’t realise it, but you are being programmed,” he said. “It was unintentional, but now you’ve got to decide how much you’re going to give up, how much of your intellectual independence.
Over the past year, social media platforms and companies have faced increased scrutiny as critics link growing political divisions across the world to the handful of platforms that tend to dominate discourse.
Many attributed the unexpected outcomes of the 2016 presidential election and Brexit, at least in part, to the ideological echo chambers created by Facebook’s algorithms, as well as the proliferation of “fake news” and propaganda, alongside legitimate news sources in Facebook’s newsfeeds.
Only recently did Facebook acknowledge that it sold advertisements to Russian operatives, who were seeking to sow division amongst US voters during the presidential election.
Facebook faced criticism for its role in amplifying anti-Rohingya propaganda in Myanmar, amidst claims of “ethnic cleansing” of the Muslim minority.
Palihapitiya referenced a case from the Indian state of Jharkhand, when false WhatsApp messages warning of a group of kidnappers led to the lynching of seven people. WhatsApp is owned by Facebook.
“Imagine when you take that to the extreme where bad actors can now manipulate swaths of people to do anything you want. It’s just a really, really bad state of affairs,” said Palihapitiya.
Facebook did not immediately respond to requests for comment.