Facebook Harms Its Users Because That’s Where Its Profits Are
Facebook has been the target of an unprecedented flood of criticism in recent months — and rightly so. But too many critics seem to forget that the company is driven to do bad things by its thirst for profit, not by a handful of mistaken ideas.
Thinking about Facebook and what to do with it means grappling with two conflicting sets of facts. One is that Facebook is an immensely useful platform for communication, news publishing, economic activity, and more, which billions of people around the world rely on. The other is that Facebook is a highly addictive, profit-seeking entity that exploits and manipulates human psychology to make money, with dire results for the rest of society.
The company is back in the news again, now at the end of a week from hell thanks to explosive revelations from a former employee who became disillusioned and leaked a trove of its internal documents to the public. Through an ongoing Wall Street Journal series based on the leak, a 60 Minutes interview, and an appearance in front of Congress, the crux of whistleblower Frances Haugen’s case is this: Facebook is well aware of the various harms and dangers of its platforms, but has consistently failed to rectify them because it would conflict with the company’s continued growth and profits.
One report revealed that company researchers had themselves determined that Instagram, which is owned by Facebook, has a psychologically damaging effect on teen girls, even as it denied this publicly and plowed ahead with a version of Instagram for under-thirteens. Another found that the company’s 2018 rejigging of its algorithm to promote “meaningful social interactions,” or MSI, had instead incentivized posts and content based in outrage, social division, violence, and bullshit. Others show that Facebook was intentionally targeting kids and finding ways to hook them on the product early, and that it dragged its feet on taking down posts it knew were made by drug cartels and human traffickers.
Many solutions have been put forward to deal with the problem of Facebook, like using antitrust laws to break it up, altering Section 230 to allow tech companies to be sued for things posted on their platforms, or, as Haugen suggested to Congress, demanding more transparency from the company and creating a regulatory oversight board. But much of what’s been revealed in these documents adds more weight to arguments that the company, and other de facto monopolies like it, should be treated as a utility or even taken under public ownership.
What the documents make clear is that, as Haugen told Congress, when Facebook comes across a conflict between its profits and people’s safety, it “consistently resolved these conflicts in favor of their own profits.” Facebook knows its platforms are bad for kids, but to keep growing, it needs to hook those kids so they’re part of its user base as adults, and for those kids to bring their parents into the fold. “They are a valuable but untapped audience,” one internal document from 2020 states about tweens, with the company studying preteens, plotting out new products to capture them, and discussing the idea of “playdates as a growth lever.”
Facebook understands that boosting MSI might feed division, vitriol, and all manner of unhealthy behaviors among its users, but not doing so means less engagement and, so, potentially less profit. When one employee suggested dealing with misinformation and anger by removing the priority the algorithm gives to content reshared by large user chains, Mark Zuckerberg, she wrote, wouldn’t do it “if there was a material tradeoff with MSI impact.” When researchers suggested the company tweak its algorithm so it didn’t send users into deeper, more extreme rabbit holes, like interest in health recipes that soon led to anorexia content, top brass ignored it for fear of limiting user engagement.
“A lot of the changes I’m talking about are not going to make Facebook an unprofitable company,” Haugen told Congress this week. “It just won’t be a ludicrously profitable company like it is today.”
Just like companies driving sales by making devices meant to break down and stop working after a few years, it’s Facebook’s hunger for growth and bigger profits that drives its reluctance to act responsibly. It would seem a no-brainer to take these incentives out of the equation, especially with these platforms taking on the status of “natural monopolies” like railroads and telecommunications.
If a firm is publicly owned or simply a tightly regulated utility, it doesn’t need to work under the capitalist logic of growth and excessive profit seeking that’s fueled these issues, nor does it have to survive if its user base no longer needs or cares for it. The fact that the company is going out of fashion with the youth and is predominantly used by people over thirty might be a problem for Mark Zuckerberg, private owner of Facebook, but it’s not much of an issue for a utility that a government reluctantly nationalized because of how much its users came to depend on it. In fact, it sounds like a readymade solution for a platform that most of us agree is, at best, addicting and unhealthy.
If younger generations don’t care if Facebook survives, why should we force them to think otherwise? If people are happier when they’re persuaded to unfollow everything and empty out their news feeds, why should we retaliate, as Facebook recently did to the creator of the tool that let them do this?
Of course, there are many practical matters that would have to be ironed out. For one, Facebook might be a US company, but its utility-like services are delivered to the entire globe, so there are real questions about what a publicly owned or regulated Facebook would actually look like — questions like “Which public?” or “Regulated by whom?”
Similarly, there would have to be stringent oversight and democratic control designed around any such scheme, lest the exploitation and manipulation carried out by the platform simply get transferred from the private sector to government. (Bear in mind, though, that through its surveillance programs and cyber operations, Washington and other governments are already using platforms like Facebook to collect and store data about the world’s users and manipulate information on them.) Perhaps in the end, the right way forward will be some combination of all of these solutions, including both breaking these monopolies up and taking parts of them under public ownership.
But if the exact solution isn’t clear yet, what is clear is that the current state of things is untenable. Beyond the issues highlighted by Haugen’s leak, we’ve long known that social media platforms and other tech innovations are mentally unhealthy for us, having been deliberately designed to be addictive to the point that the very software engineers and tech moguls responsible for them avoid using their own creations. Perhaps there’s a way to keep social media and its most useful features in our lives while getting rid of its most malignant characteristics; or perhaps the whole thing will turn out to be a mistake fundamentally incompatible with the way the human brain works. But to find out, we have to at least try something different.
Sadly, that’s not the solution that much of the Wall Street Journal series, most of Congress, and other news outlets seem to pointing to in reaction to this leak. Predictably, this news has produced calls for more intensified “content moderation,” meaning censorship, by these tech companies, as a way to prevent the spread of all kinds of misinformation or stop platforms from “enabling dangerous social movements,” as Haugen accused them of.
Ironically, this is despite the fact that the documents themselves show the folly of censorship as a solution to these issues. The very first story in the Journal’s series is about how Facebook created a “whitelist” of many tens of thousands of high-profile accounts, making them immune from censorship for posting the kinds of things that would get other users censored, suspended, or permanently banned. All the while, the company’s censors went after lower-level users, taking down completely innocuous posts or those whose message they misinterpreted, including Arabic-language news outlets and activists during Israel’s crackdown on Palestinians earlier this year. These platforms have shown repeatedly that they can’t be trusted to accurately and responsibly moderate content, as documented just this week in a Human Rights Watch report on its suppression of Palestinian content.
This response is part of a long-standing trend of what Olivier Jutel has termed tech reductionism: the belief that tech companies and their products aren’t just harmful and unhealthy for us, but responsible for every bad thing you can think of that’s happened in the last few years. In deciding how to deal with this issue (and choosing censorship as the way forward), we’re in danger of missing the wider political, economic, and social factors that are driving the tumult of our current world, and ascribing it instead to the near-mystical power of social media. Was Facebook really singularly responsible for the January 6 Capitol riot? Or was it just one of a number of useful tools that allowed attendees to organize themselves for the event, attendees who were driven by a combination of economic dislocation and largely elite- and mainstream media–peddled lies about the election?
Haugen told the Journal that her motivation for coming forward was watching a liberal friend of hers be swept up by sinister delusions described as “a mix of the occult and white nationalism” after spending “increasing amounts of time reading online forums” (not Facebook, oddly), culminating in the end of the friendship. Yet people regularly encounter or consume propaganda, let alone simply use social media and the internet, without going down a similar road. Unfortunately, we never find out what the underlying factors were for Haugen’s friend to be sucked into this miasma of lies, nor do we find out what led him to later renounce these beliefs. Misinformation has always been rife in the world; finding the answers to those questions will help us understand why it seems particularly potent in this era.
Avoiding mass censorship efforts doesn’t mean we’re powerless to do anything. There are clear changes that can be made to Facebook’s algorithms, design, central mission, and resourcing that would bring it closer to the true public service it claims to be than the nihilistic, profit-making juggernaut it operates like, and none of them would threaten our right to speak freely or mess with our ability to stay in touch with loved ones, organize events, or such platforms’ other useful features. Who knows — we might even feel like logging off every now and then.