Steven V. Roberts: First, do no harm

Published 12:00 am Friday, October 29, 2021

Many Facebook employees felt their own company helped instigate and organize the mob that stormed the U.S. Capitol on Jan. 6.

“Haven’t we had enough time to figure out how to manage discourse without enabling violence?” one worker wrote afterward. “We’ve been fueling this fire for a long time, and we shouldn’t be surprised it’s now out of control.”

An enormous trove of internal documents leaked to the press by a former Facebook employee, Frances Haugen, make the answer to that question crystal clear. It is “no” — Facebook has not figured out how to encourage free speech, a bedrock principle of American democracy, while discouraging the use of its platform to undermine that same system.

Many factors contributed to the poisonous polarization that erupted on Jan. 6 — including treacherous leaders like Donald Trump — but Facebook was a prime co-conspirator. As Haugen told Congress: “Facebook’s products harm children, stoke division and weaken our democracy. The company’s leadership knows how to make Facebook and Instagram safer, but won’t make the necessary changes because they have put their astronomical profits before people. Congressional action is needed. They cannot solve this crisis without your help.”

The case against Facebook boils down to two points. One: The company has been far too slow to restrict the reach of figures like Trump and his toadies, who use the platform to spread damaging disinformation — “the election was rigged, vaccines are dangerous, climate change is a hoax,” etc.

Two: Facebook doesn’t just tolerate disinformation. The company employs powerful and secret algorithms to amplify its impact by promoting posts that trigger anger and outrage. These emotional reactions lead users to spend more time on Facebook, which in turn makes them far more valuable to advertisers. That’s what Haugen means by putting “profits before people.”

At the core of this debate is the “harm principle,” articulated by the 19th-century British philosopher John Stuart Mill. The nonprofit Ethics Centre defines it this way: “The harm principle says people should be free to act however they wish unless their actions cause harm to somebody else.”

The harm done by Facebook abusers is obvious. Disinformation about vaccines, for example, can cost countless lives. Therefore, limiting how those abusers are free to act is certainly justified.

But here’s the problem: Who gets to define “harm”? What standards are used in reaching that judgment? And how is that definition applied to real-life situations?

None of the answers are easy. But they are critical to the functioning of a healthy democracy. Overly harsh restrictions on free speech can be even more detrimental than overly timid ones. So what are the options?

Platforms like Facebook could regulate themselves, but as Haugen notes, her former employer has largely failed to do that. The profit motive is simply too powerful. And in fact, the company’s Maximum Leader, Mark Zuckerberg, who controls more than half of Facebook’s stock, largely agrees with her.

Facebook hosts nearly 3 billion monthly users, and Zuckerberg has often said that he and his brainchild should not be the “arbiters of truth.” Amen to that.

“Every day, we make decisions about what speech is harmful, what constitutes political advertising, and how to prevent sophisticated cyberattacks,” he has written. “But if we were starting from scratch, we wouldn’t ask companies to make these judgments alone. I believe we need a more active role for governments and regulators.”

But is that really the answer? Call me old-fashioned, but I kind of like the First Amendment, which says pretty bluntly, “Congress shall make no law … abridging the freedom of speech.”

“There oughta be a law!” is not always the right answer to a public policy crisis. In fact, it often is not. Should the partisan politicians who run the government have the power to define what counts as harmful speech — and therefore dilute it?

One promising third option is the Oversight Board created by Facebook, a panel of 20 independent experts who are empowered to make critical decisions for Zuckerberg & Co. But that concept has flaws, too. The board recently issued a report accusing Facebook of not being “fully forthcoming” about its policies toward prominent platform users.

Another reasonable alternative: legislation that would force Facebook to be far more transparent about the algorithms it employs, which can spread so many toxic falsehoods so quickly.

As policymakers grapple with how to apply Mill’s “harm principle” to the digital space, they should remember another version of that idea, contained in an adage often preached to young doctors: “First, do no harm.”

Steven Roberts teaches politics and journalism at George Washington University. Contact him at stevecokie@gmail.com.