Cokie and Steven V. Roberts: Challenges for Facebook

Published 12:00 am Saturday, August 4, 2018

By Cokie Roberts and Steven V. Roberts

Facebook founder Mark Zuckerberg told tech writer Kara Swisher how the social media platform would handle contentious political information in coming elections.

“There are really two core principles at play here,” Zuckerberg told Swisher, executive editor of the website Recode. “There’s giving people a voice, so that people can express their opinions. Then there’s keeping the community safe, which I think is really important. We’re not gonna let people plan violence or attack each other or do bad things. Within this, those principles have real trade-offs and real tug on each other.”

How Facebook resolves the tension Zuckerberg describes could have a major impact on future elections. After all, the platform has 2 billion followers around the world, and, along with Google and Twitter, has emerged as a primary source of information for many voters.

Facebook’s growing importance as a political player was highlighted by the recent indictment of 12 Russian intelligence officers for hacking into the computers of Democratic campaign organizations and using stolen information to help elect President Donald Trump. Often the Russian agents used Facebook to spread their propaganda, buying at least $100,000 worth of paid ads through 470 phony accounts.

And the Russians are still conducting their campaign of subversion, no matter how often Trump denounces the story as “a big hoax.” The president’s own director of national intelligence, Dan Coats, made that completely clear when he called out Moscow’s “ongoing, pervasive efforts to undermine our democracy.”

The “trade-offs” Zuckerberg describes are as old as the republic. Leaders have always had to reconcile two profoundly American values — the right to know and the right to be safe — and most of the time, they have tilted the balance in favor of free speech.

Zuckerberg long argued that Facebook was only a “utility,” like a phone company, that simply transmitted information and had no editorial responsibility. As late as November 2016, he said it was a “pretty crazy idea” that fake news on Facebook had influenced the election — a ridiculous statement for which he now apologizes.

Twenty months later, Zuckerberg is coming to understand the enormous power Facebook has acquired. The question is how that power is employed. Fortunately, he is rejecting calls for censorship, even against odious or hate-filled speech.

“The approach we’ve taken to false news is not to say, ‘You can’t say something wrong on the internet,'” Zuckerberg told Swisher. “I think that would be too extreme.”

But in rare cases, he concedes, such as terrorists planning an attack, or a person threatening suicide, the “trade-offs” tilt in favor of safety, and censorship is justified: “The principles that we have on what we remove from the service are: If it’s going to result in real harm, real physical harm, or if you’re attacking individuals, then that content shouldn’t be on the platform.”

Most of the time, however, the choice facing Facebook is not binary; it’s not between keeping some voices on the platform or throwing them off. Careful judgments can and should be made that preserve both values: free speech and public safety.

The answer in many cases is not to ban types of speech, and few want Zuckerberg and Co. to have that kind of power. Rather, as he puts it, Facebook has “a responsibility to mitigate the darker things that people are gonna try to do.”

For example, Facebook has partnered with a number of reputable fact-checking organizations, such as Snopes and Politifact, to evaluate its content and post warnings on false statements. They’re also reducing the prominence of false stories on each member’s daily news feed and making it harder for purveyors of “fake news” to sell ads on the platform.

But Facebook needs to make fact-checking a much higher priority, and a larger budget item, so consumers have more data about what they’re reading and seeing.

Legislation has been introduced in Congress requiring Facebook to provide more information about who buys political ads — the sort of transparency that is already required of TV stations — but the platform should adopt those innovations on its own, even if it hurts its ad business.

Writing in The Atlantic, Yair Rosenberg makes an excellent suggestion: rather than banning “hateful information,” Facebook should encourage “counter-programming,” a version of the surgeon general’s warning on cigarettes that details the dangers of smoking. “The trolls would find themselves trolled,” he writes.

The answer to “hateful information” is usually more, and better, information. Then people can hear many voices — and still feel safe at the same time.

Steve and Cokie Roberts can be contacted at stevecokie@gmail.com.