Latest news

6/recent/ticker-posts

Advertisement

Why Facebook suddenly got tough

Fully Charged
Bloomberg

Hey everyone, it's Kurt. Last week, Facebook Inc. all but banned QAnon. Then it kept going.

The largest social-media company started pulling posts that call for people to aggressively police polls on election day. It blocked ads discouraging vaccine use, then reversed a longstanding policy that allowed posts denying the Holocaust. And on Wednesday, Facebook limited the reach of a New York Post article about Joe Biden, infuriating conservatives. 

Long criticized for being too slow and precious in its fight against potentially dangerous content, Facebook is suddenly taking a stand. Why now? The obvious answer is that the most important political event in Facebook's history, the 2020 U.S. presidential election, is weeks away. The company is trying to clean up misinformation before people vote.

A more nuanced answer is that Facebook is enforcing a policy it's had for years. If a post has the potential to cause real-world harm, it's often removed. Most other content, no matter how odious, usually stays up.

If you look at Facebook's recent decisions, and comments from executives, it's clear the company is worried that the threat of real-world harm is increasing as the election approaches. Leaders, including Chief Executive Officer Mark Zuckerberg, are worried that violence could erupt on election night or the days after. "This election is not going to be business as usual," he wrote in September. Everyone has a role to play in protecting democracy, including "taking steps to reduce the chances of violence and unrest," he added.

Banning QAnon, a group known for violent rhetoric, is a step toward preventing that. When Facebook announced the move, it specifically referenced content "tied to different forms of real-world harm."

Facebook's decision to prohibit calls for civilians to police polling sites is similar because it focuses on "militarized language." President Donald Trump and his son, Donald Trump Jr., have encouraged people to join an "army" to protect the polls. Facebook is worried comments like these might lead people to act violently.

"What this shows is growing concerns and threats of harm in the real world and our work to make sure we have policies and enforcement operations in place to address those," a company spokeswoman said.

It's good that Facebook is making these changes, but also frustrating it took so long. Using real-world harm as a benchmark might not be enough. Would QAnon be a such a real threat if Facebook had blocked the group a year ago?

Facebook may be learning its lesson. Anti-vaccination content has always been potentially dangerous, and it's wild that Facebook is only now blocking paid promotions against it. But fast forward a few months or a year when there's a safe Covid-19 vaccine, and you can see where a ban on anti-vaccination ads feels more proactive.

On Wednesday, Facebook also diminished the spread of a New York Post story that contained allegedly damaging information about Biden. The company was accused of censorship by Trump and conservatives. But if it ends up being disproven -- at least one Facebook executive suggested it may have been planted by a foreign government -- the company will look prudent by proactively cutting down its reach. 

Facebook is terrified of looking up on Election Day and realizing that content from its platform led to some kind of physical harm. Hopefully the company's recent steps are enough. --Kurt Wagner

If you read one thing

Facebook Slows Spread of N.Y. Post Biden Story to Fact-Check
Facebook and Twitter Inc. both tried to reduce the distribution of a story from the New York Post about Democratic presidential candidate Joe Biden, citing misinformation and hacked documents policies. President Trump called the situation "so terrible."

And here's what you need to know in global technology news

Amazon Workers Say Prime Day Rush Breaks Virus Safety Vows
Amazon.com Inc. has recklessly reinstated dangerous warehouse productivity quotas despite telling a judge that it was suspending them during the pandemic, workers said in a court filing.
Twitter, Like Facebook, to Remove Posts Denying the Holocaust
Twitter Inc. will remove posts that deny the Holocaust for violating its hateful conduct policy, according to a company spokeswoman.
Zoom Wants to Partner With, Not Defeat, Slack and Microsoft
Zoom Video Communications Inc. Chief Executive Officer Eric Yuan said the company will do a better job integrating office chatroom products from Slack Technologies Inc. and Microsoft Corp., betting that cooperation is better than competition for the software maker's growth.
 

Like Fully Charged? | Get unlimited access to Bloomberg.com, where you'll find trusted, data-based journalism in 120 countries around the world and expert analysis from exclusive daily newsletters.

 

Post a comment

0 Comments