Facebook Announces Plan to Protect Democracy in Run Up to 2020 Election

The company has analyzed over 80 potential scenarious whereby its platform might be used to compromise results

Facebook Logo

Social media is, at this point, integral to our lives both socially and politically. It has given people a voice, even those who maybe shouldn’t really have one, and is often the source of many individual’s news.

After the 2016 Cambridge Analytica scandal – when Facebook carelessly allowed a political consulting firm to access the personal information of thousands of Facebook users and their friends by paying them to take a survey, having a major impact on both the UK Brexit referendum and the US presidential election – along with an influx of fake news and disinformation campaigns in recent months, I think it’s safe to say that, on some level, social media poses a threat to democracy.

In the run-up to the 2020 election, though, it seems social media giant Facebook is on the path to redemption. Last week, Facebook CEO Mark Zuckerberg published the company’s plan to “protect democracy”.

“The US elections are just two months away, and with Covid-19 affecting communities across the country, I’m concerned about the challenges people could face when voting,” the post reads. “I’m also worried that with our nation so divided and election results potentially taking days or even weeks to be finalized, there could be an increased risk of civil unrest across the country.”

In lieu of such worries, Zuckerberg has set out a framework for fighting misinformation, encouraging people to register and vote, and helping the country to cope with the novel experience of voting in the midst of a pandemic.

Combating Misinformation

Citing the Cambridge Analytica scandal (and some other instances of foreign interference) in all but name, Zuckerberg acknowledged the threat of both foreign and domestic groups in spreading disinformation and propaganda to impact voter decisions. “…four years ago we encountered a new threat: coordinated online efforts by foreign governments and individuals to interfere in our election,” he wrote. “However,” he adds, “we’re increasingly seeing attempts to undermine the legitimacy of our elections from within our own borders.”

In order to combat meddling from both inside and outside of the US, the company will be implementing a series of policies and procedures during and after the election campaign.

Namely, Facebook will be banning new political and issue ads during the final week of the campaign (ads were a central aspect of the Cambridge Analytica scandal). Facebook’s reasons for doing this are pretty solid. “…In the final days of an election there may not be enough time to contest new claims,” Zuckerberg wrote. The new rule will mean that all ads aired during the week running up to the election will have already been vetted, and will be available publicly in Facebook’s Ads Library, meaning they can be adequately scrutinized by both fact-checkers and journalists, in order to avoid any misrepresentation of facts.

The company already has started to, and will continue to, work with fact-checkers who will identify and label misinformation regarding polling conditions, voter rights and suppression, and any use of Covid-19 to scaremonger and encourage people not to vote due to the possibility of getting sick.

They have also introduced a new policy which had already been rolled out on WhatsApp in the wake of a Covid-19 misinformation crisis earlier in the pandemic: limiting the number of accounts you can forward a message to at a given time in Messenger. This is to reduce “the risk of misinformation and harmful content going viral.”

They will also be continuing this policy in the days after election day, “since some states may still be counting valid ballots after election day [and] many experts are predicting that we may not have a final result on election night.”

Mitigating Violence

It’s starting to seem like Facebook and violence go hand in hand. Well, Facebook has been a big facilitator of violence in recent years even despite having policies in place to avoid it. But, following the murder of two protesters by an alleged vigilante was closely linked to a Facebook group that the company failed to take down despite complaints by other users, Facebook is adjusting its policies again, particularly to attempt to mitigate the unrest that delayed election results might spark. “This could be a very heated period,” wrote Zuckerberg, “so we’re preparing the following policies to help in the days and weeks after voting ends.”

In order to protect election officials, Facebook will be including them in their definition for high-risk users. Zuckerberg states this will help to prevent any attempts to pressure or harm them, “especially while they’re fulfilling their critical obligations to oversee the vote counting.”

Now, in terms of Facebook groups such as the Kenosha Guard and any related to dangerous conspiracy theories, Facebook says it has already “strengthened [their] enforcement against militias, conspiracy networks like QAnon, and other groups that could be used to organize violence or civil unrest in the period after the elections.” They have already removed and blocked thousands, says Zuckerberg, and thousands more have been removed from recommendations and search results. But they do plan on ramping this up even further in the coming weeks and months, hopefully enforcing stricter policies to avoid any grey areas that might lead to real-life violence, like before.

Avoiding the Delegitimization of Results

Earlier in the summer, President Trump took to social media more than once (as he always does) to question the legitimacy of mail-in voting. He claimed that it would negatively impact the republican vote (although this has been debunked by election experts), that votes could be lost and/or stolen and, in June, he tweeted that mail-in ballots could be printed by foreign countries aiming to rig the election.

Just last month, Mr Trump tweeted: “Mail ballots are very dangerous for this country because of cheaters. They go collect them. They are fraudulent in many cases.” Zuckerberg had to acknowledge that, although this was uncharted territory, it was possible the company would have to crack down on the country’s own President’s social media posts, especially those peddling lies.

Speaking to the New York Times, Zuckerberg said: “there’s a high likelihood that it takes days or weeks to count this — and there’s nothing wrong or illegitimate about that,” and has now committed to combating any posts that might work to delegitimize election results.

“We’ll use the Voting Information Center to prepare people for the possibility that it may take a while to get official results,” reads the recent announcement. “This information will help people understand that there is nothing illegitimate about not having a result on election night.”

The company will also be partnering with Reuters and the National election pool to, “provide authoritative information about election results.” Facebook will be updating its Voting Information Centre with information about whether election results are officially in and notifying people “proactively” as and when results become available. On top of this – and in a markedly positive move for Facebook – the company is gearing up to shut down any false claims by candidates about whether or not they have won. “Importantly, if any candidate or campaign tries to declare victory before the results are in, we’ll add a label to their post educating that official results are not yet in and directing people to the official results.”

Facebook has often been concerned with not limiting free speech on the platform and has often failed to act in this way in fear of being accused of censorship and “anti-conservative bias.” If we assume the possibility that Facebook will have to label any of President Trump’s posts, this looks like a big step forward by the company.

Rallying Voters

In a refreshingly positive policy, Facebook will be leveraging its platform to encourage people who can to register and vote in the 2020 election with the largest voting information campaign in American history. Their ultimate goal is to help four million people register and vote, and they have already driven 24 million clicks to voter registration websites.

They will also be keeping “authoritative” voting information at the top of both Facebook and Instagram in the run-up to the election. According to the post, this will include, “video tutorials on how to vote by mail, and information on deadlines for registering and voting in your state.”

Given the impact Facebook has had on past elections, these clear policies are both refreshing and reassuring. Operating in an increasingly digital world poses risks that can’t always be predicted. After taking a look at over 80 possible outcomes and threats to the 2020 election, Facebook is hoping to mitigate almost any issue that could delegitimize the results or persuade people to vote for something different than what they’ll actually get. Maybe this election, we’ll see social media truly help – not hinder – democracy.