Facebook, Twitter Take Action to Maintain Election Integrity as Voting Continues
With the US Presidential Election hanging in the balance, both Facebook and Twitter have been working to stay on top of potential misinformation and reportage, in order to keep people informed of what’s happening in the process.
Facebook, as it had outlined last month, is now displaying a prominent banner in the news feeds of all US users, on both Facebook and Instagram, which states that the votes are still being counted, in order to counter premature claims of victory.
Facebook also says that, as soon as President Trump raised questions about the polling process early this morning, it began automatically applying labels to both candidates’ posts to keep people informed of where things are at.
This is an important measure, which may quell public dissent. Facebook had been preparing for such an occurence over the last couple of months, due to the President’s long-held criticism of the voting process, and by reiterating to all users that the vote count is still underway, the reminder could help to counter rising angst over the pending result.
But Facebook’s election day didn’t go off without a hitch.
Over on Instagram, many users saw an incorrect prompt in their feeds yesterday, which stated that ‘Tomorrow is Election Day’, instead of today, which could have confused some voters.
Instagram explained that this was a caching issue, which impacted a small subset of users.
As explained by Instagram chief Adam Mosseri:
I don’t know the number, but we’ve restarted the app for everyone in the US to get as many people to see the accurate “It’s the last day to vote” banner at the top of Instagram as possible, and we’ve sent an “It’s Election Day” notification to everyone for good measure.
— Adam Mosseri ???? (@mosseri) November 3, 2020
Of course, it’s impossible to know what the impact of such an error might be, but hopefully, the overwhelming election coverage countered any potential misunderstanding in this respect.
Facebook has also been forced to clarify the specifics of its stance on premature claims of victory – in Florida, Governor Ron DeSantis claimed victory for President Trump in the state ahead of any official announcement, which Facebook allowed because DeSantis was referring to the state poll only, not the Presidential election.
Which seems an odd qualifier to make – as Facebook explained to The Wall Street Journal, its existing policy against premature declarations of victory was only intended to apply to the overall result of the presidential election, not individual state polls. But the final result will come down to state-based declarations, and if the Trump campaign is allowed to amplify claims made by state representatives, the outcome of such is essentially the same.
Either way, Facebook is allowing this exception within its approach to premature claims.
Twitter, meanwhile, has been furiously adding warning screens to tweets claiming victory, or questioning the process, as the final counts continue.
And as you can see, President Trump has been a key focus in this respect.
But Trump’s tweets are not the only ones under scrutiny – Twitter says that it’s added warning screens to many tweets from various candidates.
As per Twitter:
“Last night, we took quick action to limit engagement on a number of Tweets that may have needed more context or violated the Twitter Rules. Our teams continue to monitor Tweets that attempt to spread misleading information about voting, accounts engaged in spammy behavior, and Tweets that make premature or inaccurate claims about election results. Our teams remain vigilant and will continue working to protect the integrity of the election conversation on Twitter.”
We won’t know the impacts of any of these efforts till after the results are known, and in the wash-up, we may be able to get a better idea of the overall impact that social media activity might have had on the outcome this time around.
But you can definitely expect more questions. Reports yesterday, for example, indicated that YouTube’s efforts to reduce misinformation within its ‘Up Next’ recommendations ended up favoring content from Fox News more heavily, which is strongly biased towards the right.
Did that influence voter behavior leading into the poll?
It does seem that, for the most part, the additional efforts from the social platforms to reduce misinformation and voter manipulation have had an effect, and the increased voter turnout – likely the largest in 100 years – suggests that the efforts to get more people to the polls have also produced results, which should mean that the final outcome is more reflective of the broader US citizenship.
But with social media platforms, and potential manipulation efforts, removed from the equation this time around, that then also leaves the US with nowhere to point the finger, and significant internal division to address moving forward.
We’ll no doubt hear more on both elements in the coming months.