January’s riot at the U.S. Capitol showed the damage that can result when millions of people believe an election was stolen despite no evidence of widespread fraud.
The Election Integrity Partnership, a coalition of online information researchers, published this week a comprehensive analysis of the false narrative of the presidential contest and recommended ways to avoid a repeat.
Internet companies weren’t solely to blame for the fiction of a stolen election, but the report concluded that they were hubs where false narratives were incubated, reinforced and cemented.
I’m going to summarize here three of the report’s intriguing suggestions for how companies such as Facebook, YouTube and Twitter can change to help create a healthier climate of information about elections and everything else.
One broad point: It can feel as if the norms and behaviors of people online are immutable and inevitable, but they’re not.
Digital life is still relatively new, and what’s good or toxic is the result of deliberate choices by companies and all of us.
We can fix what’s broken. And as another threat against the Capitol this week shows, it’s imperative we get this right.
A higher bar for people with the most influence
Kim Kardashian can change more minds than your dentist. And research about the 2020 election has shown that a relatively small number of prominent organizations and people, including President Donald Trump, played an outsize role in establishing the myth of a rigged vote.
RELATED: Dr Fauci Says Children Are Safer In School From COVID-19
Currently, sites like Facebook and YouTube mostly consider the substance of a post or video, divorced from the messenger, when determining whether it violates their policies.
World leaders are given more leeway than the rest of us and other prominent people sometimes get a pass when they break the companies’ guidelines.
This doesn’t make sense.
If internet companies did nothing else, it would make a big difference if they changed how they treated the influential people who were most responsible for spreading falsehoods or twisted facts — and tended to do so again and again.
The EIP researchers suggested three changes: create stricter rules for influential people; prioritize faster decisions on prominent accounts that have broken the rules before; and escalate consequences for habitual superspreaders of bogus information.
YouTube has long had such a “three strikes” system for accounts that repeatedly break its rules, and Twitter recently adopted versions of this system for posts that it considers misleading about elections or coronavirus vaccinations.
The hard part, though, is not necessarily making policies. It’s enforcing them when doing so could trigger a backlash.
Internet companies should tell us what they’re doing and why
Big websites like Facebook and Twitter have detailed guidelines about what’s not allowed — for example, threatening others with violence or selling drugs.
But internet companies often apply their policies inconsistently and don’t always provide clear reasons when people’s posts are flagged or deleted.
The EIP report suggested that online companies do more to inform people about their guidelines and share evidence to support why a post broke the rules.
More visibility and accountability for internet companies’ decisions
News organizations have reported on Facebook’s own research identifying ways that its computer recommendations steered some to fringe ideas and made people more polarized.
But Facebook and other internet companies mostly keep such analyses a secret.
The EIP researchers suggested that internet companies make public their research into misinformation and their assessments of attempts to counter it.
That could improve people’s understanding of how these information systems work.
The report also suggested a change that journalists and researchers have long wanted. Ways for outsiders to see posts that have been deleted by the internet companies or labeled false.
This would allow accountability for the decisions that internet companies make.
There are no easy fixes to building Americans’ trust in a shared set of facts, particularly when internet sites enable lies to travel farther and faster than the truth.
But the EIP recommendations show we do have options and a path forward.