Article 59X52 Seven types of election misinformation to watch out for

Seven types of election misinformation to watch out for

by
Abby Ohlheiser
from MIT Technology Review on (#59X52)

There was a time when misinformation was thought of as something that fought its way from the fringes into the mainstream, as if it lived in a darker parallel reality that was waiting to invade our own. That's never been quite right, but in 2020 it's an obvious misconception.

Cornell researchers recently identified President Trump as the biggest driver of covid-related misinformation. The president's Twitter account has become a hub for falsehoods, both of his own and from the conspiracy-tinged accounts he's made a habit of retweeting. And as the Trump administration repeatedly shares and promotes misinformation about mail-in voting, the mainstream media has become its biggest amplifier.

Misinformation" and disinformation" aren't even great terms for what's going on anymore, as the same words can include systemic campaigns by governments to suppress voters, harmful conspiracy theories with connections to real-world violence, and dumb hoaxes that are pushed into the river of online information for no good reason.

But we do know that there will be bad information, coordinated campaigns, and attempts to amplify harmful content as far as possible through the election and beyond. So here's a list of some of the things you might encounter this week.

Unsourced rumors from polling locations

At the start of Election Day, bad actors will be trying to keep voters away from the polls. There's already a decent amount of misinformation out there about poll watchers, but experts are worried about the effect those recruited by the Republican Party to watch for fraud" could have on minority voters in Democratic-leaning districts. Angelo Carusone, the president of Media Matters for America, a left-leaning media watchdog group that has been tracking far-right misinformation campaigns, said last week that he was particularly worried about what happens if those poll watchers start issuing misleading or false reports from polling locations themselves. The whole idea of being a poll watcher is that you are credentialed in some way to sound the alarm," he said. It has a better chance of breaking through the media, especially local media."

Lyric Jain, the founder of the UK-based verification app Logically, said their researchers will be tracking a few types of rumors about polling locations, including those related to the pandemic.

What we're likely seeing is reports of fake covid outbreaks at polling stations," he said. He also expects unsourced rumors of inadequate safety procedures at local polling places. Other experts we spoke to were a bit more skeptical of the likelihood of these rumors having much impact on Election Day, in part because these rumors have been present during early voting without really catching a bigger audience. However, Jain guessed that we're likely to see a lot more of that on Election Day itself," and they'll be harder to fight in real time if they do catch some online traction.

Fake and out-of-context screen shots, videos, and images

Misinformation doesn't have to be entirely fabricated. Experts have warned voters to keep an eye out for decontextualized media too. Videos, images, and even news articles can be removed from their original context and deployed to make a questionable narrative sound more credible.

So, for instance, a recent tweet from Richard Grenell-a former ambassador to Germany and former acting director of national intelligence under Trump's administration-claimed to show Joe Biden wearing a mask outside, but not wearing one on a plane, in order to prove his hypocrisy. The two photos tweeted by Grenell do indeed show Biden on a plane without a mask ... in 2019, before the pandemic.

Other uses of decontextualized media include sharing article after article of voter fraud stories from years ago as if they're happening now, in order to give the impression that the election results are compromised by widespread fraud that doesn't exist. Or videos of long lines and unrest at polling stations when the images are from different elections, or from incidents that are unrelated to elections at all. Screen shots are also easy to fabricate and repurpose, so be skeptical of friends posting screen grabs of conversations with their friend of a friend of a friend who is a poll worker.

If you're hell-bent on sharing something, there are a few questions you can ask yourself about a video or photo you're seeing on Election Day: Is this video from a reporter from an outlet you trust, and is the reporter at the polling place in question? Can you find other videos or reports of the same incident from people who appear to actually be there? Have news sources or fact-checking organizations you trust been able to verify that this happened?

For photos in particular, try running the content through a reverse image search service like Google Images, which allows you to figure out whether it's been used before elsewhere.

Rumors from private groups

In 2016, people were primarily concerned with misinformation's ability to go viral." But in this election, private online spaces are a popular place for misinformation to spread, and much harder to track.

It makes for very different problems that a lot of apps, a lot of platforms aren't quite ready to deal with," Jain says. Private Facebook groups have helped health misinformation find an audience for years before the election, and those groups were still popular into the pandemic.

Nina Jankowicz, a disinformation fellow at the Wilson Center, told NPR over the summer that private groups are particularly susceptible to breeding misinformation because of the ways in which those groups succeed as communities. The moderators of groups use the community that they build there to create a sense of trust," she said. In some cases, these are really polarizing environments," full of content that is really indoctrinating there."

QAnon's success over the summer in reaching the mainstream demonstrated just how well this can work in spaces that are unrelated to politics: mom groups and wellness communities were particularly susceptible to some of QAnon's more mainstream-friendly campaigns.

At the end of October, Facebook announced that it was suspending recommendations to users to join groups themed around political or social issues. Researchers have long warned that algorithmic group recommendations-basically, suggesting you join x or y group based on your interest in z- play a part in bringing users deeper into conspiratorial and extreme thinking.

Repeat offenders with big followings

The Election Integrity Partnership, a coalition of researchers working to combat election-related misinformation and disinformation in real time, has identified a number of repeat offender" Twitter accounts with large followings that have regularly shared or engaged with misleading narratives about the election. Those accounts, which include @realdonaldtrump, are largely familiar faces in the pro-Trump universe, like Charlie Kirk and Sean Hannity. It also includes the handles of several figures connected to pro-Trump media with a history of amplifying misinformation, and Breaking911, a viral news account with a weird history that no one should trust.

The repeat offenders identified by EIP are particularly adept at reframing things that were originally true by quote-tweeting them and adding their own narrative, the report notes. That effort is aided by reframing rewrites from outlets like Gateway Pundit and Breitbart, which can pick up on and help spread misleading narratives about real incidents to larger audiences.

Local news, kind of

As local newspapers diminish their coverage or shut down altogether, there's some pretty compelling evidence that bad actors and PR firms are exploiting the news vacuum that creates. The New York Times reported in October on a large network of sites that, at first glance, appear to be local papers with names like Maine Business Daily" or Ann Arbor Times." In fact, however, those sites are part of a proliferation of partisan local-news sites funded by political groups associated with both parties," the report said.

So know the reliable local news sources in your area-and watch out for new sources that suddenly appear. If you're curious about whether a hyperpartisan site is pretending to be a newspaper in your area, Nieman Lab made this map over the summer. While it doesn't appear totally up to date, it's a good starting point.

Media Matters said it is also concerned about Sinclair Broadcast Group, the conservative company that owns local television stations across the United States. The company has, in the past, ordered anchors to read a script on air about the mainstream media that echoes Trump's views about fake news." Eric Bolling used an episode of his Sinclair show to spread misinformation about covid. The show, which had been posted online to multiple Sinclair station websites, was edited to remove some of the claims before it aired.

Trump's campaign

You shouldn't rely on any political campaign to give you the results of the election, but Axios reported this weekend that Trump's campaign may be planning on declaring victory on election night if the results at that moment are favorable to him, despite knowing that uncounted ballots in key states could shift the results.

Is saying this partisan? As a journalist, whenever you write about political misinformation, someone's going to tell you you're showing your bias by scrutinizing the misinformation coming from the Trump campaign more than the Biden campaign. But the situations are plainly not equivalent. One campaign in this race has embraced misinformation as a political tactic and repeatedly spread falsehoods about mail-in voting, and the other has not.

Your well-meaning friends

Well-meaning people are also fully capable of spreading misinformation. Whether it's re-sharing a racist or otherwise awful meme in order to condemn it, or getting caught up in the hype of believing that every glitching voter machine is a sign of a hacked election or every slightly off thing on Twitter is a Russian bot, it's possible that some of the people sharing bad information in your feed today will be people you know and trust.

We've gone into some of these traps (and how to avoid them) here. While it's depressing to see people who should know better share misinformation on the internet, the good news is that, experts believe, this is exactly the sort of online terribleness that you, the individual, can do something about. If you feel up to it, respectfully providing context or skepticism in response to one of your friends' suspicious posts can help to slow the spread of a false narrative.

External Content
Source RSS or Atom Feed
Feed Location https://www.technologyreview.com/stories.rss
Feed Title MIT Technology Review
Feed Link https://www.technologyreview.com/
Reply 0 comments