How Should Social Media Handle Election Polls That Turned Out To Be Misinformation?
It appears that the various election polls that predicted Joe Biden would become the 46th President of the United States eventually proved accurate -- the current President's temper tantrum notwithstanding -- but that doesn't mean the polls did a good job. In fact, most people are recognizing that the pollsters were wrong in many, many ways. They predicted a much bigger win for Biden, including multiple states that easily went to Trump. They completely flubbed many down ballot House and Senate races as well. Pollsters are now trying to figure out what went wrong and what these misses mean, coming on the heels of a set of bad predictions in 2016 as well. It's likely there isn't any simple answer, but a variety of factors involved.
However, what interests me is the simple fact that it turned out that the major polls were actually widely shared misinformation that spread all over social media, presenting incorrect information about the election -- some of which almost certainly had the likelihood of impacting voting behavior.
Now, to be clear, I'm not saying the polls were disinformation deliberately spread with the knowledge that it was false. I'm saying they were misinformation. Information that turned out to be false, but was spread, often widely, by those who believed it or wanted to believe it. And, it was exactly the kind of misinformation that had a decent likelihood of impacting voting behavior.
But that leaves open a big question: with so many people (including many in the media and a few legislators) demanding that social media websites "crack down" on "misinformation", especially with regards to an election, the fact that polling that turned out to be misinformation presents something of a challenge. I think most people would say that it would be crazy to say that social media shouldn't allow polling information to be spread (or even to go viral). Yet, with so many people calling for a crackdown on "misinformation" how do you distinguish the two?
Some will argue that they only mean the kinds of misinformation that is being spread with ill-intent, though that quickly leaps over to disinformation or requires social media companies to be the arbiters of "intent," which is not an easy task. Others will argue that this is more "well meaning" information, or that it's merely a prediction. But lots of other misinformation could fall into that category as well. Or some might argue that accurately reporting on what the polls say isn't misinformation -- since it's accurate reporting, even if the results don't match the predictions. But, again, the same could be said for other predictive bits of misinformation as well.
In short: any of the ways you might seek to distinguish these polls, you can almost certainly apply back to other forms of misinformation.
I raise this issue primarily to ask that people think much more carefully about what they're asking for when they demand that social media sites moderate "misinformation." Especially with an incoming Biden administration that has already suggested that one of its policy goals is to target misinformation online. It's one thing to say that, but it's another thing altogether to define misinformation in a manner that doesn't lead to plenty of perfectly legitimate information -- such as these misleading polls -- being targeted as well. At the very least, we should start to distinguish the important differences between misinformation and disinformation.
Perhaps, rather than demanding that the first response to misinformation be that it be removed, we should think about more ways to add more context around it instead.