Facebook Declares BBC Article About French Political Polls 'Unsafe'
Lots of people have reasonable concerns about platforms like Facebook which not only provide an avenue for free expression -- but which also have the power to suddenly decide it won't allow certain forms of expression. Admittedly, there's always a line to be drawn somewhere. People are happy that Facebook tries to keep out spam and scams, but it's still worrying when it seems to want to filter out perfectly legitimate news stories. On Sunday, Nadim Kobeissi tweeted that Facebook wouldn't allow the sharing of a BBC article on the latest political polling in France.
Now it's possible that there's a concern over rogue dangerous ads on the BBC site -- though for many people the BBC displays no ads at all. It's also possible that Facebook's algorithms interpret news about the National Front party (which is politely described as "far right," but might more accurately be described as nationalist-to-racist) as somehow dangerous. But, just the fact that Facebook is magically determining that a news story is somehow "unsafe" without giving me any details to understand why or how is tremendously concerning.
And, again, this comes just after we've seen American politicians calling for Facebook and others to magically determine how to block "bad" content that might inspire terrorists. And, it comes just as Google's Eric Schmidt argued that these kinds of filters should be more common. Yet, examples like this show just how problematic the idea of these kinds of filters can be.
The more pressure put on companies like Facebook to do that kind of proactive filtering, the more likely that perfectly legitimate information and news stories like the BBC story here get blocked. And that should be seen as immensely problematic if you believe in free expression and the ability to share ideas freely.
Permalink | Comments | Email This Story








I wasn't sure I believed it so I tried to post that link to my own Facebook page and got a similar message:Facebook, you're going full Orwell. Never go full Orwell. pic.twitter.com/wsBpQNhMqE
- Nadim Kobeissi (@kaepora) December 6, 2015
Now it's possible that there's a concern over rogue dangerous ads on the BBC site -- though for many people the BBC displays no ads at all. It's also possible that Facebook's algorithms interpret news about the National Front party (which is politely described as "far right," but might more accurately be described as nationalist-to-racist) as somehow dangerous. But, just the fact that Facebook is magically determining that a news story is somehow "unsafe" without giving me any details to understand why or how is tremendously concerning. And, again, this comes just after we've seen American politicians calling for Facebook and others to magically determine how to block "bad" content that might inspire terrorists. And, it comes just as Google's Eric Schmidt argued that these kinds of filters should be more common. Yet, examples like this show just how problematic the idea of these kinds of filters can be.
The more pressure put on companies like Facebook to do that kind of proactive filtering, the more likely that perfectly legitimate information and news stories like the BBC story here get blocked. And that should be seen as immensely problematic if you believe in free expression and the ability to share ideas freely.
Permalink | Comments | Email This Story
