John Foreman on Facebook's data mining and manipulation

by
in science on (#3Q1)
Yes, everyone is angry that Facebook manipulated 700,000 users' timelines to see if they could help researchers prove emotions are contagious. But John Foreman isn't just annoyed about it. As a data scientist, he's thought deeply about where this is headed, and it's not good. He brings up the Kuleshov Effect first described by a Russian scientist, that images matter less than the sequence in which they are stitched together. And he concludes:
In this particular study in PNAS, we can see that the promise of data modeling at Facebook is not to "let humans be more human." It's not to "free their minds."

All of that machine reasoning isn't trying to make us more human so much as it is trying to make us more sad and predictable. And just wait until deep learning applied to image recognition can recognize and stream my selfie at Krispie Kreme next to a tagged photo of me and my love handles at the beach. Data-driven inferiority complexes for all!

The promise of data modeling at Facebook is to place us in chains made from the juxtaposition of our own content. We'll be driven into pens made of a few profitable emotional states where marketing content waits for us like a cattle gun to the skull.
This is an interesting read. Not surprising, but disappointing nonetheless. Time to ditch Facebook and spend more time on Twitter? He's got thoughts about that too.

Re: And? (Score: 1)

by skarjak@pipedot.org on 2014-07-07 13:15 (#2C4)

We have plenty of laws in place to protect people from their own idiocy. It's not always such a bad thing because while you may not feel empathy for them, it's probably best not to reward companies for exploiting others. Because you might end up being their next target.
Post Comment
Subject
Comment
Captcha
In the number 222457, what is the 6th digit?