John Foreman on Facebook's data mining and manipulation

by
in science on (#3Q1)
Yes, everyone is angry that Facebook manipulated 700,000 users' timelines to see if they could help researchers prove emotions are contagious. But John Foreman isn't just annoyed about it. As a data scientist, he's thought deeply about where this is headed, and it's not good. He brings up the Kuleshov Effect first described by a Russian scientist, that images matter less than the sequence in which they are stitched together. And he concludes:
In this particular study in PNAS, we can see that the promise of data modeling at Facebook is not to "let humans be more human." It's not to "free their minds."

All of that machine reasoning isn't trying to make us more human so much as it is trying to make us more sad and predictable. And just wait until deep learning applied to image recognition can recognize and stream my selfie at Krispie Kreme next to a tagged photo of me and my love handles at the beach. Data-driven inferiority complexes for all!

The promise of data modeling at Facebook is to place us in chains made from the juxtaposition of our own content. We'll be driven into pens made of a few profitable emotional states where marketing content waits for us like a cattle gun to the skull.
This is an interesting read. Not surprising, but disappointing nonetheless. Time to ditch Facebook and spend more time on Twitter? He's got thoughts about that too.

Re: And? (Score: 2, Insightful)

by zafiro17@pipedot.org on 2014-07-07 09:16 (#2C2)

You've got a good point, sort of. You mean they didn't put all this hardware and software on line for my benefit only, so I could share pictures of my kid vomiting with friends and family?

There's a good XKCD about it: http://xkcd.org/1390/ "Research Ethics"
Post Comment
Subject
Comment
Captcha
Eighty nine, sixty two, eighty one, forty five or 55: the biggest is?