John Foreman on Facebook's data mining and manipulation

by
in science on (#3Q1)
Yes, everyone is angry that Facebook manipulated 700,000 users' timelines to see if they could help researchers prove emotions are contagious. But John Foreman isn't just annoyed about it. As a data scientist, he's thought deeply about where this is headed, and it's not good. He brings up the Kuleshov Effect first described by a Russian scientist, that images matter less than the sequence in which they are stitched together. And he concludes:
In this particular study in PNAS, we can see that the promise of data modeling at Facebook is not to "let humans be more human." It's not to "free their minds."

All of that machine reasoning isn't trying to make us more human so much as it is trying to make us more sad and predictable. And just wait until deep learning applied to image recognition can recognize and stream my selfie at Krispie Kreme next to a tagged photo of me and my love handles at the beach. Data-driven inferiority complexes for all!

The promise of data modeling at Facebook is to place us in chains made from the juxtaposition of our own content. We'll be driven into pens made of a few profitable emotional states where marketing content waits for us like a cattle gun to the skull.
This is an interesting read. Not surprising, but disappointing nonetheless. Time to ditch Facebook and spend more time on Twitter? He's got thoughts about that too.

Re: And? (Score: 0)

by Anonymous Coward on 2014-07-07 16:11 (#2C6)

Again, Facebook did nothing wrong legally, morally, technically, or contractually. If people want to introduce legislation on how companies mine their own data, let them do so. For now, simply exposing FB as an evil festering pile of dung us sufficient.
Post Comment
Subject
Comment
Captcha
Enter the number forty nine thousand five hundred and sixty five in digits: