Article 5N6CE Facebook’s vaccine stance is part of a familiar pattern, says author and NYTimes journalist

Facebook’s vaccine stance is part of a familiar pattern, says author and NYTimes journalist

by
Connie Loizos
from Crunch Hype on (#5N6CE)

Today, in a new report about coordinated inauthentic behavior" on its platform, Facebook states that it last month removed hundreds of accounts across its Facebook and Instagram platforms that were tied to anti-vaccination disinformation campaigns operated from Russia. In one campaign, says the company, a newly banned network posted memes and comments claiming that the AstraZeneca COVID-19 vaccine would turn people into chimpanzees." More recently, in May, the same network questioned the safety of the Pfizer vaccine by posting an allegedly hacked and leaked
AstraZeneca document," says Facebook.

The company publishes such reports as a reminder to the public that it is focused on finding and removing deceptive campaigns around the world." Still, a new New York Times investigation into Facebook's relationship with the Biden administration suggests that the company continues to fall short when it comes to tackling misinformation, including, currently, around vaccine misinformation.

We talked about that reported disconnect earlier today with Sheera Frenkel, a cybersecurity correspondent for the New York Times and recent co-author, with New York Times national correspondent Cecelia Kang, of An Ugly Truth: Inside Facebook's Battle for Domination," which was published in June. Our conversation has been lightly edited for length.

TC: This big story right now about Facebook centers on it shutting down the accounts of NYU researchers whose tools for studying advertising on the network violated its rules, according to the company. A lot of people think those objections don't hold water. In the meantime, several Democratic senators have sent the company a letter, grilling it about its decision to ban these scholars. How does this particular situation fit into your understanding of how Facebook operates?

SF: I was struck by how it fit a pattern that we really showed in [our] book of Facebook taking what seems like a very ad hoc and piecemeal approach to many of its problems. This action they took against NYU was surprising because there are so many others that are using data in the way that NYU is, including, private companies and commercial firms that are using it in ways that we don't fully understand.

With NYU, the academics there were actually quite transparent and how they were collecting data. They didn't hide what they were doing. They told journalists about it, and they told Facebook about it. So for Facebook to take action against just them, just as they were about to publish some research that may have been critical of Facebook and may have been damaging to Facebook, seems like a one off thing and really gets to the root of Facebook's problems about what data the company holds about its own users.

TC: Do you have any sense that investigators in the Senate or in Congress may demand more accountability for more recent industry indiscretions, such as the events of January 6? Typically, there comes a point where Facebook apologizes over a public flap . . . then nothing changes.

SF: After the book came out, I spoke to one lawmaker who read our book and said, It's one thing if they apologized once, and we saw a substantial change happen at the company. But what these apologies are showing us is that they think they can get away with just an apology and then changing really surface level things but not getting to the root of the problem.'

So you brought up January 6, which is something that we know Congress is looking at, and I think that what lawmakers are doing is going a step beyond what they usually do . . . they're taking a step back and saying, How did Facebook allow groups to foment on the platform for months ahead of January 6? How did its algorithms drive people toward these groups? And how did its piecemeal approach to removing some groups but not others allow this movement known as stop-the-steal really take off. That's fascinating because, until now, they haven't taken that step back to understand the whole machinery behind Facebook.

TC: Still, if Facebook is not willing to share its data in a more granular way, I wonder how fruitful these investigations will really be.

SF: We reported in the New York Times that Facebook, when asked by the White House for this prevalence data on COVID - the idea being how prevalent is COVID misinformation - couldn't give it to the White House because they didn't have it. And the reason they didn't have it is that when their own data scientists wanted to start tracking that over a year ago at the start of the pandemic, Facebook did not give them the resources or the mandate to start tracking the prevalence of COVID misinformation. One thing lawmakers can do is pressure Facebook to do that in the future and to give the company firm deadlines for when they want to see that data.

TC: Based on your reporting, do you think there's a reporting issue within Facebook or that these unclosed information loops are by design? In the book, for example, you talk about Russian activity on the platform leading up to the 2016 elections. You say that the company's then chief security officer, Alex Stamos, had come up with a special team to look at Russian election interference relatively early in 2016, but that after Donald Trump won the election, Mark Zuckerberg and Sheryl Sandberg said they were clueless and frustrated and they didn't know why they weren't presented with Stamos's findings earlier.

SF: As we were doing reporting for this book, we really wanted to get to the bottom of that. Did Mark Zuckerberg and Sheryl Sandberg avoid knowing what there was to know about Russia, or were they just kept out of the loop? Ultimately, I think only Mark Zuckerberg or Sheryl Sandberg can answer that question.

What I'll say is that early on, about a week or two after the 2016 elections, Alex Stamos goes to them and says, There was Russian election interference. We don't know how much; we don't know the extent. But there definitely was something here and we want to investigate it. And even after being told that startling news, Mark Zuckerberg [and other to brass] didn't ask for daily or even weekly meetings to be updated on the progress of the security team. I know this is the chief executive of a company and as the CEO [he has] a lot on [his] plate. But you would think if your security team said to you, Hey, there was an unprecedented thing that happened on our platform. Democracy was potentially harmed in a way that we didn't foresee or expect,' you would think that as the head of the company, you'd say, This is a really huge priority for me, and I'm going to ask for regular updates and meetings on this.' We don't see that happen. And that let's them monthly to be able to say, Well, we didn't know. We weren't totally up to date with things.'

TC: In the meantime, industry participants remain very interested in where regulation goes. What are you watching most closely?

SF: In the next six months to a year, there are two things that are fascinating to me. One is COVID misinformation. It's the worst problem for Facebook, because it's been growing on the platform for close to a decade. It's got deep roots across all parts of Facebook. And it's homegrown. It's Americans who are spreading this misinformation to other Americans. So it challenges all Facebook's tenets on free speech and what it means to be a platform that welcomes free speech but also hasn't drawn a clear line between what free speech is and what harmful speech is, especially during the time of the pandemic. So I'm really curious to see how they handle the fact that their own algorithms are still pushing people into anti vaccine groups and are still promoting people that definitely off the platform spread incorrect information about about COVID.

The second thing for me is that we're going into a year where there are a lot of really important elections to be held in other countries with populist leaders, some of whom are modeling their use of Facebook after Donald Trump. After banning Donald Trump. I'm very curious to see how Facebook deals with some of these leaders in other countries who are testing the waters much in the same way that he did.

Techcrunch?d=2mJPEYqXBVI Techcrunch?d=7Q72WNTAKBA Techcrunch?d=yIl2AUoC8zA Techcrunch?i=PEM3iCZo0yM:qF-eLfg2Zdk:-BT Techcrunch?i=PEM3iCZo0yM:qF-eLfg2Zdk:D7D Techcrunch?d=qj6IDK7rITsPEM3iCZo0yM
External Content
Source RSS or Atom Feed
Feed Location http://feeds.feedburner.com/TechCrunch/
Feed Title Crunch Hype
Feed Link https://techncruncher.blogspot.com/
Reply 0 comments