Article 49F5F A Conversation With EU Parliament Member Marietje Schaake About Digital Platforms And Regulation, Part I

A Conversation With EU Parliament Member Marietje Schaake About Digital Platforms And Regulation, Part I

by
Flemming Rose
from Techdirt on (#49F5F)
Story Image

We are cross posting the following interview conducted by Danish journalist, Cato Institute Senior Fellow, and author of The Tyranny of Silence, Flemming Rose with European Parliament Member from the Netherlands, Marietje Schaake -- who we've discussed on the site many times, and who has even contributed here as well. It's an interesting look at how she views the question of regulating internet platforms. Since this is a relatively long interview, we have broken it up into two parts, with the second part running tomorrow.

Marietje Schaake is a leading and influential voice in Europe on digital platformsand the digital economy. She is the founder of the European Parliament Intergroupon the Digital Agenda for Europe and has been a member of the EuropeanParliament since 2009 representing the Dutch party D66 that is part of the Allianceof Liberals and Democrats for Europe (ALDE) political group. Schaake isspokesperson for the center/right group in the European Parliament on transatlantictrade and digital trade, and she is Vice-President of the European Parliament's USDelegation. She has for some time advocated more regulation and accountability ofthe digital platforms.

Recently, I sat down with Marietje Schaake in a cafi(C) in the European Parliament inBrussels to talk about what's on the agenda in Europe when it comes to digitalplatforms and possible regulation.

FR: Digital platforms like Facebook, Twitter and Google have had a consistentmessage for European lawmakers: Regulation will stifle innovation. You have saidthat this is a losing strategy in Brussels. What do you mean by that?

MS: I think it's safe to say that American big tech companies across the boardhave pushed back against regulation, and this approach is in line with the quasi-libertarian culture and outlook that we know well from Silicon Valley. It hasbenefited these companies that they have been free from regulation. They havebeen free not only from new regulation but also have had explicit exemptions fromliability in both European and American law (Section 230 in the US and theIntermediary Liability Exemption in the E-commerce Directive in the EU). At thesame time they have benefited from regulations like net neutrality and othersafeguards in the law. We have been discussing many new initiatives here in theEuropean Parliament including measures against copyright violations, terroristcontent, hate speech, child pornography and other problems. digital platformsreaction to most of the initiatives has at been at best an offer to regulatethemselves. They in effect say, "We as a company will fix it, and please don'tstifle innovation." This has been the consistent counter-argument againstregulation. Another counter-argument has been that if Europe starts regulatingdigital platforms, then China will do the same.

FR: You don't buy that argument?

MS: Well, China does what it wants anyway. I think we have made a big mistakein the democratic world. The EU, the US and other liberal democracies have beenso slow to create a rule-based system for the internet and for digital platforms.Since World War II, we in the West have developed a rules on trade, on humanrights, on war and peace, and on the rule of law itself; not because we love rules inand by themselves, but because it has created a framework that protects our way oflife. Rules mean fairness and a level playing field with regard to the things I justmentioned. But there has been a push-back against regulation and rules when itcomes to digital platforms due to this libertarian spirit and argument about stiflinginnovation, this "move fast and break things" attitude that we know so well fromSilicon Valley.

This is problematic for two reasons. First, we now see a global competitionbetween authoritarian regimes with a closed internet with no rule of law anddemocracies with an open internet with the rule of law. We have stood by andwatched as China, the leading authoritarian regime, has offered its model to theworld of a sovereign, fragmented internet. This alternative model stiflesinnovation, and if people are concerned about stifling innovation, they should takemuch more interest in fostering an internet governance model that beats theChinese alternative. Second, because with the current law of the jungle on theinternet, liberal democracy and democratic rights of people are suffering, becausewe have no accountability for the algorithms of digital platforms. At this pointprofit is much more important than the public good.

FR: But you said that emphasizing innovation is a losing strategy here in Brussels.

MS: I feel there is a big turning point happening as we speak. It is not only here inBrussels but even Americans are now advocating regulation.

FR: Why?

MS: They have seen the 2016 election in the US, they have seen conspiracy afterconspiracy rising to the top ranks of searches, and it's just not sustainable.

FR: What kind of regulation are you calling for and what regulation will there bepolitical support for here in Brussels?

MS: I believe that the e-commerce directive with the liability exemptions in theEU and Section 230 with similar exemptions in the US will come under pressure. Itwill be a huge game changer.

FR: A game changer in what way?

MS: I think there will be forms of liability for content. You can already see moreactive regulation in the German law and in the agreements between the EU-Commission and the companies) to take down content (the code of conduct on hatespeech and disinformation). These companies cannot credibly say that they are notediting content. They are offering to edit content in order not to be regulated, sothey are involved in taking down content. And their business model involvespromoting or demoting content, so the whole idea that they would not be able toedit is actually not credible and factually incorrect. So regulation is coming, and Ithink it will cause an earthquake in the digital economy. You can already see theissues being raised in the public debate about more forceful competitionrequirements, whether emerging data sets should also be scrutinized in differentways, and net neutrality. We have had an important discussion about the right toprivacy and data protection here in Europe. Of course, in Europe we have a right toprivacy. The United States does not recognize such a right, but I think they willstart to think more about it as a basic principle as well.

FR: Why?

MS: Because of the backlash they have seen.

FR: Do you have scandals like Cambridge Analytica in mind?

MS: Yes, but not only that one. Americans are as concerned about protection ofchildren as Europeans are if not more. I think we might see a backlash againstsmart toys. Think about dolls that listen to your baby, capture its entire learningprocess, its voice, its first words, and then use that data for AI to activate toys. I amnot sure American parents are willing accept this. The same with facialrecognition. It's a new kind of technology that is becoming more sophisticated.Should it be banned? I have seen proposals to that end coming from California ofall places.

FR: Liability may involve a lot of things. What kind of liability is on the politicalmenu of the European Union? Filtering technology or other tools?

MS: Filtering is on the menu, but I would like to see it off the menu becauseautomatic filtering is a real risk to freedom of expression, and it's not feasible forSME (Small and Medium Enterprises) so it only helps the big companies. We needto look at accountability of algorithms. If we know how they are built, and whatcould be their flaws or unintended consequences, then we will be able to setdeadlines for companies to solve these problems. I think we will look much moreat compliance deadlines than just methods. We already have principles in our lawslike non-discrimination, fair competition, freedom of expression and access toinformation. They are not disputed, but some of these platforms are in factdiscriminating. It has been documented that Amazon, the biggest tech companyand the front runner of AI had a gender bias in favor of men in its AI-algorithm forhiring. I think future efforts will be directed toward the question of designingtechnology and fostering accountability for its outcomes.

FR: Do you think the governments in the US and Europe are converging on theseissues?

MS: Yes. Liberal democracies need to protect themselves. Democracy is in declinefor 13th year in a row (according to Freedom House). It's a nightmare, and it'ssomething that we cannot think lightly about. Democracy is the best system inspite of all its flaws, it guarantees the freedoms of our people. It also can beimproved by holding the use of power accountable through checks and balancesand other means.

FR: Shouldn't we be careful not to throw out the baby with the bath water? We areonly in the early stages of developing these technologies and businesses. Aren'tyou concerned that too much regulation will have unintended consequences?

MS: I don't think there is a risk of too much regulation. There is a risk of poorlydrafted regulation. We can already see some very grave consequences, and I don'twant to wait until there are more. Instead, let's double down on principles thatshould apply in the digital world as they do in the physical world. It doesn't matterif we are talking about a truck company, a gas company or a tech company. I don'tthink any technology or AI should be allowed to disrupt fundamental principlesand we should begin to address it. I believe such regulation would be in thecompanies' interest too because the trust of their customers is at stake. I don't thinkregulation is a goal in and by itself, but everything around us is regulated: thebattery in your recording device, the coffee we just drank, the light bulbs here, thesprinkler system, the router on the ceiling, the plastic plants behind you so that if achild happens to eat it, it will not kill them as fast as it might without regulation,and the glass in the doors over there, so if it breaks it does so in a less harmful wayand so on and so forth. There are all kinds of ideas behind regulation, andregulation is not an injustice to technology. If done well, regulation works as asafeguard of our rights and freedoms. And if it is bad, we have a system to changeit.

The status quo is unacceptable. We already have had manipulation of ourdemocracies. We just learned that Facebook paid teenagers $20 to get to their mostprivate information. I think that's criminal, and there should be accountability forthat. We have data breach after data breach, we have conspiracy theories still risingto the top search at YouTube in spite of all their promises to do better. We haveFacebook selling data without consent, we have absolutely incomprehensible termsof use and consent agreements, we have lack of oversight over who is paying forwhich messages, how the algorithms are pushing certain things up and other thingsdown. It's not only about politics. Look at a public health issues like anti-vaccination hoaxes. Online sources say it is dangerous to vaccinate your child.People hear online that vaccinations are dangerous and do not vaccinate theirchildren leading to a new outbreak of measles. My mother and sister are medicaldoctors, cancer specialists, and they have patients who have been online andstudied what they should do to treat their cancer, and they get suggestions withoutany medical or scientific proof. People will not get the treatment that could savetheir lives. This touches upon many more issues than politics and democracy.

FR: So you see here a conflict between Big Tech and democracy and freedom?

MS: Between Big Tech with certain business models and democracy, yes.

FR: Do you see any changes in the attitudes and behaviour of the tech companies?

MS: Yes, it is changing, but it's too little, too late. I think there is moreapologizing, and there is still the terminology, "Oh we still have to learneverything, we are trying." But the question is, is that good enough?

FR: It's not good enough for you?

MS: It's not convincing. If you can make billions and billions tweeking youralgorithm every day to sell ever more adds, but you claim that you are unable todetermine when conspiracies or anti-vaccination messages rise to the top of yoursearch. At one point I looked into search results on the Eurozone. I received 8 outof 10 results from one source, an English tabloid with a negative view of the Euro.How come?

FR: Yes, how come, why should that be in the interest of the tech companies?

MS: I don't think it's in their interest to change it, but it's in the interest ofdemocracy. Their goal is to keep you online as long as possible, basically to getyou hooked. If you are trying to sell television, you want people to watch a lot oftelevision. I am not surprised by this. It was to be expected. However, it becomes aproblem, when hundreds of millions of people only use a handful of theseplatforms for their information. It's remarkably easy for commercial or politicalpurposes to influence people whether it's about anti-vaccination or politics. Iunderstand from experts that the reward mechanism of the algorithm means thatsensation sells more, and once you click on the first sensational message it pullsyou in a certain direction where it becomes more and more sensational, and onesensation after another is being automatically presented to you.

I say to the platforms, you are automatically suggesting more of the same. Theysay no, no, no, we just changed our algorithm. What does that mean to me? Am Isupposed to blindly believe them? Or do I have a way of finding out? At this pointI have no way of finding out, and even AI machine learning coders tell me thateven they don't know what the algorithms will churn out at the end of the day. Oneaspect of AI is that the people who code don't know exactly what's going comeout. I think it's too vague about safeguards, and clear that the impact is alreadyquite significant.

I don't pretend to know everything about how the systems work. We need to knowmore because it impacts so many people, and there is no precedent of any serviceor product that so many people use for such essential activities as accessinginformation about politics, public health and other things with no oversight. Weneed oversight to make sure that there are no excesses, that there is fairness, non-discrimination and free expression.



Permalink | Comments | Email This Story
External Content
Source RSS or Atom Feed
Feed Location https://www.techdirt.com/techdirt_rss.xml
Feed Title Techdirt
Feed Link https://www.techdirt.com/
Reply 0 comments