Europe vs. Facebook
“It is essential to me not to spread apocalyptic sentiments. It is about to call on people, like I do in my book, that improvements can actually be achieved and that we, the citizens, are in no way helpless when it comes to our rights.”
The story of Mac Schrems is one of engaging in a hard, long struggle that reached a pivotal moment in October 2015, when the European High Court ruled that the US can no longer portray themselves as a ‘safe harbor’ for the data trails of European citizens.
On 26 June 2013, the law student turned privacy activist filed a complaint against “Facebook Ireland”, the international headquarter of Facebook Inc., with the Irish Data Protection Commissioner. Schrems argued that the transfer of customer data to the US, where these data were processed, constitutes a “transfer to a third country,” which is only legal in the European Union if the receiving country can guarantee adequate protection of these data. Because the data is forwarded from Facebook Inc. to the NSA and other US authorities for mass surveillance programs, the core claim was that personal data transferred to the US is not adequately protected once it reaches the United States. About one year later, the Irish High Court referred the case to the European Court of Justice.
Max Schrems and his thousands of supporters did not give in. On 6 October 2015, the Court of Justice of the European Union ruled that the regulation of data transfers under the ‘Safe Harbour’ agreement between the European Union and the US, which allowed tech firms to share their data, was invalid. The court followed Schrems’ interpretation, stating that Facebook and other digital operators do not provide customers with protection from state surveillance, and concluded that the US thus “does not afford an adequate level of protection of personal data.
In a first reaction, Schrems stated that “this case law will be a milestone for constitutional challenges against similar surveillance conducted by EU member states.” He also thanked the bravery of ex-security analyst Edward Snowden, whose revelations about mass surveillance had played a pivotal role for the Strasbourg decision. In this interview with award-winning technology journalist John Kennedy, he provides background information on the case against Facebook, how end users’ lack of technical knowledge fosters their lack of necessary mistrust and how business interests outrank the question of legality.
- Date of recording: Wed, 2015-01-28
- Language(s) spoken: English
00:06 Background information on the case against Facebook
JK: Hey Max, welcome to Ireland. You first crossed our radar when, as a student, you took a case against Facebook’s Irish operations over privacy. Can you tell me about your motivation then, and the current class action suit now, and just what we have learned in that whole process?
MS: My motivation? I don’t know, I think the whole privacy thing is kind of the debate of the next 20 or 30 years. I see the whole thing like the environmentalist movement in the 60s, where, like, the first people come up and say: there is that fish we should do something about it… And that is kind of where I come from, I guess.
And on Facebook: it was really that I was studying in the US for half a year, and there were guys from Facebook and other companies as guest speakers at my university. And they were pretty much saying: “You can fuck the US … ah the American … ah the European rules, nothing is ever going to happen if you break them.” And, quite honestly, nothing does ever happen if you break them. We usually pride ourselves about all these privacy laws in Europe, and point fingers at the US for not having them, and for being, like, the badass spying people, but the reality is that we are not really enforcing these laws. And it is really interesting how, I do not know, every parking violation is enforced, but if you just suck up the data of millions of people illegally, the worst thing that can happen to you - for example in Ireland - is an enforcement notice, which is a piece of paper saying: “Dear company, don’t do that anymore. Kisses, your Data Protection Commissioner.” That is the worst that can realistically happen to you.
But that is not necessary an Irish issue, we see that in a lot of other countries too, like, in Austria the maximum fine is 25,000 Euro, which usually means that getting a lawyer to tell you what the law says, and to be fully compliant, is more expensive than just breaking it. And I think that is an overall problem that I think is really interesting, because we are talking about fundamental rights here, not just some consumer rights…
01:50 Shifting responsibilities to end users
JK: What will tech companies do? As people, we were sleepwalking into this mess, and we still do.
We publish stuff about ourselves and about others but at the same time we do not realize that, or we didn’t realize how well-protected our information is going to be. Would you say, thus, when it comes to policy and governments that we have all been sleepwalking into this kind of situation?
MS: I think the biggest problem we have is that it is not tangible for the average person. Like, if you talk about Big Data analytics as an example: that is something where even the representatives of the companies cannot really tell you what these guys are doing there. And that’s the biggest problem of the whole privacy debate: just like, as I said Chernobyl before, as an atomic power debate; that it’s so complicated that the average user just doesn’t get it, and therefore ignores it. We don’t shift this responsibility to an average guy in all other fields. I usually compare it to building codes: we expect that in a modern country, buildings are not just collapsing and falling on our head. No one of us has checked if this building is correctly built, we just expect it. And same thing for hygiene laws: if you go to a supermarket and buy an apple, you have the expectation that you can eat it without throwing up.
In the privacy field, however, we feel that the individual consumer should know about all these things and make these decisions, even though it’s impossible for the average guy to do that, because no one has a lab in the basement to figure out what is, I don’t know, on the apple, or to check on the building. However, whenever it comes to apps on your cell phone, which are even much more opaque than these analog things, suddenly the user should have the responsibility, and that’s something I think is really interesting.
03:24 Irish policies and their implications for data protection
JK: Interestingly, the case you took against Facebook’s Irish operations, because that’s where the international headquarters are… How did you find the experience of chasing this, in Ireland, in terms of dealing with the Data Protection authorities here and, you know, where you see this going from here, as a strategy with the European Courts?
MS: You pretty much have a safe haven here when it comes to privacy, and that is something that is seriously criticized outside of Ireland, that we do have this tech cop here, and legally, they are responsible for it, but they are not taking on the responsibility of enforcing things. I know that things have changed now with the new Commissioner, that at least it got much more people, much more resources, which is really necessary. I do not know if the actual approach has changed by any means; that is something I don’t have any idea either way. So far, I still see that people who make complaints are all turned down. If you look at the statistics of Billy Hawkes’ times (ed. note: the former Data Protection Commissioner of Ireland, 2005-2014) at least, two to four percent of the complaints were even decided, all the others were not decided.
Most of them - in my experience all of them - actually got an email saying: “We are not taking on this complaint,” even though they have a legal obligation to go after every complaint. And in the end, pretty much, they said: “If you want us to do our job, you have to go to court and sue us,” knowing that this costs an insane amount of money and is impossible for anyone, especially outside of Ireland, to force them to do it. So, what we see internationally now, also with the new regulation, is that the European Union tries to pull out of Ireland and away, to give cooperation in some way, because there is just this feeling that certain data protection authorities are not doing their job properly. Unfortunately, that is oftentimes connected with a kind of – how do you call it – a “nice business environment.”
But actually the representative of the DPC in the High Court case has actually pointed out that, in our case on the Prism mass surveillance that, if the DPC would enforce things like that, it would harm the businesses in Ireland, which pretty much says that businesses are more important than your fundamental rights.
05:28 Technical possibilities vs. political questions
JK: I was watching something interesting yesterday: I think the Electronic Frontier Foundation in the United States had… put out a master plan for ending mass surveillance. And one of the points - and I think it was one of the first points - was getting the tech companies to harden their approach to protecting data. The tech companies themselves. On one hand, you can say – look, the technologies evolve faster than we could ever realize them and we are all carrying computers in our pockets. But on the other hand they are making money, vast amounts of money out of this infrastructure they have created. What kind of leadership position should the tech companies take, or are they willing to, in your opinion?
MS: I think that is a hard question, because, to be fair, one problem they really have is that they have to stick to different jurisdictions. So, for example, in the US, they have to give out all this information under the law. Under European law, that is prohibited. The problem they have is that they try to get international companies take advantage of, I don’t know, the tax regime here, or the access to the stock market in the US, and all these kind of different things, and now they are under a lot of jurisdictions that are actually in conflict. I think, the overall problem here is something that not necessarily the tech companies can actually deal with. That is something that has to be solved politically.
What they could do, however, is to change their systems in the more privacy-friendly way. For example, in Facebook they keep all your deleted information, saying that it is a centralized database and, if person A deletes a message… If we have in common conversation and I delete my side of the message, they still need to keep the conversation displayed to you. Now, the problem is that they can still centrally get all my messages that I have ever written, and there is no way I can practically delete them. That is a systematic failure. That is something where… If you have separated in- and outboxes, if you have them spread out on the network in a way that you cannot recover it anymore, which is the typical case for email: if I delete my inbox, they have to go after hundreds of thousands of other email providers to really get the information. These are things that you could possibly solve technically. There are also things that… With encryption, it’s really the keys is only with the individual users. The company can say: “We don’t have the keys, I am very sorry, you got to get it from somewhere else.” That are things that they can technically solve.
Legally, that is really a thing; that if they have access to the information and there is a legal procedure to get it, they have to give it out. What we were going after in the Facebook case was to exactly go after this difference in jurisdiction. So we sued Facebook Ireland for participating in the PRISM program by forwarding the data to Facebook US.
And to get them into the trouble of these two sides. Because if we win the case by any means, you can totally see Silicon Valley line up with the White House and be, like: “We have got a serious problem here, because either we get rid of the text version over there, or we get rid of, like, the servers over here.” And that is kind of really getting them into the position of solving this conflict. So far, this conflict was mainly solved by Europe not really applying the law.
08:18 The relation between mistrust and knowledge
JK: When you look at what PRISM, at what Snowden had done by doing what he had done, revealing all that information, you know, it is nearly possibly to say that in technology every secondary artery disease has something to do with trust and privacy. And it was debated last week at Dallas by Merce Meyer and salesforce.com’s SEO that we are talking about, basically, no trust when it comes to the most important things in the next while. Would you be optimistic that this issue will one day be resolved, or is it a long road ahead?
MS: I don’t think the trust issue is that big. It would be necessary that the average guy on the street knows what is going on to mistrust. And that is the basic problem: that the average guy on the street just doesn’t remotely understand what is going on. I have seen in a lot of cases that even the privacy freaks to come to that kind of speeches I do, as soon as I show them the actual datasets from Facebook, telling them that, I don’t know, there is a category called “last location” where they exactly have your GPS coordinates even though you never shared anything with them, stuff like that… If you show that to them, they are outraged. And that is even the people that are concerned about privacy that come to a conference like that.
And that is, I think, the underlying problem; that there is not as much mistrust as would be necessary, or would be reasonable, just because of a lack of understanding, and knowledge. That is one of the biggest problems in the whole privacy debate, that you have a black box where all of this is happening. It is some server in the US – you roughly know what is going in, you roughly know what is coming out. Sometimes, they have mistakes, so you have a better understanding of what is going on. Sometimes you have people like Facebook that for “internal communication problems” shift you a bunch of data out of it so you can get a rough idea what’s going on.
This debate is too complicated to really get through on arguments, on really understanding everything. I usually compare it to the atomic power plant thing. No one of us understands how an atomic power plant works, we just roughly know: “Do I want to have that in front of my front yard, or not?” And we get kind of a rough idea of what the consequences there can be, we have a couple of cases where things went wrong, and then you can have a debate, in a way, and we make a decision.
And unfortunately, we will very likely end up on the same side on the privacy thing, that people just have cases like Snowden, where they see “Oh my God, that’s the amount of data they have!” In my case, it was these 1,200 pages which all the media went after, because it was, like, physical and tangible. And I think on this side you can possibly get people shift and think about this, but the idea that people really fully understand what thousands of companies hold about them, what they do in the background…
10:48 Business interests outrank the question of legality
JK: …the words that come to my head is “playing with fire” here, because you’re gathering all this data, and ultimately it is still in the hands of human beings who can make mistakes.
MS: Yeah, I am notorious to make mistakes, yeah…
JK: Exactly, and hardly a week goes by without some breach happening, or some nefarious activity by a hacker for example, who breaks in, because that is what they do, they try to break in. The tech industry…does the moral hazard, or the moral obligation in line has taken to try and figure out how to do things better and more safe?
MS: I think there are two things that oftentimes makes the… One thing is really data security, where you are usually in the boat with the industry, because both don’t want to have a breach. That is something where you usually don’t have too much of a conflict between the company and the person; of course the people are outraged because the company X didn’t secure it well enough, but usually you are like on the same side, you want to have it secured.
In a market-driven society, you do - as a big, especially US-based company, with a cultural approach from there, you try to make as much money as possible; that is your obligation, no matter what the laws are. And that was one key difference between a US perspective and my legal training: We did not talk about what is legal or not legal, we talked about how likely it is you are going to be uncovered, you are going to be persecuted, how much you are going to pay; and if that is cheaper than sticking to a law or breaking the law. That was the question.
So it is a very business-style approach, and if the only consequence we have in Europe is a couple of thousand bucks that you may pay, the answer is: Let us break the law and make money. And this is something that Europe has to understand and realize. Right now, we are more in this whiny position of people, like: “But why are they not sticking to our laws, are they not nice?” which it is just totally, like, naive if you look at the realities out there.
More details on the CJEU ruling, the subsequent ruling of the Irish High Court, and to the ongoing class action suit in Austria can be found at http://europe-v-facebook.org/EN/en.html
image source: http://europe-v-facebook.org
video compiled from the following sources: