SOP Workshop 160: Global Trends to Watch: The Erosion of Privacy and Anonymity and The Need of Transparency of Government Access Requests (2024)

Sixth Annual Meeting of the Internet Governance Forum
27 -30 September 2011
United Nations Office in Naiorbi, Nairobi, Kenya

September 29, 2011 - 09:00AM

***

The following is the output of the real-time captioning taken during the Sixth Meeting of the IGF, in Nairobi, Kenya. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the session, but should not be treated as an authoritative record.

****

>>KATITZA RODRIGUEZ: Hi. The Twitter account where we are going to be tweet something IGF160, just in case you also want to tweet the discussion.

So hash tag is again IGF160, hash tag. Yes, IGF160.

Okay. Welcome, everyone, and thank you for coming to this workshop today. It is nice to see familiar faces and new ones as well.

Today we have here to share this space and to discuss some crucial policy issues surrounding privacy.

Let me introduce myself. I am Katitza Rodriguez and I work as international rights director at the Electronic Frontier Foundation, and I will be your moderator today.

For those who don't know EFF, EFF is a not-for-profit organization based in San Francisco, but we have 14,000 members around the world in 67 countries.

We were founded in 1990. EFF foresaw that more and more people would be using the Internet and that this new online space would become a huge part of people's lives.

We were founded with one simple goal, with recognition that more and more people were going to be moving online. We want -- we wanted to ensure that our fundamental rights would be respected in the online world.

Since then, we have handled many major online challenges, and we have continued to work on it until this day.

We anticipate that the more and more people get online, unfortunately the more challenges this will pose. The Internet is a dynamic and changing landscape, and this is a future, not the back.

It's a landscape that allows a lot of innovation, a lot of trying new things, a lot of reordering of social arrangements. This often results in one side trying to shut down people's rights. We see ourselves on the other side of the fence, trying to keep things open and free.

So digital privacy is of paramount importance now that we are regularly turning to search engines, social networks, and other intermediaries to find information online, record our emotions on blogs, share personal data with friends, store privacy and sensitive information such as e-mail, use mobile devices to connect and interact on Internet, write a collaborative document with our friends, and save vast amounts of our information to the cloud.

Today we have a distinguished panel. Each speaker will have seven minutes presentation; five to seven, if possible, presentation. Then we will open the floor for questions and foster an interactive dialogue among us.

The formula of the panel will be as follows. We will start with Christopher Soghoian. He is Ph.D. candidate in the school of informatics and computing at Indiana University.

Chris, would you like to start here, start the presentation?

Thank you.

>>CHRISTOPHER SOGHOIAN: There we go. Thank you.

So as Katitza said I am a research fellow at Indiana university. My research is focused on the relationship between Internet companies and law enforcement agencies. In particular, I study the way that these companies facilitate or resist surveillance of their customers. And so what I would like to talk about for at least five or six minutes this morning is the way that cloud computing has changed the landscape for law enforcement. How it has made surveillance easier and cheap -- and much, much cheaper.

So as Katitza said, we are all using cloud computing now in a big way. To borrow a phrase from one of our colleagues, we treat the search engine as our most trusted advisor. We tell it things we wouldn't tell our spouses, our doctor, our rabbi. Google knows that you are sick before you go to the doctor because you are searching for information.

But we're using these services to store our most private files online, our photographs, our movement. Just yesterday, Amazon announced a competing mobile platform in which their Web browser will actually be sending every Web page you view through Amazon servers, and Amazon will, in fact, be keeping a list of every Web page you view for 30 days.

So we are using these services and as consumers have shifted to the cloud, so, too, has law enforcement because law enforcement will follow the data wherever it is. In this case it's actually even easier for them to get this information.

So as a thought experiment, think about law enforcement investigations five or ten years ago. If the police wanted to follow someone, they would have to get, you know, two or three vehicles of people, two or three agents each following the person. Every few minutes they would have to change which vehicle was immediately behind the car so that you couldn't see them in the rear-view mirror.

For every additional person that wanted to be surveilled, you would need an additional two or three teams of vehicles.

Now, one police officer from the comfort of their desk can track 20, 30, 50 people, all through Web interfaces provided by mobile companies and cloud computing companies.

Likewise, you know, we think of the police seizing documents. In the old days, they would come to your house at 2:00 in the morning and kick down the door and then you would see the police walking out with boxes of computers or documents or papers. Now they just submit a request to Google, to Yahoo!, to Amazon, to Facebook. And the marginal cost, the cost of surveilling one more person is now essentially approaching zero.

So Google charges $25 to hand over your inbox. I know because I have the invoices that I have obtained via the Freedom of Information Act. Yahoo! charges $20 plus the cost of a stamp. Facebook and Microsoft don't even bother charging because they say it's too difficult to get compensated for this.

And so it's very, very easy for law enforcement to get this information.

Now, in addition to the ease and the flexibility of law enforcement investigations, cloud computing also really changes the status quo, because the consumer is no longer in the loop; right? So in the old days when the police came to your house and they served a search warrant, you know, on their premises, you knew that you were being surveilled. You knew because they'd kicked down your front door because they had taken all your documents and files.

Now when your Facebook posts or your private communications with Google are obtained by law enforcement, you don't know.

So all these telecommunications companies, they all have large teams. Sprint for example, the third largest wireless carrier in the U.S., has 100 employees working full-time on surveillance. We know that other companies have similarly large groups, and they do nothing but respond to requests from the government and, in some cases, civil litigants, too, like divorce lawyers.

Now, you may ask, well, why are these companies not protecting our privacy more?

And the answer is that it's very, very difficult to deploy privacy protective policies with the current business model of ad supported services.

So as an example, many in the privacy community would like companies to deploy encryption; right? If the data is encrypted on your device, if the police seize it they cannot get any of the data.

Unfortunately, it's very difficult to monetize data when you cannot see it. And so if the files that I store in Google docs are encrypted or if the files I store on Amazon's drives are encrypted then they are not able to monetize it. And of course we have all seen the ads on the right-hand side of a Gmail window. They are analyzing the content of your e-mail to show you ads, and there's not really a privacy preserving way for them to target those ads to you without seeing your data. And unfortunately, these companies are putting their desire to monetize your data over their desire to protect your communications.

Now, this doesn't mean that Google and Microsoft and Yahoo! are evil. They are not going out of their way to help law enforcement. It's just that their business model is in conflict with your privacy.

And given two choices, one of which is protecting you from the government and the other which is making money, they are going to go with making money because, of course, they are public corporations. They are required to make money and return it to their shareholders.

So what are some options? We have heard over the last couple of days that many companies and businesses are shift together cloud, and in that case when you are paying Google $50 per user per year, the company, at least, has an incentive to protect your data. So maybe we will see Google deploying encryption for its corporate users and for its educational users.

There are some, you know, cloud-based services that, in fact, do encrypt your data. While Dropbox, the online backup service, doesn't encrypt your data, there are competing companies like SpiderOak that do encrypt your data in the cloud.

It's not impossible to have privacy in the cloud, but it's very difficult when we have an ad supported business model.

And the other big issue that we have right now, just to wrap up, is that we don't know what's happening.

So Google is actually the most transparent company in the space, which is unfortunate given how untransparent they are. So Google has this annual government -- or they have the Web site where they release statistics every six months detailing the number of requests they get from law enforcement agencies, and as of the most recent update they actually say how many they comply with.

But we don't know what the requests are for. We don't know if they are for e-mail or for search history. We don't know how many people the requests are for. So it could be one request for 5 million users or it could be one request for one user. We don't know.

We do know all these companies do receive requests and we do know that they hand data out on a regular basis.

Now, it's very difficult to have a public conversation with someone from Google and not have them tell you about that one time five years ago when they fought the Department of Justice's request for search records. And it's true, they did fight in a very public way once. But the norm is for Google to hand over data, and in fact for Yahoo! to hand over data and Facebook and Microsoft. And the reason for that is they are required by law.

Again, these companies are not evil, but if they have the data, they are required to disclose it.

And so this is the big problem with cloud computing. When you give your data to a third party, you lose your control over it, and the government can come in whenever it likes, with a valid court order, but they are relatively easy to obtain, and get your data.

So there isn't an easy solution, but I think many of us who have concerns about privacy maybe should be considering whether we want to trust or private data to services like these.

Thank you.

>>KATITZA RODRIGUEZ: Thank you.

Our next speaker is Amr Gharbeia. He is technology and freedom program officer from the Egyptian Initiative for Personal Rights.

Please, Amr, we would like to know what's going on in Egypt these days.

>>AMR GHARBEIA: Thank you, Katitza. I will try to contextualize what Chris has started upon in an Egyptian context.

But let me first give you a stack of four layers that would stand in the face of retaining persons, privacy, anonymity on the Internet.

The first of those is, actually, in a country like Egypt now going into so much of a transition, is the basic issue of rule of law.

It can't be ignored. It has been ignored for so long, and so far there isn't -- the process that's going for building a country that has respect and rule of law would require the reinvention of the enforcement agencies from the police, judiciary in ways that would make them capable and able to perform without going to excessive violation of privacy and a person's safety.

The second layer would be transparency. Actually, we don't know what's happening, just like Christopher said.

Going into the state security headquarters last March, we have seen that there were documents leaked showing surveillance taking place from the edges of the network, but also at the very center of the network. Trojan horses like Finfisher, by the U.K. based company Gamma, and other systems that live in the center of the network have been found out.

The only way to find out what the surveillance operations are going on in a security apparatus is if you actually break that. There is no transparency.

Freedom of information laws are being drafted in Egypt and Tunisia and other companies are going through transformation now, but still the ecosystem, the legal ecosystem isn't there.

The third layer would be actually national security itself and how broadly or narrowly it is defined. I, again, come from a country where everything, everything can become an issue of national security, and it's a matter of tolerance rather than actual boundaries known for everybody. There are national -- The now reinvented national security agency is interfering in the people's right to assembly, in the appointment of university professors, in the declaration of newspapers and NGOs. So it is still very much in control. And that broadly or narrowly defined national security in comparison with the criminal places would make it much more difficult for human rights defenders to actually find a platform, find an argument to defend the people they are working for.

The fourth layer would be the treating the Internet as a special domain in itself, even in cases where the criminal code is applicable, not the exceptional state of emergency cases.

The Internet is being treated as a special domain in itself, so the crimes that are -- committed outside the Internet are treated in a different way than they are in the Internet. And this has also to do with the issues of data retention, of anonymity surveillance and right to encryption.

It's actually illegal for you to use any encrypted transmission. So basically everyone who is logging on the Facebook or Twitter account in the morning are actually violating the law in Egypt.

The companies are required to keep logs for indefinite periods of time and to hand over those logs without any clear process.

And the right to communication itself, access to the radio spectrum and freedom just basically to be social on the Internet and build your own networks so that, actually, just like Christopher said, relying on your own right to build a Wi-Fi mesh networking and host your own services instead of relying on servers that are offered in the cloud.

These four layers have to be tackled simultaneously and some of them are being changed now in a post-revolutionary Egypt. Some of them will require more of a technical solution, and the coming together of not just efforts in one country but as a worldwide movement.

Thank you.

>>KATITZA RODRIGUEZ: Thank you, Amr, for your presentation. It's interesting to hear about the centralization of networks. We heard it in the last

cloud computer club in Germany about projects about the centralization of the cloud service providers, mixing peer to peer with encryption technologies so people can have decentralized services.

Now we are going to pass -- we would like to introduce Vint Cerf, is vice president and chief Internet evangelist from Google. Many of you already know. And he is well-known as the father of the Internet.

Thank you for coming to our workshop.

>>VINT CERF: Thank you very much, Madam Chairman, and good morning, everyone.

I had some points I wanted to make but now I feel compelled to rebut Mr. Soghoian, at least in some respects.

I'm only looking for precision here. We don't use docs for ads. If you bring up the docs page, you don't see any ads. But we do absolutely generate ads with your e-mail.

You mentioned that Google knows you're sick. Actually, all you know is there are statistics for how many people are searching for a particular topic, and we do try to mine that information purely statistically in order to track potential trends. So flu trends, for example, is one of the applications.

We do encrypt your access to the search engine, for example. You can go to -- to Google search under encryption. However, I can tem you that trying to implement encryption with cloud-based systems is difficult, especially if all the crypto has to happen in your browser.

So I have actually looked in some detail at that.

I think you're quite right, however that, we couldn't run our system if everything in it were encrypted because then we wouldn't know which ads to show you. So this is a system that was designed around a particular business model.

Finally, with respect to sharing of information with law enforcement, we only do so in the event that we're presented with proper court orders. I would like to mention a few other things that I had originally planned to talk about. One of them is that our privacy is being eroded by technology. We all carry mobiles, and mobiles can take pictures and mobiles can record video, and mobiles can record audio. We can upload things from mobiles into the net. The network allows rapid and very, very far-flung distributed sharing of information. Social networks are being used by us to share information with a fairly broad range of people. So in some respects it's our choices that are creating some of the erosion of privacy. One thing that we might have to consider is shifting in social norms so we can decide whether and when we want to share this kind of information. Providing tools to limit who has access to the information you choose to share may turn out to be important. Google+ has Circles which is intended to allow you to say something about who should get which information that you choose to share, but I suspect we have not seen the end of the evolution of those kinds of norms. As far as governments go, it's pretty clear that if the information is available and public and the government feels the need to protect the citizens that they are going to take advantage of whatever they can find in public. So we have little choice; if things are shared in that way, governments are going to go after and use that information. Finally, with regard to retention of information, I was quite, actually, surprised to see data that was published recently showing that, at least in the United States, there's quite a variation in the amount of time that certain kinds of data is kept about your communications. But some of it is kept for up to four or five years or more. I had thought that it went away in a year or two, but that's partly a function of law and partly a function of practice. Finally, at Google, anyway, we don't share any of the information that's in the system with any third parties except under the legal constraints that we're required to abide by. It's true that we use a lot of information to generate, select and display ads, but we don't share that information with third parties. Some people misunderstand the way the system works. The information stays in the environment. The last point I want to make is that e-mail has the property that only works if there's a place where all the e-mail can come together and then be accessed and in redistributed. The cloud is not the only case. Every e-mail server, regardless of how it's implemented, has that property, every time-sharing system has that property. The only way that you could inhibit that would be to make sure that every piece of e-mail is encrypted from the source and only decrypted at the destination. Many of you I'm sure, use PGP and other kinds of systems. I can tell you that while they work, they are not always convenient, and there's a place where some serious technology effort would be worthwhile, and that's to find ways of making it easy to encrypt mail and share it with parties, only with the parties intended. I'll stop there, Madam Chairman, and I am looking forward to the rest of the discussion. Thank you. We have now Katarzyna Szymielewicz. She is a director and security director of Panoptykon Foundation. She is also -- the Panoptykon Foundation is also a member of the European digital rights NGO. Welcome, Kasia. Thank you, Katitza. For those who are public speakers, it's Katarzyna Szymielewicz, but I do not expect anybody else to pronounce it. You can call me Kasia. Just a brief intro about our foundation. We work in broader surveillance. We deal with surveillance as a problem, seeing that information about us can be misused to control people by both corporate entities and the government. So that's our broad interest, and within that we work a lot on the Internet. because we are based in Poland and we work in the EU, I will try to put what we heard so far more in a European perspective. Maybe show you, too, big problems that are also pending debates in the EU when it comes to how to regulate the Internet. And to be fair I will make one about the corporate entities and the other about the governments, so it will be equal.

So the first problem we see is profiling. We heard a lot from Chris and Mr. Cerf about collection of data which often happens at our own request. I mean, this is people who seek cloud-based services who want to use technologies to have comfortable access to their data. So certainly it's not business forcing us to do so, but once the data is collected, the data might have second and third life, which sometimes we do not realize. It happens behind a screen. And this is increasing problem, especially with the fact that so many young people go online. Tweeting online environment is natural for them. Not considering the mechanisms that work might be to their disadvantage. I am also obviously talking about cookies, and the whole ecosystem of behavioral advertising, which is entirely not regulated at that point.

Cookies are not perceived as any pieces of personal information, so at least in Europe, everybody can store and use them without any, any protection.

And the debate started by European Commission recently is how should we approach that problem. And one of the ideas is whether we should have consent of the user for the use of the cookie, unless the cookie is unnecessary to provide the service. As privacy defenders, human rights defenders, we see that could be a very interesting move, although we are fully aware that consenting to serving a cookie might well kill the Internet and kill our activity online.

So we really need to think about innovative solutions and we need the help of the companies like Google and others, who are famous for their innovative potential, to think of maybe other ways, more comfortable ways, for both user and business to ensure transparency, to ensure that users have control over what happens with the information online, that users understand them, and it's not done by the means of long lengthy terms and conditions, but we need some other measures to do so.

And the problem we see in Europe is that unfortunately business that we meet and encounter, it's not very willing to get into this debate. They are not willing to innovate and think with us. They rather tend to say, "You cannot -- you simply cannot do this. Internet is based on monetization of private data. That's the only model that exists and we shouldn't seek for another."

We privacy defenders think that's wrong and that we do need to seek other models.

So the second trend we might have about data retention, we heard today a lot about collection of data by private entities and access of the governments to this data.

That's obviously the same problem in Europe. We have many companies operating in Europe that are based in America, so the same problems that Chris was describing will probably apply there.

In Europe, I would agree with Mr. Cerf. It's normally based on court warrant, so the access to data in the cloud is not automatic, but the government said, "Well, we want to make things easier. We need another mechanism to access users' data more freely, without all these warrants and all the red tape." So after the London bombings and Madrid bombings, the whole War on Terror paradigm has been adopted in the E.U. and the E.U. came up with the proposal to introduce data retention, which essentially means that the companies are forced to store data and collect data and keep data for more than they would like to do. It concerns now only telecommunication companies, so essentially Internet access providers, not people -- not companies like Facebook or Google, so not ISPs.

But already, the discussion goes around should we extend the obligation to everybody who provides any service online.

The idea is to make a huge collection of data that is easily accessible for various law enforcement services.

In my country, in Poland, we have the best -- sorry, the worst -- the worst law in Europe regarding data retention. We are trying hard to make it the best one. And at this point, the data is stored for two years and can be accessed by anybody from the law enforcement without any warrant, without any justification at all.

We have over 1,000,400 requests per year, so it's pretty privacy intrusive and we see many cases of abuse. We see how that -- how negative an impact it has on privacy, especially when you realize that your data stored by telecommunication companies says a lot about your routines, a lot about your social network, a lot about where you go, what is your location.

So law enforcement can not only trace you back, but can also predict your future behavior.

So that's why we see that data retention is extremely privacy intrusive. It's not only us. Also Peter Hustinx, European Data Protection Commissioner, said openly it's the most privacy intrusive instrument ever adopted in the E.U.

And right now, the battle is -- is running to change that law. I'm afraid we might lose it and the law will stay with some modifications.

What I want to point here is that retention, it's extremely important what data protection principles we craft today because the services online are developing. Now we have Internet, as we know, but in 10 years we might have new services. Probably we will have smart grid. We might have very popular -- I mean, Internet things might become more popular and accessible.

We will have more geo-location services.

So the more our life moves to the cloud, the more it is important to think now about the principles of data retention for both commercial purposes and law enforcement security purposes.

If we allow everybody to collect everything today, we might have a huge problem stopping that process from -- from getting even worse in the future. Thank you.

>>KATITZA RODRIGUEZ: Thank you, Kasia.

Now we have Alexander Seger. He's the head of Economic Crime Division of the Council of Europe and we would like to hear the law enforcement point of view. Thank you.

>>ALEXANDER SEGER: Thank you. Good morning.

There is famine in the north of Kenya and we all live in rather expensive hotels, and the cost of one night for each of us in the hotel here can easily bring relief to a family for one month or more in some of these famine-stricken areas.

I'm saying this to underline that we have here a responsibility. We have a responsibility to deliver and make progress and find constructive solutions to very complicated issues, and therefore, we should refrain, perhaps, from engaging in polemics in this discussion.

The fact that I'm here to represent the law enforcement point of view points also at a major problem that the IGF has and has had for the past five years, ever since it came into existence.

I'm a Council of Europe bureaucrat, and now I have to talk about law enforcement. I have to talk about law enforcement because law enforcement and the criminal justice community is not represented at the IGF, and it's also not represented in discussions on cybersecurity at many international fora.

The result is that we get solutions, proposals that are technologically sound but very risky from a human rights and rule of law perspective.

Criminal justice is a very tricky process, and I think Kenyans watching television every day, they are witness to that, how risky that is, how tricky it is, and how important evidence is, chain of custody for evidence, when it comes to criminal proceedings.

So in the future, I think the IGF, we all have to make an effort to bring the criminal justice community also here to engage in exchanges like this, so that I have a quieter time next time.

To come now to the topic of today, I fully support much of what has been said by previous speakers. Technology undermines privacy. Let's be clear on that.

And the risks that our lives are taken over and controlled by private sector companies or governments are real. However, there's also cybercrime. In my personal life, as an individual, my life and my privacy is more threatened by cybercriminals right now than by governments. I'm not talking about companies right now.

It is clear that cybercrime threatens fundamental rights. In particular, privacy. And therefore, I'm not so sure that we just have to look at how to protect ourselves against governments and companies. We have to protect ourselves against criminals, and very much so.

The Europe Court of Human Rights has argued on several occasions -- has ruled on several occasions that governments have a positive obligation to protect the privacy of citizens against criminal intrusion.

There's a positive obligation by governments to protect citizens and people against crime.

From the Council of Europe perspective, there are two tools I would like to mention here that do not only apply in Europe but they also apply outside. One is the Budapest Convention on Cybercrime that has been discussed in a number of previous IGFs and a number of workshops here. There is also the Convention 108 on data protection which also applies outside Europe, and we are very happy that recently Uruguay was invited as the first non-European country to join the Convention 108 on data protection.

So both of these treaties -- Budapest Convention on Cybercrime, Data Protection Convention 108 -- are designed for implementation anywhere in the world.

With regard to the challenges we are discussing today, I would like to speak a bit more about the Budapest Convention now.

The Budapest Convention provides for substantive law. You know, it defines what is a criminal offense in a country. It provides for procedural powers to allow law enforcement to investigate, search, seize computer systems and data, intercept communications, all the bad things you have just been mentioning.

But the objective is to promote human rights and the rule of law.

When you go look at the substantive criminal law provisions, the type of conduct to be made a criminal offense, there are all sorts of safeguards there to provide over-criminalization.

For example, with regard to interception, it clearly says that the interception should be limited to serious offenses, not just to -- not be applied to any offense. You have similar provisions and safeguards through a number of other substantive criminal law provisions.

However, conditions and safeguards limiting law enforcement powers are, of course, specifically designed to address the procedural law powers, the investigative tools of the treaty.

The investigative tools -- I mentioned some of them -- includes the expedited preservation of data, search and seizure, protection orders, real-time collection of traffic, and interception of content data.

The Budapest Convention does not contain a data retention provision, which means you don't have to preemptively -- or service providers are not asked under the Budapest Convention to preemptively retain data.

It's data expedited preservation. It's for specific specified traffic or content data, but it's specific. There's a specific situation, a specific investigation, and specific data have to be preserved under the Budapest Convention.

Now, the problem is with regard to procedural powers. That's clear. But with regard to the conditions and safeguards limiting such powers, it's very difficult to define the details of that in an international treaty.

An international treaty is always a minimum. It's the consensus reached between those that negotiate a treaty, so you have a minimum of measures there. The conditions and safeguards are very complicated and I'll explain why in a second.

However, the Budapest Convention comprises Article 15. Katitza writes a lot about Article -- Katitza, maybe you'll listen now and maybe then you'll write differently in the future, if you, hopefully, understand also a bit better -- sorry, I would not get into polemics here.

Article 15 of the Budapest Convention establishes some general principles with regard to safeguards and conditions and also makes reference to international human rights standards. So if a country is a party to international human rights treaty -- for Europe countries, that's the European Convention of Human Rights with all the protocols; for other countries that may be the 1966 Covenant on Civil and Political Rights, and many others -- so that you then also implement the obligations that you have with regard to such international agreements and you apply them with regard to investigative measures.

So as I said, it's impossible to define all the details in an international treaty, but there are some principles that we can identify in Article 15, in the preamble, and in some of the other provisions of the Budapest Convention.

A general principle is: There shall be no punishment without a law, right? You cannot be punished for something that is not defined as a criminal offense in the law.

Therefore, it is important that through the Budapest Convention, countries clearly define the conduct that is to be made a criminal offense, and that we don't have situations that we have in many countries that investigations take place on something that is not even a criminal offense.

The second thing, the second general principle, is, everybody has the right to a fair trial, including the presumption of innocence.

Theoretically, we all believe in that.

The third general principle is that any interference in the rights of an individual can only be done in accordance with the law and is necessary in the public interest -- and public interest includes crime prevention -- or the protection of the rights of others.

This means that investigative measures -- in particular, if they entail intrusion into their rights -- must be prescribed clearly by law. Therefore, it is important that countries implement the procedural law provisions of the Budapest Convention very clearly, because then you will see that law enforcement will be able to act based on law and not arbitrarily.

Rule of law, by the way, in short is -- avoids arbitrariness in government action.

Another general principle is that anyone whose rights are violated must have the right to an effective remedy.

So if you think your rights have been violated, you need to be able to go to a court and go through different court systems and have the right to appeal and everything.

So the right to an effective remedy, very important.

The court system, the judicial system, also points at another general principle, that governments need to put in place the framework that allows to reconcile different interests that are to be protected.

Now, there are some specific provisions, there are some more specific provisions in the Budapest Convention as regards the procedural law powers.

One of it is the principle of proportionality, which means that the power or the procedure must be proportional to the nature and circumstances of the offense.

If you have a minor offense, a judge will hopefully not agree that your data is intercepted, your communications is intercepted. If it's a serious offense, a judge may see that, yes, it is proportional and it should be able for police to intercept the communication. There must be judicial or other independent supervision. Which means before -- before enforce -- law enforcement searches or seizes a computer system or takes any of the other measures, somebody has to approve that. You need an investigative judge or a prosecutor regarding the level, but for a more intrusive measure, you need a judge to authorize that.

There are some other more difficult -- principles more difficult to apply. Namely, that you need to have clear grounds when applying a certain power, and there must be a limitation on the scope or duration, which means you cannot get an open-ended interception warrant. There must be a time limit for that. There must be also a clear limitation of what you can search and what you cannot search on a computer system, for example.

And then there is the principle of reasonableness, a very difficult one.

That means that powers and procedures must be reasonable, considering the rights or the impact on the rights and responsibilities of third parties, and that they should limit the interest of third parties. And here the Budapest Convention also makes reference to the interests of Internet Service Providers.

As I said, the details of all of this are to be left to domestic -- domestic legislation. The Council of Europe would not be in a position and the Budapest Convention cannot prescribe what, in your countries, should be done in detail, how a judge should interpret whether something is proportional or not.

If you're unhappy with the laws in your country, hopefully your countries are democratic enough so that you can then challenge laws or adopt new laws or different or better laws through elected and democratically elected parliaments.

For us from the Council of Europe, we now have -- we have to ask ourselves a question. In Europe, we have -- in the 47 member states of the Council of Europe, we have 205 treaties, I believe. Many of them deal with criminal law measures. Many of them deal with human rights issues.

And member states of the Council of Europe are also subject to the European Court of Human Rights. If the rights of an individual in Europe are violated and you don't get justice in your domestic court system, eventually you can also go to the European Court of Rights -- Human Rights. Very important.

The question is, when you work outside Europe, how does that apply?

Now, we can have the choice and say we wait for the perfect human rights situation in any country before we start working with a country on cybercrime. That's one option. It's the easy one.

The other option is to say we engage in countries. We see that the countries are prepared to work with us. We work with countries on criminal legislation, but also --

Oh, I have more time than I thought. Thank you.

>>KATITZA RODRIGUEZ: Yes.

>>ALEXANDER SEGER: We can engage with countries. We can help them establish proper law to define what is a criminal conduct and what is not a criminal conduct. We can engage with countries to define procedural law measures. But we can also engage and we should engage with countries on Article 15 questions, to see how conditions and safeguards can be put in place, and hopefully also improve the rule of law and human rights situation in third countries.

We cannot, in that way, change everything in the country. That's very clear. We cannot be that ambitious.

But we can engage in a dialogue, and that's what we are trying to do in order to help countries take measures against cybercrime, but also improve human rights and the rule of law in any country. Thank you.

>>KATITZA RODRIGUEZ: Thank you. Since I think I have been directly pointed out in the previous speech, I think I have the right to reply quickly.

I just would make a few -- one remark and then we'll open the floor for discussion, because I think everyone is eager to start talking.

Regarding the Cybercrime Convention, at EFF we have a concern about some of the (indiscernible) criminal assumptions that are there, and just I will point out one concrete example.

For instance, many security researchers need to authorize -- access to -- need to access seized computer systems without authorization, especially when they are doing, for instance, security research on e-voting machines, and sometimes when there is not enough exceptions, when you can do that legally, many security researchers are not able to publish their results. And this was one case, for instance, in e-voting machine in India when one security researcher was criminalized for disclosing the security failures of the security voting machine, so we are concerned with just this little part of the law on where legal -- legal attitudes can be criminalized because it's not clear.

But our concern is more when it is supported outside the European Union. And I'm not talking about authoritarian regimes. I'm talking about democratic elected governments. There are many countries -- and just my country, I am from Peru, from Latin America -- we don't have solid judicial powers or independence of powers, sometimes, and there are many cases of illegal wiretapping. Well, you support a treaty with so overbroad powers and just one article of three paragraphs of the legal safeguards implementation, it's at a national level, it's true, but it become in different ways.

Many of those countries also don't have data protection laws like in the European Union. They don't -- they have the Cybercrime Convention and they have Convention 108 that -- Convention 108 we promoted, but the countries are not adopting those laws. They are, some of them, presently adopting ones and many of them, those laws does not apply to the -- to the law enforcement area.

So there are a lot of challenges when you support those rules outside a country where the systems are completely different and the legal safeguards in place are different.

And so, you know, where a civil society tries to educate the general public and governments, we try to highlight those mistakes or those -- more than mistake -- not a mistake -- those challenges that can -- those powers, overbroad powers, of the lack of legal safeguards that can just infringe on our rights. So we are just seeing this narrow thing of how to keep our rights safe and not be undermined by a bad implementation of a treaty that is just supporting -- being supported by countries with not the same culture as those who were promoting the convention.

And there are more things. I think I see hands from both sides of the room, so I will give one comment to Chris Soghoian and the guy is just -- and the guy -- I don't know your name, sorry, that -- yes.

First is Chris Soghoian. Then we open the floor for discussion and --

>>CHRISTOPHER SOGHOIAN: So I just wanted to respond to a couple things that were said by some of the other speakers.

So Alexander said some interesting things, in particular, that if you're unhappy with your own country's laws, you have the opportunity to change them.

And that may be true. So let's just assume for a moment that you have the ability to control, via -- you know, via the ballot -- or via your elections, your own country's surveillance laws. And let's say you even believe that your own country's surveillance laws are just and proportional and that they will not be abused and that your police and intelligence agencies will do the right thing.

Well, your data is in the hands, in more cases than not, with U.S. companies, right?

And so not only does your country's police and intelligence agencies have access to your data, but my country's police and intelligence agencies have access to your data. And my country tortures people. They hold people in jail for large periods of time without arrest -- without prosecuting them or without charging them. We have the death penalty, yes. We -- we spy on our own citizens in large numbers. Millions of U.S. citizens have been spied on. And our laws specifically state that foreigners don't have a right to privacy, right?

We have the Foreign Intelligence Surveillance Act that allows warrants to be issued to spy on foreigners. On politicians, on political groups, and any -- and other foreign persons.

You, as non-U.S. persons, don't have a right to privacy in the U.S., and so you're extremely foolish to give your data to companies in the U.S., even if your data is held in Europe, right?

If you put your data in Google's cloud and it's in a Google datacenter in Finland or in Switzerland, Google's U.S. employees can and will be forced to reach into those datacenters, pull your data out, and give it to the NSA, right?

And I think -- you know, I, at least, have some protections under the law in the U.S., but you don't.

And so it's -- it's insane that people would give their data to U.S. companies, given the track record that we -- given what we know about what the U.S. Government does. Think about what we don't even know that they're doing, but we know that there have been abuses.

Alexander also said at the beginning of his presentation that he was more worried about cybercrime than -- than, you know, these abuses of surveillance powers, and it's interesting that, you know, some of the same techniques that can protect you against cybercrime also protect you from surveillance powers, and when -- so a great example of this is SSL by default, right?

So Google, in fact, is brilliant. The company -- in this particular area, the company is using SSL encryption by default for Gmail. So Vint said that they offer it as an opt-in for search, which is true but no one is using it. But actually in the area of e-mail and Google Docs, they have done the right thing and protected their customers by default.

I do try and say nice things when companies deserve it.

Now, unfortunately Facebook and Twitter and Microsoft and Yahoo! haven't followed Google's lead, and we're seeing that governments around the world are not happy about the fact that Google has deployed SSL by default. And in fact, Google was the target of a very sophisticated man-in-the-middle attack performed in August by the Iranian government. 300,000 Iranian users' e-mail communications were intercepted to get around the encryption that Google is using.

But the encryption that protects you from the Iranian government also protects you from the hacker sitting next to you at Starbucks who is using Firesheep.

And so if the Council of Europe and if other -- and if law enforcement agencies and governments are really concerned about protecting consumers about -- against cybercrime, we would see them pushing for SSL by default.

We would see them pushing for encryption on hard disks in operating systems by default, so when your laptop is stolen, your data doesn't go out the door with it.

We would see them pushing for automatic security updates. I suspect that everyone in this room who has a smartphone is running vulnerable software, because the operators and operating system vendors like Google are not giving consumers timely security updates. On the desktop it's fine, but on mobile things suck. And we don't have governments pressuring these companies.

So if we do care about cybersecurity and cybercrime, we would be seeing governments pushing for real security instead of just expanding their powers. Thank you.

>>KATITZA RODRIGUEZ: Let me open the floor for discussion. Please, yes.

>>ERIC JOYCE: Thank you very much. My name is Eric Joyce. I'm a member of Parliament in the U.K. I'm just -- after that, I'm just scared. I'm scared to go outside this room. Something terrible is going to happen. I've been putting stuff on the cloud and something terrible is going to happen, surely.

I mean, it seems to me that there's -- as Vint said, there's -- it's certainly true that we expose ourselves to a degree of risk and an erosion of our own privacy through the choices we make, but one has to be cautious about sinking into what, to be perfectly frank, is a mad, hysterical perspective upon the world.

There was one thing you said earlier, and I've forgotten your name but it was a very interesting presentation. But you mentioned something about Google willy-nilly handing over inboxes and so forth for 25 bucks, and other companies doing it for nothing. Surely that's simply an admin charge where a court order has been presented and in fact all those companies are doing is complying with the law.

And one has to be a little bit cautious, and I'm a fairly skeptical MP and, you know, we do get the chance to legislate from time to time. We do live in a democracy, and at one point you said even if you think that the laws of your land are all (indiscernible), even if you think as if, you know, we shouldn't necessarily think that, I mean, there's an element of, you know, common sense that has to be applied to healthy, skeptical -- you know, skeptical behavior towards, you know, Internet companies. But I think one has to be very careful about sinking into hysteria.

>>CHRISTOPHER SOGHOIAN: But -- And you live in the U.K., which is criminalized encryption, right? In the U.K., if you do not hand over your encryption keys, when the police ask you to, you go to jail, so your laws are not just and they're not reasonable in this area.

>>KATITZA RODRIGUEZ: Sorry. I need to moderate.

I'm going to the next speaker next to him.

>> Thank you. Thank you very much. I mean, you have a conspiracy theory about the U.K. but we're used to being misrepresented.

For instance, in this place, somebody has said on a couple of occasions, I gather, that the U.K. tried to close down social networks during the course of the riots in August.

Now, firstly, the riots were not political protests, they were merely theft and breaking into places. And secondly, the U.K. didn't try to close down social networks.

I sit on the Home Affairs Select Committee, and we asked about the issues and we asked whether the police had even thought about closing down social networks. And the answer was, "No. We thought about it for about a minute and said that is history. That's -- that would be crazy. You can't do that."

And what instead happened was a lot of use of social networks was quite creative. For instance, there was one piece of communication suggesting to people they should gather in a particular place to break into a particular group of shops, and the Twitter response was the police saying, "Yeah, that's great. We'll be there too."

The -- in other words, the engagement with what is a place of social intercourse is something that law enforcement needs to get involved in and embrace, rather than trying to think that they can control and close down things.

So I just wanted to put that point on the record.

One of the issues about the information available on Google, if I can point this to Vint Cerf, is the question of the quality of information. And I think there's a reputational issue coming up for Google now on if you seek information, do you actually get the right information, the core information.

I mean, just to take a very simple example, during the course of this conference, there was a lot of reference in one of the workshops yesterday to the consultation by the Council of Europe. There was also references to the things that Neelie Kroes is proposing in relation to Europe. I tried Googling both of those. In both cases, you didn't go to the core document as a result of the search. You went into what somebody had commented on or what a news report was. And surely there is a need for reputational dependability to be able to take people to the core document in relation to any particular issue, and if Google isn't doing that, it seems to me that the standard of information that's being given to people is reducing.

>>KATITZA RODRIGUEZ: Thank you. We should remember, too, that in U.K., the minister can search for authorized searchers where those related to foreign, for non-U.K. citizens. So it's just a minister and it's not a higher threshold when it's related to anti-terrorist and intelligence systems. So it's similar to the U.S. on that front.

We could go maybe to -- I am talking about the intersections power for foreign investigations have historically vested in executive, and under its anti-terrorist act the home secretary may authorize interception on foreign communications. And that's what I'm referring.

>> Yeah but the home secretary -- I think you've got a problem. It may be a linguistic problem, but if the starting point for your view is "government bad" and the whole -- sorry, then we have got something that is nothing to do with the governance of the Internet. It's to do with basic human rights in different countries. And one of the problems, it seems to me, is that the legislation and rules need to be neutral as to technology. Where it's a question of protecting human rights or where it's a question of regulating behavior, that behavior may be on the Internet or it may be elsewhere. We mustn't get muddled or have political polemic, which is what it sounds like, getting in the way of us having intelligent discussion about the governance of the Internet.

>>SPEKAER3: This is very relevant with relation to cloud computing, because we now, as citizens, our data is in the cloud and it's moving in different locations. And when we learn that legislations can ask for data with less standards than known nationals, it's a concern that we can discuss, you know. It's nothing about one country or the other one. It's just an example that I was pointing out.

But I would like to continue the discussion. Thank you, very gentleman, sir, you are very nice. And we would like to pass the discussion to -- Oh, okay.

>>VINT CERF: There. Thank you.

I'm sorry. I would like to respond to your request about the quality of the search response. This is a long story. I will keep it very short.

This is hard. The mechanism that we currently use involves how many things are pointing to the thing and so on. And so it's not so simple to figure out what's good quality and what isn't. We try really hard. If you have some ideas, come and visit us in Mountain View.

>>KATITZA RODRIGUEZ: Thank you.

There are two -- Yes. You can speak and then the guy in front of him.

>>SABASTIANO RWENGABO: Thank you. Sabastiano Rwengabo from the Singapore Internet Research Center. I think this question is a human rights concern, but I also have observations that we can share about.

We know that states violate our privacy, they access our personal data, but if we compare the extent of these violations to the violations done by cyber criminals, then we are possibly in a dilemma.

I would think that the (indiscernible) privacy and anonymity is not done by governments alone. Criminals are the first violators of such privacy and anonymity, so we need international instruments, such as the Budapest Convention referred to, we need government's commitment to respect these principles and provisions of such instruments. We need national legislative frameworks and laws that ensure justice and good practices. But above all, we need the capacity of law enforcement authorities to transparently and legitimately fight cybercrime, promote cybersecurity, at the same time without violating our rights, our privacy. And these are the concerns.

Therefore, we seem to be operating in a contested space, a space contested between cyber criminals and states' law enforcement agencies. And I think we need to understand the balance between these two contestations or contestants. I thank you.

>>KATITZA RODRIGUEZ: Thank you.

And the guy who is in the back, please.

>> Hi. I would just like to point out two conceptual errors. I am nit-picking, so please don't anyone take it personally.

First, I heard from Mr. Cerf that Google doesn't share information with third parties. I'm not pointing to anything that scares me, but I have a Google Analytics account. If you visit my Web site, I can see what key words you searched to get to that Web site, I can see what browser you are running. So I'm a third party there. Google is sharing not personal information with me. And then Google Analytics users can, in turn, share that information. I don't have a problem with it. I am just -- conceptually.

Another conceptual point. The British gentleman in the pink shirt said that the recent protests or riots, whatever you want to call them, in Britain were not political. I think that, "A," is it conceptually wrong and he should possibly open an encyclopedia of political science and find out what this thing called politics is. And I think, secondly, it's quite disrespectful to the concerns of your people.

Thank you.

>>KATITZA RODRIGUEZ: Thank you.

I need to move -- Wait a minute. Yes, please.

>>HENNING MORTENSEN: Thank you very much. Henning Mortensen, Danish Industries.

From the panel's first round of intervention, I think you had a rather pessimistic view on technologies. I think all of you more or less mentioned that it would undermine privacy. And also, it seems to be an underlying assumption with all of you that citizens are always identified when we process that data. For instance, for profiling.

I think that technologies can also be positive and be used to improve privacy.

We have new concepts like privacy by design. We have groups of technologies called privacy enhancing technologies like, for instance, pseudonyms. So I would like to ask the panel what do you think of these new methods and technologies as a possible way to improve citizens' privacy?

>>KATITZA RODRIGUEZ: Yes, please. And then Vint Cerf.

>>VINT CERF: So, first of all, I am in agreement with you that technology should be our friend here in addition to being a problem, and that's why I'm a big fan and was very much in favor of using SSL and other things to limit visibility of material that's going through the network.

I think that the biggest problem I can see right now is that, just in the case of using cryptography, it's not very convenient. It's not easy to use. We have to work hard to make that a lot simpler.

We also have to give people more tools to limit what happens to their information. And I think this is still -- once again, this is a user interface problem. If we interact with thousands of people, having dials and switches, one for each of the thousands of people we interact with is probably not workable.

So one of the big challenges I see is when one of scaling. Figuring out how to find way to characterize the information that we want to share and with whom we wish to share it.

A small example in Google docs is you can list by Gmail identifier who is allowed to see the document, and we enforce that.

>>CHRISTOPHER SOGHOIAN: So I would like to push back a little bit on what Vint said with regard to this being a usability problem, this being one of user interfaces. I think the problem is actually about defaults.

So prior to January of 2010, Google had always offered SSL as an option, but it was hidden in the configuration menus. And when Google decided to make SSL the default, every single Google mail user received that degree of protection without having to do anything. They didn't have to go into a menu. They didn't have to tweak any settings.

Companies can protect their customers by default, if they choose to do so. Now, it's true that if you log into Facebook there are hundreds of buttons and dials to change your privacy settings, but the system is stacked against you. The default with Facebook is to make your information public, and that benefits the company, and they offer this myriad of confusing settings that you can choose to use but no one will actually use if you wish to claw back some of your privacy.

So the gentleman is correct that there are privacy enhancing technologies but as I said in my opening remarks, these are fundamentally in conflict with the business models of many companies.

If you are paying a company for a service, then maybe they will deploy some more privacy enhancing technologies, but when the company is monetizing your data, to provide you with a free and useful service, it's going to be really difficult for them to justify not saving any data by default or deleting IP addresses the minute they come in the door. Those are going to be tough decisions to get past the marketing team and other teams within the company.

>> Thank you very much. Christopher, I fully agree to what you were saying, the importance of having privacy on default. Encryption is today something where you can have full hard drive encryption by pressing one button. So I don't necessarily agree to what Vint said.

But what I think we need to do is we need to balance two things. On the one hand side, we need the privacy; on the other side, we need to be able to investigate in cases where society believes it is important to do so.

And this needs to be done within a global discussion. We cannot discuss this.

The U.K. can discuss this on the national level, and we have seen what's come out of it. The regulation of Investigatory Power Act which, as you just mentioned, gives them the authorization to ask you to disclose your password if you are a suspect of investigation. In most countries in the world that would be a violation of fundamental principles of human rights. We don't have to convict -- If you are a suspect, you don't have to participate in your own conviction.

However, what we need is we don't need to discuss this in the U.K. or in the Council of Europe or in Europe. This needs to be a global debate. And we need find the right tools but for protecting privacy as well as investigation instruments and that is something that needs to be updated. And I think we have the representative of the Council of Europe in the room. The Council of Europe convention does neither address the issue of anonymous communication, which can be a benefit as well as a disadvantage, nor encryption technology. So we see that countries are popping up with their own solutions. Same with interception of communication. No safeguards are clearly defined.

So what we need is we really need to speak about what do we accept as a society with regard to interference with privacy.

>>KATITZA RODRIGUEZ: Thank you, Cynthia.

>>CYNTHIA WONG: Hi. Cynthia Wong with the Center for Democracy and Technology. For those of you not familiar with CDT, we are a Washington, D.C. based NGO that works on civil liberties in the Internet. And I just wanted to second that last intervention very strongly. I think in many of these issues, finding the right way forward to protect privacy, on the one hand, and also to meet legitimate law enforcement needs on the other, these are cross-border issues. And I think we all need to push for stronger privacy protections in all the countries and work together to do so in order to make sure that the privacy protections that we have had in the law previously, before many of our communications moved online, still have some meaning in the digital age.

Just to highlight one effort that we're doing in the U.S. CDT, EFF and Google are also part of a coalition called the digital due process coalition and the goal of our coalition is to really update U.S. communications privacy law to ensure that privacy protections still have meaning when we move all of our e-mail to the Web, when we move all of our social networking to the cloud, et cetera.

And so I just wanted to highlight that and really say that I think we all need to work together because these are very much cross-border issues.

>>KATITZA RODRIGUEZ: Okay. Please, those two ladies in the back.

>>KAREN REILLY: Karen Reilly from the Tor Project, a U.S. nonprofit organization that provides privacy, security and anonymity to many sectors of society.

We talk to law enforcement and groups of people who need security to deal with issues that are both the responsibility of law enforcement and NGOs. It's important to note that we're not just talking about citizens versus governments here. We are talking about citizens versus fellow citizens.

One example is survivors of domestic abuse have a lot of good support systems online, because they are often isolated from society. But technology that gives governments access to information of private citizens can be exploited by criminals.

The Internet is a tool for forming support networks, but stalkers know this. They target phones, Internet connections, e-mail accounts, and the systems of survivor support organizations.

So I ask any people seeking to expand the power of the state ask themselves first if this law will have a negative impact on vulnerable populations, and especially when law enforcement agencies are not empowered or, frankly, not inclined to help these citizens.

>>KATITZA RODRIGUEZ: Thank you.

Yes, please.

>> Thank you. My name is Samu Ma (phonetic ) from Kenya. I also teach electronic business.

I think one aspect that has been left out in all the discussion is education. I think having users who are informed and who know how to utilize all these technologies, I think it will come a long way in solving some of the problems. Whatever systems we come up with, somebody will always invent something better than that.

So let's educate our users. Let's get engage them and I think that will go a long way in helping resolve some of these issues.

Thank you.

>> My name is Francis. I am project manager, Assistance Limited. We are deeply concerned in the rate and the ratio in which pornography stuff are rolling in within the Internet, and we are worried about little kids who going to the Internet every now and then.

What can be done in terms of either -- because I believe, and I'm happy that we are now discussing about the Internet crime. And I believe those who are doing -- posting all these kind of stuff on the Internet, actually to me I see that as a crime. How could they be stopped so our kids can grow in a better natured way and can do about -- forget about these kind of rubbish that we are able to see around that cannot make them consider to the education.

>>KATITZA RODRIGUEZ: Thank you. We will take one more question, because we are running out of time now, from you, and then we will have a quick wrap.

>> My name is John Carribio (phonetic). I am from Brazil, (indiscernible) to freedom on the Internet.

My question is for European Commission. Just because the instance Budapest Convention, this convention was developed under the stigma of September 11 when the world population was terrified about the fact. At Brazil, we fight against project developed based in the convention Budapest which install the vigilante-ism (indiscernible) state, affecting the citizen rights, affecting the citizen privacy. I understand the convention developed ten years ago in Internet times represented 70 years. Why this convention was not executed and adapted to resolve the crimes but not affect our rights on privacy, as I told the guy from before me. That's the question.

>>KATITZA RODRIGUEZ: We are running out of time, so we have an intervention for all the speakers.

>>ALEXANDER SEGER: Should I respond to this question first? Between 9/11 and the opening for signature of the Budapest Convention, there were exactly 14 days. You cannot draft a convention like that in 14 days.

The convention had 14 years of preparatory work. It's a pure coincidence that it was then open for signature. The decision had been made long before and I don't think the Council of Europe was privy to the plans of terrorists when exactly to hit.

With regard to Brazil, Brazil has been working since 1997 on a law on cyber crime and it is still not there. If I were a Brazilian I would be very much concerned about that. And we can talk after this meeting in detail. I would really like to know from you where you see that the Budapest Convention would intrude in your private rights. I don't see it, but let's talk after this meeting.

And since -- Can I use this for my final word? Sorry to hijack the thing.

Okay. The point is that -- The comment made about the reform of ECPA in the United States, I think this is an illustration of a process in a democratic society, and I hope this will lead to something positive in the future.

Indeed, I don't feel safe if my data are stored in the United States. And, Vint, you will be very disappointed, I am not using Google for that reason.

And I think with that we have to come to another argument. It's not just a question of the ballot box that will then decide whether laws are adopted. If cloud services in the United States are not privacy-proof, the cloud services will move to other countries where there is better privacy protection. So eventually there will also be the economic argument, and hopefully the United States of America will in not too distant future also become a party to the Data Protection Convention 108. United States is a party to the Budapest Convention, but we would also like the United States to be party to the Data Protection Convention 108.

Thank you.

>>KATITZA RODRIGUEZ: We go this way.

>>KATARZYNA SZYMIELEWICZ : Thank you. I would love to responds to questions but I cannot in this time, just two quick points referring back to ideas that I heard from the audience. One was the trust in technology, especially privacy by design. And the other one, trust in our own governments to do good things for us and protect our rights.

Well, this is the same problem with both of these.

I mean, we cannot really entrust our privacy to entities that have vested interest to infringe that privacy. I am not telling that everybody, every company, every government will do that. But there is a potential in both cases that there will be abuse of our personal information. So if we really need either the use the technology, which certainly can be the case, to protect ourselves, or we want our governments to behave well, unfortunately we need certain safeguards, checks and balances. Democracy that is designed a way to provide such checks and balances. Now we need to think about other safeguards for the companies in international settings.

So it's not good news for us, but unfortunately we have to develop international principles, which is extremely difficult but that's why we meet here.

So I wouldn't like to just leave with the trust for technology in itself or the government in itself.

Thank you.

>>VINT CERF: Very briefly, we have a social contract, I think. And that is that we form governments in order to make ourselves safer in our societies. And we have to give up some of our freedoms in order to do that.

Plainly, we don't want to give up more than is necessary. And so finding a way to protect people from harm and, at the same time, protecting their privacy has to be balanced in some fashion.

Technology may have a role to play, but so do the legal systems.

We're, at Google, very conscious of that. And I hope that we will be able to find a way forward, as was mentioned, with modification of the ECMA and other constraints on the use of these technologies.

>>AMR GHARBEIA: I would like to quickly respond to the point that technology actually hinders or undermines privacy and freedom actually. It depends on how you look at it. Technology can also create freedom and can also create privacy for you. So it's technology is neutral depending on how you use it.

The discussion generally has been going the direction that a democratic society would, with due process, would keep people's privacy and anonymity on data, basically. And, well, even with that measure, let's remind ourselves that the majority of the population on this planet do not live under those democratic regimes, and even inside the democratic realm, let us remind ourselves that, for example, in 2006, according to Amnesty U.K. chief Keith Allen, Microsoft handed over the details of Mordechai Vanunu's Hotmail account by alluding that he was being investigated for espionage. Mordechai Vanunu is a pacifist, is an anti-nuclear activist. This happened before a court order had been obtained.

So depending on which side of democracy you are is not necessarily a guarantee.

>>CHRISTOPHER SOGHOIAN: So I have some bad news for those of you in the audience, which is the Digital Due Process Coalition is aiming to -- the bill they have been supporting and the reform they have been pushing is about reforming law enforcement access to data, not intelligence agencies' access to data. This is about largely protecting U.S. persons from U.S. law enforcement.

So the gentleman from the Council of Europe, he, by his very job, is an agent of a foreign power and it would be very trivial to get a FISA order to surveil you.

So whether or not the Digital Due Process Coalition is successful, your e-mails, if you ever decide to give them to Google, will be easy to access.

And Google receives lots of requests from the FISA court. It's likely. And there is nothing they can do to say no. You cannot criticize Google in this space, but there is nothing that any U.S. company can do to say no.

I live in Washington. I understand the political realities of D.C. You cannot expect U.S. politicians to fight for legislation to protect the privacy of foreigners. It is not going to happen. Politicians do not want to be accused of protecting the privacy of terrorists or foreign persons.

So as long as your data is accessible by a U.S. company, you are toast.

Thank you.

>>KATITZA RODRIGUEZ: Thank you, everybody. With this, we close the session.

Thank you.

SOP Workshop 160: Global Trends to Watch: The Erosion of Privacy and Anonymity and The Need of Transparency of Government Access Requests (2024)
Top Articles
Latest Posts
Recommended Articles
Article information

Author: Roderick King

Last Updated:

Views: 6098

Rating: 4 / 5 (51 voted)

Reviews: 90% of readers found this page helpful

Author information

Name: Roderick King

Birthday: 1997-10-09

Address: 3782 Madge Knoll, East Dudley, MA 63913

Phone: +2521695290067

Job: Customer Sales Coordinator

Hobby: Gunsmithing, Embroidery, Parkour, Kitesurfing, Rock climbing, Sand art, Beekeeping

Introduction: My name is Roderick King, I am a cute, splendid, excited, perfect, gentle, funny, vivacious person who loves writing and wants to share my knowledge and understanding with you.