WS 308 PRIVACY AS INNOVATION: RETHINKING PRIVACY AS AN AREA OF OPPORTUNITY

FINISHED TRANSCRIPT

  

EIGHTH INTERNET GOVERNANCE FORUM

BALI

BUILDING BRIDGES-ENHANCING MULTI-STAKEHOLDER COOPERATION FOR GROWTH AND SUSTAINABLE DEVELOPMENT

24 OCTOBER 2013

09:00

WORKSHOP NUMBER 308

PRIVACY AS INNOVATION: RETHINKING PRIVACY AS AN AREA OF OPPORTUNITY

  

 

********

This text is being provided in a rough draft format. Communication Access Realtime Translation (CART) is provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings.

********

 

   >> GRY HASSELBALCH LAPENTA: Okay. Good morning. We are going to start now, because we are -- already the clock is ticking. I'm from the Danish Media Council for children and young people and representing the Insafe Network here.

   This is privacy, innovation, 308, in case anyone was wondering. We have a large panel. Ten people up here. I'll start by taking five minutes out of the debate to say a few words about the objective today, which you could say in a very simplistic way, it is to rebrand privacy, with the help from these young people here and here.

And if you look at the history of online privacy, it lived so far in three stages. An early stage, where anonymity was described as this unique opportunity to experiment with identity, and challenge -- under its protection, of course -- establish forms of power, and constitute market models. And this description was of course also based in the perception that the Internet is a new free territory, where all social and market rules don't apply.

   Now that stage has been followed by a second stage, where we could claim that online privacy has in some sense gained a bad reputation, meaning that it's been named or named for a lot of things, including being a cover for illegal activities, to being an obstacle to innovation, juxtaposed to everything that is social, public, and shared. And for that reason, contrast it to open innovation in big data, cloud services, social media and so on. And some have even at this age, in this age of social sharing, have announced privacy dead. I'm not mentioning any names here.

   And yet, under these changing cultural conditions, we have heard several vocal voices that have consistently reminded us why privacy or individual empowerment for both youth and adults still has remained a basic human demand and a foundation for these spaces where creativity and free thinking and including innovation can thrive.

   Now, in the past few years, and here we come to the main point of this discussion today, we see a third stage emerging, one where users, while they embrace online social media, they still demand also to set their own boundaries, to create circles of inclusion and exclusion. This very complex, deeply humanly rooted ability to set the context of our interactions.

   So privacy in this late stage is becoming a market demand, at least I think so. Because service, they show, if you look at service from the last couple of years, they show that users are increasingly asking to trust the services they use. And a lack of trust will actually affect their practices.

   And also, if you look at this year's reports about the rising popularity of services that, for example, have multimedia messages that disappear after receipt, like Snapchat, or social media services that promise privacy and confidentiality to users, such as a sample-path, you can also see that there is starting -- there is a paradigm shift in what privacy means today.

   So this, including -- and of course you can't have a privacy debate without mentioning Snowden. But including Snowden's revelation of this blanket Internet surveillance, this is creating a momentum for a new definition, maybe a different business model.

   So today's panel is about starting this discussion to talk about how we might rebrand privacy as something that may exist in an open, social public space, as an area of opportunity and innovation and a basis for new great inventions.

   So again, as I said, we have ten panelists, even more panelists on the floor. I think we also might have some remote participants, which Mikiah Gordon from the UK youth IGF is part of.

   We were told to keep the interventions short, so I won't go on now. I have a clock and I'm not afraid to use it.

   Just to introduce the panelists, we have representatives, one of them is coming here from the industry, Google, Max Senges.

   We have from Microsoft the chief online safety officer, Jacqueline Beauchere.

   We have a representative from academia, associate professor at the IT University, Gitte Stald, which is also the person who I organised this workshop with.

   And we have representative from Governmental institutions, Claus Hjorth.

   And we have Malgorzata Steiner, who is head of the Department for analysis and public education in the Ministry of Administration and Digitization of Poland.

   Of course we have our five youth panelists, Jack and Matthew. Jack from the Youth IGF project in the UK.

   We have Bastiaan Zwanenburg. You are there, you were sitting too close, from the Netherlands Youth IGF project.

   And we have Luis Ivan Cuende next to me, who is from Asturix Holalabs. And we have the youth Ambassador over here.

   So I want to start the discussion by asking the youth, because I think you might be able to tell us about the trends, the future use, and maybe a different meaning of privacy. So how could we think about privacy and issues of control. What do you need? Is anyone jumping to start?

   Luis, to you?

   >> LUIS IVAN CUENDE: Well, I think we have to get to the point in which privacy is part of our human condition and we need it. You know, from the recent scandal, we're starting to think about the kind of business models to, you know, for start-ups. So instead of having a centralized Facebook that has a big database, and other companies and so on, I propose a new like model so the server is peer-to-peer. So instead of giving away our data to other companies that have, you know, like centralized servers and a centralized situation, I propose peer-to-peer Facebook. So I own my data and I have my date in my own computer and I share it with the rest of the network. But I don't give away my information to a single company and they sell it.

   So from a technical perspective, it's easier to just, you know, code, and we can do it. We have the resources and the tools to develop peer-to-peer applications, like a Facebook peer-to-peer. So why not. Let's do it. Let's build peer-to-peer Facebook that respects our privacy as users, not a centralized company that goes through the whole network.

   So that's my point of view.

   Thanks.

   >> GRY HASSELBALCH LAPENTA: Thank you. Is anyone -- yes?

   >> MATTHEW JACKMANN: I'm Matthew from the Youth IGF. When I think of privacy, I think of two channels. It's the stuff I want to protect and the practical element of how do I do that, the security.

   You're right, I feel there is a huge demand for privacy and a demand to protect. And that becomes and is the way we use the Internet and how much we trust each site. And I think the trust element in the ISPs is massive. I trust Facebook, because they -- their application when they say view as a friend, you know, increasing that trust, because I can see it from the outside perspective. What am I actually giving away.

   And there needs to be more applications, because -- applications and phone mobile devices. Obviously how the Internet is being used and, you know, apps that increase the trust in the SPs, so apps -- not just Facebook, but allow you to view your profile from the different aspects, from the complete random to the friend to yourself. And maybe an app that rates each site and then it's sent to the ISP so they see how well they are protecting each and every person.

   Well, privacy from the word, Privace, it's individual, free. It's about what I'm protecting and I want to trust the site that I'm giving my information to.

   >> GRY HASSELBALCH LAPENTA: Thank you.

   >> JACK PASSMORE: I'm Jack Passmore from the Youth IGF project.

   For me, privacy means that we have to keep what we know to ourselves. So protecting that data that we have and the media that we have ourselves, as long as we have it kept to ourselves and we're not sharing it, it shouldn't be a problem. That shouldn't be any problem that arises from them. I use Facebook and Twitter. Twitter is sort of my more day-to-day what I use, and I tell everyone what I'm doing. And Facebook is sort of more media sharing, photos and music and things.

   As for the privacy settings on there, on Facebook, mine are more lenient. I allow myself to sense what I put online before I put online, and then my censors do it on there. I don't need to share with everybody. As long as I have my friends and me online, that is all that matters.

   With regard to Twitter, there is not a lot of privacy on there. So I put only what is appropriate for all age groups and for everyone to see.

   I'm a bit of a gamer. And as with regard to settings on that, I control them myself. So when I'm sort of chatting online during a game, as long as I'm not giving away my personal information and my age, address, things like that, then there is not an issue there.

   I only put in data to sites that I trust. So like Amazon I'll give them banking details and my address. Online banking, they will need that.

   But yes, it's sort of personal censorship of my data, and then using the privacy settings accordingly on social media sites.

   >> GRY HASSELBALCH LAPENTA: Thank you.

   >> BASTIAAN ZWANENBURG: Good morning. I work at two companies. One is a telecom provider for young people and the second one is a shop where we sell sunglasses. And I feel that everyone sees privacy just as a threat. But I think it can also be an opportunity.

   And for example, with Willify, we use Facebook to show things to our users. Your target ads, I want to see people from the Netherlands who are between 18 and 25 years, and all those people, we advertise. And most of them like the advertisement that they saw. So they saw an advertisement based on who they are. But they actually liked it. So some of them bought a pair of sunglasses, and they are happy with the pair of sunglasses. So it's also an opportunity because it makes people happy.

   And it also offers me a way of like earning, like, 200k in 7 months. And it's an opportunity, and we shouldn't forget that. There is a down side of this. Sunglasses are harmless. It's not a harmful product. So if it's a harmful product like medicines, you shouldn't do that. But it can also be an opportunity and we shouldn't forget that.

   >> GRY HASSELBALCH LAPENTA: Thank you.

   >> Hi everyone. My name is Gian and I'm an ambassador from Hong Kong. To me, privacy is my personal information, name, date of birth, address. And to me, privacy is something that I have a choice if I want to share it with other people or not. And, admittedly, like the youth today is the most common Internet user. So it's like we're more prone to Internet threats, especially if we lack the knowledge to responsibly use the Internet.

   And there is -- it's crucial that we don't really know who to trust, like where we should share our personal data or not. So many privacy issues have been arising, like apps are sucking up our data or hacking. 25 percent of the free apps, like they don't really have secure privacy policies. And because of that, I think that the youth also has increasing demand for better privacy security settings.

   And I think that is crucial to have the choice to pick up what we want. For example, when it comes to ads, it comes to ads, I think we should have a choice to see what kind of ads that we want to see, instead of just, you know, getting our search histories and predicting what kinds of ads they will show us, I think it's necessary that what we see is like what we want.

   That's all. Thank you.

   >> GRY HASSELBALCH LAPENTA: Thank you.

   The next panelist on the floor here is Gitte Stald from the IT University. And Gitte worked for 15 years on youth trends in mobile and social media. Last time you were the Danish coordinator of the online survey, which was looking at 18,000 young people all over -- 25,000? Oh, wow. And you are publishing new results from a European thing that has gone mobile now.

   So can you see trends? Have you seen trends in behavior among youth? You told me, for example, that you see a trend to increasingly ask for control, and I think that's what we heard from the young people here as well. Can you elaborate on that?

   >> GITTE STALD: Yes. And thank you for mentioning, because it's hard to come after the personal voices of the youth. But I've been studying and talking to a lot of young people. And we have the youth databases and I've been comparative studying. I did the same study five times over ten years. So I have a sense of continuity and development.

   One of the main results we have is that it's very difficult to give one answer, to give one kind of picture of what the situation is. Because one thing that impacts very much what the experiences and attitudes towards various things such as privacy is, is that of course it depends on the age and gender and background. But also very much which country you come from, which part of the world, and which history and media your part of the world has, because the awareness of issues such as privacy and other things depends very much on experience. And we can hear that these young people base their attitudes towards privacy very much on their own personal experiences, and at the exact moment in time where they experience these things, and the context where they are. And these young people are from Europe and from Hong Kong. But in other parts of the world, the history, the media would give a completely different picture.

   Another thing, of course, is the general attitudes. I mean, the issues of trust and risk in society, which seem to be very important as well. In Denmark, for instance, people trust the system because that's what we do. And that has had an impact on how you trust the digital systems. In other parts of the world it's different. So that is one main thing.

   But the other important thing here are the issues of trust and risk that we heard about. But also the fact that young children, young people just like everyone else, but specifically young people, are somehow caught between this need for being constantly updated and constantly connected, constantly having access to information and all the different services that you have through digital media, and then on the other hand also being in control to a certain degree. And it's not as much as we can see in our survey, it's always a matter of having the right applications or technologies and so on. Because those are technical solutions that are kind of invisible sometimes. And if we ask about them, well, we're not impressed. We can say about the level of how young people use the settings and these opportunities.

It's much more about the feeling of being in control. The fact that you want to control who you talk to, in which situation, who knows where you are at what time, at which times.

   For instance, in Denmark, very few young people use things like location-based services and so on, because they prefer actually still text messages, because it's so much easier to be in control of who you talk to.

   So it's very much about this personally based control rather than the technical solutions. So I suppose that it's kind of -- well, again, it's a very diverse picture. But it's a combination of these very complicated things about what do the systems and the technologies provide and how are they openly, visibly implemented in everyday life, and what are the attitudes towards being in control and how do young people act towards that.

   And just one final comment. It's obvious that there is something going on over time, for instance, the attitude that well, it's also an opportunity that we don't have privacy. Because the systems can provide you with all the offers you want, personalized information and so on. And that's quite different from attitudes from 20 years ago. So it's very obvious that attitudes also change over time. And I think that that is part of the need for innovation and how we deal with these issues.

   >> GRY HASSELBALCH LAPENTA: Thank you, Gitte.

   So now I'll go to the industry participants from Microsoft and Google. It's clear now when we hear are both Gitte and the young people, the choice, personalized sense of privacy and control is an issue or something that is important when you talk about privacy.

   Now, data is one type of thing, but apparently trust is another thing. And if you look at services, you can see that lately there has been an increase in, for example, looking at deselecting services. People don't trust or are not using applications they don't trust. So I just want to hear your opinions about, first of all, like how do you build strategies to build user trust? And also how you innovate in privacy in Microsoft and Google. And I don't know who would like to go first.

   >> Ladies first.

   >> JACQUELINE BEAUCHERE: Thank you. And thank you to Gry for inviting us to participate in this panel today.

   I work in Microsoft. And my specific bit of concentration is online safety. I would say that trust is a very emotional construct and it takes a lot for us to build trust with each other. It probably takes more time to build trust with an organisation. It's a long established relationship and there have to be certain attributes that take place. We need to feel safe. You need to feel secure, but we want to say more secure. We want to build those more trusted online experiences.

   In terms of strategies from industry, I think there are some strategic considerations that we have to take into account from a variety of stakeholder groups. So we're talking about consumers and companies, organisations, even Governments.

   Individuals and organisations are asking important questions about how their privacy is being protected. And privacy, as many people have said so far, means very many different things to very many different people. I think we run the gamut of privacy doves to privacy hawks, and every flavor in between. So companies need to make sure that they're offering individuals the ability to be where they want to be on that scale and how they want to control what they distribute, what they put out there online, and what have you.

   Data collection and analysis I think are going to be core to innovation moving forward, and personal data, increasingly have that economic and social value.

   Companies are also increasingly collecting and using and hosting and sharing privacy impacting data. It's core to their business models.

   And we're also seeing heightened scrutiny from regulators, researchers, media, and of course the public and others. So of course it comes back to a need for discussing the common good. And I think we need sort of a collective viewpoint on how we use data.

   Thank you.

   >> GRY HASSELBALCH LAPENTA: Thank you. Max?

   >> MAX SENGES: Thanks for inviting us, too. I think there already has been quite a number of interesting aspects. I hope to add a bit. I work in Google's Berlin office. As many of you know, Germany has a particular sensitivity and interest in good data usage. So, yes, we believe there is quite some innovation and privacy relevant innovation coming out of Germany. And I'll go into that in a second.

   But let me start by putting out two -- two stories or quotes. One is from Wilson and he says "Everything that comes into the world when you're young seems natural and it's just the way it is. Everything that comes into the world after you're 30 seems somehow crazy, and we should better watch out that it's not destroying the world and the good way it used to be."

   And I think that's a lot of what we're seeing is that a lack of expertise always causes insecurity and basically in the end asks for better control and, you know, we have to gauge this. And therefore I think it's a real appropriate setup that you have chosen for this panel to talk to the youth and to hear the way that they are using and really benefiting from the technology. If you look at Dana Boyd's work that you quoted in your Article, I think it's absolutely correct.

   You know, I remember when I got on the Net, I figured it out. I looked around, maybe I got a bloody nose or two, but this is the way you learn. You have to experiment and then you find out the way things work out for you. And then Boyd describes how kids are super aware of what they are sharing and how they are sharing. And they do have a sense of privacy even though they share a lot.

   So I think there are many misconceptions of well meaning parents and experts who try to, you know, basically exercise their parental control. And I'm not 100 percent sure that that is the right approach for finding the solutions that are appropriate and normative in the sense that we're going forward.

   And that leads me to a number of dichotomies that I would observe when it comes to privacy and the way that we talk about it and try to get it right.

   The first one, and I think it's probably the most important one, is the conflation between privacy and security. In many aspects and debates, and understanding what I do as a private person, and I want to keep to myself and my social contacts, et cetera, is conflated with data theft and surveillance and security breeches.

   The second one is between the traditionalists and the progressive thinkers. So people who want to conserve the way it used to be, where nobody knew what I was doing and why are you sharing all of this? This is a view where you want to conserve the past, basically. And people who think okay, there is all these new tools. How far can we benefit and get it right?

   The third one is between people who are Mazzian thinkers, who try to find solutions, and people who are just privacy professionals, basically paid guns who argue this way or the other and are not actually trying to solve something, but to make money or to build a career out of this.

   Now, I think a point that is really important to point out in this context is also it's absurd to think of Google as a data collecting privacy destroying monger. Because what Google wants to do is deliver a good service for its users. That is the only reason that we're out there, to make money by providing a good service and building trust for the user.

   We have more than 100 people at Google working on this stuff. Doing user research, trying all kinds of controls and trying to figure out what works. Interestingly enough, we have put all kinds of settings and people are not using them. So you can have the most sophisticated privacy setting, but if it's too complicated for people to use it, and privacy is one of the subjects when you ask them in a survey, they say it's really really important. But when you ask what are they doing about it, nobody really does anything.

   So we have to look at the interest and willingness to go the extra mile to safeguard your privacy, if you want.

   It might show through that I have a philosophical approach to this. I'm a Ph.D. in philosophy. So I spare you some of the private versus intimate, private versus civic participation part of it, which I think is quite -- let me go on the last one a little bit, private versus civic participation. It's actually a civic duty to participate in the public as a private person, in contrast to your professional duty when I'm here sitting with Google, I'll make a different argument, and this is the distinction between private and the privacy in the sense that I can act as a private person versus my professional participation in a discussion like that.

   So just some food for thought for how we can construct this. And I'll give you a very short list because you asked for that and how Google is innovating in that space.

   We see it as an opportunity to see privacy made in Germany to test with one of the most sensitive people in the world about these subjects, to work there and work with the people around there. It's more than 30 people working on that in Germany.

   The dashboard, many people don't know about this. Google dashboard. You can see all the privacy settings for all the Google products that you are using, and you may have signed up for more Google products than you remember. So when you go on there, you'll see oh, I never used that product for ten years. You can quit the account or you can change the settings the way you want.

   I think circles and Google Plus is actually a very smart analogy that makes it easier for people to understand with whom they are sharing. It's not such a new thing. There used to be lists in Facebook and other products, but the way it's built from a usability point of view in Google Plus is actually smarter, I think, than many other products.

   >> GRY HASSELBALCH LAPENTA: And, Max, I'll use my watch now. If you could --

   >> MAX SENGES: I have two more. Cognito and Chrome is something that I recommend in terms of browsing without there being any record of what you are doing. Because you do want there to be a record in order to get better services, and I'll leave it there.

   >> GRY HASSELBALCH LAPENTA: Thank you. Moving on from the industrial perspective, it's also interesting to hear the policy perspective on this. Because it also -- you can feel and you hear this a lot at IGF, that you need innovation in privacy policies as well. It seems that we have a privacy self management system. I'm not sure it's enough. But I don't know, how do we feel about privacy and data protection and Regulations? I was thinking myself that maybe you could build privacy policies into innovation policy, but I don't know.

   Should we do the ladies first again? Mrs. Steiner, why not?

   >> MALGORZATA STEINER: I work for the Polish Government and we are active on the topic right now. And we are active because we feel this is what our citizens expect from us. So of course we had very youthful voices here of the youth participants. So thank you for them.

   We know surveys that show that people are increasingly worried about their privacy and they might, as Max said, we might not yet put the things to action, but increasingly, actually, taking into account the settings of the products and trying to fix the privacy settings as much as they can.

   So we think that certainly like the awareness raising part is crucial, and no policy will actually substitute for awareness raising. Because we do believe that you should get products that fit your personal preferences, and not that you should get products that are unified by policies. And so we think awareness raising is the most important part and this is something that we should do altogether with companies.

   And next week we will start a coalition for privacy. It's going to include companies and public sector representatives, and they will be educating people about privacy as a passive concept. So we are not interested in depicting privacy as a threat. So we don't think that is a useful kind of approach. We think it should be something that is seen as an opportunity. But to do that, we need the cooperation of all the stakeholders.

   And here I come to the policy question. So this cooperation is not only needed in coalitions about awareness raising. This cooperation is also needed in terms of finding ways to regulate or to adjust our regulations to the current state of event. And I think it's a little bit courageous to talk about innovation in policy because actually our policy needs a catch up. And we need to like be aware that Europe is actually considered as a pretty like high standard in terms of privacy. But our directive that regulates privacy, and actually in all countries it has been transposed into laws of all Member States, and it's from the year '95. And I believe Mark Zuckerberg was 9 or 11 when this directive was written. So it's been a while ago.

   And there is no way that this law is adapted to the current developments. And I do not believe that law should preempt the future developments, because this is the way to innovation. But I do think there is a need to adjust to the new context given to us by the digitization and the potential of use of data in the digital world.

   So, currently, the EU is working on a new general Data Protection Regulation, so it's going to be a Regulation, not a directive. And this is going to be like a basic, basic law, piece of legislation, that will actually in my opinion address the current -- well, the current framework to the digital reality.

   For example, by expanding the definition of "data," because of course like previously it was only a name, surname, and other things that would lead to direct identification of a person would be considered data. And right now you can identify a person based on two random informations easily using the Web. So there is a need to expand the definition of what personal data is. Because, simply, you need less data to find out who is who. So we think it needs expanding.

   Similarly, in terms of profiling. So of course it's a basis for many businesses to function, and we don't want to forbid profiling or make it like too hard to be relevant, but we do want to set some rules for profiling. Because we do believe that people should be informed and that they are profiled and that the characteristics are used to give them a special kind of service, and they should have a choice. And we do believe that profiling shouldn't lead to legal consequences for a person.

   So we don't want to -- so we feel that the citizens don't want to live in the world where, for example, you can be prevented from getting credit because of the friends you have on Facebook, or because of your environment that you derive from, your family background or whatever.

   So this is something that is really important. And that is currently scarcely regulated. There is no way it could have been regulated well in '95.

   So these are the examples of the changes that will come with this law. We don't believe it should stifle or inhibit innovation. And that's why in Poland we work with private sector representatives and NGOs to figure out which elements of the Regulation are acceptable for all sides, and which would be too far. And this is really important for us to make sure that this is -- that this Regulation doesn't go too far in either direction.

   So it will not take care of everything, and we still believe that user awareness will be the core of privacy for the future. But I think it will be helpful and it will give us a helpful framework for people to at least be fairly informed and to -- well, basically, informed about what choices they are making.

   And I want to stress one more point. Some research shows that even right now, the market for data, for privacy, and using data and digital services, is like $330 billion a year worth. This is an enormous amount. And also for public sector, using personal data and data in general is very useful. So we need that in order for the Government to improve our services, I'm not talking about spying, but to improve our services, let's say health services for citizens, we can very well use data. So data helps us to give people personalized services. And of course we are also under pressure, so if Google is improving its products all the time, the citizens when they come to the Government, they are not expecting the 19th century type of service. So they learn from Google, from Microsoft, from other companies what the standard is, and they expect similar things from the Government.

   And then I wouldn't like to be the public servant who then needs to say well, sorry, we are the Government. So we will stick to the 19th century type of giving as a service, which will not be personalized, which will not take into account your personal preferences and so on. So we want to use these opportunities. But we believe that analogy to currency is very good in this sense. It's really dependent on trust.

   So in the same way I think the private and public sector is in the same boat on that, that we really need trust to continue improving our services as the Government and as private sector companies. And to have this trust, we need to have a clear legal framework that we are currently working on in the EU, and we need to have awareness raising that we will all work on. So one boat at the end of the day.

   >> GRY HASSELBALCH LAPENTA: Thank you.

   Claus, your area is not privacy regulation, but it's media policies, and that's why you have been brought here, because it's interesting to have a person with a different perspective but still with an experience in the area of policymaking. So do you have -- what would you like to add to this?

   >> CLAUS HJORTH: Just a small comment. Actually, think we are living in amazing times. Because we are used to, seen from a parents perspective, we are used to that we can trust the Government to regulate the way that young children and children use media.

   I want to talk about film classification. Because the nature of film classification, censorship from the Government, we will, if we put down an H for a specific film, parents know that you cannot go to the cinema to watch that movie. So parents and teachers are used to the trusts we can put in the Government. But in this field, when we talk about privacy, when we talk about the new media, we are putting the trust to the companies. So there is a shift.

   And you can say the role of Government in this respect is twofold. One is, of course, regulation. But another thing is to explain how these processes are actually going on to teachers, to parents who are, you can say, lost in cyberspace when it comes to raising children in these matters. Because they don't know how. They don't know their technologies. They don't know what kind of troubles the young people can get into if they have a long track or a bad track on Facebook or whatever.

   So I think this is the role of Government. And I'm very much agreeing with you from Poland that we have a dilemma because we are always on the backtrack as a Government agency on what is happening out there. Because, first of all, the young people, they are running very fast and also the companies are running very fast after the young people. And then we are sitting back there, sometimes we are feeling we are sitting back there. What should be our role? Should we be on top of this? Should we be the watch dog or should we just say well, this is, as it used to be, a question of Government ruling. And I don't think we should go by that track, but it's the dilemma for the Government in these times.

   Thank you.

   >> GRY HASSELBALCH LAPENTA: Thank you, Claus.

   Now we are actually opening it up to the floor. But just to sum up, you can see that we are discussing a kind of definition of privacy here, with the young people emphasizing control, personalized definition, and trust is an issue as well.

   From the industry, you can say that you mentioned some dichotomies between different stakeholders and different people having different opinions, and that we need to meet in the middle to have some practical solutions to some of the privacy threats that I also hear.

   So I'm just going to leave it out in the open and then open to the floor. But one thing I was thinking, and it's a question and I'll just leave it out in the open, is that you often hear that privacy is a construct. It's a cultural construct. But that doesn't mean that it doesn't have a meaning in everyday life and that we don't have a right to privacy, for example. Time is a cultural constraint.

   So the question is, do we need a new standard definition of "Privacy" or like we defined a standard time at one point?

   But it's not -- does anyone have a question from the floor? Yes? Down there, Larry.

   Corey, do we have a microphone?

   >> AUDIENCE: The reaction from the gentleman from Google, from Germany, I appreciated your comments especially about the privacy professionals. Because I do think it's true that we have, in this field, professionals who are both representing industry and representing NGOs and representing themselves. And there has become sort of this -- I actually recently wrote an Article called "Beware of the Internet safety industrial complex," which is the play on words of a speech that former President Dwight Eisenhower made in the '60s, how he said that the defense contractors got together to scare us and to spend more money to defend ourselves.

   I feel that we have panic starting with a predator panic. We were all convinced that there was a predator behind every keyboard, and if the kids go online they will be abused. It's possible, but unlikely.

   And then we saw studies that 82 percent of people are bullied. Well, that is not true. The credible studies show it's a much lower level and that many children handle it with a certain amount of resilience, not to diminish the importance of that.

   Now we have the privacy panic where we hear how vulnerable we are. I was talking yesterday with Jacqueline about the whole NSA situation. And companies like Microsoft and Google are legally not allowed to disclose the number of queries that they got from the Government. But the extent to which we have seen data, and some of it has gone out, the numbers are very, very small. So while it's an important consideration, no American or anyone should be spied upon by the government. But I think millions of people feel they are being spied on, and perhaps it's only hundreds of people in a population of billions.

   One abuse of anybody's rights is one too many. But the notion that privacy is out the window, I think there are people with a vested interest in perpetuating that. But there are people in industry who have a vested interest because they want to control and limit regulation. So it's really up to people to look at this critically. Put on the same critical thinking skills that we tell children to use, and not over react to some of the allegations.

   >> GRY HASSELBALCH LAPENTA: Thank you.

   Max, are there any hands here? No? Max.

   >> MAX SENGES: I'd like to quickly respond to that. I disagree on the surveillance stuff. I think it's important that we have the conversation right now and get the checks and balances in place. So it is, however bad people feel about it, if we get the right consequences, and that means that we find a means to develop more transparency and to get that thing under control, it's good that it wasn't as bad as its blown up in the media. But it's more important that we use this opportunity to get it right.

   On your first point, I think, you know, many times when it's very legitimate concerns, everybody who has a hammer sees nails all over the place. And I just wanted to point out that next to getting it right in terms of legally finally the right framework, there's many times smart technology solutions. Let me take one example from another field, which is copyright.

   You know, it's very, very difficult to control the flow of information online. And you know you can write as many laws and restrict people and restrict the freedoms and the potential of the technology in many ways and try to get it right. We don't have problems with the content industry anymore in terms of YouTube because we developed content IDs, which hashes the content and finds when somebody is uploading copyrighted content. And that is not a legal or policy solution, it's a technology solution. And I think in many ways we should get experts to think about this and reframe it until we get it right, rather than limiting people's freedoms and benefits of using the technology by trying to find the right laws.

   >> GRY HASSELBALCH LAPENTA: Thank you.

   >> Okay. So I will also need to react to the surveillance thing. So I wouldn't see that as harmless. It is huge. There are statistics that are mixed oftentimes. And one is the amount of inquiries that Google, Facebook or other companies get. And the other thing is whether there was something beyond that is going on. And, well, all the evidence that we have shows that there was something beyond it. And actually the thing that makes us uncomfortable is that there wasn't enough clear statements from the U.S. Government explaining that.

   And this is the thing that is really -- well, so we should -- like, for example, in Europe, what we have here -- and that's the role of legislation again. Yes, there are actually clear rules on data retention. And the retention can be from six months to two years, depending on the country. So a European country can choose whether it's six months or two years. But definitely after this time, the data stored by telecommunications companies needs to be destroyed. And what we are also worried about is that no one told us that the data gathered by NSA will be destroyed at any time. And this is worrisome, but is -- of course, this is not like the little people who sit and read every e-mail that we are sending, and this is not like that.

But this data is gathered and stored somewhere, and no one gives us a guarantee that it will be deleted after due time. And that is a problem. Right now so many of us would think okay, I don't have that much to hide, so what is the problem here? But if at some point in our lives we want to make a political career or a courageous step that is not popular with some Government, we don't want the Government to have the information about us going back 30 years.

   So this is not a very -- and this is a really, we believe -- so if the negative scenario is true, we think that's a threat to human rights and to Democracy long-term. So if something like that was to be out of control, of course it's a threat to human rights and it needs to be set clearly and needs to be explained.

   In terms of reacting to what you said, Max, in terms of technology innovation, so we are waiting to that with our hands open. So the -- where the history shows that sometimes setting standards, legal standards, helps innovation instead of preventing it. And so, for example, research of Michael Porter, who was a Harvard Business School professor and has done a lot of research on competitiveness and companies, shows that in countries where quite early environmental standards were introduced, in those countries there was more innovation on energy and efficiency than the countries that didn't have those standards.

   So we are introducing standards that we think are necessary and also because we are worried about current market development in some sense.

   Let me just mention the market for applications. So there is currently an estimated number of six million applications that we download for our telephones and so on. It's six million. It's supposed to be growing right now by 30,000 applications a day, which is a lot. And so it will, in 200 days, we will have like 12 million if this estimate is true. And those applications, many times, have terms of services that a normal user will not be very happy. So from my own experience, I'll not name names, but you can download something for your iPad and it will ask you to give you access to your contacts. What for? No idea. So now that I pay attention to these terms of service, I wonder if this is fair for the customer.

   Or you download the application that tracks your physical activity, so how fast you run and so on. And it will ask you to agree that it will give your data to a third-party, which is sensitive data. Because like being relatively fit and healthy, I could think it's okay. I have nothing to hide. But actually, it's possibly a very sensitive data, because as we all know the information about health can get very sensitive at this point.

   So this is really -- so I would try to like balance those things. So it's important to have transparency and both about like surveillance mechanisms and about like normal mechanisms that we are using in our everyday lives to get services.

   >> Could I respond quickly? Because people -- I wasn't in any way defending the national security agency of my own country nor was I underestimating the impact. What I'm saying is there was a lot of misinformation. Early on, there was a story with a back door. And unless Google, Facebook, Microsoft, Apple and all the other companies are lying, that back door didn't exist. Yet that became part of the mythology of the story. And people have to get a handle on what is or isn't true. And I realise there is much that we don't know. But let's not spread false rumors. Because if there were a back door, believe me I'd be out there marching in the streets and probably against Microsoft for allowing that to happen, along with the other companies that allegedly did that or didn't do that, fortunately.

   >> GRY HASSELBALCH LAPENTA: Gitte?

   >> GITTE STALD: Yes. I just wanted to come back to one of your comments on what you said on -- on the cyberbullying as kind of, well, we didn't always get the problem -- but the right response.

   I think as a researcher, that is always very, very important to consider which questions we ask, I mean, to ask the right questions, and in which context. Because very often we have a presumption of how things are, and there were public debates and other things. And that heavily impacts how we frame the questions that we ask.

   And it that, I can see here that we are discussing multiple things, but at least two main perspectives on privacy. And in this discussion about how to innovate the perception and how we work with the privacy issues, I think we need to -- on the one hand we discuss company stuff, regulations, data things, all these overall things about all the data that flows around, all the foot prints that we set that kind of compile into databases and how can we control that.

   And then on the other hand we had the very personal experiences, and we need to know much more about those personal experiences and how it works in everyday life for young people but also for all of us.

   I mean, it's very difficult as an individual user to do anything about the big data issues in a way -- it comes back to you in terms of the offers that you get from your systems, and the algorithms that they control which information you get and so on.

   But on the other hand, your own experience and your own way of being in control of what you're doing also has a huge impact on how these work.

   So coming back to that, I think really we need to start discussing the questions that we ask.

   >> GRY HASSELBALCH LAPENTA: Jacqueline Beauchere and then Claus.

   >> JACQUELINE BEAUCHERE: I hope we can hear from the youth because I want to hear from the panelists and adults coming to the table. I'm in favor of awareness raising and education around all of these issues, very much in favor of technology solutions, collaborative efforts among industry and Government and NGOs and all other parties that can come to the table.

   It's important to keep in mind that the decisions that we're making, we hope they're right. But they are the decisions for right now. When we lead with something like regulation, when we lead with something like legislation, that can have long-term effects and we don't have that nimbleness and ability to rejigger if we need to. So we have to keep that in mind.

   We should lead with -- I agree with Google that there might be a technology solution or at least a technology approach. We should look at those holistic frameworks and the multi-pronged approaches before it leads to something more permanent and more lasting that we might not be able to get out from under.

   >> CLAUS HJORTH: I think panic is arising from that maybe we are excluding a very important part of the discussion; namely, the society. What is society in this respect? What is society? Society is teachers and parents who have the responsibility by tradition to raise the upcoming generation in order to let the society continue.

   And because they have such a huge lack of knowledge on what is going on, and because the relationship between what's going on in this discussion is between the young people and the companies, and not having the focus on what is the role of teachers? What is the role of parents? And they -- well, I have to say it again, they don't know a lot about this stuff.

   We made a study in the media council a half year ago about what is the dilemmas for teachers? One of the dilemmas they have, can they be friends with young people on Facebook? Can they go Twittering with them on the Internet? Is this interfering in the privacy sphere or is it the way teachers always have done, they have a finger on young people in the school yard? Well, this is a big dilemma. We have to solve this also, because if we are not doing this, then the role of society here represented by teachers and parents, well, where is that?

   Thank you very much.

   >> GRY HASSELBALCH LAPENTA: Thank you.

   I would love to hear from any of you. Yes?

   >> Well, I heard other people saying about transparency. Larry mentioned that there were a lot of rumors. Sorry. There is a lot of things going on that are not transparent. But I think a lot of things are fairly transparent. You give the example of the flash live app which asks for your contact information. When someone comes, can I see your agenda with all of your contacts? Can I make a copy? You would think I wouldn't do that. But why do you do it in an app? It's commonsense, I think, and awareness is very important.

   >> GRY HASSELBALCH LAPENTA: Thank you. I see a number of hands. I will remember you, but I just want to hear, are there any remote participants with questions? No. Okay.

   So ladies first.

   >> I'm Helena from the Netherlands. I'm a student. And I was wondering what you were saying about cooperation between the stakeholders, which is important, and about trusting the Government and trusting the businesses. But then I was thinking about what Gitte said about how important the local content is, and there are a lot of oppressive Governments that I don't think can be trusted in a way. And I was thinking about how businesses should act in those countries.

   Because there are laws there, and should they take a political stand in a way and be choosing for the privacy of their customers, or should they be following the laws of the countries they work in? So I was wondering about that.

   >> GRY HASSELBALCH LAPENTA: Anyone care to answer? Max?

   >> MAX SENGES: It's an interesting institution called the Global Network Initiative, which came about after the Shantel case when Yahoo! handed over information to identify a blogger who was inconsistent. And basically, yes, it's the goal of that institution, which is a multi-stakeholder place with Human Rights Watching and the Burkman Centre and other nonprivate sector platform providers, and they are trying that. And it's a very difficult edge where you say of course when you operate in a country, you have to obey by the laws in the country. But the other end of the spectrum, if that country has really laws that are against your principle convictions, then you have a problem. So I guess that's an ongoing and very difficult discussion that we are having.

   >> GRY HASSELBALCH LAPENTA: Anyone else? Yes?

   >> I'm Bianca from NetMission. So maybe since there is copyright around, I mean, it's a global world and all the services that you provide is global. So how do you take a stand on the legal stands and what do you treat as privacy? Do you have a clear line? And how do you kind of put an outline on what people think is privacy? Because I think obviously you see a lot of youth, they all think differently. And people might judge personal data in a different sense than everyone.

   So it might be good to see how you guys see privacy as well, how you defined it, and how your services, since it's global and the culture is different as well, how do you see it?

   >> I guess it's an industry question, so --

   >> MAX SENGES: It's a very difficult one. Let me give you one analogy. Basically, the answer is yes, we have a definition, and that is an all privacy policy. And, for example, we had, I think with all the different products that Google provides, we had I think more than 70 different privacy policies, and then we consolidated them all into one privacy policy, exactly, to reach a point where people understand. Okay. This is how Google -- how my data is handled when I work with a Google service.

   We were criticized that how can you take a complex matter and conflate it. The same thing with the things when we introduced the street view in Germany. It was a big debate. We have to have a solution and we did that. We got thousands of letters saying oh, my God, I want you to put it back on. How can you allow my house to be on there? So you'll never get it right.

   But yes, we have one understanding and we try to define that and we have a discourse of how it should evolve and get better.

   >> GRY HASSELBALCH LAPENTA: The gentlemen down here.

   >> I'm from Robert from the Netherlands as well. And I would like to comment on two things. The first one is about the role of the Government and how the Government can pressure. Of course, when we look at the NSA scandal, we all look at the big corporations, Google, Facebook, Microsoft. That's the bad guys. But we forget there is a gun to their heads from the Government. And when you look at privacy, you don't have like one identity online.

   I mean, you have -- for example, one of your friends knows way more about you than other friends because you tell them more. And when you look at companies, you should look at them as a relationship, where you share information with the company, for example Facebook, and then you decide on what information you're going to share with who. That is the kind of relationship you have with such a website.

And when it comes to the Government, I mean, I would be happy to share some of my information with the Government. But then it should also be like a relationship based trust. Like I say okay, maybe I can look into the Government with my Facebook account and select some settings or things that they can see as well. But that is the way the Government should think about privacy and of information as well.

   Okay. We have a relationship with the people in our country and we should be clear about what we actually gather. And it should never, never be through another this relationship like Facebook or Google.

   And the second thing I want to mention is that if you look at data, there's so much possibilities we have with data. For example, I think in the near future we all will be this wearing a device that will measure the quality of our blood. And if we hand in this information to have scientists discover certain trends in the health in country, they can really act on this information. This will increase the quality of life like very hugely.

   And what I want to mention, if you know the purpose of why you're sharing certain information, you actually will be more at ease and you'll be sharing it more easily.

   And even though it's very valuable information, and if we go back to the present, that's what we don't do. We don't think of our information as really valuable. We just share it with different companies. And you know people like to say, you know, you end up paying Google and they use your data and you're the product. It sounds very bad. Because I think people are always underestimating. Your information is valuable. And if you feel that your information is valuable to a company, I mean, you can search on the Internet or connect with your friend on Facebook, that is a return. Then you have a relationship with the companies based on these values.

   And if that's the way you think about your information, about your privacy, it's a very valuable thing, then you are deciding way more better what you share. And then when it comes to awareness, I think this is the key thing that we should educate to people.

   I mean, you're valuable. The information is valuable. And please, you know, you're making a deal with the company using your information, so that they care about it. Thank you.

   >> GRY HASSELBALCH LAPENTA: Down here.

   >> My name is John Leprees. I'm a professor the a Northwestern University. I'm really concerned by the panel's use of law in the discussion. And the reason I'm concerned about this is if we think about the next billion users of the Internet, many of those users come from countries where the role of law is weak, at least, or arbitrary and capricious, at least.

   Could you please respond to the idea of the role of law in these kinds of countries with all the new users coming about, and how the law may not be such a good or useful partner in addressing these issues.

   Thank you.

   >> GRY HASSELBALCH LAPENTA: Thank you. It's a good comment.

   There was here. Anyone else who has a question, please raise their hand.

   Okay. Yes. Thank you.

   >> I'm Allegra from the Philippines, NGO, and also a member of APC.

   When I first read about the title of the innovation and privacy, immediately what came to my mind is how this great innovation in the surveillance industry, R&D and all of these things that help Governments, good or bad, develop these tools for what they do.

   So I was wondering, in the Philippines we have this big scandal with our President, the wire tap, her cell phone was tapped. It was about ten years ago. And at that time it was a very sophisticated sort of sniffing tools of the air.

   And when we dug deeper into the dynamic of it, it's all of the companies going to our military establishment, presenting, and judging and all of these things so that these technologies shall be taken.

   I was wondering if in the more advanced regulatory environments, more than ours, if there are regulations about transparency of companies like this, in terms of reporting what the technologies -- what their technologies are capable of and who they deal with in terms of, well, especially Governments. Because I think when my Government buys something like this, I think I should have the right to know if they're buying something like this. So I don't know if that regulatory sort of angle is being explored in International contexts.

   >> GRY HASSELBALCH LAPENTA: Thank you. And thank you for bringing up that comment. Because it's true. When you talk about innovation and privacy, there are great innovations in surveillance technologies, great innovations in big data, but that's what we're trying to do with this panel saying how can we look at privacy then? We need to even the balance.

   >> MALGORZATA STEINER: So I'd like to talk about how the countries and the users in the countries can handle their privacy.

   And so there -- actually, there is no good answer to that, yes, and there are many attempts to help many countries improve, in terms of the quality of Democracy and the rule of law and so on. And that's the only sustainable long-term solution.

   But it makes me think of a recent Article by Timothy Garton Ash, who talked about surveillance, as in the NSA scandal, and now it needs to be private/public partnership. It's not private or public, but if something like that exists, it would be a private/public partnership. That's a negative example.

   The other example, based on the same phenomena, of course private companies in those countries can play a big role in setting the standards. And consumers in countries with rule of law and maybe higher awareness about privacy can also play a bigger role. And there is a talk, and I'm not sure whether you mentioned that already, but some people say that privacy protection and privacy in general is a new green movement. So it's a new movement that, you know, consumers will require the services that are -- that has the highest standards of privacy protection and that they will actually put the pressure on the companies. They might be a more -- well, efficient factor of change than maybe legal instruments.

   I'm not sure whether this is true. You know, for now you cannot tell. There is some data that seems to support and that this could go in this direction and some data, again, doesn't support it. So the data that supports it is that the number of people who are worried about the amount of data in the Internet about them is clearly growing. So in 2009, there was only like 33 percent of people who were worried about the data on the Internet. And now it's 50 percent.

   And more and more people start caring about and worrying about the data, and this could be an indication that more attention will be placed on that, and maybe we will have something like a green movement. I'm not sure about that. But certainly a consumer pressure on companies, on Googles, on Microsofts, so that they get even more innovative.

   Because we all know that they have amazing human resources on the side and can deliver more innovative solutions than they do now. We all believe in the capacity and possibilities. So by putting positive pressure on them as consumers, we are setting the standards for services in our area, but also maybe in countries that don't have such a good rule of law for protection.

   And I think it was an example of, I mentioned the general data protection -- no, the data protection directive from '95 that Europe had, and that was at that time considered like a progressive and high standard data protection legal instrument.

   And it actually, it spread around the world pretty quickly. Because companies in countries outside of Europe quickly understood that if they wanted to sell their products to European citizens, they will need to apply to the same standards. So then you have this phenomenon that the companies started to apply the highest standard from the most difficult market they are active in. And in that case it was the European market. So the companies even outside of Europe started to adjust their standards of data protection to what we had in Europe. And this is like another kind of influence, a soft influence, that I think a good regulation in Europe could have.

   But as I said, here it must be a combination between like a consumer pressure on companies so that they stick to good standards, and I'm talking about Europe because I'm from Europe and I know the situation there. Sorry for that. But companies from -- to keep to certain standards in Europe but also outside of Europe, and of legislation standards that will spread by -- well, just some of the similar standards. So that's the combination that I think could help.

   >> GRY HASSELBALCH LAPENTA: I saw you Max, but I would like to hear from the youth first.

   >> BASTIAAN ZWANENBURG: Well, I want to follow up on this discussion. And I think it's impossible to force a law or try to solve this worldwide problem with laws. We tried to solve the problem in the entire European Union with a law, the cookie issue, and the law is different in every country. It's very difficult for companies, so it's difficult for users to understand what is going on.

   So I don't think we should try to solve problems in terms of law or regulation. I think we should try to find a solution in terms of innovation or awareness.

   I mean, you can innovate at a worldwide level or raise awareness on a worldwide level. You can't enforce law or regulation. So I think that's very important.

   >> GRY HASSELBALCH LAPENTA: Thank you.

   Actually, we were -- yes? Well, before I ask a question, I'll take --

   >> I was going to say that I completely agree. I wanted to pick up something that was said by the person from Microsoft about the scale and degree. I think you're right. It shouldn't be given to the Government, but it should be given to the individual. The choice, the reign of choice. And the scale, it might be complicated and might distract people, but if you give it to the person who is using it themselves, the best thing, as you said, is to give me the decision on my privacy.

   >> GRY HASSELBALCH LAPENTA: I'm just curious, because that is what we kind of have now, privacy self management system. And in a sense some people are talking about a consent dilemma. I may be talking down to you, but can you manage that choice? Can you manage that privacy -- I mean, many will say that it will be difficult to see the consequences of correlation of data and so on. And it's also a question to all of you, also.

   >> I was going to say, you know, I think I can manage it, yes. But I think we can manage it, the society can manage it. And that's important. I'm not alone on the Internet. I don't think I'm alone by myself. But together my peers can say I can see this. So together we can work this and together we can manage it, yes.

   >> GRY HASSELBALCH LAPENTA: Gitte would like to hear, do you think about big data, do you think about the surveillance issues and the correlation of data? And this is a question to all of you. Do you have -- maybe?

   >> LUIS IVAN CUENDE: Yes, I'm a bit, you know, conspiracy in the sense of what the NSA is doing with all the data.

   So the NSA is actually -- there is a theory that says that the NSA is doing like the biggest database in the world and they know every movement that we are doing. So in the long-term that could lead to, you know, for example, this could note what I'm going to do with my life when I don't even know. Because they have all of my data. They have all of the movements that I made. They have all of my online activity. I'm worried about the future of that.

   Because when you are in a service, you are trusting that service, you are choosing to use that service. I can choose to use gmail, for example, but I cannot, apart from my Government, I cannot say US stop collecting my data from gmail or so on.

   So that is my worry. That you can look out a certain service, but not the Government and ICP and so on.

   >> GRY HASSELBALCH LAPENTA: Two comments from the panel and one question on there, and I think Max was the first to raise his hand.

   >> MAX SENGES: Thanks. I think it's a pretty good conversation. Well, I'd like to come back to the gentleman from Holland who made an excellent point, I think, where really we are at the heart of this.

   Is it the data minimization that we want? So we collect as little data as possible, or do we see data as a resource that is actually a rich thing? We are data rich and not data minimal. And then we can decide how we use it.

   And I think all of the discussion, you know, we have to be smart about it and so on and so forth. But this is really at the core. Do we want minimal data? Do we want people off and switch it off and try to be minimalistic about it? Or do we create a rich world around it, produce new services and are smart about how we are using it?

   And I -- again, I felt it was a wonderful argument. You know, many times you get the argument. You get the self driving cars, you can fly balloons out in the stratosphere, you can get the data privacy thing right. But there are just some requests from Government and activists, like the right to be forgotten, it's impossible. You cannot take something back that you said.

   So even if we tried all our best, it's about the decision, as you said, you know, do you want to put this online or not? And once it is out, you cannot take it back. It's just like what I say right now goes on the record and it will not be able to take it back.

   We have a very clear case in Germany that I think puts this to the core. There was a bouncer at the Gappaban in Hamburg and he was known under a nickname that, if you want, I'm a racist. But he liked it. It's a word that I'm not going to repeat here. But, you know, there were people rapping about it and he felt it was cool. At one point he decided he didn't like it. So he went to court and he had the online archives remove that name from the archives. It was a very wrong decision from the court to do that because if it's out it should stay out. Because we are rewriting history. And that's dangerous.

   So data protection needs to have limits. And you cannot expect Google and other companies to be normative in this field. It's dangerous.

   >> GRY HASSELBALCH LAPENTA: Thank you. We are opening up discussion. The right to be forgotten.

   >> CLAUS HJORTH: Also referring to what you said from Holland and what others have said, I don't believe we can regulate this from the Government's side. But I think the Government has some very important role initiating and maybe also helping the society to find what is our basic rights? Because I think this is the confusion for a lot of people out there, that they don't know what rights they have to say to the companies or to also public institutions. What can you do with my data? And before we do this, before we know about what is our fundamental rights in the new world, where you cannot delete everything and so on, it's very difficult for people to engage in this discussion unless you are young people being experts or a company also being an expert. So we have to know our fundamental rights.

Thank you.

   >> But they are the same rights. I mean, there is a right to privacy and everybody knows that. There's no new rights. I don't understand.

   >> Hello. My name is Ludo Geiser, maybe you can guess where I'm from when I'm sitting here.

   I just want to say that I would like to keep the right to give up my privacy completely. Thank you.

   >> GRY HASSELBALCH LAPENTA: Okay. That opened a whole new discussion here at the very end of the debates. But Malgorzata Steiner, you can join in.

   >> MALGORZATA STEINER: Point well taken. But referring to just -- because I really don't want this discussion to be over simplified. And I have the impression that it might be going on some points in this direction. So it's not -- our dilemma is not whether we want minimal or maximal data. That's not the true dilemma. The true dilemma is whether people can control data and make a choice. And that's something that we are aiming for. So we are not aiming for minimal data or maximal data. We are aiming for a choice of a consumer that they can control data. And from all the surveys, it shows that this is what consumers want, this is what citizens want.

   And it's a little bit different for the public sector, because many of you mentioned also our Netherland corner, many of you mentioned that status that you mentioned. It's a bit different, because you have no choice to whether you give your information to the Government or not. The Government has certain data. We have special rules that say you can only use minimal data that is used for certain purposes. But this is for the public sector. Otherwise, we aim for a good balance between minimum and maximal, and try to be forgotten.

   And I think, to be honest, I don't know anyone very serious right now involved in the discussion who is still advocating the right to be forgotten. And it's the success of the current dialog that we're having. It's the success of the dialog going around the general data protection regulation.

   Yes, the world organisations and Governments which first wanted this right, because it sounds great, it sounds great. It is tempting. Every one of us has made mistakes in the past. Everyone has a picture from a party in the Internet that you might not be very happy about. And like giving a chance, like or at least suggesting that there might be a chance to erase those facts from your past that were a mistake. It's tempting, right? And every one of us thinks yeah, yeah, okay, if it was possible it would be great. But it's technically not possible. And it was discussed for several months with the technical community, with NGOs, with privacy protection officers, and so on. And now I think everyone who is in the discussion agrees that that is not the way to go.

   So let's not mock this discussion, let's just keep to what we agreed. So nobody really wants the right to be forgotten, I believe, from what I really see from the dialog right now. So no one is really advocating it to implement it into the current law. Simply not because we don't want it, but because we agreed that it's technically hard to achieve.

   >> GRY HASSELBALCH LAPENTA: Sorry. I have to cut you off. We are almost done. We have questions from the floor. Do we have remote participants? Sorry, but I would like a few people who want to comment.

   You had a question down there?

   >> Yes, hi. I'm a privacy professional, so don't all throw your bottles all at once. But it worries me a bit, because there have to be limits to choice. So choice is not unbounded. And it worries me a bit that we're talking about compliance with laws. And this isn't a legal compliance issue. The law can only go so far.

   You know, this is about the fact that the ecosystem, we talked about mobiles, I mean, the ecosystem is architectured for disclosure by default. Mobile devices that you carry in your pocket, they are complex. And the network that underpins it are global. They are global. They cross borders. Your and my privacy doesn't stop at the English channel. It has data for somebody in Nigeria to see about for advertising.

   It requires of us how can we create user experiences that make it simple, that make it easy for you to make simple choices on your devices as well, you know, parking, NSA and all of that other stuff.

   And I don't agree with comments made earlier. So parking that, how do we design for trust? I haven't heard this conversation yet. It has to be made simpler for people to understand.

   I can show you a social network tool that I have on my device that I would argue may be architectured and designed to restrict choice. Because the frustration of choice supports the business model, which may not meet my privacy interests.

   >> GRY HASSELBALCH LAPENTA: Well, thank you for pointing that out. Because that is what we started with the discussion here is to design for privacy and innovating privacy.

   We have a comment from Bastiaan.

   >> BASTIAAN ZWANENBURG: You said it's important that we gain more trust and we have to figure out how we can gain trust. But a lot of people my age don't know what is going on, how it works. And how can you trust in something of which you don't know what is going on. So first we have to raise awareness and then trust. I agree with you. First awareness.

   >> GRY HASSELBALCH LAPENTA: We don't have too much time. I would love to continue this discussion and get more concrete on the definition of privacy, which we all can see here is a cultural construct, but still is important, and the innovation and privacy and looking at privacy as an opportunity.

   Is there anyone here on the panel who is sitting and thinking I would like to say something here in the end?

   No?

   One last thing, is anyone on Twitter? Can anyone tell me what has been retweeted the most, if anyone has tweeted from this session?

   >> The Internet is down most of the time, so it doesn't really work. Personal notifications get through but you cannot load your timeline.

   >> GRY HASSELBALCH LAPENTA: That's a pity. But it's to build up. Thank you for all the panelists. It's been interesting and wonderful to hear from the different perspectives. Clearly we can see that trust has been shaken somehow maybe more for adults than for youth. And maybe this is a way to rebuild trust is by innovating in privacy and thinking of it as an opportunity, as an area that we have to explore, where we have to balance the advance of innovations in other areas to the innovations in privacy.

   So I just want to thank everyone. It was an interesting discussion. And I hope you have a wonderful continued IGF.

   Thank you.

   (Applause)

   (End of session 10:30 a.m.)

  

 

********

This text is being provided in a rough draft format. Communication Access Realtime Translation (CART) is provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings.

********