Open Forum: Internet Openness and Privacy

IGF 2010
VILNIUM, LITHUANIA
OPEN FORUM:  INTERNET OPENNESS AND PRIVACY
SESSION OF
COUNCIL OF EUROPE PARLIAMENTARY ASSEMBLY (PACE)
SUBCOMMITTEE ON THE MEDIA
0900
15 SEPTEMBER 2010
ROOM 1


Note: The following is the output of the real-time captioning taken during Fifth Meeting of the IGF, in Vilnius. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the session, but should not be treated as an authoritative record.

********



>> JONATHAN CHARLES:  Ladies and gentlemen, good morning.  Thank you very much for joining us at this session, which is being organised by the Council of Europe parliamentary assembly, subcommittee on the media, as a Council of Europe session.  It is about the openness on the Internet and privacy.  
My name is Jonathan Charles.  I'm a BBC world news presenter, and I will be moderating today's session.  We have our distinguished panel -- two panels in fact.  And we will be hearing from them.  
The whole issue, as we know, privacy on the Internet is growing in importance.  If I can give you one tale that I was hearing the other day, we are living in a day and the technology that you carry around and the networking sites that you belong to, people can pinpoint where you are.  And a journalist on the Guardian Newspaper on the UK joined a networking site, called foursquare, which is something that is being used increasingly.  That is one of the new geolocation networking sites.  It allows your friends to know exactly where you are anywhere around, using GPS.  You just put in, in effect, I've arrived here, and then any friends in your area can come and join you in the bar or wherever you happen to be.  The trouble is, it allows lots of other people to know where you are as well.
They can work out where you live, where you work, and this Guardian Newspaper journalist decided that to test the question of privacy on the Internet, he would become a cyberstalker for a day.  And he managed to just connect with this woman called Louise on the Internet.  He didn't know who she was.  He managed to get into other sites on this foursquare site.  And he discovered that she was in a bar in central London.  He could see that from the locator.  And he managed to track her down.  He went into the bar.  He can't see her.  He had seen her picture on the Internet.  He had access to her Twitter and Facebook accounts because of the nature of the interconnection of all of these things.  He couldn't see her.  He looked again at her site on foursquare saying she was in the bar.  She said it was a private function.  The journalist asked:  Is there a private function going on?  Oh, yes, downstairs in the room below.  He went downstairs and he was able to confront this woman and said I don't know who you are, but I knew exactly where you are, even though I had to come halfway across London to do it.  
The more we interact, the more we give away of ourselves.  Whether we do it knowingly or unknowingly.  These are questions that are raised by Internet privacy.  We want to be open, but what are the issues that come about as a result of that?  That's what we will be discussing here today.  
Let's see a film called "Freedom to Connect," which has been made by the Council of Europe.
(Video)
(Music)
Well, there is a little flavor of the issues that we will be discussing.  I'm surprised that it's only 50 percent of your time spent on the Internet.  I suspect for some of us, it's a great deal more.  
We have our distinguished panel.  I'll introduce them one by one.  At the end of the panel session and in today's session, it's split into two bits.  There will be a chance for you to engage with the panel, for you to ask them questions.  We hope to get debate.  And a lot of people are watching us on the Internet and they will have a chance via the remote moderation to put their questions as well to the panel.  
Let me start by introducing the Deputy Secretary Genera.  Council of Europe, Maud de Boer-Buquicchio.  The issue of the children on the internet we will discuss, but it's more wide than that.  

>> MAUD de BOER-BUQUICCHIO:  Thank you.  Is this working?  Thank you for the introduction, and welcome to all those who have chosen to be with us today.  The IGF is devoted to the development of the future together.  I believe that to achieve a meaningful result, we have first to share a vision of what we want that future to be.  As a human rights organisation, the Council of Europe's vision of the future can only be a world in which fundamental rights and freedoms are respected both online and off line.  
To make this possible, the Council of Europe strongly believes in the need for a multi-stakeholder approach.  Several institutions and bodies in our organisation are mobilized around the issue of Internet governance.  
And I would like to pay tribute here to the role of our Parliamentary Assembly, and more in particular today to my dear friend the late Andrew McIntosh, who planned to be here with us today.      The Council of Europe is working with partners to, first, ensure that on the Internet people have a maximum of rights and services, subject to a minimum number of restrictions and a level of security which they are entitled to expect.  And, second, keep looking ahead to make sure that it is a space in which we can work, learn, play, and communicate with confidence and trust; a space where people are neither threatened nor discriminated.  
The Internet's openness and privacy are key elements in building trust and confidence, and in this context I would like to stress, first, everyone's freedom to connect and thereby to access Internet services.  Actually, already a few countries in Europe are granting their citizens a legal right to access broadband Internet.  Examples are Finland, Spain, and Switzerland, thereby recognizing the openness of the Internet to promote positive change for our democracies, economy, and general well-being.  
Second, everyone's right to freedom of expression and access to information which should be enjoyed without interference and regardless of frontiers, which includes the freedom to communicate and create.  
For it's 47 Member States of the Council of Europe, the openness of the Internet as a public resource, and public service value means not subjecting people to any licensing requirements.  
And encouraging the reuse of Internet content with respect to intellectual property rights.  
Allowing service providers to operate in a regulatory framework, which guarantees them nondiscriminatory access to national and international telecommunication networks.  
Promoting public domain information accessibility via the Internet, which includes government documents, allowing all persons to participate in the process of government.  
And, finally, making sure that governments in cooperation with other actors do not damage the connectivity, stability, security of the Internet in other states.  
On the last point, the Council of Europe is currently examining the viability of an international treaty of minimum standards on the roles and responsibilities of governments to deter and protect the Internet from interference, attack, and how to prompt between states and other actors to protect and preserve Internet freedom in order to achieve greater unity across borders and to preserve the openness and universality of the open Internet and transboundary flow of content.  
For the protection of people's privacy and personal data transmitted via the Internet, the Council of Europe governments agreed on the need to protect users from the unlawful storage of their personal data, including that which is inaccurate or has been abused or disclosed without authorization, to protect the transfer of international personal data to states which do not have an adequate level of data protection, and to facilitate cross border cooperation in privacy law enforcement.  
The Council of Europe is committed to countering the Big Brother phenomenon of the Internet, in particular the silent surveillance tracking and profiling of people.  Being watched can impair people's openness on the Internet.  So it is essential that they have minimum standards of profession protection.  
We are responding first by providing guidance on the practice of profiling, making sure that the Internet is open while at the same time private when we choose it to be so.  
Second, by modernizing the Council of Europe Convention on data protection so that everyone has a global benchmark on minimum standards and principles.  
Ladies and gentlemen, I believe that the fate of our democracy is at stake if we fail to achieve the proper balance between potential conflicting rights and freedoms.  The boundaries of freedom of expression, of the right to privacy and the right not to be discriminated against may sometimes dissolve in a gray area, which holds some dangers.  
In this context, we should be careful to accept that freedom of expression is not an absolute right.  The need to fight hate speech and the protection of the safety, privacy and dignity of other people and in particular children may justify limitations and the use of Internet.  
Avoiding over regulation of the Internet, and resist the imposition of restrictions that censor, block and filter what we see.  Any attempts of blocking and/or taking down content from the Internet should be duly justified and done giving full information to the public because they have a right to know.  
Not to lose control of our personal data accumulated in the context of profiling, to erase and delete personal information and to build privacy into the design of Internet services.  Not to inadvertently give away our rights on the Internet, in particular by seeding them in exchange for free sv, nor should we obliged to pay a fee to protect them because they are not for sale.  
Last, we should inform and educate ourselves and empower our children in order to deal with the challenges of the Internet, including managing our online identities.  
Albert Einstein once said that the main problem seems to be a perfection of means and confusion of aims seems to be our main problem.
We all know that the best technology can be the worst of inventions if it doesn't serve the right purpose.      Let us focus on the aims that are worth achieving.  The means will follow.  And humankind will eventually benefit of their perfection.  
Thank you.  

>> JONATHAN CHARLES:  Thank you.   

>> MAUD de BOER-BUQUICCHIO:  Thank you very much indeed for that intervention.
(Applause)

>> JONATHAN CHARLES:  Next, I'll call on Markku Laukkanen, Chairman of the PACE sub-committee on the Media, Member of the Finnish Parliament, and a former broadcaster who can see things from both the media angle and the parliamentary democracy angle.  
I should say that the other thing that we're doing here is collating everything that is discussed today, including your interaction, and the whole idea is that this is the first time, really, that the Parliamentary Assembly of the Council of Europe has been open for everyone to contribute to, right from the very beginning of the process, because at the end of the process we are going to be drawing something up.  But right from the ground level, this is the chance for everyone to play a role in trying to shape the outcome.  

>> MARKKU LAUKKANEN:  Thank you, Mr. Chairman.  I used to say that there is no new media without old media.  

>> JONATHAN CHARLES:  I'm happy to hear that.  

>> MARKKU LAUKKANEN:  Let's remember that.  And when I looked at this film in the beginning, I felt that this guy had some addiction to the Internet.  But when I came last night to Vilnius and I lost my cell phone --

>> JONATHAN CHARLES:  You no longer --

>> MARKKU LAUKKANEN:  -- I felt that I have strong -- I felt I had a strong dependence on that.  So we all have an addiction to the Internet.  
Dear colleague, the World Wide Web was developed as a concept for nuclear research in Geneva 20 years ago.  It has spread all over the world.  Meanwhile, ownership and governance are global now.  
But for us politicians, we need to have special responsibility or regulation, like Maud said.  It may be helpful to develop standards.  First, for instance, within Europe, but it's essential to have the support of all the stakeholders worldwide.  Because the character of the Internet is global.  
Cybercrime, the standards were developed through the work at the Council of Europe years ago.  States around the world applied those standards and many have signed the Budapest Convention on Cybercrime.  
We at the Council of Europe have been working on other issues relevant for implementing human rights and freedoms in cyberspace, and I'm pleased that this event of this subcommittee on the media Council of Europe Parliamentary Assembly, we launch new work through all stakeholders through the IGF this year.  
The Internet is also a political issue.  We need to remember that.  And that's why we shall prepare parliamentary reports with policy recommendations on privacy and Internet freedom by next year.  
And those recommendations will be based on the European Convention on Human Rights.  The rights to the protection or private life and the right to freedom of expression and information.  
And, let me say, still, there are countries in Europe where Internet is controlled by the state.  It's not only half an hour from this place to the border of Belarusa.  Human rights are universal and parallel norms exist for the UN.  However, different places in the world have different experiences and put different emphasis on issues.  Both things are a necessary condition to ensure that the Internet remains the prime tool for information sharing as well as for many other interests in our society, be they public, private or personal interests.  
And, therefore, I'm very grateful to the experts on this panel as well as to the participations in the audience.  Through you, we politicians will be able to make political recommendations which reflect the multitude of people and the diversity of approaches.  And that's why we are here listening to you.  
And let me thank you in advance.  Thank you very much.  

>> JONATHAN CHARLES:  Thank you very much.
(Applause)
Markku has to go into another meeting.  He will be back in an hour.  We have split today's events into two panels.  The first one is a panel on the privacy and management of private information online, and I'll introduce the speakers one by one.  Listening to everything is Peng Hwa Ang, from Singapore, who is making notes, and at the end will draw together everything, all the conclusions at the end of the session.  Peng is a great expert on the Internet and no stranger to the IGF, having attended previous IGF meetings.  
Let me first of all ask Andreja Rihter to speak.  She is a rapporteur for the Council of Europe committee on culture, science and education, and a member of the Slovenian Parliament.  

>> ANDREJA RIHTER:  I'm pleased that the Council of Europe, we opened the floor, because it will be much more easier for me and especially for the administration to prepare the report.  
The right to privacy is a concept which was developed 120 years ago, when print media and photography expanded rapidly.  It was a new experience to see one's own picture in a newspaper or film.  People were not necessarily comfortable with this situation and demanded respect for their privacy.  
Today, we are filmed by video cameras in streets and shops.  Everyone can take a picture with their mobile phone.  People are ready to put private pictures and details on their own Web presence or include it in the Web content of others.  Individuals might nowadays accept to a greater extent the private details about them are in the global public domain on the Internet.  But this development has also created new risks and dangers.  
Under the European Convention on Human Rights, states are obligated to protect the private life of individuals.  People can view these rights.  Such viewers require informed consent by an adult.  
There was a development in the Convention on the automatic processing of personal data, looking at the interrelation of technology and privacy.  Privacy is protected differently in different parts of the world.  It may be difficult to harmonize national approaches around the world.  In some countries it's left to the private sector to agree on common standards with their customers or users.  In other countries, privacy intervention can be a criminal offense.  Nevertheless, I believe Internet users will have common interests in protecting the privacy, irrespective of who they are.  
This might include pictures of minors or personal details about one's private life, such as health information.  Most people are also worried that private data will be exploited.  It's therefore important to share as widely as possible -- thank you -- as possible common concerns, and raise globally common awareness regarding privacy risks and rights on the Internet.  
I would have, for example, the following questions:  Privacy laws should apply in a technology neutral manner, because in principle it does not make a difference whether your private picture is published in a newspaper or on the Internet.  It may make a difference, however, whether your privacy was used for commercial purpose by a newspaper or Web site, trying to sell better, or by a private person with no financial interests, publishing in a chat room.  
A publication on the Internet may stay on the Web for an indefinite time, while a printed publication will be out of circulation after a while and pictures broadcast on television will be able for a moment only.  
Images of child pornography victims have, for instance, been circulating on the Internet and the victims or their families could do little to stop this.  
The Web also allows everyone to open up his own -- his or her own privacy, for example, by publishing personal data or even naked pictures.  Can the right to privacy be viewed, for instance, by minors?  Does human dignity prohibit privacy?
For many, it has become normal to Google persons by their names before they meet.  Commercial firms are offering profiles of individuals via the Internet, going from professional backgrounds to criminal registrations, from private information to family and business relations, from birth notices to death notices.  
Individuals, are they an open source or should they be protected?  Does anything published on the Internet become part of the public domain or is it still private?
The World Wide Web allows us to act anonymously or use different names and even identities.  On the other hand, each move on the Web leaves an electronic trace.  And anonymity may be a safeguard for one's privacy, but it may also pose a higher risk of criminal activity.  In any case, the traceability of Internet users is much greater.  
Thus, does privacy include a right to remain anonymous on the Internet?  A technology neutral approach may also be helpful in this case.  On a physical highway you are not obligated to identify yourself until you violate the law on the highway or pursue an activity with potential liabilities.  The same may be applied to the data highway.  The right to privacy includes the right to secrecy of the correspondents.  How can this include e-mail as well as private Internet postings?
Since users on the Web create their personality, they can create their social rules among themselves.  Those self-imposed rules may be stricter in protecting the privacy of the users.  This is helpful for chat rooms for minors.  
The right to privacy applies not only to individuals, but also to legal entities such as commercial firms.  This is particularly relevant with regard to client and business information of such firms and commercial privacy may be worth a lot.  
The Web may also be affected by interactions and hacking.  This is regarded as a criminal offense under the international Cybercrime Convention produced by the Council of Europe.  This forum gives you the opportunity to express your views on the subject.  And I'm happy, really, to listen to you all in order to reflect your contributions in my future parliamentary report.  
And again, thank you very much for your cooperation in this morning.  

>> JONATHAN CHARLES:  Thank you very much.  And --
(Applause)
-- as she is saying, she is the Rapporteur, so she will be drawing up a lot of this and reporting on what is said here.  
Let me introduce Catherine Pozzo di Borgo, representing the government view.  She is the Deputy French Government Commissioner to the National Commission in France on IT and Liberties, the CNIL, in Paris.  Vice Chair of the consultative committee of the Convention for the protection of individuals with regard to automatic processing of personal data.  

>> CATHERINE POZZO di BORGO:  I'm here representing the consultative committee of the Convention 108 of the Council of Europe.  The work of this committee is exclusively dedicated with the application of Convention 108.  And at the moment, the committee has mandated its bureau to do an expert appraisal of this Convention in order to make sure that it's still relevant to the new technologies.  
This work should start very soon and go on next year.  In order to do so, the bureau has decided to take into consideration the various aspects involved, and we will bring together the opinions of the various interlock tours of civil society and consumers as well.  
Now, how to manage personal information online to make it the most beautiful world, as we were told yesterday with music.  I think the core of the Internet, at the moment, the Internet raises new things that may not be accounted for in the existing framework.  Indeed, existing off-line measures to protect privacy may seem obsolete when they are applied to the Internet.  But, this does not make them useless.  It just means that we have to think altogether about how to reactivate them or to find a new data protection of rights in order to provide effective protection in the Internet world.  
I would propose three points to deal with this Question of management of personal information online.  First of all, we should think about reinforcement of the rights of data subjects.  Secondly, the enforcement of the duties of data processes.  And, thirdly, creation of new principles or rights or obligations.  
As to the reinforcement of the rights of data subjects, we can see that all data protection laws give the individual the right to protect his or her data through the right of access, the right of certification.  However, to make sure that these rights are still efficient, they should be examined against the new needs created by the Internet.  
When doing so, at the same time, technical tools must be thought of in order to make these rights effective.  I can give you some examples.  
An online writer can control the identity of the person looking for the certification.  
Or, the files should be structured in order to make the right of access easy to do.  We can think, too, about making the access enlarged not only to individuals' own data, but also to information related to the flow of such data within the network, the categories of data used, and by whom it is finally used.  
Another example would be to organise the online right of certification, which may require the creation of a clearly defined authority to deal with this kind of complaint.  
The second point for the data subject's right would be to make the Internet user'S consent compulsory and valid, and prior to the processing of the data.  
So once you say that it's ruled that the concept is prior and compulsory to any processing, we must wonder about what we mean by consent in the Internet world.  Because on the one hand, you have what the people accept to give to a company.  And on the other hand, you have companies asking for less privacy for cheaper services.  So there is no balance of power, really.  And the answers to these questions are not straightforward.  So I think that the reason why the discussion about consent may be the next topic in the Internet security and privacy debate, and this topic may need intervention and supervision by the authorities.  
Indeed, it will be necessary to clarify the application and use of consent.  What kind of consent is necessary?  Should it be explicit?  Should it be used only for specific sensitive data only or only in specific situations which are harmful to privacy?
We must think, too, about some safeguards in order to make sure that the consent is valid.  For instance, the consent should not be forever, but it should be valid for a certain period of time after which consent would need to be obtained again if necessary.  
Furthermore, it should be possible for the data subject to revoke it easily.  Or maybe also new consent should be given when data are being used for a different purpose.  Technically, consent may be given in different ways.  But it may be necessary to consider whether the technique adopted for gaining the consent meets the requirements of valid consent.  For instance, it's what the group 29 did in the case of online behavioral advertising.  
The second topic about managing personal information online would be the reinforcement of the obligation of data processes through two points, compulsory information to be given to the data subject and security.  Valid consent can be given only if current clear, fair and visible information is provided at the right moment by the data processor.  So the information must give at the very least the details about the data processor, the data collected, the purpose for which it's collected and how long they will be stored for.      The second point would be security, because existing texts such as Convention 108 or national news for the data controllers to take appropriate technical measures to maintain security, but we may, however, wonder about the technical applicability of this principle, because they did not state clearly what is meant by "appropriate technical measures" in the context of the Internet.
So the situation is a bit uncertain.  
So more precise requirements may be necessary to think of in order to achieve security on the Internet.  For instance, the social networks could be asked that restricted access profile should not be discoverable by internal or external search engines.  Or they could require user's consent before any profile is accessible to third-parties.  
Some technical devices are being developed such as obfuscation devices which allow an individual to design their personal data in order to make them impossible to use.  But the development of search means would require cooperation between government, industries, providers, in order to define the best security tools to assure privacy.  
And the third point for the data processor would be to minimize the data.  The principle of restricting the amount of information collected to that which is needed to provide service should guide the design of any system which involves the collection and storage of personal information.  That could be part of the privacy by design tool.  
And the last point we should think of about managing information, personal information online, would be the creation of new either principle or rights or obligations, and we could think of two of them.  The creation of a specific right to be (Off microphone.) And the other one is privacy by design as a tool for privacy.  So a lot of things were already told about, you know, the creation of this right to be (Off microphone.)
And the idea that the information is not permanent is the real topic to be considered.  However, this idea needs to be examined carefully on the Internet.  Because data are already protected by the 5.1 of the main principles of data protection, that data must be collected for a limited time only.  Afterward, it must be destroyed.  But this principle is no longer valid on the Internet with the use of search engines. So we -- it's necessary to find a balance now in between privacy, the right to be forgotten, and the right of information.  So we may think about, you know, technical things like the creation of negative finds of a person which would permit a person not to be referenced by a search engine.  
And then the last thing would be to consider privacy by design as a tool to guarantee data protection and privacy.  On the Internet, the solution is clearly not to try to eliminate all the risk, but to find a way to take into account privacy and data protection rights from the very beginning of the design stage to the ultimate use and disposal.  
So the idea of privacy by design may offer tools to guarantee privacy and data protection as it incorporates data protection principles into the architecture of information and communication systems.  
So it would permit the definition of specific tools like encryptions, minimization of data, the definition of access rights, the elimination of data which are no longer needed.  But this would need to put together design use, manufacturers of systems, and software, electronic communication service providers, data controllers and authorities to define together better protection and privacy safeguards.  
To conclude, I would say that the technical problem does not simply require a technical solution.  Indeed, new technical tools and devices are necessary to guarantee privacy on the Internet.  However, we may look at the same time for new provisions in legal instruments or new definitions and guidelines or self regulations to do this.  
Thank you.
(Applause)

>> JONATHAN CHARLES:  Thank you.  Thank you very much.  And I think you're right.  That issue of whether we give active or passive consent to what we're doing, actually, will be a key measure in the future of how we approach the Internet.  And while she was speaking, Maud wrote something down that says:  What we need is a digital information shredder.  Just like we have a paper shredder.  
Richard Allan is representing Facebook.  He is the Director of The policy for Facebook.  And I should declare an interest here.  I was telling him, I declined Facebook suicide this year because I was worried about privacy.  I was stalked every day by hundreds of my viewers to my television programme who wanted to be my friend online.  The difficulty is that if you're not very good at protecting your privacy on Facebook, and there are ways to do that, if I let these people in I would have been giving them huge amounts of my personal information.  

>> RICHARD ALLAN:  I've offered Jonathan a one-on-one class on setting up profiles for people in the public eye.  
I'd like, really, to follow on from the previous speaker in terms of referring to the complexity of what we're dealing with.  And then just spend a few minutes talking about where some of the keys to getting it right are, from an industry perspective.  The complexity comes from the fact that it comes from the Internet.  20 years ago, even in the old days of the Internet, publication of personal data was something that large organisations did with very, very expensive Web sites.  And then perhaps ten years ago, you started to get the advent of personal publishing tools, but those were really restricted to very technical people.  A small number of people using tools like blogging were able to produce personal publications online.  In the last five or six years, through Facebook but through a whole range of generated content sites where you post videos, photos and tech, you have the reality of every citizen being a publisher as a reality.  That creates complexity.  We have neat notions of data controllers and subjects.  You have notions where like a household exemption, because things are either commercial or not commercial.  And yet we have moved into a world where a lot of things are both personal and commercial and where most people are data controllers and data subjects.  And that creates complexity, so it's timely that we try to navigate through that.  
I'd like to add to some of the previous comments.  I think a critical one is around trust.  Some of that is about brand trust.  A lot of it is about the responsiveness of people, organisations, that create the platforms when things happen.  
I'm looking to go to my left and thinking of the massive efforts, I think hugely successful efforts that Microsoft puts in to make sure that they responded to user concerns around trust and security, hugely successful, with their user base.  In Facebook, when people raise issues, you feel that, and you feel that unless you are responsive, trust will be lost.  
The other part of the equation is your relationship with regulators and policymakers.  So, for example, we're engaged in active discussions with data privacy commerce right across the world, not only in EU but in Canada and Australia as well.  And we take the view that we have to engage in good faith in discussions with those regulators, irrespective of technical questions around jurisdiction, because our trust depends on the fact that we are behaving in a responsible manner, where a regulator who is an expert brings concerns to us.  
So we may not always reach perfect agreement, but it's essential that we engage in that dialog, because privacy regulators are there as experts who generally and genuinely have good points that we should be responding to.  
And there are schemes again, I know there are different views around schemes like safe harbor, but they are an expression of organisations as to whether or not they wish to be seen as a trusted partner.  
The second part is transparency.  There the biggest challenge is in two words being "comprehensive" and "clear" and frequently those two conflict.  So the response we have taken, having developed a comprehensive privacy policy, because we need to tell people absolutely everything.  And if we don't, that is a problem.  But, at the same time, we need to tell people the most important stuff in a simpler way.  And so if you look at Facebook.Com/privacy, now, you'll see like a very simple graphical representation of the most important privacy things for a Facebook user, and then you can click through and see the great screens of text that have to be there if we are going to be fully comprehensive in all of our disclosures.  
The third element of the tools that you provide, again, there are very strong views as to whether or not Facebook tools are effective -- we happen to believe they are.  We happen to believe that as we iterated them, again, being very honest, in the face of user pressure, they have said to us we don't like the way this works.  It's too complex.  We need you to produce better tools for us.  It's iterated around those tools and we believe that they are at a place where they are able to meet the requirements of users who want to share widely and those who want to share on a much more restricted basis.  And that is our challenge, to achieve that.  
We announced recently the launch of a service around checking into locations called Facebook places.  Again, we had a lot of feedback even before we launched it about what good tools would look like for that.  And we have built in particular safeguards.  For example, minors cannot share their location beyond their immediate friend circle.
There are notifications so that people are told when somebody has been checked in, and they are able to accept or reject that.  
And then the final area is that of teaching and being able to educate and inform people.  As platforms, we have a responsibility because we are touching people's lives on a daily basis, but there is a broader education around protocol, in respect of me publishing personal information about you.  And those issues can be resolved through legal means and through a platform complaint system, for example.  
But, there is something more immediate, which is if you published a photo of me and you are my friend, what are the common social understandings between us about my rights in respect of not having that photo published and your rights in respect of publishing that photo?  There is a lot of social development around understanding now that we are all publishers, how we should behave to each other.
So we are interested in the Council of Europe approach to promoting openness and connectedness.  That is a Facebook value as well.  We are either hearing the areas of I think significance that you set out in terms of your report.  And we look forward to you helping to guide us as we try to develop popular and widespread technologies that take into account privacy concerns.  

>> JONATHAN CHARLES:  Richard, thank you very much.
(Applause)
I think that shows just how complex the issue is, apart from anything else, how many areas it covers.  
I've got Katitza Rodriguez here representing International Privacy Programme, right?  No you're not.  Sorry, just the EFF.  

>> KATITZA RODRIGUEZ:  Thank you.  My organisation is The Electronic Frontier foundation.  And there are members around the world in 67 countries, which is to protect civil rights in the world and the creation of things that enable access to knowledge and empower the consumer and foster technology innovation.  
Privacy, EFF major's concerns are the data is stored on the Internet.  If the data is stored on the Internet, it could be compromised.  For instance, our major concern is also the issue of security breeches, what we call the data (off microphone) or the spillage of data.  
The second is the interaction of government requests.  We see especially in the United States how government wants to access citizens' data hosted by first parties, social networks, companies, search engines and others.  
Another major concern or priority of work for us is the need to make sure that first parties don't have easy access to this data.  When we analyze privacy law, we have a holistic approach.  We are not only thinking on privacy, but we are also thinking on other fundamental human rights, freedom of expression, or digital due process concerns.  
We believe that companies should ensure the right to protect your personal data to exercise the protection of your personal data.  Companies must ensure that consumers take control over the disclosure and use of their data.  
I will make some comments of the conversations made previously.  I would say that for us, we should say that many consumers are not aware of how their personal information is collected, used and disclosed.  
Transparency in the data collection and data processing are quite important in order to be able to give a meaningful consent to the process of the disclosure of our own personal data.  
But how can we exercise this control if there is a lack of ways to manage our own information?  Privacy policies are still very complicated to understand.  Many people just don't read it.  
Many things are extremely opaque.  Sometimes long detailed files.  Sometimes I want to read 50 pages of my contract.  Even a lawyer will not read that.  Our real consent should be freely given and obtained through fair means and that's important.  
At EFF, we value the EFF social network rights, among others, and we believe that there is a right to firm decision-making.  Users should have the right to make informed choices about who sees their data and how it is used.  Users should be able to see really who is entitled to access any particular piece of information, but not only other people.  
Also, advertisers, advertisements, government officials when they request the data, we should be notified, Web sites, applications, and others.  
Whenever possible, users should have an opportunity to respond when governments or others seek information about them.  
We know that, for instance,  if we review the previous problems of Facebook, not the latest changes, they have improved the social interface, we could review that more generally, the problem, the design of software, and the complexity of a user to make a minimum choice about what control privacy settings or control settings should be enabled.  What are the privacy settings by default?  That is not necessarily what the people want, but what the company wants, asked to disclosure information.  There are things like the Canadian privacy law that requires that, for instance, to get some form of consent, and that might be the right direction of policy for our consideration.  
Some examples of, we say, a bad example of bad interfaces, it could be the discussion of Google bus, when they made the release of the gmail recipients public, for instance.
Regarding the Convention 108, there are some things that we would like to see on that research.  We would like to see not only how the principles have changed and how the principles themselves improved, but how those countries, Member States from the Council of Europe, have implemented or have enforced the national legislations in their own country.  How enforcement has been working or has not been working.  And it would give a very good starting point to see whether or not there are effective privacy protections.  Data protection principles won't matter if we don't have a commitment of resources and political weight to enforce the standards.  So enforcement is also is also very important.  
That's everything I have to say.  Thank you.  

>> JONATHAN CHARLES:  Thank you very much indeed.  
Well, now it's your chance to ask any questions that you might have.  And if you don't have some, I do, and also of course our remote moderators may have some questions, but they will let us know if they do.  
Please, anyone here would like to have a question for the panel after that discussion?  Yes.  
Sorry, just for the transcription, if you say who you are, of course, every time you stand up.  And there is a microphone there and there is a microphone over there as well.

>> AUDIENCE:  Good morning.  Good morning, Katarina Syke from the University of Leeds.  Thank you for your presentations.  I only have one Question:  Where is Mr. Allan?  

>> JONATHAN CHARLES:  Yes.  That's a good Question.  Maybe he returned.  He seems to have been called out.  That's all I know.  But he will be back.  
Anybody else want to ask a question?  Yes?  Preferably not one for Mr. Allan until he comes back.

>> AUDIENCE:  I'm Serge from the European Forum.  I have a question:  Why do you refer to individuals as data subjects?  I understand you want to stress the point that the off line and the online environments should come as the same.  So why do you make -- why do you introduce this new terminology, referring to individual rights as data subjects?  

>> JONATHAN CHARLES:  It's depersonalizing it in some ways, isn't it?  Data subject, rather than people.  

>> CATHERINE POZZO di BORGO:  Sorry.  I never thought about that question.  

>> JONATHAN CHARLES:  Interesting thought.  

>> CATHERINE POZZO di BORGO:  Yes.  It's very interesting.  

>> JONATHAN CHARLES:  Remote moderator.

>> I'm Ginger, remote moderator for the remote group.  But I'm speaking now the remote hub in Burundi.  They indicate that they appreciate the opportunity to participate remotely and that you are taking them into account.  They particularly commended Katiza's intervention on privacy.  

>> JONATHAN CHARLES:  Thank you.  I have a question.  Maybe people here might like to think about it.  It strikes me that we talk a lot about data, about Internet users, as though people are quite good at the way they use the Internet.  But the truth is that most people are not very good at protecting their privacy.  If they think about it, they are inept and we have to think about protecting people against themselves.  That's one of the issues.  We have to assume that most people actually need protection.  

>> KATITZA RODRIGUEZ PEREDA:  I think that young people are aware of that.  You can see that when Facebook changed their terms of service, the big change was for the young people and the young people who wanted to react against those changes that infringed their privacy.  Some data was first only shared between your friends and they would make the public available for third parties.  And there was a big protest to see that people are reacting about that.  But it's also true that people also believe in freedom of expression.  So I have some kinds of concerns when there are regulations that are trying to put us -- I'm a person.  In my personal capacity I affirm the European model because it gives a balance between the power of the companies against the individuals.  Well, when you try to put those obligations on individuals, the same way as you're having the companies, I think it's not proportionate.  Because you also like to talk, to discuss, and embody all the social networks or all the technologies that bring you a way to change opinions.  So, I will have concerns.  I mean, I'm -- a more clear evaluation about how to deal with specific problems that address not only the privacy problem, but -- but also the freedom of expression and the due process concerns.  

>> JONATHAN CHARLES:  Maud?  

>> MAUD de BOER-BUQUICCHIO:  Yes.  I think that in order to -- I agree that it's important to teach individuals themselves how to protect themselves and, as you mentioned earlier, I'm very concerned about children's rights, protection of children.  And precisely, in order to enable children, starting from a very young age, to become aware of the risks of the use of Internet.  We have designed an online game called "Wild Web woods.  Www, but in this particular case it's wild Web woods.  It's been translated into 24 languages so far.  It's online from our Web site.  We have I think monthly something like 3 million hits.  3 million visitors.  And the -- it's a game which consists of a trip to an imaginary city where children travel through the woods and they come across danger.
And they are asked to identify themselves.  But they are being taught that this has certain consequences and risks and it leaves traces and so on.  So I think there are ways to teach children from a very young age about the risks of the use of Internet.  

>> JONATHAN CHARLES:  Maybe it's harder to reach older people.  You can reach young people, but people my age and older, it's more difficult to protect them because they are not as adept at using it.  

>> MAUD de BOER-BUQUICCHIO:  One of the children was 99 years old.  We checked the ages of the users.  

>> JONATHAN CHARLES:  A quick intervention, yes.  Daniel Dardailler.  

>> DANIEL DARDAILLER:  Yes, I'm from W3C.  I wanted to give a short update on the privacy technical area.  For us, every layer of the Internet has to care about privacy of course.  But, as far as the Web is concerned, we have been working for a few years on a platform for privacy that starts defining a common vocabulary.  You know, you have to start with the basic, which is what it is that you want to protect.  Your name, your picture, your bank account.  And what sort of action can people do on that?  You can revoke access, you can grant access.  All those data types and vocabulary have to be common.  Otherwise, there is nothing that is going to happen.  
So, we are at the level now where people are thinking about the policy language that would use those words, the vocabulary, as, you know, an object.  So this is very difficult because this is close to a human language.  You want to express something in English or in Chinese, you want to say it's okay to use my picture it if you want to count brown eyes.  But if you want to associate my picture with my name, then it's not okay.  It's complex.  And to some extent it reminds me of 15 years ago, when I started at the W3C, we were working on protection of children.  And the same thing happened.  It was very hard to define a basic language for defining harmful content.  In some countries, seeing frontal nudity for a woman is considered indecent.  In France, on the beach, you see that all day.
So, how do you deal with that?  You have to define objective criteria to start with.  What is nudity?  What is frontal?  You know, those are anatomical things, they are not Web things.  So there is a point, the same for privacy, where, you know, the specialist in privacy, you have to come to the technical working group and define the vocabulary with us.  We are at this level.  
And it's difficult, because the -- the same as for child pornography, there is no definition of child pornography.  Some countries will disallow making a montage picture, things that never involve any human beings, you know, you can draw something and it can be illegal in one country and legal in another.  The same thing is for privacy.  
So we're faced with the same issue, which is the policy level, we have to find a language that expresses all of the possible policies.  It's going to take some time.  

>> JONATHAN CHARLES:  Thank you.  We will hear more from you in just a moment.  One Question I would have asked Richard Allan if he were here, there is a contradiction here, isn't there?  The more private we want to be, the more inconvenient that is for companies like Facebook.  Because the more data they can share, the more profitable it is for them, potentially.  And this is where one of the conflicts lies.  It's not necessarily in the company's best interest to promote privacy.  

>> KATITZA RODRIGUEZ:  I have one comment on that.  We believe that we need to promote competition, for instance, on social networks and other services.  Because then a user can, if they could make a minimal decision, move from one place to another.  And to be able to do that, you might need what is called data portability.  Which means that you're able to take all of your data and go from one to another.  But of course this works only if there is another player.  

>> JONATHAN CHARLES:  We will stop this part of the panel and we will change to our second panel now, because we have 40 minutes or so left of this session.  
Again I want to give you a chance to comment at the end of the second panel.  Thanks to this distinguished panel.
(Applause)
Perhaps you can move to the side chairs and the next panel can move up.  
I know where I will sit everyone just for your captioning.  Daniel, you're there, and Cornelia, you are here.  And Kurt is not here.  Great.  Thank you.  Yes.  Peng, why don't you sit there.  You can sit at the side if you'd like, Katiza.  
So thank you very much.  Let's move on to the second stage of our discussion today, which is about Internet freedoms, structural and operational openness online.  The format will be the same.  Again we will have several speakers and then a chance to debate all of this.  And I'd like to welcome back Markku Laukkanen, our Finnish member of Parliament.  Chairman of the Parliamentary Assembly for the Council of European subcommittee on the media.  

>> MARKKU LAUKKANEN:  Thank you very much.  When we talk about Internet freedom, we need to talk about regulation and human rights, first of all, because the European Council is concerned for human rights issues.  Article 10 on human rights guarantees the right to freedom of expression and information, irrespective of national borders.  The right to freedom of expression and information is the foundation of a Democratic society and is one of the basic conditions for its progress for the development of every individual.  
And that is why I want to increase that it's one cornerstone of western democracy.  When we talk about Internet freedom, this right, which is also in Article 19 of the UN declaration, applies in a way to traditional media and Internet alike.  However, it's an abstract right which has to be transposed in a particular way into our rapidly changing media environment.  
Internet media has revolutionized this freedom by enabling virtually everyone to communicate with an unlimited number of individuals around the world.  And that's where we politicians, we need to follow very carefully this development, for example, in China, the other countries who want to control the Internet freedoms for individuals.  
Print media depends on a physical distribution system, bookstores and kiosks, television and radio depend on radio, frequencies, and satellite access.  And communication depends on cables and radio frequencies as well as on universal access.  In those areas, the standards have been established over the years.  
There is a structural requirement for Internet media, and they are more complicated.  Because they depend on software, hardware, as well as telecommunication infrastructure.  
Besides those technical requirements, the Internet media requires also operational openness.  Users must have access to content and be free to contribute their own content.  The amount of content uploaded on to the Internet today is overwhelming.  The Internet freedom and openness become even more important with the steady increase in audiovisual media on the Internet and the maturity of users shifting from traditional media to those who use Internet media.  But, we have a lot of ideas today.  We need good cooperation with policymakers, and the end-users, good guidelines and engagement for these targets.  And we also need a good understanding of what the Internet freedom really means.  Internet freedom means also accountability, trust and commitment to avoid a digital divide. To give opportunities to everyone, everything, helping to build better economies and societies.  
We also have to remember that the Internet freedom is a very public but also a very private issue.  We need rules to respect all human rights such as free expression, privacy and fighting cyberfraud, protecting children and fighting terrorist crimes.  We all remember what happened in Estonia more than four years ago, what was a very organised attack to their cyberworld.  
The Council of Europe committee ministry decided in 2009 to start work on the protection of the cross border flow of Internet traffic and to protect research, which was critical for the functioning and borderless nature of the Internet and connectivity.  
New parliamentary reports initiated by our late respected colleague, Andrew Mcintoch, probably next year to be released next year.  And that's why we're here listening to your views for this topic.  
Looking at this issue from -- I see that the Internet based media will soon have the same role in ensuring freedom of expression and information necessary in a Democratic society with television and newspapers as in the past.  It's therefore important that we identify and tackle the structural and operational restrictions to Internet freedom which may endanger the freedom necessary for every individual and Democratic states in the 21st century.      And this second panel will help to collect information and talk about contributions to the new report of the Parliamentary Assembly.  I appreciate your cooperation in this process.  Thank you.      
>> JONATHAN CHARLES:  Thank you very much, indeed.  Let me now welcome again Daniel Dardailler, Associate Chair for Europe and Director of International Relations and Offices, World Wide Web Consortium.  

>> DANIEL DARDAILLER:  I wrote a short paper but it gives more detail.  But I want to describe my vision of the technical community of the Internet, the W3C, but also IETF and the Unicode consortium.  People are working with a basic concept of openness that actually gives a larger concept of openness and the feel for a user.  
So there are -- in fact, let's look at the freedom from the point of view of giving rights to people.  So the first right that I think everybody is using today is the right to choose their platform.  There used to be a time 20 or 25 years ago where if you buy a Novel computer or Microsoft or Apple Computer, you could only connect to a similar computer.  20 years ago.  It's very fast.  Nowadays you can buy anything you want and connect to anybody you want.  So, that's something that came along because the Internet community had that.  Interoperability it's called.  No matter what you want at the software level, you should be able to talk to somebody else.  Because the standard, the language that you have to talk is free, and it's public, and everybody can use it.
And everybody can participate in its design, in its evolution.  So that is the second right that I think is important, the right to participate.  It's not for the user, namely, it's more for the developer.  But in a world like the Web, you know, with Web App, everybody becomes a developer.  You can write an application for the iPhone or for a Web Page with Google, the platform, it's very easy.  So everybody is a developer, everybody has something to say about, say, the HTML platform.  Everybody uses HTML as an author, so everybody should be able to say what should go into the next version of HTML.  So that's why those bodies like IETF and others are open to participation.  Because we need the people to tell us what are going to be the next technology.  It's just a matter of mathematics.  
We're about 50 to a thousand in this community, and we serve billions of people.  There is no way we can have it right if we do it alone.  So we have to get people on board to do that.  And so we give them the right to participate.  Once people have the right, they actively participate.  
So the third right, which is also very, you know, important, is the right to access.  We talked about the right to access at various levels.  There is access to the infrastructure, which is not a given in a country where the cost of access equals the cost of living, basically.  So, in modern countries, you can actually -- it's, you know, it's a commodity.  It's not a real cost.  It's something that people have already integrated.  So it's not something that W3C can do much about.  It's more digital divide kind of thing.      We created a Web Foundation to start doing a project to that effect.  But it's kind of below our problems.  
So the problem we have is access for disability.  People with disability, they have the -- the technology exists to transform, you know, text on the screen to voice.  It's very easy.  It costs 50 bucks.  Everybody should have that.  The page itself has to be designed so that it's actually text friendly, you know.  If you have a page with a lot of 3D, you know, a lot of graphical visual information, you have to think about how to serialize that so it can be conveyed using a voice.  Those things are possible.  People are not educated to do it.  
So it's a right to access as well.  The same thing applies to everybody, the right to access useful data.  If you cannot access your government data, which you paid for by the way with your tax, it's kind of denying your right.
So, a lot of countries are starting eGov link data projects.  That's good.  I'm waiting for the day in my town where I can spot on a Google map all the bicycle accidents.  I'm a biker.  That is happening in some countries already.  We have to promote this kind of thing.  It's the right to access data.  
And I think that's all for me.  So I just want to conclude that I think it's a French person, Jean Dossier, said 40 or 50 years ago, with the advent of all of the telecommunicating, wireless, everywhere on the planet, suddenly it was clear that everybody was going to be connected.  Because you have the technology coming up.  So the idea is, it was to start thinking about the right to communicate.  It's not just the right of freedom of expression.  It's actually -- it's not abstract.  It's actually the right to be connected to the network, and that's something that we talked about earlier.  Some countries are broadcasting rights, you know, passing access laws.  And that's very important as well.  
So, I'm done.  

>> JONATHAN CHARLES:  Daniel, thank you very much.  I think this is a bit too much government data online.  Yesterday I was accessing my tax data.  That was not pleasant.  
From the industry side, I'll welcome Cornelia Kutterer, from Microsoft, who is the Senior Regulatory Policy Manager in EMEA in Brussels.  

>> CORNELIA KUTTERER:  Thank you for inviting me to this plenary session.  I would like to focus on one aspect of Internet freedom, which is freedom of expression, and I would like to talk about mainly how we address this issue, which is increasingly important for companies.  They are increasingly faced with tensions around freedom of expression.  And there are basically two projects in which we participate, which I would like to go a bit into detail.  And as it happens, Microsoft recently had a very bad press to read about an incident.  And I would also like to comment quickly on that at the end of the presentation.  
The starting point is that freedom of expression is a human right and a grantor of human dignity.  It can only be restricted by governments in a very narrowly defined circumstance, based on internally agreed laws and standards, such as the international covenant on civil and political rights.
And another starting point for the discussion should be that online service providers have a vital role to play in ensuring free speech protections in the online space.  
We did believe that while governments are principally responsible for promoting and protecting human rights, and fostering the safety necessary for the information society to flourish.  Microsoft and other online service providers likewise have a responsibility to promote respectful human rights and international humanitarian law.  And I'd like to refer in that context to a recent report undertaken by the UN by Mr. Rugge who explored that in much more detail.  
Consistent with the Council of Europe law enforcement Internet Service Provider guidelines, and Microsoft participation as a founder member of the Global Network Initiative, we are committed to respect and protect the freedom of expression and privacy rights of its users.  So we do recognize international law and standards that allow and sometimes require governments to limit certain types of speech and in certain narrowly defined circumstances.  
In that case, we do have established as a founding member the principles and guidelines that help guide us in addressing freedom of expression and privacy and information and communications and technologies.  And they provide an important forum for stakeholders' feedback and training.  
Members consist of company, human rights NGO, academics and investor organisations.  
And we have, in addition to these principles, we have agreed to implementation guidelines that give detailed guidance to ICD companies in how to put these principles in practice.  
And just to list a number of the operational commitments Microsoft has undertaken in that context, to employ human rights impact assessment, to identify circumstances when freedom of expression and privacy may be jeopardized or advanced, to train employees on procedures to protect freedom of expression and privacy, when faced with government demands and restrictions.  Requests that governments follow established domestic legal processes when they seek to restrict freedom of expression.  Consider challenging governments in court or other formal forums when faced with restrictions that appear inconsistent with domestic law or Internet human rights laws and standards and freedom of expression and privacy.  
And, establishing high level transparency with users when required by governments to remove content or limit access to information and ideas and the circumstances where they might be required to disclose personal information.  
I would also like to mention just in the context that the transition to cloud computing certainly magnifies the importance of protecting freedom of expression online.  And, therefore, Cloud providers should implement policies and practices for responding to government requests for use of data that -- and inform individuals of the circumstances in which data will be provided in response to law enforcement requests.  
First of all, I would like to thank the Council of Europe also to have recognized the Global Network Initiative in a reference on the Web site.  And we hope that European policymakers do recognize the efforts and foster the internationalization of those standards in Europe, with European companies in particular.  
We also do look into technology and future developments, and Microsoft is a part of a consortium called trust in digital life, which is a consortium funded by the European Commission, and which looks in particular into how -- how to enhance and maintain privacy, security, and human rights in the context of technology, including hardware infrastructure and software.  
And, finally, we internalized the respect to these human rights by having developed a process, this is called a security development lifecycle for our software services that ensure that developers need to assess at certain given times the impact on privacy and security.  
I would like to refer you to a blog which was posted yesterday by our general counsel, Brad Smith, on an incident that happened in Russia which was reported in the New York Times on Sunday.  Basically, it reported that the Russian government has used IP enforcement with IP rights from Microsoft against NGOs in Russia, and we have taken that very seriously.  It's not good, and it's not nice to read something like that in the press.  Our antipiracy efforts are legitimate but we do not accept that they are misused by governments.  So we are taken very concrete steps to address the situation in Russia.  And again I would like to refer to these concrete steps in the blog by Brad Smith, our general counsel.  

>> JONATHAN CHARLES:  To clarify that, the Russian authorities were using that to identify users, is that what is happening?  

>> CORNELIA KUTTERER:  No.  What happened as reported in the New York Times was that material and computers were seized in the premises of that NGO.  So some of the concrete steps are legal assistance to the NGOs, which we deploy.  License offering -- basically, extending our donation programme for a free licensing software to NGOs, and independent newspapers and we are looking at how we can actually deploy that on a wider scale.  
So, I think that -- and just to mention that, it's really, I think, the response which was very quick-- and I can assure you we were on the phone worldwide, all Monday, to address the situation.  
We have also addressed and asked human rights organizations in that given context, also.  And I would say that it has been facilitated by being a member of the Global Network Initiative.  That forum helps in addressing these issues.  

>> JONATHAN CHARLES:  Thank you very much indeed.  It's your chance, before Peng does his conclusions, it's your chance to have your say.  Suddenly we don't have Richard Allan.  

>> MARKKU LAUKKANEN:  Cornelia, that was a very important talk.  I listened to what you said.  Because I was very happy for your paper.  I have read this.  Of course, it shows that Microsoft has really thought about this issue.  And I feel that you have very high responsibility over this quest on the Internet freedom.  
And this was also a very interesting example you told about the Russia government.  Because it's not only business for you when you go there or China, it's also a very high ethical question.  And that's why I'm happy that you recognize this Question as an ethical Question.  And that's why I hope that the whole industry or sector could also produce such a kind of papers which shows that they recognize this issue, if that it's not only business.  Thank you.  

>> CORNELIA KUTTERER:  Thank you.  

>> JONATHAN CHARLES:  Our remote moderator?

>> We have a comment from the remote hub in Burundi from a gentlemen pronounced I hope Mrida Samwel.  He directs his question to Daniel Dardailler and comments that the emerging issue of cloud computing is somehow not yet relevant for developing nations who have big accessibility problems.  Since W3C does not address the problem of access, are you able to do anything or make suggestions about capacity building, both on applications issues, such as data protection within cloud computing and for developing regional resources?  

>> DANIEL DARDAILLER:  Well, so, with respect to cloud computing, it is a new form of computing that relies heavily on the Web.  So it is something that is going to go everywhere in Africa as well.  It's just a way of looking at data storage and data exchange.  So, there is nothing very different from the technology that is applied in a particular way.  
So, I mean, the thing about a digital divide is that it involves not just the Internet traffic and the cost of exchange of data.  I mean, I read stuff that is really bad, that countries in Africa, they have to pay for the data going to Africa but not the other way.  It's, you know, I have no or little knowledge about that.  So the level of the application, which I'm working on, which is the Web, basically we have all of the technology to make data accessible with low bandwidth to everybody.  But, what we need is a project that actually is using the platform available in those regions.  So most likely it's going to be the mobile platform.  That's why we have a very important mobile Web area, W3C, because people have mobile.  And I think the it is like three times more than computers or five times more.  So we need to rely on this platform, knowing that the device that people have in Africa or in other countries are two years behind the ones we have here.  So we have to adapt the technology for platforms that are two years old, that we don't have anymore here.      So this is kind of complex.  Which means that we have to work completely more than usual with the people on location, with the companies, the NGOs that know the platform better than we do.  So there are projects starting at W3C, with help from various grant organisations, and the Web Foundation, which I mentioned more specifically which will focus on that.  
Let's see, I don't know what else we can do for the lower layer.  That is not our area.  That's getting into things about the Internet.  Everybody has its own role.  But the bad thing is that when you look at a problem below you, like we do every day at W3C, we see problems with the IP, IPv6, IPv4, which we can do nothing about.  We are on top of that.  All we can do is make the Web more flexible, able to work on any kind of network, not just a high speed network.  And that's what we're doing.  

>> JONATHAN CHARLES:  Thank you very much.  I want to throw something into the mix here now, of course.  There is a lot of talk of the Web moving to the position again of walled gardens.  We're not moving back to the old days of AOL, but we are moving into to a position which is away from the current one back in some ways, or forward some companies would say.  Walled gardens, may be good for privacy, but bad for openness.  

>> MARKKU LAUKKANEN:  

>> DANIEL DARDAILLER:  No. they are bad for both privacy and openness.  
And they are, you know, think of the current times.  I don't think they are going to last.  Nobody wants to give their data to a company that may not be there in a couple of years.  And that's what we're doing today, you know, with all of the social networks.  And there is movement, you know, that comes from the bottom that says, well, let's revisit the entire architecture of the social network, using open technology.  You know, I'm not going to go into the technical detail, but it is possible to make a completely privacy safe social network where you control things.  It needs a lot of education for people to move to that platform.  And the understanding of why it's good for them to do so.  But, you know, it's just, I think it's a temporary situation we are in today.  Much like the network before the Internet, where we had dozens of networks, and you know everybody wants to be on the same network.
Here we are, we are on the same network.  But on top of that, we have applications that are different.  That's okay, you know, social network application is just one application.  
So, we're not calling for one single application, but for one platform for those applications to actually interoperate.  

>> JONATHAN CHARLES:  Thank you.  I don't know whether anyone else wants to comment on that.  Yes, a question over there.  If you could identified yourself.  That would be great.

>> I'm a member of the Irish Parliament, but also a member of the Council of Europe committee.  And I suppose, I might be off base or I might be on the wrong track.  The first comment is social networks.  Sometimes we talk about social networks.  But are we becoming a more antisocial with the more social networking we do?  As we go along, we are living life with our Blackberry telling the rest of the world what we're doing, as opposed to experiencing what we are actually doing?  
But the comment I was looking for information, if the panel have any thoughts on it, is the Gavel report that is going to be debated in the European Parliament next year.  It's around the whole issue of the enhancing the enforcement of IP rights in the internal market.  As a musician by background, I'm always very interested in the European creative artists and very interested in everybody having the right to access information.  But the big Question I hear, and this is embracing the views of the industry, is who will pay for the creative artists of the future if we are to open and open and open not only the, as it was, the regulated sharing of information, but also the illegal sharing of information and how we are going to pay people in the future to be the creative artists?
One man wrote 11 books, and someone scanned them into the Internet.  Everyone has access to them.  But he has no income anymore.  It's probably not the topic here, but I would appreciate the views of the industry

>> JONATHAN CHARLES:  Copyright is a messy issue for everyone.  

>> CORNELIA KUTTERER:  

>> I'm not following this, so I wouldn't be able to comment on the specific one, which is outside the scope.  

>> MARKKU LAUKKANEN:  Just briefly, I want to say that the -- we have talked about this copyright issue many times in the committee of culture in the Council of Europe.  And I just want to say that we -- it's not enough that we have some local copyright deals.  We need global deals.  The same laws.  All over.  Because the Internet is global.  And that's why I have -- I have seen that this is a very difficult Question to many copyright organizations.  And that's why we have asked them to the same table, to try to find some solutions for this very, very big issue.  But I can't answer when this report will be discussed in the European Parliament.  Maybe there is some member of the European Parliament that can actually answer that.  

>> JONATHAN CHARLES:  The gentleman there.

>> I'm Uri Guspri.  I'm a member of Parliament from Finland and a colleague of Mr. Markku Laukkanen.  And I have a comment and two questions.  The comment is about the important regulation.  But the problem is that the regulation is only as good as the people you are regulating.  So that is why, for example, in Finland we started to have this data protection dates, where we try to inform people.  So that they would behave, you know, in a more sensitive manner.  
But the two questions concerning the presentations here.  First there was a very good note about access to disabled people.  So that we have to transfer materials to forms that are accessible to disabled people.  But, are we going to have such changes in our copyright system that this is possible?  Because we can do it nationally.
But if a Finnish student, blind student, would like to access material in a British -- that has been transferred in Britain, he cannot do that because of the copyright restrictions.  
And the other one is that, as you mentioned, there is to be service providers have a key role in protecting free speech and freedom of expression.  But are not the companies actually becoming more and more cautious?  Take the Apple Store, they are sensitive what you can put there, and they are taking cartoons out, not only adult content.

>> CORNELIA KUTTERER:  Let me answer the second part, the topic to which I'm more familiar.  I think this is the real difficult Question, and the experience we have in the Global Network Initiative is that this provides a platform where we can discuss these difficult questions.
There is also currently, at least, in a draft form a recommendation from the European Commission on taking on a public/private partnership, on taking down illegal content, and some trade associations have raised concerns on impact on that matter.  So even if you address these issues by sort of a public/private partnership or self regulation, we do believe that the impact on freedom of expression needs to be taken into account.  
But it is very clear that the multi-stakeholder approach is a key part in moving forward and coming to reasonable solutions.  And having this type of examples, which I mention is certainly something which can be discussed openly in such a forum.  

>> JONATHAN CHARLES:  On that other issue, it doesn't matter how good regulations are, it's up to companies as well to regulate themselves, alongside the regulations.  If they don't do that, then the regulations are often meaningless.  

>> CORNELIA KUTTERER:  Absolutely.  And maybe just one point on the comments we received online, on my speech.  It is at the end also a corporate social responsibility issue type of discussion, which seems to raise now, just as a couple years ago, other areas, like employment rights.  They were topical in that context.  And really it is, if you look at reactions to incidents that are widely reported in the press, this has definitely also an impact on the economic value of a company.  The company has also an economic interest in not having -- not having these types of issues arising.  So it's also -- this is also a very important part to be part of the GNI, to be able to -- to be able to address issues that arise very quickly and appropriately.  

>> JONATHAN CHARLES:  Very quick comment from Daniel before we go to conclusions.  

>> DANIEL DARDAILLER:  We can spend hours on digital right managements.  There is right on the work as well.  So it's important to talk about.  But it's different.  You cannot oppose openness and open rights.  We have copyright at W3C.  Our Web site is confidential at some level.  You know, open to our members.  Open to the public.  So it's not the opposite.  You can manage the right of people to use something with technology and the thing that is always needed is a social contract.  I can access all the private data of the W3C, but if I start publishing it here, I'm going to be infringing a social contract.  It's not the technology that, you know, forbids me from doing that.  I cannot do that because I'll be fired from W3C.  So that's the thing that people have to understand.  There is a need for more social contracts on too much of the technology, which facilitates those kinds of transfers.  

>> MARKKU LAUKKANEN:  Good regulation is very small regulation, minimum regulation.  And it's like an enabler.  And that's why when we talk about, too, illegal content or how to protect children, of course there we need regulation.  And all stakeholders need to be involved in this work.  
And I'm happy that this copyright issue came forward for discussion now, because copyright can't be any kind of restriction for services.  Never.  And we have -- we were ready to serve some TV services through mobile, but we couldn't do that because of copyright.  And that is not ideal copyright.  And that's why of course I want to ask and invite all copyright organisations for this debate.  We need to solve this Question.  

>> JONATHAN CHARLES:  It might be one for the WTO as well in their discussions on a global basis.  Yes, small regulation, that is an interesting one.  That's what we used to say in the UK of course about the way that we regulated our banks, and that didn't turn out so well.  
Peng, you've been listening to all of this the whole two hours.  What do you make of it?  

>> PENG HWA ANG:  Okay.  Have you all tried getting into a chat -- summarizing a chat room?  I'll try.  I see a couple points here.  I see the effort by the COE, Council of Europe, as building trust.  This was mentioned a few times and this is very important.  Trust has been found to correlate higher with the penetration of the Internet than income.  And it's also interesting, because it's sort of a -- because I've done work in this area, an internal trust and an external trust.  So Germany and Japan had quite high internal trust.  So, their Internet traffic within Germany and Japan is high.  Relative to the Internet content, relatively high within Germany and Japan, versus outside of Germany and Japan.  Trust has been found to have a high correlation and so trust is very important.
I see that the work has gone to some level of technical details, talking about the consent, and the definition of consent, what is consent.  I thought consent means yes, but it's not just yes, yes to what and why to how, and so there are a lot of details.  
So we have made very good progress because people are concerned and they want to know about consent.  But the big picture Question is not clear.  What are rights?  So Daniel mentioned three rights.  I call them the W3 P, the right to platform, participate and the right to plug in.  And so you know, these are rights that we must consider.  
And then we talked about the right to forget.  Jonathan talked about the digital information.  My friend at the Oxford Institute talks about it. I'm sure you heard or read a book, you can talk about the right to remove information, because the Internet retains the information forever.  
So there is a right to consider as well as part of your openness, so is the right to delete an open right or is it a closed right?  And then when I -- I see people smiling because this is a freedom to connect.  That is not the problem.  The problem is the freedom to not connect.  
I want to throw away my Blackberry sometimes.  So is there a freedom not to be open and a freedom not to be private?
And on the issue of freedom of expression, you know, every state has got a freedom of expression clause.  I just checked for Korea.  Article 67, citizens are guaranteed freedom of speech, the press of assembly, demonstration and association.  Article 67 in the Korean constitution.  That is a bit down.  Article 14, Article 19, right, and this is Article 67.  Article 10  in the UN, right?  But still, it is there.  
So, the issue of freedom of expression is not the right itself, it's the issue of implementation.  And I think it was mentioned, Jonathan and Cornelia, that the law, only as far as how they are implemented.  Not just by companies, but by individuals, governments, and various entities.  
The final point I'll make here is that we assume that rights are like forever, a human right is a right forever.  But can rights change?  And a classic one is privacy.  I have no idea what these young people are doing these days, right?  I was young only yesterday, some of you are today, but I have no idea.  Why would you put information online so that you can be stalked and people can get into your house and steal your things?  Why would you do that?  So it's interesting that Jonathan's colleague, you know, a friend was able to track this lady down, a cyberstalker.  
I do have some things that are expensive and they would be of some interest to somebody to take from me.  So is this right of privacy changing as we talk?  And there are some Articles talking about the right of privacy.  Is it changing right under our feet?  Is this generation different?  I gave one of my students who graduated a house warming gift.  She bought a condo.  I urged that she buy it.  It went up 20 percent.  She invited me to a housewarming.  It's a fan.  And she took a picture and uploaded it on the Facebook.  I just gave her a $50 fan.  It's cheap.  Everybody sees that.  What is going to happen now?
You know, so, is this notion of privacy changing?  And right now, I would say that it's not enough research to go into this.  
So my and you daughter, I'm telling you not to put things up online.  And there were burglaries, again too bad that Richard isn't here, there were burglaries of homes in Australia using Facebook and foursquare.  And just as we speak, in New Hamshire there is a rescue going on of a home burglary in New Hampshire.  So this research will keep us occupied for decades.  But we are facing problems that we want to address.  We want to build trust.  We want to protect privacy.  I think the Council of Europe is the perfect organisation for this.  And I don't mean sarcastically.  
This is a fine example of how the -- this is really international work.  Europe is diverse enough so you have different cultures.  There is diversity.  Often one works in the COE and you can come up with an agreement, you can extrapolate and scale globally, because it considers national differences and cultural differences.  
But I would say that not everybody sees the same problem at the same time.  And in order to solve an international problem, you have to have everybody in the same room to see the same problem.  And I have the solution for that.  Vilnius is one perfect venue.  Everybody should go.  Maybe you should have the next meeting there, should go to the KGB museum.  It's called the museum of genocide victims.  It's a depressing place.  That's why you have to go, right?  When you go, you understand, I think we can understand the problem.  And we can appreciate better the need for privacy, for openness, for freedom of expression and for building trust.  Thank you.  

>> JONATHAN CHARLES:  Thank you very much.  I hope you realise --
(Applause)
-- that even now that picture of you with a 50 Euro fan is being used by manufacturers.  
Thank you to our distinguished panel.  Thank you to you for taking part.  All of the information will go to the Council of Europe report.  Thank you to everyone who is watching online there.  Their input will be welcomed by the Council of Europe.  I want to thank the Council of Europe for organising this and it was good of them.  
And a reminder to you that around the room you'll find these documents from the Council of Europe.  They tell you more about the issues of Internet access, privacy, and of course you're welcome to take those.  Thank you very much.
(Applause)
(End of session)

********