FINISHED - 2014 09 05 - Dynamic Coalition on Freedom of Expression and Freedom of the Media on the Internet - Room 5

FINISHED COPY

NINTH ANNUAL MEETING OF THE
INTERNET GOVERNANCE FORUM 2014
ISTANBUL, TURKEY
"CONNECTING CONTINENTS FOR ENHANCED
MULTI-STAKEHOLDER INTERNET GOVERNANCE"

05 SEPTEMBER 2014
11:00
DYNAMIC COALITION ON FREEDOM OF EXPRESSION
AND FREEDOM OF THE MEDIA ON THE INTERNET

 

 ***


This is the output of the real-time captioning taken during the IGF 2014 Istanbul, Turkey, meetings.  Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the session, but should not be treated as an authoritative record. 


***


    >> GABRIELLE GUILLEMIN:  Good morning, everyone, my name is Gabrielle.  I work for an international free speech organisation.  Thank you very much for coming as you can see we have a little problem with our panel which has pretty much ran away and our chair isn't here yet. 
    So I would like to suggest two options.  One option is to maybe cancel this session because the Coalition is not being very dynamic right now.  Otherwise, well, there's me.  And we have a couple of questions to get us going with this session.  I can tell you a little bit about our work on Article 19 on Internet freedom, some of the issues that are of a concern to us.  But what I would really like is to make this as interactive as possible so that we are all panelists more like having a discussion. 
    If you can maybe give us your thoughts if you would like to carry on with the session so if you would like to continue, please raise your hands.  And if you would rather cancel, please you know raise your hands, too, so we can see if we should just keep carrying on or not. 
    Okay.  So well thanks for being here again.  So I'm not sure if you're all familiar with Article 19 it's an international free speech organisation that works for freedom of expression freedom of information around the world.  We have our headquarters in London but we have regional offices in many countries around the world such as Brazil, Mexico, Kenya, Senegal, Bangladesh, Tunisia, and other places.  So we're very well placed to get a good sense of what's going on around the world when it comes to freedom of expression.  What we do is a lot of standard setting and we also engage in strategic litigation.  We intervene in cases to tell the courts about the important public interest issues that they should take into account when examining cases. 
    Today the topic for the Coalition was the battle for user generated content.  So what I'm going to do is tell you a little bit about the work that we do on two topics. 
    The broad topic is intermediary liability.  So what I would like to talk to you about is, first of all, Web site blocking and filtering.  And secondly about what is usually known as notice and takedown and which is very relevant to user generated content.  So first of all, thinking about Web site filtering and blocking, one of the main concerns that we have is that we're seeing that this is increasingly becoming common practice in several countries around the world. 
    Usually it's in relation to certain types of content.  So for example, for the purpose of protecting children, for copyright reasons, and many others.  We have a number of concerns with these practices.  A lot of time first of all there's no legal basis for them.  We very much oppose mandatory blocking, filtering in any event.  What you also see is Web site blocking orders.  So the problem for us is a lot of the time instead of being ordered by a court, it's ordered by the executives so there's no balancing of freedom of expression with other considerations.  And as you know from expression it's not -- as you know, freedom of expression is not absolute it can be restricted nonetheless what we are seeing is in the name of protecting all of these other interests, freedom of expression weighs very little in the balance.  So that's one of the concerns that we have.
    Another concern that we have of is that very often these measures become voluntary so sometimes the state seeks to act as a broker and will encourage Internet intermediaries to filter content and that's one of the problems that we've had for example in the United Kingdom where first there was a talk of filtering content for child pornography, which is a well established exception under Internet Human Rights law.  And there's no question that this kind of content needs to be prohibited. 
    However, what happened is from child pornography, the Government all of a sudden started talking about pornography full stop but pornography was not and is not illegal.  And then what we started seeing is that instead of having child pornography site being blocked voluntarily by Internet service providers what happened was that a lot of Web sites that were providing information about sex education, for instance, were being blocked. 
    And now the real issue has become that we don't know if other types of content have been added to the list because this is being done voluntarily.  So the fact that it's voluntary means that, first of all, there hasn't been any transparency as to the exact types of content that are being blocked.  And there's no real remedy to challenge if a site has been blocked.  So if I have a sex education site and I discover that my site is being blocked, there's no way of complaining about it.  And I think that's an interesting trend that we're also seeing in relation to freedom of expression because the more notice and takedown there is it's very difficult because it's against a private actor.  When you're being well I'll say censored but when your information is being blocked or content removed, it's really hard to challenge it.  There's a real problem in terms of remedies, due process, and transparency.
The only caveat we had in the case with the United Kingdom is in principle people are able to ask their Internet Service Provider not to put their filters on.  The problem is that because people usually don't really know how to use software or it takes -- I don't know if you have ever tried to get in touch with your mobile providers to try to get something fixed we all know it always takes forever it's really complicated and sometimes you just give up because it's just too much hassle.  So here the filters are being put on by default and people have to actively ask for no porn, please, or no -- we don't even know what. 
    So that's one of the concerns I wanted to talk about. 
    The second thing I wanted to talk about was in relation to notice and takedown.  Are you all familiar with notice and takedown?  Do you know what this means?  Maybe a show of hands. 
    All right.  So what happens is you all know people can just put in comments and share their content on various platforms such as YouTube and many others.  Now people can also -- they generate their own content. 
    And here you have like different interests that go into play.  For instance very often with user generated content, it may be say for instance that it will be copyright protected so the copyright owner will want to complain about the material and will want the material to be either filtered or taken down.  Most of the time it won't be very effective for them to go to the user so it's more interesting for them to go to the intermediary the intermediary in a lot of countries is protected, has an immunity from liability so they are responsible for the content because they are only providing the platform.  However in a lot of countries when they are notified that there's content that is illegal, they have an incentive to remove it because if they don't remove it, that means they may be held liable for it as if it were their own content.
    So this creates problems because very often what we have seen so what is called the notice and takedown regime is that anybody can complain about anything they don't like essentially.  They give very little information about the nature of the complaint.  And then because the intermediary risks being held liable, they just remove it without examining it. 
    And when you think about the vast amount of content that is, you know, available online, what happens is that an awful lot of material disappears without any real process or without anybody really noticing. 
    So that has a chilling effect on freedom of expression. 
    Now, I'm just going to give you just a little example of a case we were involved in in the European Court of Human Rights called the Delfi case. 
    In that case, there was an article published on the Delfi Web site which is a new site in Estonia.  Very popular.  It was an article about an issue of public interest.  It was about ice roads in Estonia don't ask me what it means I'm not sure but it was certainly of interest to the people in Estonia. 
    There was a comment about the various companies that were involved in putting in those ice roads so when people reacted to the comments, some of them used abusive language to describe the company or some of the people who were involved in these ice roads.  So there was a complaint to remove the comments.  And Delfi removed the comments almost immediately.  Nonetheless, so you would have thought that having removed the content without even looking at whether or not the content was in fact unlawful and defamatory or whatever the law was in Estonia, so the content was just removed. 
    But still the person sued Delfi in damages despite the fact that it had actually done what you could have expected in the circumstances. 
    And so we got involved in the case because we generally don't like notice and takedown because we think that intermediaries are not best placed to make these determinations.  In fact they don't.  So they just remove the content.  This has a chilling effect on freedom of expression but here they almost actually played by the book but nonetheless, they still got sued for it. 
    At first instance the European Court of Human Rights agreed with the Government that said yes it was right that it was removed in fact Delfi did more than what they did because they took too long they should have put filters in place they should have foreseen what was in the main article, these types of comments.  Can you imagine trying to figure out every time you put something out you have think of what people will say?  They even recognized that this was in the public interest. 
    Fortunately when we all got together in very Dynamic Coalition with nearly 70 newspapers, associations, Internet service providers associations and so on sort of explained to the court look I think here you got it wrong, there's a problem.  This is also stifling for innovation.  Because newspapers are going to die.  Because I don't know if some of you are involved in the newspaper industry but personally as a user, I live in the United Kingdom.  I read newspapers.  And usually I tend to go for, oh, what is the article that got all of these comments?  There must be something interesting of the some of these comments are actually pretty hilarious, as well.  So it creates a really vibrant platform for public debate.  So fortunately we managed to convince the court to reconsider the case.  And we're now waiting to hear the outcome which may be about a year.
So I'm going to stop here now because I've already done quite a bit of talking and I would like to hear what your concerns are in your countries.  How you think a coalition might be good for.  So please raise your hand and I'll give you the mic. 
    Anybody?  Please. 
    >> RENATO LEITE:  Hi I'm Renato from Brazil from the Center of Technology and Society.  And I do know the work of the Article 19 I do follow it not only in Brazil but the rest.  I do think I guess everybody heard about Brazil, Brazil, Brazil during the IGF and everything but I don't see anybody speaking how is the impact of censorship in Brazil when it comes to online content and Article 19 has written about it for years and years Brazil although it's a democracy and is innovative in the new Internet and regulation of the Internet, for years it was the first one and a number of requests to Google to remove content and last year was the second one.  And as of right now we are in an electoral period over there and more and more Connecticut has been taken down.  Several people were just arrested because they just put some content online that was criticizing politicians and I guess one of the things they didn't see here in the IGF and maybe Article 19 could do something or -- I don't think we have time anymore but anyway.  It's Friday already. But show the other side of Brazil.  And not only the good influence of NETmundial and the good influence of a law that tries to put the liability on the intermediary, gives almost total immunity to the intermediary, but still the judiciary branch has given a lot of room for censorship for Internet content over there. 
    >> GABRIELLE GUILLEMIN:  Thank you, also one thing to bear in mind is even though you may have a framework for intermediary liability there's always terms and conditions and content can be removed under those terms and conditions and one of the of problems that's been discussed I think in other sessions is that although the transparency already a great initiative from the companies in terms of transparency but unfortunately it doesn't really give a full picture.  Because it includes the requests from governments and court orders but it doesn't really say much about the content that's removed on terms and conditions.  My understanding is that it can be tricky because a lot of complaints can come and be very a bit like spam. 
    So it's a high volume.  But nonetheless, it still doesn't give a very accurate picture of how much content is actually removed. 
    Anybody else want to share their concerns and their questions? 
    >> NORA ABUSITTA:  Yeah thank you this is Nora from Turkey first of all I would like to thank you not to leave the battlefield and to share your ideas with us.  And during the last three days from the sessions I got the idea that blocking and filtering mechanisms for the protection of minors, children and for the reason of copyright, one of the measures.  But it sets limits the governments would like to protect the public interest they put blocking and filtering mechanisms but on the other side together with the youth and the public do not want they are not very interested to accept or to share the same feelings with the governments.  But how can we -- you mention about the child pornography and where child pornography is forbidden everywhere in the world but not pornography, adult pornography is allowed in some countries but not also in some countries.
    And it is very tiny detail maybe but how can we come together in the world?  I mean I just entered yesterday a session on the charter for the protection of minors again rights and today this morning there are nice guidelines prepared by UNICEF and ITU just for, again, for the child protection.  There are many guidelines, many regulations, soft laws. 
    But how can -- how are we going to apply together I mean with the Developing Countries and Developed Countries.  Thank you. 
    >> GABRIELLE GUILLEMIN:  I'll take another question and then I'll get back to you. 
    >> AUDIENCE:  Hello, I'm Siliam (phonetic).  I'm from Turkey.  I'm a lawyer and I just want to give an example of the removal request, the taken down request so maybe you just follow the news and they blocked and censored Twitter it was not about the chart protection.  It was about they just -- some group of people, they revealed the politicians’ records and recordings and the voice records. 
    And so instead of -- I don't know which was harmful content for whom.  But they just censored Twitter and YouTube.  So I think, okay, maybe the child safety is a reason for takedown requests.  But at some point I think this point of view is just misused and lots of politicians are misusing it.  And the extent is getting broader and broader so where are we going to put the limitation?   Thank you. 
    >> AUDIENCE:  Hi.  I'm John from Denmark we have a huge dilemma we have been trying to solve for the last year and a half I think we have a hotline against child sexual abuse images also called child pornography and we each day make records and try to find the pictures and have the notice and takedown actually.  But a year and a half ago there was a huge trend by young people who took naked selfies of themselves and put them online for instance Instagram and they were actually above the age limit so it actually was child pornography.  We don't know what to do with that so if anybody has an idea, I would like to really listen. 
    >> GABRIELLE GUILLEMIN:  We'll have a few thoughts on this and move on to the other speakers. 
    I think the problem we see time and again with Web site blocking one has already been mentioned is the slippery slope which you start by blocking one type of content and all of a sudden the next thing you know you're blocking something else which wasn't -- which you in a way doesn't even have a legal basis for it because it was not provided for. 
    The other problem we see with Web site blocking is that filters although they must be improving them as technology develops but they either underblock or overblock.  So what you end up with is in a way a sense of false security.  Because you look like you're doing something about the problem.  But without actually solving it.  It looks like yes those filters were taking care of the problem.  But in fact, the content is, as I was mentioning, underblocked or other content, legitimate content, gets blocked in the middle. 
    So that's obviously an issue.  In terms of international standards on Human Rights, the UN special Rapporteur has been very clear and there's been a case the case in the European court which says very clearly you need to have a legal basis and you need to have safeguards when you have a blocking order.  It should be done by a court.  But also it should be strictly tail order to be proportionate.  And so far as European juris prudence is concerned, anything blanket ban will be found to be disproportionate.  So that's one thing. 
    The other thing is that when you're talking about especially child pornography at the end of the day what you need to do is prosecute these people rather than just block content like what you really need is prosecution if you have the evidence that you need against these people.  So that's the real measure that should be taking place.  Of course it's often harder. 
    So that's also an issue. 
    Our position based on international standards is that to the extent you want to have filters, just let user choose and let's parents take responsibility for their children.  Because otherwise it's like the state saying we're going to put a little policeman everywhere you go online to try you to check on whether or not it is safe.  And it's not really turning people -- making them responsible for what they do.  And in the case of children we think it should be their parents if you can have something you put it on your device it's your choice but at least it's not affecting what others can get access to. 
    And in the UK there was this sort of reaction which was that well so all of a sudden you'll a need to look at what a five-year-old can have a look at. 
    To go back to the other point that was made about child pornography, we have heard more recently a debate around revenge porn or sexting and all of those types of things and again there's a lot of outrage usually and certainly we agree what you're talking about very often is a breach of privacy for something that has little free speech value.  At the same time the problem is the means used to address the problem. 
    Usually we would say, well, look at what's already available on the statute book and see whether the circumstances fall within that.  So for example, harassment, does it amount to harassment or threat or violence?
    Now I think in the case of child pornography it's a tricky one because I think it's -- for instance, in the UK which is the system I know best, this is something that would fall to the prosecution, that it would have to exercise discretion.  And here it leaves -- it still doesn't give a very strong guarantee because they have to exercise this discretion wisely and again we've seen in other cases how they have used very broadly drafted laws to prosecute individuals for what they said online when, in fact, it was innocuous and things that would not normally have been prosecuted if it had been said in the offline world down at the pub or in a cafe.  So this is where I think it's very important to have various tools to try to educate about users and children in particular about what they might find online, how also to protect themselves.
    It is true that the real name policy of organisations such as Facebook is set to be good to give a measure of control of civility on the platform because people don't lash out completely when it's in their real name.  At the same time it also creates problems in their own rights so for example if you're talking about children, if you have like prying adults trying to lure children online.  I'll stop here and I'll take some more questions.
    >> AUDIENCE:  Thank you.  I would like to share some of the latest trend and some of my questions.  I'm Sarah from Hong Kong I'm a journalist also working for the international federation of journalists.  First, one of the things I would like to share is when you just mentioned about the EU court, the different kinds of standards, one thing which is if one company they wanted to order to close a Web site or something that kind of thing, they need an order.  But I would like to say, it only happens in Europe countries.  And you are talking about in other countries, there's no way for them to follow this.  And they just want to think that administrative law or administration's power is more important than the artist.  So this is one important thing we are always facing at sometimes I have to say we feel powerless and the other things I would like to share is about the trend.
    I would like -- in particular because we are focusing on monitoring press freedom violations and those in the authoritative regime they started to cooperate with the business sectors under the good names of best practice or agreement which is signed by different sectors so it makes you feel a little bit more uneasy to compete with. 
    And the other thing is governments started not only to block those of the so-called involved with sensitive issues Web sites.  They also engage those of the social networking platforms.  They use these as their mouthpiece to disseminate all of the information.  But at the same time, they will shun the other voices so this is one of the trends.  It also happened in Hong Kong.  Quite a number of the Government officials, they like to make use of this Internet system to convey their messages.  Instead they come out to face to the press to ask all of the questions.  So this is one of the difficulties that we are facing. 
    So I'm really looking for whether there's any solutions to deal with it.  Thanks. 
    >> GABRIELLE GUILLEMIN:  Anyone else and we'll get back to your point. 
    >> BISHAKHA DATTA:  I actually wanted to share a little about the India experience with freedom of expression and then make some broad points.  My name is Bishakha.  I work for a non-profit in India called Point of View so we are facing a really complicated situation right now where there's a case in the Supreme Court as well as a petition in Parliament asking for a ban on online pornography.  Which again the complexity is that we feel not only is it completely technically unfeasible, but also, you know, we obviously achieve nothing.  We'll be like a serious onslaught in freedom of expression. 
    And arguments that are being used in the court cases, you know we've had a series of sort of a high profile sexual violence cases recently in India and if you ban online porn violence against women will go down there's no evidence anywhere in the world but one of the problems we face is policymakers don't understand sort of the technical aspects as well as that, you know, not -- like they don't understand sort of the space in many ways. 
    And it's become increasingly difficult to actually make arguments.  And the other problem is that even when we pitch it as freedom of expression we are likely to be into an evil -- a morality frame.  And it's very hard with the right wing conservative Government in India so we're actually trying to strategize on what to do the other thing I wanted to say and I would like to ask what people think about it I think it's not just governments that are threats to freedom of expression online.  I think we face a lot of private censorship from platforms and there have been many cases recently from Facebook, Instagram, where people who are not actually violating their Terms of Service if you want to examine it are getting thrown off so this is an arbitrary way in which regular people in countries where we don't have blocking by the Government are still facing day-to-day censorship.
    >> GABRIELLE GUILLEMIN:  Thank you.  Just one more. 
    >> DAFNE SABANES PLOU:  Just briefly I'm Dafne from Argentina I work for the APC women's rights programmes and we're working now on a project on violence against women and ICT and use of ICTs and we find that when there is harassment and diminishing of women online it is also affecting their freedom of expression because they decide to leave the site for example they decide to close their Facebook accounts or Twitter accounts or stop blogging because they feel harassed.  So I think there's a big problem there that at this moment we're having hate speech against women is also limiting their possibilities to express themselves and to participate online. 
    >> GABRIELLE GUILLEMIN:  Thank you.  Just to reflect very briefly on what you were mentioning about pornography it's well known it's extremely hard to define it.  If you look at works of art dating from the antiquity, you have of a lot of statues rather of naked men and women.  That could well qualify as pornography and I think what this illustrates is I think even if you thought about like how even if you prohibited your teenage boy from looking at pornography he may well find a way of finding  it. 
    So really I think the problem with these filters is that it allows the Government to look like they are doing something.  Once again.  But the filters themselves are not effective.  And one of the things that we see time and again is actually if we're talking here about criminals looking at say like real child pornography they are not really googling it.  They are much more likely to be in other networks where it's harder to catch them.  So ultimately it's more the general public which is less tech savvy that doesn't get to get access to content which is legitimate and lawful. 
    Just to go back briefly to the point I made earlier about how governments also use social media for their own purposes I think it's a very tricky one in terms of how it's used to -- well, depending on how you view  it as either to manipulate or direct the debate.  We had this problem in Tunisia, for instance, and knowing how and when can Government officials use social media during the time of elections.  And I think for some of these more discrete issues, I think it would probably be better to start by regulating the parties themselves in terms of what they do and their own use.  We see regularly how the use of social media and other things in the workplace is more constrained.  So I think the it's not impossible to think about like especially when you're talking about election time how this could be within certain limits and they would have to be checked against the usual principles so far as we are concerned with does this have a legal basis, is it necessary and proportionate.
And think about these kinds of solutions. 
    Now just to think briefly about women, I think it's a tough one because it has -- with the whole debate about cyber bullying we don't really have a definition.  It does have a chilling effect on their freedom of expression.  At the same time it's really difficult to say well you know some people have a very robust and frankly despicable way of expressing themselves however as a free speech advocate, when you look on balance, before you apply a restriction on their own speech, it has to comply with a number of requirements.  And if it doesn't reach that sort of level of threat of violence for instance or if it's not enough to amount to harassment, then depending on the measures that are taken against those trolls, it may the well just be disproportionate it's not because someone is using really abusive behavior that they should go to prison for instance so I think again here it's a lot of dialogue and probably education on how to respond to that kind of behavior to be better armed to deal with those issues.
    I think we had someone else who wanted to speak at the back. 
    >> AUDIENCE:  I'm a journalist from Radio Netherlands.  And I have something more positive to share with everybody of here.  Because I mean I heard people are talking about censorship from the state and from different spectrums.  And we -- I have seen some really exciting developments the in China.  Some Civil Society groups.  They have been developing this technology called clear of freedom which basically uses clouds to make Web sites that have been traditionally blocked in China unblockable so people in China don't need to use VPN or proxy servers to be able to access these Web sites.  The funny thing is Google has been blocked in China and they made a mirror of Google called free Google so people in China could still use Google service. 
    >> GABRIELLE GUILLEMIN:  Great.  Thanks very much.  We need good news. 
    Anyone else want to talk about or share their experience and try to think about solutions? 
    One of the things I've been wondering personally is especially thinking about the right to be forgotten, which may not be of direct concern to all of you here but what we're seeing is that when individuals want their personal data to be erased now Google has an obligation to consider their request.  And it's under the protection law. 
    If Google refuses to comply with a request and refuses to erase the data from its index, then the person can complain to the data protection Commissioner. 
    Now if on the other hand your content is deindexed, you have no such remedy.  So first of all, you have to count on Google to tell you, to notify you, that your content -- that someone has asked that the link to your content should be removed. 
    Which in itself can be a problem because for instance even if you are a business say for instance it's arguably it causes interference with what you're trying to convey to the public.  And so you have to count on Google, first of all, but then you can't complain really to anyone.  You have to hope that Google will do the right thing.  And then if your content still gets removed, you don't really have a means of complaining. 
    So one of the things I've been wondering about is if we should perhaps think about new causes of action possibly I'm speaking in legal terms here so that you have a remedy when you have that kind of interference with your freedom of expression. 
    So I don't know if anyone else has any thoughts on this, would like to share their views on how we could try to solve some of these problems. 
    >> JO ELLEN GREEN KAISER:  My name is Jo Ellen and I'm with the Media Consortium in the United States.  And I think one of the reasons I'm sort of stuck here is a lot of the problems are being caused -- are happening in a legal framework.  And you know, we're all -- I represent journalists who are working in a journalistic framework and so we don't really -- it's hard for us to grapple with the legal issues.  And also even to find some representation and to understand exactly -- also the fact that they are global issues like the right to be removed is global.  And so it's hard to even know exactly where we'll intervene.  Like I'm not sure how -- a lot of U.S. journalists are very upset about the right to be forgotten but it's not clear to us how we would intervene in that debate. 
    >> GABRIELLE GUILLEMIN:  Well, I suddenly understand that you know it is a challenge in terms of how global it is.  At the same time thinking about the right to be forgotten, it's interesting that first of all a link would have to be established with the EU.  So I imagine if for instance you had content which is -- and the link is erased but it still has readership in the EU, it's not impossible to imagine there would be enough of a connection to -- but again, you would still have an issue about complaining your content has been removed.  That's a general problem. 
    But what we see very often in terms of which law is applicable, I mean you have some relatively straightforward scenarios where everything is just in one country so it's not too much of a problem. 
    Usually one of the difficulties is something actually we have talked about in the Council of Europe is what happens when a state orders a measure and it has an effect on access to information in another country where that measure -- where the content itself of is lawful.  I think here quite possibly what you would see is that the people who are affected in the other countries should be able to complain about the measure in the country where the measure was ordered.  And arguably, it raises sovereignty issues, as well. 
    Now the other thing that we see very often it's not so much of a problem the courts order very often injunctions but what's really difficult is for them to be enforced. 
    So for instance if a court in France says that all of the images of Mr. Mosley should be filtered, the question becomes again you have first of all this problem.  But it's not clear that this is something that once it reaches say Google in California that you wouldn't have an issue of enforcement because there's no equivalence to enforce the judgments. 
    So obviously these are all issues that we're still grappling with.  And I don't think there's any sort of simple answer unfortunately to these questions. 
    Any more questions or experiences you would like to share?  Yeah. 
    >> JONATHAN SSEMBAJWE:  Thank you so much.  Mine is a question.  Jonathan Ssembajwe from the foundation of Uganda.  We have a media programme in Uganda my question to you is how can we empower children and young people to use the Internet especially the social media, to share productive information among their peers maybe the entire public.  Because some of them have got some good information to share with the rest of the people, their peers, but especially in Africa many don't know the media -- I mean the Internet and social media can be used in that productive way.  Thank you. 
    >> GABRIELLE GUILLEMIN:  I think first thing that would come to mind is having education and maybe get subsidies to be able to have these training courses for these categories of people.  Don't you think for children it's probably easier because now they are born and raised and usually in a lot of countries they already have those devices?  I mean, they might be slightly different depending on the countries.  Maybe in Africa it's via mobile phones.  But certainly I think for example I'm sure we all have elderly parents for instance who are struggling with trying to send an e-mail and I think that here what we're talking about is really trying to educate them to use those tools.  And training. 
    And hopefully having supports to sponsor some of those by way of subsidies for instance. 
    Okay.  So if there aren't any more comments -- no?  You would like.  Okay. 
    >> JONATHAN SSEMBAJWE:  Yeah, I can comment on the youth thing.  Yeah, there are so many possibilities with the Internet.  And many know about it which is really, really great.  But one of the things that can make a barrier for us using it as an innovation and using it a lot to make a change is if there's too many controls too much blocking the sites because they are so afraid.  So many adults are so afraid that the Internet is such a dangerous place and yes it is but it's also really, really great and I think the a lot better so if you're always talking about when we talk about the parents with dangerous thing with the Internet we also have to talk about the positive things so they also see that the Internet can also make a difference. 
    >> GABRIELLE GUILLEMIN:  Thank you very much for these positive thoughts on all of the benefits that can be from the Internet.  And I think it's actually very important to bear that in mind because it's the always easy to focus on the problems and all that's wrong but actually if you looked at the proportion of problems versus all of the benefits that we've been getting from the Internet, I think we would see that actually there are much more proportioned. 
    All right.  So I think that maybe if there aren't any more comments, of course I would be very happy to continue.  But otherwise, I would suggest that we take first questions from remote participants, if there are any.  Yes.  Okay.  We would be very happy to address them. 
    >> REMOTE MODERATOR:  We have two questions from two people.
    >> GABRIELLE GUILLEMIN:  Okay.
    >> REMOTE MODERATOR:  The first one is from Davis Onsakia.  The Right to be Forgotten is great.  But it is applicable to the whole world from a Google perspective.  Google laws are supposed to enforce it in Europe stand to be corrected.  The first question. 
    >> GABRIELLE GUILLEMIN:  No, I think here there's a bit of unclarity still has to how it should be applied.  So far what we have heard is that Google was applying it to Google.fr if it's France or Google.sp for Spain or ES rather. 
    But only for the EU countries but not beyond.  Not to Google.com . But it's been contested by the data protection Commissioners because they have said that it would defeat the purpose of the measure.  Because all you need to do is instead of doing your search on Google.co.UK for instance you only have to go to Google.com but it's still something which is being discussed. 
    Second question? 
    >> REMOTE MODERATOR:  And the second question from Alexander Saduc (phonetic).  What about set legislation to make the most of ICT provided for users in school traffic and content to children? 
    >> GABRIELLE GUILLEMIN:  I'm not sure I understand the question.  I mean generally speaking, as far as international law is concerned, there are standards that on access to the Internet that seek to promote measures to empower people to get better access to the Internet.  What you also see which is a completely different aspect is that the use of the Internet is more strictly regulated in certain places.  So for example the workplace but also schools.  Which raises a host of other issues.  And I'm not sure which one that is, which of those two issues this question is addressed to.
    >> REMOTE MODERATOR:  School service. 
    >> GABRIELLE GUILLEMIN:  Well, normally when you're looking at for example the use of the Internet being more constrained maybe in schools, very often what you need to have a look at is whether or not the measure is, first of all, provided by law.  And you know if it's necessary and proportionate.  If it's reasonable. 
    I think usually the filters in schools are generally regarded as being acceptable.  Because it's for a particular place.  And they have a duty towards children but now if it's beyond schools, it raises far more issues because you have to think about the broader public. 
    Okay.  Well, thank you very much, everyone, for staying for this session. 
    We have heard some great thoughts and we got some very positive energy about all of the benefits of the Internet.  I hope you've had interesting comments that will help you think more about these issues and try to find some solutions.  And once again, thank you very much for staying with us. 
    (Applause)

***
This is the output of the real-time captioning taken during the IGF 2014 Istanbul, Turkey, meetings.  Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the session, but should not be treated as an authoritative record. 
***