The following are the outputs of the real-time captioning taken during the Tenth Annual Meeting of the Internet Governance Forum (IGF) in João Pessoa, Brazil, from 10 to 13 November 2015. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record.
***
>> JUTTA CROLL: So, once again, good morning, everybody. I think we are now complete with our speakers, those on site and those that participate remotely.
My name is Jutta Croll, and I'm the managing director of the German Center For Child Protection on the Internet. I'm glad to have our distinguished speakers here on the panel to take part in the meeting and also you as the audience and the remote participants.
We are speaking now today on a topic that has already been on the agenda of the International Governance Forum since 2007 at least as I know which is abuse and sexual exploitation of children on the Internet. But today we want to put a special focus on the so‑called grey areas, which would mean we would like to talk with you about what is sexual exploitation, what is abuse beyond what has been called previously so‑called child pornography; and there are many grey areas around that.
And my colleague Gitte Jakobsen from the Save the Children Denmark will go into more detail into that topic.
First, let me introduce you to the speakers.
I begin on my right side which is Marie-laure Lemineur from ECPAT International.
Then we have Marco Pancini from Google.
We have Gregory Mounier from the European Cybercrime Center at EUROPOL.
Natasha Jackson representing the GSMA, the mobile operator association.
And Amy Crocker from the INHOPE Foundation. And we will explain what the role of all the organizations is in combatting child sexual abuse.
I would like to introduce you to the network No Grey Areas which was set up in Germany one year ago, and you all have the reports on your table so if you want more information, it's in the reports that are spread out.
We have slides to show you just shortly to introduce you to the work of the network.
And when we are talking about depictions of abuse and sexual exploitation of children in the grey area, these images are characterized by mostly international dissemination, why it's so important that we are talking today at the Internet Governance Forum about this. They are easily to be found and accessed, and it depends on national legislation whether they are considered illegal or not.
So we will hear more about that later on.
When we.
Started the network in Germany, we had the objective to develop an international strategy to combat this type of depictions of images of abuse and sexual exploitations.
We have agreements with platform providers so far with Google ‑‑ and, Marco, I think you will tell us a little more about what Google is doing in the network. And we have agreements with the hotlines operating in Germany, how they deal when they get reports on this grey area imagery.
It is very important to understand that the hotlines take reports from the people out there. And most of the people that report that they have found something that they find disturbing that is of concern to them, they would not know if this is really illegal material or if it's just disturbing to them.
So the hotlines already get the reports on the grey area imagery and they deal with that. And that's the role they have in the network No Grey Areas.
So just to make you understand that we are talking about images, we have some criteria already set to understand what images we are talking about.
And the images could be in the grey area itself because the depiction is harmful to the children that is shown on the image.
But in addition, it could also be harmful to children that are confronted with these images, for example, in the process of grooming.
And then what is really important to mention is also the context of the provision of these images is very important. It could be the amount of images that are shown in a certain area on a website or in a forum. And it could also be the area where the images are shown when it's a website where other pornographic content is shown and then the images of the children are put in there, it could also be grey area imagery. And then ‑‑ and that is also important.
It could be just sexualization by the comments put to the innocent images that show the children, but then they are commented on by people who just might be telling what they'd like to do with the children that are depicted on the images.
And we know it's very difficult to assess the context of the image and the context of the provision of the image, but we will to that point later when the colleague from the German Institute from Artificial Intelligence will tell us about the methodologies, how the image itself and the context that it's provided in will help to understand whether it's grey area imagery.
So the network No Grey Areas that has been working for one year now in Germany is built on three pillars. On the one hand, it's a competent center that sets up all the knowledge that is around the images, how to understand which search terms are used by perpetrators to get access to the images.
Then we have the platform providers ‑‑ I'm very happy to have Google on board the project. We are also talking about others like Facebook and Twitter. So far they have not joined the network.
And then as mentioned before, we have the hotlines that have a very important role in taking the reports and then considering, assessing, and deciding what they can do on the images if they're illegal or not illegal.
So far, I'd leave it to that point just to give you a short understanding of what the network No Grey Areas is doing.
Now, I'll hand over to the Gitte Jakobsen from Save the Children Denmark. She is remotely here, and we have her presentation.
Can you put up the next presentation, please. And let us hear what Gitte has to explain with her presentation. Thank you.
Gitte, can you hear us?
We don't have your voice on the loudspeaker, but I heard you faintly.
Gitte?
We can't hear her.
Could someone tell her to stop because she's speaking, and we don't hear her.
We were so much forced to have remote participation and speakers, so I hope that's a technology will help us make it working.
Okay. The technicians are ‑‑ do you hear it ‑‑ she was ‑‑
(No audio)
>> JUTTA CROLL: Go on, please.
(No audio)
>> GITTE JAKOBSEN: ‑‑ on the area. You need to change the slide, please.
So we are looking at the illegal content on the one side knowing that what we have there is pretty clear and simple for us to understand.
Next slide. Please move on.
Can I do that myself?
So on one hand ‑‑ this is going to be fast. Can I do anything on the remote here? No?
Okay. There's a problem with the slides. You are moving too fast for me. I'll just go by my own slides here. See if you can follow along.
We have on the one hand the illegal content. We are not allowed to show you. We don't want to.
In the middle, we have the grey areas between the normal growing up ‑‑ normal images, images of any child. And then we have what we call images in the grey area. And we did a report on that in Denmark two years ago looking at probably around 20,000 images because we do run a hotline looking at illegal material, reporting to police, international network, everything about that. But we found that a lot of images were like modeling images. And if you have that one slide showing the front page of the report, you also have one example of an image in the grey area which is obviously grey area. Because in Denmark, this would not be legal. In some countries, it would be.
But what you have when you look at images in the grey area ‑‑ and we're moving to that slide with the black‑and‑white image ‑‑ and the sexual erotic nature of the images lies both in the objective qualities of the material itself and in the mind of the collector. That is an old quote and a very old image.
But really what it's saying is that you could have images that are in the grey area or any image exploiting children in a context that is highly sexualized.
So let me give you the definition that we work with on images in the grey area. If you have that on the screen now, my slide number 7.
Images in the grey area are still or moving images of children's bodies where the photographer's staging of the child and/or the use of the material in an erotic and/or otherwise sexualized context gives grounds to presume that the purpose of the recording or the display is to satisfy a sexual interest in children. This is the definition that we came up with. We're talking about images of children's bodies and the way that it is being exposed. I will give you a few examples of that kind of images in a few minutes.
We have a commercial aspect in this that is very important for us to highlight as well. That is the next slide, financial profit made from sexual exploitation of a child, trafficking children for sex, sex tourism ‑‑ also, the latest, what comes out is the live cam digital sex tourism. You don't even have to travel anymore. Just sit down behind your computer and order sexual abuse of children ‑‑ child prostitution, and then the production and sale of child abuse images including sale or advertising of pornographic and other types of websites with child sexual abuse material.
Moving on to the next slide is a bit tricky for you if you do it there because several images will appears once you click through it. So what you have is on the one hand, we have the illegal material. Really, it's based on the Taylor and Quayle Categories 1 through 10. You can read about that in the report if you don't know that yet.
You have basically from normal to ‑‑ normal images ‑‑ any image of any child in any circumstance that you would just show to anyone all the way through to the most gross assault, sadistic, bestiality images. And all that is marked in the red is illegal content. That is pretty obvious what that is.
And then on the left‑hand side, you have images of the happy child, the nudist art, and then the voyeuristic, any interest that an adult with sexual interest in children could look at for their own benefit of sexual interest.
In between, we have the posing and erotic posing. If you click through, you will see the images appear and now should be a yellow circle on erotic posing. And in that is where you find the images in the grey area. And that would be the images itself. They are there. And I'll show you examples, but we've put the grey area from the erotic to the normal posing because that is about the context.
So grey area could also be how the images of the children are being exploited.
The next slide gives you details on the research we did.
You can have a look at that later, but basically it's saying that in some of the websites you find one image, but in a lot of them, you find more than a hundred images. When you scroll down, it just keeps coming up, and lots and lots of images.
Children, boys, girls, age group interestingly 9 through 12 years old. Most of the images show children in that age group.
So we have the illegal ‑‑ if you scroll down now, the illegal content on the one hand and then comes again the report and three images from the report gives you an idea of what kind of images are we looking at. And these are the definitely close ‑‑ very obvious staging of children as models.
These images would not be illegal in Denmark, maybe in some other countries.
One of them, girls in gym suits, but really the way they're being exposed and the way they're posed is coming closer to the erotic posing.
And the next image gives you an example of even more of this kind of images.
I have highlighted one image of boy and girl. They could be really just sister and brother, but it's put in a context of highly sexualized images.
But on the left side of this, if you had the same slide in front of you that I have now from my computer, you have an example of hundreds of images, just that narrow sort of band with lots of images coming down.
The next image I want to show you is more specific on the erotic posing and the absolutely grey area exploitation. Now, for some of you, the next slide will be illegal images. But still they are children that are being most likely commercial exploitation of children.
I've covered part of the image on the right there. You see it in front of you. But, really, if I took away that grey cover, that is the one image. It's a girl, but you know what she's dressed like.
On the left image, you see another image that could be just any ordinary image of a girl, but the context with the little penis drawing in it definitely gives grey area.
This is just one small example of the most easiest you could say or the most simple example of grey area.
There are more that are much more explicit.
So coming to closure, next slide. Images in the grey area is definitely exploitation of the child. Did the child give his or her consent to the photographer? I don't think so.
Did the child give permission to the distribution?
They probably didn't even know that this was about to happen, that it would be distributed?
And did the parents give their consent is the question we can ask, and maybe, yes, they did because they also got money from it or maybe they were forced into it, maybe they sold their child. We don't know that.
Probably a lot of grooming going on, but it could be any image of any child coming from any webpage that is displayed in a grey area context ‑‑ sexual like context. Or it could be an image that is definitely made for that purpose.
So just stating again the definition of the images in the grey area, we are not talking about the obvious illegal. We're not in a sense talking about the normal any picture of any child, but the grey area that is there in between this.
So we have images in the grey area, still or moving images of children's bodies where the photographer stating of the child and/or the use of the material in an erotic or otherwise sexualized context gives grounds to presume that the purpose is taking of the image in the that the purpose of the recording or display is to satisfy a sexual interest in the children.
So just the next slide to end at least on a happy note with the happy smiling children because that's what it's supposed to be. But just bear in mind that this could also be images brought in a sexual like context and thereby become in its own way a grey area.
Last slide is just saying stop the images in the grey area.
Thank you for listening.
>> JUTTA CROLL: Thank you for describing what we are talking about, Gitte, in the next 60 minutes.
And I would like to ask of the floor, does anybody have a direct question to our colleague Gitte about what she was talking about?
If that's not the case, then due to time, I would say we move directly forward because the last question was by Gitte what could we do about that phenomenon; and that would also mean understanding the concepts, understanding what industry can do, and especially what industry can do on a voluntary basis because we are not, as Gitte has explained before, speaking only about illegal content.
And I give the floor to Marco Pancini. Thanks a lot.
>> MARCO PANCINI: Thank you very much.
I introduce myself. I'm from the Google office in Brussels. I'm dealing together with my colleagues on these issues a few years now. And it's going to be at the IGF where we can have discussion on these kind of issues on the multistakeholder level. So we bring together all the different actors.
One problem that sometimes we have in Brussels is that these conversation are happening in the multilateral way, for example, between two kind of stakeholders, the industry and civil society. But it's very hard to get in the same room all the different parties involved, including also the experts. So this is a great opportunity. Actually, we should try to think how we can replicate the same approach in Brussels.
So quickly, let me go into our approach to these very terrible issues. We are very committed to try to fight against any kind of abuse of this kind of images online.
I will drive you through our commitment and our approach on this issue, in particular what are our reporting and removal tools, and then update on the next challenges that we are trying to face.
In terms of Google commitments, we're talking about hundreds of people working on child protection across all the different departments at Google, so legal, policy, enforcement, but in particular engineering because for a technological company like Google, the answer is first a technological answer to the problem.
Which means that there's 60‑plus teams across all our structure and which means also stronger investment in supporting the child protection NGOs in fighting against this phenomenon. We're talking about globally $24 million in the last year.
Our approach is based on three pillars.
First, remove the child abuser imagery and report the abuser as soon as we know about this kind of contest.
The second approach is to invest in the best technology to detect this kind of content.
And the third is to work with industry and law enforcement and the NGOs in order to make sure that we also keep pace with the development of the criminal organization that is exploiting extracting this kind of content.
So let me focus first on the removal aspect.
So I don't have to mention that our policies are absolutely in line with European laws so we don't accept on any of our platform this kind of content with no grey area.
It's across all the different platforms and in a very strong way which goes beyond what is legal or not legal in some jurisdiction as we heard.
In terms of proactive removal process, we have a very streamlined process where we receive a flag both from users that are on our platforms. Like Youtube, for example, may find something that is not appropriate and can they report this to us. And that's very, very important to NGOs.
And for NGOs, for example, we have a specific program called Youtube Trusted Flagger that allow them to allow them to have a direct contact with our team and report in a very quick way any kind of illegal content that they may find.
We are working with the Internet foundation. We're working with the NCMEC. We are working with FSM and with other NGOs.
All the content that they report to us also is hashtagged. I'm not going to go more into the detail of this, but, basically, we build together with these partners a database of well‑known image of child abuse material images so we add to these images a digital tag which make it possible to identify when the same image is uploaded on one of our platforms and, therefore, being able to avoid that the same image goes back online.
So it's not just receiving and being very prompt to react to the flags and removal requests that we receive, but from users and from NGOs but it's also empowering a specific technology to make sure that the same image of the same content cannot go up again.
The user reporter can come to us on all different ways. We want to make sure that we put removal very near to the content and so this happens across all the different services we provide. For example, we have the possibility to report content on different products through our centers. In Youtube, you will find near each video a tool that allow you to report content that you think is not appropriate.
So what happens when we receive ‑‑ again, let me summarize what happens. When we receive a removal request, we review with our team the request and then we take action against the content when it's a child abuser content. And then we also report this content to NCMEC automatically in order to make sure there is the possibility to make investigation against the content, against the people ‑‑ the person that uploaded the content.
Let me also focus for a second on our action technology.
This action technology is very powerful because it's not only about the content that we receive and the notice that we receive from our user, from our NGO. We are also taking into consideration the database of the NCMEC previously reported images. So it's really effective tool.
And the true effect of this is to prevent redistribution of know images and to make sure we can follow with law enforcement on the investigation against the individual who are trying to share these known images as well.
There is a recent case Germany about that, and that's actually another very important a tool.
And we also have regular mention sharing of information with another hashtag technology like photo DNA, for example, which shows how the industry is working together on that.
Let me now move for a second on what we are doing on search, quickly, but it is very, very important, because search is actually the core of Google's services.
We started to work with the U.K. Government and now we are expanding these across all the different platforms in Europe and elsewhere. And our approach on search is based on different tools that we are applying.
First is the word ingots. So when we identified more or less 1,000 key words and terms that are related to child abuse material or to dangerous activities in this area, and every time these key words are input in our search engine, we provide a specific warning answer informing the person that is making this search that the content he's searching for is illegal, the activity of searching and disseminating of the contents is illegal.
And, also, we provide links to support ‑‑ to NGOs that can support this person in curing and affording their problem if there is a problem with the child ‑‑ with the child abuse material.
So, again, that's the first approach is making sure that every time someone is searching for these specific key words, this person receives a clear notice that the activities are illegal and there is way to help him in trying to get over these illegal activities.
Second, we provide, also, a removal notification. So if someone is looking for something specific link and this link was removed because it was notified by us by NGOs, by law enforcement, we clearly state in the search page that this link was removed because the content was illegal, and, again, there is was a way to learn more through our partners and resources. So this happens for link ‑‑ specific link to child abuse material.
And the third tool that we implemented and we think is very, very powerful is to work with our engineers on the search quality to make sure that there is also ‑‑ all are related. So we talk about the grey area. All the related search on child abuse material are demoted on Google so that does not come on top of the search results. And that's, again, to make sure that we are addressing not just the content that is illegal but also the content that we don't think ‑‑ we want to treat this content much as we treat the same for very, very bad stuff online, like, for example, malware or other illegal activities, so we demote these results.
And the result we are having actually is great. We a reduction of eight times of traffic on these pages that we targeted by demoting them. So it's actually today more, more, more difficult for someone who is not clearly looking for this kind of content to end up in these pages.
And that's also, for example, had an impact on the number of notices that we're receiving. It's actually positive.
All this is to say this is a combination of tools and the process that we need to put in place. But on top of all of these tools, I think the technological answer to try to solve this problem is still very important and still where a company like Google has to focus. And this means there are plenty of things to be done, but through the collaboration with NGOs and through the collaboration with law enforcement, I think and trust that we can provide a strong answer again these issues.
(Applause)
>> JUTTA CROLL: Thank you very much, Marco Pancini.
This was not only the industry perspective I would say, but it was really what industry can do about the phenomenon. So thank you very much for explaining this.
Natasha, would you like to comment on the industry perspective as well your point of view.
>> NATASHA JACKSON: Thank you.
I am Natasha Jackson from the GSMA which is the international association of mobile phone operators so the 850 or so mobile operators around the world are our members.
And we took action against child sexual abuse content five years ago now when we launched a voluntary initiative called the Mobile Alliance Against Child Sexual Abuse Content.
As I said, it's a voluntary initiative and focused specifically on the issue of child sexual abuse content.
It was discussed and agreed at board level where operators were absolutely adamant that their networks shouldn't be abused by people who are looking to exploit children online.
And as members of the alliance, the operators who join take very sort of strong commitments to undertake specific actions.
The first action they do is to have hotline reporting mechanisms. So they work with hotlines in their countries or international ones where they don't exist to make sure their customers can easily report any incidences of child sexual abuse content that they come across.
Secondly, they make sure that they have their own house in order, so to make sure that they have notices and take down processes enabled within their organization to rapidly remove any content that they may find.
And there's also terms and conditions and acceptable use policies which will talk about this illegal content in order to reserve right to take action.
And other examples of internal processes reflect very much the sort of knowledge that repeat viewing of images is also harmful. And so there are processes to make sure that any employee's viewing of any of those images, whether it's through a report from a customer, is minimized or tried to avoid at all and always passed on to hotline so the appropriate agencies can assess and view them.
So there is absolutely no assessment of any images or viewing of that as part of the process within the organization.
And the third commitment is to undertake technical mechanisms to prevent access to this content.
And, once again, this is a really important point and very different from content companies, Google's and others who host content. This is where the operators, of course, will not interfere with communications and there are regulations around what they're allowed to do. They're access providers, not content hosters.
So they will implement technical measures to restrict access to this illegal content, but they do it on the basis of a list that's provided to them. So that list could be from an appropriate agency in their country that may be law enforcement, and it may be ‑‑ depending on the country, it could vary.
They will take that list. They won't see that list. They will take it and implement mechanisms.
Importantly, they don't make decision on the content themselves; and that's a really important principle for access providers.
If a mobile operator does do content hosting, that's different issue; but in most cases, we're not mostly talking about that.
So for us while we very much support this area and recognize the sort of impacts and the need to protect children, the line around what is legal and what is illegal is quite firm, and it makes it difficult for us. Obviously, if there are countries where some of the posing images that we saw earlier, or the staged ones, were illegal, they went on to the list. For instance, if it were illegal in Denmark and on the police list, then access to that would be restricted from mobile networks. But it's through the lists that we need some legal clarity on what's that.
The lists that the alliance members that we have use is only on child sexual abuse content; and in many cases, they take either a local list if that's not in place, one from the Internet Watch Foundation or perhaps from Interpol.
So from our perspective, we have an additional objective of making sure that we look always at what are the future issues that are coming our way and how can we help collaborate with all the stakeholders. So we look at issues like mobile payments.
With mobile payments, a part of the mechanism by which people are getting access to these catalogs of people posing, then that's something we will look at as well.
But, really, we encourage national debate on this. The more people that understand it, the more important it is. And the more we can get legal clarity on these images, then they will just fall straight under the existing commitments of the operators.
(Applause)
>> JUTTA CROLL: Thank you very much, Natasha, for your comments and for explaining a little bit more what the role of the mobile access providers could be.
And I'm really grateful that you mentioned, also, the mobile payments because we'll come to that point later on talking about what payment and money is in there and what financial interests, the commercial interests.
At this point of time, we also want to look at what can technology really do. Marco Pancini has already explained how Google employs technology to detect and to remove such type of content.
And as I had mentioned before, the context of the images, the provision ‑‑ the context of the provision of the images and also comments, that means text, matters when it comes to categorize whether it's grey area imagery or not. And we have to have a look at new technologies to better understand and to help categorize the huge amount of images around there.
And the research colleagues from the German Research Institute on Artificial Intelligence are collaborating with us in the project Network No Grey Areas. And Christian Schulze from the German Research Institute for Artificial Intelligence will now present what they're doing, what technologies that are already in use and what they are trying to find out how the context could be more exactly categorized.
So do we have Christian on WebEx now, because he's also a remote speaker to us today.
I hope the technology will work.
We have the slides already there. Christian, are you there?
>> CHRISTIAN SCHULZE: Yes, I'm there. Can you hear me?
>> JUTTA CROLL: Seems we need to have some patience, but maybe you have questions in between. But maybe we can answer right now.
(Discussion off microphone)
>> JUTTA CROLL: Okay. But we don't hear him.
>> CHRISTIAN: So I'm here. Can you hear me?
>> JUTTA CROLL: Great. Hello.
You just should start, and I will go on with the slides.
>> CHRISTIAN SCHULZE: Okay. So I try to share, but I was asked to follow the slides. So that will be fine with me.
Can you get up the slides?
>> JUTTA CROLL: You should just start. We have the problem statement slide now.
>> CHRISTIAN SCHULZE: Okay. Yes.
So I mean there are multiple ways for perpetrators to seek and find the illegal contents on the web, searching different slides forums, blogs, social media channels as well for not only the content itself, but also for the pointers to that content. In this case, usually a specific vocabulary containing a lot of the regulations is utilized.
And additional to that, the vocabulary might change over time as law enforcement gains awareness and is able to take down these offers, trends within communities regarding this vocabulary, and this all turns out to become a problem of natural language processing from a technical point of view where opinions and sentiment from these texts is supposed to be drawn. And this all requires a deep understanding of how language works, which is technically still posing a big problem.
Next slide, please.
So one way of tackling this issue would be the so‑called multimedia opinion mining, a topic where the DFKI has taken a close look into. So the ‑‑ drawing the opinion of text is a specific natural language poses problem.
And we need to understand the sentiments of the text on one hand, and also in order to make a decision what this text is targeting. So we need to get an insight into how text is being used by people.
So, actually, this is all trying to aim for analyzing search terms and search phrases in order to prevent troubling results that might be available on the web.
So in order to do so, we can go and analyze a bunch of text documents that are possibly provided by a search engine given a certain query.
So the text drawn from these documents can be turned into descriptive features. For text, this is particularly term frequency or the occurrence of certain words in amounts, so‑called engrams. These resulting features can then be used to train particular classifiers for detecting particular search phrases or queries.
So due to recent developments in this field, it is now not only possible to analyze certain segments of text, but also take the other modalities like images and videos into account.
These might be linked to the documents we find given our query, or they might even be embedded. And with respect to the whole posing image problem, we are ‑‑ from the legal point of view, the legal nature of the imagery is often defined by the context that appear in, which in this case, would refer to the text surrounding a particular image or maybe a video.
So in order to do so, some things are needed which actually allow to describe the visual or even audible content by textual description or to understand what's ‑‑ from the technical point of view, what's visible in the images and videos.
A major step forward towards being able to do this are these novelty learning techniques that are currently under heavy research in the community worldwide which seem to do exactly that.
So the next slide ‑‑ so this is basically a system sketch how to analyze search phrases for the aim of illegal content ‑‑ of finding illegal content.
So what can be done is to actually pass on the search phrase to a search engine which will sort of query itself at different channels, social media, blogs, and other sources within the Internet which will result in the set of documents, images, and videos. And given feature extractors that build up a deep learning, for instance, the deep approach that is being developed, we are able to generate the necessary features for all of these modalities. And these, themselves, will be put into separate classifiers for text, visual information, and audible information, which then, in turn, will give a possibility to decide whether a particular search phrase or any text that is passed on to the search engine aims to find illegal or nonillegal content, which is basically the arrow going back to the search phrase and we are able to decide between whether this is an acceptable phrase or not.
So this would basically conclude the technical aspect so I want to mention another development. Sort of ‑‑ just yesterday, there was a conference with some people from Google in Zurich, people from the Safer Search and Youtube which are interested in technologies for classifying images and videos for illegal content which has been developed previously at DFKI which is able to detect explicit content so far, but given appropriate training data in the area of posing images, these technical solutions can also be extended towards detecting those kind of illegal content on the web.
And by that, if deployed in the ‑‑ for the platform hosters, basically, they would allow to prevent the distribution, the finding of such material as well.
>> JUTTA CROLL: Thanks a lot, Christian. I think we are running a little bit out of time. So there might be questions to this highly technological explanation that you've given.
I thought it was very useful that you made the link between the work that you are doing trying to analyze the search terms that are in use because this is a precondition to the efficiency of the mechanisms that are employed by Google because this warning should show up when a certain search term that is previously categorized as being typical for maybe a perpetrator, so it's built on the relationship between those.
So are there any questions from the room regarding these technical aspects? I see someone ‑‑ you need to come over to the microphone that we give you.
Could you please introduce yourself.
>> AUDIENCE: Sorry. I did not have to come on this side.
Yes. I want to say it's important that researchers are going to great length in explaining how they look at phrases, meaning and interpretation, not only what a meaning is but the context.
Telecommunications companies are the initiatives voluntary on how to stop this kind of content. That is good.
(Technical audio difficulties)
And law enforcement could come and say how it's their duty to protect children. So our governments have the role to ensure children are protected.
But I want to pose a different question.
Is it the role of government to raise children?
Because I want to ask that question by posing that this is a question for debate.
And at the end of the day, it's a parent that has the duty to raise the child. Now, we can look for support from technical and government and everybody else; but at the end of the day, the primary goal of raising the children. All of this is semantics and at the end of the day is useless if the parent is not in power.
So I think the missing voice in this convention is what should be amplified as the role of civil society. I'm happy to see Google saying how they moving NGOs because this is a free speech issue, censorship of content online. And if we allow government or we allow companies to censor content in the name of protecting children, there is a bigger issue.
So what we must be doing is the parent must have more civil society that advocates for free speech so the name of children protection is not used to censor the Internet.
So I want to welcome everybody's intervention, including technical; but let's not forget whose role it is to actually protect the children.
And in the interest of disclosure, I am implementing Webrangers Kenya which is a Google‑supported child online protection. So I do know the issues, and I am from the Civil Society. Thank you.
>> JUTTA CROLL: ‑‑ for the statement and I think we will get back to this statement when we have the debate at the end of the round because it's a big issue talking about education of children, and it's also ‑‑ there is the word saying that you need not only the parents but the whole village to raise a child. And we are the whole village that have the purpose to protect children on the Internet. And we will go back to that afterwards.
I would now like to turn to the commercial aspects of the phenomenon that we've already mentioned, and I'm happy to have Gregory Mounier from the European Cybercrime Center in the Hague.
Thank you.
>> GREGORY MOUNIER: Thank you very much, Jutta. And good morning, everyone.
So I'm from the European Cybercrime Center which is one of the units of EUROPOL which is the European law enforcement agency. Just a few words about EUROPOL.
We don't have any executive powers. We support the 28 EU member states' law enforcement authorities to tackle organized crime, terrorism, and child sexual abuse online and elsewhere.
In terms of child sexual abuse and exploitation, we have a team of about 15 specialists and criminal analysts working full time on that issue.
We are working on the basis of accruing information that are sent by our law enforcement agencies around Europe and beyond actually. We work closely with the FBI, the Australian Federal Police. We are focusing on any forms of criminal behavior against children.
On the specific aspect of criminal commercial distributions of child abuse material, we are part of the European Financial Coalition which is a broad, multistakeholder coalition including technical ‑‑ NGOs like INHOPE, Internet Watch Foundation, Google, and a number of other partners. We've just issued a report on commercial exploitations of children online, and I invite you to take a look on our website to read the report.
The first point I want to make is really about the volume of child abuse material that we can find online. I mean it's enormous, huge. So those 15 or 20 specialists that are working on that field are working on about 20 ongoing international major operations. We had several successes like Danfor (spelled phonetically) recently on P2P networks. But still the amount of information we're talking about is really big.
About two or three years ago, one of the law enforcement agency partners mentioned that in one of the biggest EU member states, it was their estimation that more than 50,000 were involved in exchanging and sharing child abuse material online. And every time we do an appraisement, we get more new fresh material, much more than we can really process actually. And I want to really press that the limited resources of the law enforcement community to target these types of horrific crimes, which leads us, of course, to prioritize our targets. We are going after the biggest administrators of the firm, going after the most active criminals. Of course, we can't tackle everything which leads me to the point of the grey area.
From a purely professional and police perspective, the grey area is important of course; but because of the prioritization, we have to do, we're going after all the child abuse material.
Of course, it's down to the member states at the end after the analysis. And then we provide them targets and we say that in your country, you have 3,000 targets that are active so please investigate, please arrest them. And then it's down to the member states to say, actually, this is borderline according to our national legislations.
So, of course, we support the initiative of Germany, for instance. You have an amendment to the criminal codes to explicitly draw the lines in ways of child abuse material, and we would like to see like that at the European level because, of course, we would catch more people.
But again, my point is, like Marco and Natasha said, that from the oppression from the perspective of grey area, there is no grey area. It's either illegal or not, and we are going after everyone.
Now, I was invited to focus on the various payment measures and methods that are used to distribute commercial child abuse material. I think from that perspective, there are two principles. The first one is those offenders that are involved in those types of crimes are extremely IT savvy.
We see in that particular community of criminals, they are ‑‑ they are using cutting edge encryption technologies, using all the new materials and technology and methods that can obfuscate their activities online. And then they are very, very concerned about their anonymity and their security.
So whenever they have to use payment methods, they will naturally go towards alternate payment methods. That's what we call them. So that means money transfers, digital wallets and virtual currencies.
The second principle is that it really depends on the modus operandi. If you take, for instance, the booming trend currently which is web live streaming of abuse online, then you will see that they're using money transfers because it's just more convenient.
Money transfer companies have bureaus around the world, particularly in South Asia and Africa and everywhere that crime tends to take place more often. And so they will use that because it's convenient.
But when we talk about dot‑nets or online or hacked websites and so on, then they might be using more virtual currencies. That's also the second principle.
But, overall, there is a massive trends from a shift or migration from traditional payments like credit card payment to these alternative payment methods, not just because of robust law enforcement action, but this is also what the private sector is doing either in the financial bank sector, the payment methods, and the electronic service providers.
Also, they are shifting to these alternative payment methods because it's less traceable. And for them, it means it's less vulnerable and provides more anonymity.
In terms of the volume and trying to give a quantitative assessment, it's very difficult; but on the basis of information we received from ‑‑ north and Internet watch foundations and to analyzing information, we come up with a very indicative assessment.
So I think it's about 50 percent of, for instance, websites, URLs that are providing commercial child abuse material that are using money transfers for 50 percent about money transfers.
30 percent are still using credit card transfer. And then 20 percent digital wallets and virtual currencies.
Again this is completely indicative. It's more or less the feeling from the law enforcement perspective.
Now to focus more on the virtual currency. There is definitely big trends toward increasingly using cryptocurrencies, but we have limited evidence I have to say.
We find that they're using virtual currencies mostly on the dot‑net but also sometimes on the open web. Recently, we had an operation where we saw it was a hacked legitimate website that was providing child abuse material. They exclusively accepted big client payments.
But, again, I don't want to single out big client because they are using any virtual currencies, can be Litecoin, Dotcoin. It can be anything.
Another example is that in 2013 during the takedown of Silk Road ‑‑ the illegal marketplace online. It was distributing drugs and firearms and everything ‑‑ we found ‑‑ or investigators from the FBI found that there were links to child abuse material in their block chain related to Silk Road in the marketplace. So it's just indications that they are using virtual currencies to distribute child abuse material.
But, again, those users, especially those on the dot‑nets that are interested in child abuse material are very IT savvy and they have a very good understanding that any use of payment methods is a vulnerability for them. It is a risk. So they will try to avoid that.
And we are monitoring some discussion on the Tor‑Net reference since we're ‑‑ participants are discussing whether it's really worth trying to produce material for gains because they know that it's a tradeoff with security.
And we also see discussions on alternate methods of trying to generate new materials.
As a conclusion, I would say we need to keep monitoring the new trends, that migrations from traditional payment methods towards new alternative methods. We need to work with all the stakeholders; and that's very, very important. We're setting the bar a bit higher every time to those offenders. We need to make it more difficult for them to distribute child abuse material.
And I think I will conclude on this. Thank you very much.
(Applause)
>> JUTTA CROLL: Thank you, Gregory. I think you made a very important point because you said ‑‑ you mentioned the huge amount of images that we're talking about and the need to prioritize. And that also shows that we need these different strands of mechanisms and activities, how to address the problem. It cannot go only one way. That's where the hotlines come into their role because on the one hand, hotlines when they discover material that is considered illegal, it goes straight to the law enforcement so they can deal with that.
But in the Network No Gray Areas when the hotlines receive reports that they consider even with national law in Germany could not be considered illegal, then they turn to the companies like Google and still ask whether it's possible to remove the content.
And I'm pretty sure that all the industry partners, they know about freedom of expression, and they will be very careful how they react to these reports.
But when it comes to the trustful relationship within the network, you can be pretty sure that the hotlines have already assessed and considered whether they can turn to the provider asking for removal of the content.
We have another remote speaker which is Akio Kokubu.
He has a big time difference coming from the Japan from the Japanese Internet Association which is running a hotline as well. And Japan has already had an amendment to their legislation with regard to purging images.
So, Akio, are you there?
>> AKIO KOKUBU: Can you hear me?
>> JUTTA CROLL: Yes, we can hear you. Wonderful. The slides, please.
>> AKIO KOKUBU: Can you show us the slide.
>> JUTTA CROLL: Yes, the slides are here.
You can just start, please.
And very quickly.
Please go ahead, Akio.
>> AKIO KOKUBU: Yes, before I talking hotlines, I want to introduce regulation for child sexual abuse material.
Models and photo albums are sold in stores and online.
Addiction can be too provocative such as sexually‑suggestive pose with partially clothed or very small swim suit.
They are illegal in Japan if such a photo is corresponded to child pornography law, Article No. 3 as shown ‑‑ still we don't have slide, examples in the slide.
Several people through selling, printing were arrested very recently on such distribution of child sexual abuse material.
Can you show us the slide? Can you hear me?
>> JUTTA CROLL: Which slide do you mean? The role of Internet hotlines?
Can you go ahead, please.
>> AKIO KOKUBU: Yes. Although the police crack down on child sexual abuse material thought as grey area, whether they lead or not is, as the case may be, to improve such a situation, an organization including ECPAT Japan and other child welfare organization called for changes to the child welfare role rather than child online pornographic role.
Still we don't have slide.
I'm going to introduce the Internet hotline Japan.
The hotline has been operated under the contract with the national police agency since 2006. We receive reports anonymously from Internet users through a web form and provide information to the national police agency.
Also we request ‑‑ and the harmful information from ISPs and websites administrator.
Seventeen and two managers are currently working in hotline.
The number of reports from Internet users is about 150,000 in 2014.
That meant for such a huge number of reports for us ‑‑
>> JUTTA CROLL: Akio, we can hardly understand you.
So if you don't mind, I would like you to come to the end of your presentation, because it's very difficult for the audience to understand you.
>> AKIO KOKUBU: Yes. Can you show us the slide.
>> JUTTA CROLL: I think we have the last slide now.
>> AKIO KOKUBU: Last slide. Okay.
This is the last slide with illegal information received from Internet users, hotline will generally carry out the following:
Number one, inform police.
Number two, submit notice and take down to website administrators and ISPs.
Number three, in home related institutions.
Number four, handle legal information through international corporation such as iHUB.
Context of information is confirmed by human eyes. In the past, I tried to develop an automatic content rating system on the Internet with image recognition technologies. It was good for screening; but confirmation, human eyes was still needed because such technology did not take into account context of information. This is the reason why we employ so many staff as I mentioned.
Last week, hotlines, primarily passing on of information. Namely, they do not such information by themselves to avoid criticism of censorship.
Understanding for child sexual abuse material is necessary to receive such reports.
Awareness, activities to users are very important.
I thank you very much for listening to my talk.
Can you hear me?
>> JUTTA CROLL: It will be commented on by Amy Crocker from the INHOPE Foundation who can explain from the broad experience of more than 50 hotlines that are organized in INHOPE.
>> AMY CROCKER: Thank you very much.
Yes, just a clarification.
I'll represent the INHOPE Foundation but also the INHOPE Association which is an international membership organization. So I should clarify that INHOPE is not a hotline, but we have members in 45 countries, 51 members who operate national services. And we've just had a very good presentation from one of our members in Japan, and I'm very happy to have two of our leading members in the room from U.K. and Netherlands here so if I say anything, they can tell me off.
But you'd like me to speak to the role of hotlines in this area? Yes.
Again, because we're a membership organization, I have to make it clear that each hotline operates under its own national legislation; and they will have has processes in place to receive, analyze, and refer onwards child sexual abuse material for removal at source. And that means through the INHOPE network using the international exchange platform that we have to ensure that reports of confirmed child sexual abuse material in the country that receives the report will be sent to the confirmed country where the material is being hosted for removal in that country.
And we operate a global mechanism that all our member hotlines have access to and our cooperation on a daily basis to have that material removed.
The role of hotlines in relation to material in grey area is distinct from that of the role of INHOPE. We would defer to ‑‑ our national members would make a decision at their national level based on their legislation, based on the relationships they have with the industry and law enforcement about what they would do to potentially take action against material that is within the grey area.
But I think definitions are important, and I do pick up on Natasha's point and Marco's point and actually Gregory's point as well about this idea of we need clear definitions and need to be acting based on what is confirmed to be illegal in any particular country.
That said, there are examples of ‑‑ you know, Marco from Google gave this example of countries that on a voluntary basis will be taking action against material that may be questionable and may be concerning. But it's not the case that every hotline would be able to do that or have the capacity. And very much picking up on Gregory's point of being mindful of the law enforcement capacity to act. And we need to manage expectations in this area about what is possible.
But at the same time, we absolutely support an initiative that is trying to raise awareness and increase ‑‑ sensitize the general public to the nature of this material and not just the material that is confirmed to be illegal which will be the explicit sexual exploitation and abuse of a child but also material that could be used for sexual purposes.
So that I hope gives us some kind of insight into where the INHOPE sees this.
And I would point out that with 45 countries in our network, there are countries where there is now legislation in place that will cover what is sometimes referred to as child modeling and sometimes as child erotica. It will not be illegal in other countries. So that's huge diversity within our network.
And one of the things that INHOPE is also doing with a new platform that we've developed over last two years, which is primarily a content classification and hashing tool to provide information to law enforcement networks via Interpol ‑‑ to provide information to victim identification law enforcement networks via Interpol. And one of the things that this tool is able to do is also create categories of information that are not illegal that would be defined as not illegal by a hotline in that country but nevertheless might be ‑‑ might be illegal in the country hosting.
So there are things that can be done from the hotline perspective, but we have to be mindful that we have to defer to national legislations and national procedures on this issue.
But from a child welfare perspective, absolutely I think there's huge importance in making this issue of grey area material ‑‑ you know, putting it into the public debate.
>> JUTTA CROLL: Thank you for going a little bit more in ‑‑
(Applause)
>> JUTTA CROLL: Thank you, Amy, for explaining a little bit more about the role of the INHOPE Association, the INHOPE Foundation, and what the single hotlines can do.
And you've already referred to the role that child welfare organizations, child welfare advocates can play or have to play in that issue.
And that's now the point of time to turn to ECPAT International. It's one of the most important players in that game. And Marie‑laure Lemineur will introduce you to a recent activity that 35 child organizations around the world have undertaken to put more focus on this issue and to broaden the focus of what we understand is inacceptable to be displayed on the Internet.
Please, Marie-laure.
>> MARIE-LAURE LEMINEUR: Thank you, Jutta.
My name is Marie-laure Lemineur. I work for an organization called ECPAT International. We are based in Thailand, and we are a network 85 member organizations based in 77 countries.
So my job today basically is to speak about the communique has been issued by the network that Jutta mentioned, the network that's on grey images that was launched in 2014, and it's a bit of background information.
Then the network worked for a few months at a national level in Germany. Then in context of the framework of the German G7 presidency, the network decided to organize a roundtable towards the end of October recently. And, basically, the roundtable was aimed at discussing the issue around the grey images and also to adopt a communique. And, basically, the aim of the communique is to declare ‑‑ to contain all forms of sexual exploitation of children online, plus to declare a policy of zero tolerance towards exploitative images such as the grey images that is online.
So, basically, I believe you have the text of the communique somewhere. You should be able to see the slide so I won't read it. It's quite boring. You can have a look yourself and ask questions about the content.
It's divided into two sections. The first one is a brief introduction and definitions, and then you have the postulations.
The communique has been signed so far and endorsed by 34 organizations. It's still open for endorsements, so whoever is here representing an organization and is interested in joining in, please do approach Jutta and her team to discuss it.
It has been endorsed by several national and international organizations such as ECPAT Germany, Sweden, the Canadian Center For Child Protection, Child Focus, European NGO Alliance for Child Safety Online.
And speaking of the next steps, the idea is to issue officially the communique on November 18 ‑‑ yes, which is European ‑‑ it's a new European day. It's a new European day on the protection of child against sexual exploitation and abuse.
So that would be the day that the communique will be officially launched.
That would be it. Thank you so much.
>> JUTTA CROLL: Thank you very much, Marie-laure Lemineur, for introducing us briefly to the communique.
Due to the technical problems, we have been running out of a little bit of time; but I think there is still room for two or three questions.
If there are any people in the room who are interested in the work that the network is already been doing, how the network ‑‑ the idea of the network can be transferred to other countries, how you would like to take up on the idea, how it can be supported by technology, please feel free to come over to the microphone and put your questions to the panelists and speakers.
>> AUDIENCE: Hi. Thank you for this presentation. My name it Maria Garcia. I'm an activist from Mexico and I'm part of the Youth at IGF Program.
One of the concerns that we as youth have is that in this fight against child exploitation, many times teens are criminalized because they are sharing images that end up in these filters. Or legislation that is used to attack something horrible is also used to attack teenagers.
So I wanted to know what ‑‑ in your experience, what speech you have encountered or what actions have you taken to prevent the use of a supergroup effort to criminalize teens.
Thank you.
>> MARIE-LAURE LEMINEUR: Yes. There is a workshop this afternoon about youth self‑produced content that would address specifically this issue. So I would invite you to join in. It's room ‑‑ it's at 4:00 p.m., but I don't remember the room. You can look on the schedule.
>> JUTTA CROLL: I just want to add that if you have a look at the communique, you will find that we explicitly not have been talking about self‑produced content in the communique because in the communique we want to address the responsibility of the other stakeholders that are involved in that. And we did not want to address responsibility on youth that have produced images themselves.
Although we know that self‑produced images somehow can go also in that abusive area. They can be exploitative, but it should not be the sole responsibility of the youth. So it's not criminalizing youth. It's the attempt of having the other stakeholders involved.
I think we have a remote participant, and then we have Patrick Curry as another questioner.
>> REMOTE MODERATOR: Gitte Jakobsen wants to comment on that question.
Please, Gitte.
>> MARCO PANCINI: I have to leave for another meeting; but if there are any pressing questions, I'm happy to answer.
>> GITTE JAKOBSEN: Yes. Jutta, I think you said that very well. This grey area is not about the self‑exposition material, although could be.
But that's not the issue.
When you look at the images I showed to you, you know exactly the difference between the exploitation of the children for someone's else purpose.
I want to address Gregory's presentation. Thank you, I like that also from the enforcement perspective.
I agree with you that it's the matter of police priority if the grey area should be on top or not.
But from our experience that some of the grey area images actually gives ‑‑ could be the last missing piece in the big jigsaw puzzle made to engage in the final investigation of perpetrators because sometimes we see grey area images hold even more information in the identification of victims, places, and perpetrators because they seem to be less careful of what they put in images that are not completely illegal.
So for that reason to have grey area images in what is illegal makes it possible to enter data into the databases for LEA and also into EUROPOL ‑‑ Interpol databases that we also type data into from the INHOPE network.
>> JUTTA CROLL: Thank you for that explanation.
And I see some kind of confirmation from law enforcement from my left side.
Would you add something, and then we take another question.
>> GREGORY MOUNIER: No, I don't have anything to add to that. I agree with you, Jutta. And even I would say that the grey images is also sometimes brings the offenders to another hidden website victims behind where you find ‑‑ yes, it's very, very important from an investigation perspective because, as you say, they give more evidence.
>> AUDIENCE: Thank you. My name is Patrick Curry. I'm from an organization called BBFA.
I'd like to make a point first of all.
The requirements to be able to identify, detect, prevent these different kinds of mechanisms that we're talking about here, many of these already advancing in other related areas. And so I would make a plea for coordination ‑‑ greater cooperation and coordination on the cross cutting activities.
I'm also here on behalf EU Project Mapping which includes Interpol, and we've had exactly this discussion in Interpol about how we can use capabilities and techniques.
We are seeing governments that are looking for more regulation, particularly the United Kingdom, for age verification and online access to adult content as it's called, 7.5 million web sites that we know about. How the United Kingdom is going to enforce some kind of legislation for age verification for that none of us really know at the moment.
But I would suggest there is a lot of crossover that can take place here.
And I'm particularly looking at some of the advances in block chain technologies to try to get assurance and also in identity management, and some of the more advanced video.
We haven't mentioned video, but in the adult world, the amount of content generation today is moving from images ‑‑ the balance is moving from images to video because you can.
So that presents significant additional challenges in that area.
So I realize we're running out of time, but I'd like to make a plea for greater cooperation. And I will be very happy to discuss in detail what that means.
And the second point ‑‑ my question is: How important is video to you?
>> JUTTA CROLL: As I'm not representing a hotline and I've never analyzed the content, I cannot answer that question completely.
What we know is that it's ‑‑ there is evidence for so‑called child pornography that we've also embedded grey area imagery in video material but we can't tell to what extent.
Has anybody else on the panel with experience?
>> NATASHA JACKSON: Video is important because we recognize and I totally acknowledge your point on the hotlines. And we have hotlines in the room. And I've also worked with law enforcement, and there's been a huge migration and increase in video because of greater storage capacity, broadband, all of these things.
It's very important and there are several ongoing efforts. Google ‑‑ who have left the room ‑‑ are currently developing and testing the video technology.
The tool that I mentioned that INHOPE has developed in the last years uses a video hashing solution in order to be able to process videos and provide that information to Interpol, to law enforcement. So, absolutely, it's important.
And I think you would also agree with me that no one has yet solved that problem. But there's a lot going on in that space because we have to respond to the reality of what we see around us.
And this is not speaking to grey area material. This is speaking about the problem in general.
So I'd like to get an answer on content based.
Marie-laure?
>> MARIE-LAURE LEMINEUR: Very briefly, I think it's very clear to all of us who are in the community who are trying to deal with combatting child sexual exploitation online, that when we speak about child abuse images, we cover still images of moving materials.
And we also are very aware that the share of videos is growing among the collections based on the reports from law enforcement. But you're obviously aware of that.
>> REMOTE MODERATOR: We have one brief comment from Gitte Jakobsen from Denmark.
>> GITTE JAKOBSEN: Yes. Just to say, yes, we do certainly include video on this material, also, from a content analyst perspective. Sometimes we see in the images of the abuse material, the images, thumbnails from videos.
But hashing is going on from videos, and it's important in part of this. It's also included in the definition we give of the grey area material.
And then there is the new aspect of the whole field that sex tourism no longer requires you to travel because there's a lot of digital or video sexual abuse going on, illegal part of it, but also in the grey area, because for some, that's what they want.
So video is also on the table for this.
>> JUTTA CROLL: Thank you, Gitte, once again, for stepping in.
I think we have one last question.
>> AUDIENCE: Thank you. My name is Arsene Tungali. I'm from the Democratic Republic of the Congo. I'm the ISOC ambassador for this year.
I don't have a question but comments.
The issues of sexual abuse for children online as we see in the different countries are very different from the way we see our child online protection in developing countries if we can say it like that.
The issues of hotlines ‑‑ for instance, in countries like mine, there is no hotlines that can support children to report the cases of violation.
And if ‑‑ I ask if there is one of you in an organization that supports African countries or other less developed countries in order to deal with those issues of child online protection.
Thank you.
>> NATASHA JACKSON: Yes, on the GSMA's activities, we've been focusing on ‑‑ a lot on Africa recently, providing capacity building in the area of children and mobile phones. So not just around child sexual abuse content and exploitation, but also about children's safe and responsible use of the Internet.
We've also participated in multistakeholder workshops with some of the participants who are in that room so we can start getting the understanding of some of these issues and some of the best practices or examples from around the world so that the African countries can deal with it swifter than perhaps we have previously.
>> AMY CROCKER: And just in relation to your ‑‑ relation to your points about hotlines and help lines and sometimes there's confusion with the terminology. But I absolutely agree with your point. And, actually, one of the things that INHOPE is doing through the INHOPE Foundation is trying to support start‑up initiatives that would like to establish exactly that kind of national reporting mechanism in their countries.
And the Internet Watch Foundation, who are also in the room, also have a solution; and they work very much internationally to try and support countries to do ‑‑ to do a similar thing. And so feel free to speak to either INHOPE or the Internet Watch Foundation, and I can give you more information.
It is a challenge.
I would also say I haven't been to your country, but I agree there are very different challenges in different countries. But I think what we also see and have seen is that as Internet connectivity increases, as more access, there are more opportunities for the exploitation of children. And there are similarities in the way that children can be reached and then exploited online.
So in that sense, you know, we can have some common solutions and certainly from a technological point of view, the solutions can be similar.
And, yes, we've been working very closely with GSMA, and they're doing great work on the continent to try and support activities in this area.
>> JUTTA CROLL: So I think it's time to conclude the session.
And I would like to use ‑‑ to make use of the opportunity to come back to the question that was raised regarding freedom of expression.
Those of you who have been looking into the program for the Internet Governance Forum might have seen that this workshop was categorized under the subtheme of human rights, and that was not done on occasion but on purpose because we think it's very important to talk about the human rights of children and to ask the question whether the protection of the dignity and privacy and physical integrity of children can be balanced with the right to freedom of expression. And that is one of the questions that we will bring forward from this workshop to the main session on human rights which will be held on Friday morning I think at 11:00.
So please feel free to come to the main session on human rights as well where we can continue the discussion of about the balance between freedom of expression and human rights.
It's time to thank all of you for being so patient, for staying so long with us for the debate. And it's time to thank all the speakers. Unfortunately, Marco Pancini had to leave.
If some of you have questions for Marco, you can come right to me and I will pass them to Marco. He will answer them afterwards.
My last comment. The workshop that was mentioned on self‑produced content is in Workshop Room 3 at 4:00 in the afternoon. And maybe we will see some of you there.
Feel free to come with your questions to ask.
We will be outside right now for the coffee break. And enjoy the break and your stay in Joao Pessoa. Thank you.
(Applause)
(Session concludes at 10:47.)