IGF 2019 – Day 1 – Convention Hall I-D – OF #19 Human Rights And Digital Platforms

The following are the outputs of the real-time captioning taken during the Fourteenth Annual Meeting of the Internet Governance Forum (IGF) in Berlin, Germany, from 25 to 29 November 2019. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

 

>> MODERATOR: Good morning, ladies and gentlemen.  And welcome to this provocatively open forum on whether platforms and human rights are a contradiction in terms.

My name is Joe McNamee.  I've had the honor and privilege to work on Council of Europe working groups on roles and responsibilities in intermediaries and most recently as co-rapporteur on a draft recommendation on automated decision making.

The past few years have been very exciting from a data protection perspective with the entry into force of Reformed Convention on Personal Data, Convention 108+, as well as the first years of operation of Europe's General Data Protection Regulation.

In this panel, we will look with esteemed colleagues at the expansion of data protection rules internationally and the roles and responsibilities of businesses in relation to privacy and data protection.

In particular, we will look at our direction of travel.  How close are we to a global framework that works in practice as well as on paper?  A global framework that ensures respect for fundamental rights to privacy and data protection.  What is our destination and how close are we to it?

We need to be very conscious of the fact that without privacy we cannot have security.  Without privacy, we cannot speak freely, anonymously, or without fear of retribution.

Without the freedom to speak and move freely, we cannot associate freely.  If we cannot speak, associate, or move freely, we cannot hold power to account.  If we cannot hold power to account, our human rights are at the whim of authority.  Lack of privacy is therefore the very antithesis of human rights and democracy.

We will start with a round of introductory comments, and hopefully we'll be able to open up to some questions and answers after that.

Our speakers today are Jan Kleijssen, who is Director of the Information Society Directorate of the Council of Europe on the Action Against Crime.  Alexandria Walden is Global Human Rights and Free Expression Policy Counsel at Google.  Fanny Hidvegi is European Policy Manager at Access Now.  Rami Efrati is Senior Cyber Fellow at the Tel Aviv University and former head of the Civilian Division of the Israel National Cyber Bureau Prime Minister's Office.  And Florence Raynal is Deputy Director, Head of Department of European and International Affairs at the CNIL.

So I will keep as quiet as I can about a subject that I'm very passionate about, and hand over the floor initially to Jan Kleijssen from the Council of Europe.

>> JAN KLEIJSSEN: Thank you very much.  Good morning, everyone.

In recent weeks, we have seen two very powerful statements amongst others on the issue that is the title of this session:  Human Rights and Digital Platforms --

One came from a person that we would normally associate with these issues, namely the UN, United Nations Special Rapporteur David Kaye, who came out with a report very strongly pointing out that the respect for human rights on digital platforms left to be desired.

The second statement came from someone we would not normally associate with these issues, a British actor by the name of Sacha Baron Cohen, who  made a remarkable speech just a few days ago which has been widely circulated, and I think deservedly so, on the internet and social media.  And for those of you that haven't seen either or haven't read David Kaye's report and haven't seen Sacha Baron Cohen, I would like to very much encourage you to do to.

There was one phrase in the statement by Sacha Baron Cohen which struck me particularly, I must say, and which I would describe as debunking the myth that freedom of expression covers everything on social media.  And I think he rightly pointed out that those that deny or justify the Holocaust on social media are not offering an academic point of view.  They are preparing the next one.

On this note, I'm sitting here representing the Council of Europe, which is an international organization especially founded to prevent the horrors of World War II from repeating themselves.  Our founding fathers include Winston Churchill and Konrad Adenauer. We celebrate 70 years this year.  And we have adopted in the course of those 70 years for our 47 members now some nearly 850 million Europeans, some 200 treaties including the European Convention on Human Rights which a number of you may be well aware of.

It is a binding and enforceable treaty and it does cover human rights, whether they are violated online or in the real world.  It applies to both.  And referring to the very nice booklet that we all received, the agenda for the 2020s.  One of the opening statements by Vint Cerf, with whom I usually 100% agree, but in this case would like to differ slightly because he calls -- he points out that in the field of privacy and internet governance we need enforceable treaties but they are not yet there.

There I would slightly disagree.  These treaties do exist.  On cybercrime, there is the Budapest Convention which binds nearly 70 countries worldwide.  It originated in the Council of Europe but has gone far beyond Europe's borders.  And last year we were active with capacity building in the fight against cybercrime in more than 130 countries.

On data protection, there is Convention 108, which you already mentioned.  Convention 108 is a binding international treaty enforceable by the parties to the treaty through a committee of the parties which is called the TPD, and in which now also nearly 70 countries cooperate.  There are nearly 60 parties, but more countries than parties already cooperate, preparing themselves for accession, which is about half of the countries in the world that have data protection legislation at all.

So there is a binding international treaty.  And, therefore, I would take this opportunity to encourage all of you who have not yet, from countries that have not yet adhered to this treaty which was, by the way, recently modernized with the input from Civil Society, with the input from academia and other international organizations to consider cooperating with us or acceding to this treaty.

And I'd like to close, in order not to abuse my speaking time, with a reflection.

Law enforcement and data protection.  When we think about those two issues, we usually consider the restraints and the conditions under which law enforcement may use data.  And this is, for instance, an issue that is being discussed at the moment in Strasbourg.  When we speak about a protocol on Cloud evidence to enable law enforcement to have easier and quicker access to evidence held in the Cloud because nowadays most cybercrimes go unpunished.  There is virtual impunity.  I think one case in not even a million actually leads to a conviction.

However, there is another issue regarding criminal law and data protection.  Our societies, and I speak in perhaps the privileged position here on the Council of Europe as someone who is responsible both for freedom of expression and the fight against crime in our member states, our societies criminalize behavior that seriously harms individuals or society.

Is it therefore perhaps not time to start considering whether those that deliberately breach data protection regulations, that deliberately sell or violate privacy provisions, whether those people should not be held also criminally responsible?  I leave you with this question, and I thank you very much for your attention.

>> MODERATOR: That's a very interesting final question.  I hope we come back to it in the discussion later.  Next we have Alexandria Walden from Google.

>> ALEXANDRIA WALDEN: Thank you.  Thank you for including us in the conversation today.

My expertise is in human rights, and I come from a background of doing civil and human rights and social justice issues.  And I bring that work to -- I bring that work and that experience to what I do at the company.

And so while there are thousands of people who work on privacy every day, my agreement is to look at how we approach human rights across the business.

So I just wanted to back up a little bit and talk about how Google approaches human rights.  From our perspective, we believe in technology's power and potential to have a profound effect and positive impact on the world.  In everything that we do, we're guided by internationally recognized human rights standards, and we're committed to respecting the rights enumerated in the Universal Declaration of Human Rights and its implementing treaties.

An important part of that for us is the UN Guiding Principles on Business and Human Rights and the Global Network Initiative Principles.

That informs the way in which we operationalize these commitments across our business.  In addition to actively harnessing the power of technology to respect and advance human rights and create opportunities for people across the globe, we are committed to responsible decision making around emerging technologies.

This approach includes important pieces in terms of the way we integrate the issues across the business.  So, one piece that I think is critically important to the way companies are addressing these issues both in how they design products but also in terms of how they engage with governments and contribute to thinking around policy, one aspect of that is executive commitment to human rights and engaging on these issues.

Another important piece is internal processes for conducting human rights, due diligence, and HRAAs.

Lastly, it is important that companies, and specifically this is important to Google, to do external engagement and consultation with experts around how we develop our policy positions, our products and those features.

And so if you take that as the foundation for how we approach these issues, I would like to just kind of harken back to what Jan said about some of the key issues that we're focused on and facing in the world today, both in the realm of privacy and in the realm of free expression.

We come to the table to engage with stakeholders around the way these problems are actually sort of emerging and evolving to ensure that what we are doing with our products is actually addressing the problems as our users are experiencing them, and as governments are experiencing them as well.

So, just -- I guess I will say in closing that I think it is important for us as we talk about what companies are doing in this area and how companies are maintaining their commitment to human rights to always tie that back to the UNGPs and ensure that we are having a conversation around the UNGPs that is evolving alongside the way that we are viewing these issues in the world.  Thanks.

>> MODERATOR: Thank you very much.  We will pass the floor straight to Rami Efrati from Tel Aviv University.

>> RAMI EFRATI: Shalom, good morning.  My name is Rami Efrati, I'm coming from Israel.  I'm the self-nominated cyber ambassador of Israel.

And since I don't know how many of you have been to Israel, I would just like you to raise your hands if you visited Israel because I'm going -- excellent.

So while speaking about privacy, whenever you are coming to Israel, the first thing that you are -- you find is while coming to any supermarket or any movie, cinema, you find somebody, a guard looking at your basket or looking at your clothes or looking at you in order to find out whether you are a terrorist or not.

So I would like to discuss very quickly and very briefly the main question which is what is the right for privacy in data protection way?  Is it also valid while we are speaking of terrorism activities, and what is the right way to communicate with the digital platforms.

Just to make it very, very clear, in Israel, we have two main organizations dealing with cyber and privacy.  One is the Privacy Authority, the second one is the Israeli National Cyber Authority.  It goes together.

Speaking about privacy cannot be done unless you are dealing very well with the cyber as well.  And, therefore, we also decided, the government of the State of Israel decided to start up also with what is called a cyber low because without a cyber low, taking care also these privacy issues we believe that you cannot work in the right way.  We are looking at ourself as the leading country both in cyber but also in privacy.

And you'll be surprised, but GDPR became a very important take, a very important role in our life.  But our life is totally different from most of you.

We just heard from Jan about Sacha Baron Cohen, what he said.  We can speak about security, we can speak about what is the role of the digital platforms when we are speaking about antisemitism, we can speak about it when we are speaking about money laundering and pedophiles as well.

So what is the question and what is the way that we should deal with it?

The digital platforms takes a very important role not only when you have to protect yourself but also if you are a terrorist.  And, unfortunately, we found out most of the digital platforms terrorists are using these platforms against privacy.

When the terrorist is using a platform, a digital platform, he knows very well that it is open to the public.  So when it is open to the public he can understand that also it is difficult for the law enforcement agencies to deal with it.

What I would like just to come out with highlight is what are the tools that the government has to give for law enforcement agencies in order to deal with cyber when cyber is -- with terrorism or antiterrorism when the main platform is a digital one.

And I will be more than happy to answer questions about this one later.  Thank you.

>> MODERATOR: We have some very well-behaved panelists who are staying well within their time, so great.  That's not to put pressure on Fanny.  Fanny Hidvegi from Access Now.

>> FANNY HIDVEGI: Thank you very much.  I will behave, too.

Thank you very much for being here.  My name is Fanny, I'm the European Policy Manager of Access Now.  Access Now is a global human rights organization.  We work at the intersection of human rights and technology so we are engaging on topics like privacy and data protection, freedom of expression, artificial intelligence, cyber security and more.  So this couldn't be more timely for us.

And I'm based in Brussels.  And one of our key topics has been in the past few years the adoption of the General Data Protection Regulation.  And my colleagues work on it so I'm really glad that the panelists are addressing that topic.

Going back to the title of the session for a second in contractual terms.  In our view, contractual terms are not enough to provide adequate prevention, mitigation, prevention and redress even for normal users of platforms and services like Facebook, for instance, much less in the event of misuse and abuse.

We need incentives and we need business models that respect human rights.  Companies have the responsibility both to know about the impact of their products and services on human rights by conducting due diligence and working with outside stakeholders, but also to demonstrate that they are taking meaningful measures to prevent and mitigate these adverse effects.

On the government side, we talk often about the obligation of lack of interference with fundamental rights.  But we have to mention the positive obligations as well that states need to create an environment for the full enjoyment of human rights.

This panel focuses on business models of platforms mostly, but when we mention human rights and companies we must also account for different types of violations.  So I want to highlight that companies such as the NSO Group and The Hacking Team make it possible for repressive regimes to target those who oppose them in order to stifle dissent.

The covert nature of targeted spyware makes them the tool of choice for authoritarians.  We see the role, on the other hand, of the big companies taking actions like the WhatsApp litigation that they brought in the court of California.

When we talk about the governments and the companies responsibilities and obligations at the moment, and maybe that ideal scenario that you asked for, we are failing on both ends.

As the session description rightly mentions the Cambridge Analytica Facebook scandal as a key moment.  That scandal was a foreseeable consequence of a business model of harvesting and exploiting data and not respecting privacy.  It created a momentum maybe to mainstream the urgent need for the enforcement of privacy and data protection rules or the adoption of comprehensive frameworks in areas where they don't exist.

In contrast to the revelations, however, it has not led to meaningful reforms yet.  It has translated into political talking points about addressing the information mostly by self-regulatory measures.  But no systematic reform of strengthening safeguards against micro-targeting.

To bring the European example, the way this revelation helped move the needle in the adoption of the GDPR, it was just last week when we pronounced the e-privacy reform dead or zombie at best.  The European Union most follow through and complete the reform after the GDPR to provide protections against online tracking and to ensure the confidentiality of electronic communications.

I'm looking forward to discuss all of these topics at once in 60 minutes, how we will solve them.  And thank you very much, once again.

>> MODERATOR: Thank you very much.  And finally, we have Florence Raynal from the French National Data Protection Authority, CNIL.

>> FLORENCE RAYNAL: Hello, everybody.  I'm Florence Raynal, I'm working for the CNIL.

I'm very honored to be with you today.  Just in case, let me recall what is the CNIL, it's the French Data Protection Authority.  In fact, we are regulate data protection in France.  Our role is to advise companies but also public bodies in complying with the GDPR in French law.  And we also are enforcing the law on our territory.

Digital forms is truly an interesting case study from the privacy point of view.  Indeed, in today's word, it is technically possible to collect enormous amount of data but there are privacy risks associated to this. Indeed, maculation can offer greater opportunity of data combination and big potential of profiling we need to be very cautious about.

In certain cases, new techniques are used to sell artificial intelligence and facial recognition and other automated system which must be carefully framed because they raise privacy challenges.

These can lead to blacklisting, discrimination, abuse of decision for the people.

We also see phenomenon such as creation of big data reservoir where companies can pick and choose and lead to the development of a huge data market without much control.  This needs to be done with legal basis in a transparent way and with possibilities for the people to control this collection and reuse them.

It raise also other privacy issues such as data retention, security, also post issue is back to qualification of responsible parties which is crucial in order to identify who is responsible for what.  GDPR provides for tools to the people to better control the digital life.  They provide tools to exercise their rights, rights to object, right to be informed, right to erase, and right to the portability.  This is a very important right that helps also to rebalance the asymmetrical relationship with companies.

GDPR also provides for duties to companies in the way they process data, and I would like just to mention here Article 22 about profiling which is a very important provision with respect to data combination.

And user for new technologies to create profiling.  As GDPR give a robust framework to the new practices, it is also -- it is also provide way for digital platform but more generally public bodies and companies to develop policies that also correspond to expectation for users on their privacy and at the end also can become good for the business in order to provide trust to the customers and at the end have a good business model.

With respect to all those type of processing that we see happening, we are issuing guidelines and tools to accompany business in order to comply with the GDPR.  For example, recently we have developed a DPIA tool that can really be seen as a success as a tool -- as a compliance tool.  And we are also doing some enforcement action.  Maybe you have seen some action with respect to Facebook and Google in respect to data combination, lack of transparency, but also other platforms such as BlaBlaCar and the 18 Website where we found that security transparency consent issues.

There is also an important factor linked to the geography to some of this platform and big internet parameters that are not necessarily located in the EU and where data transferred and stored reused and on this case GDPR brings also a very clear policing message with Article 3 and the territorial scope of the GDPR.

To summarize this, if you do business in the EU, you must respect EU rules.  Either because the company is established in the EU or because the business is targeting the EU markets.

And it is a very important provision which in certain way puts EU and EU actors on the same equal foot and provide -- if they target European markets.

EDPB, so it's the legal body of the European Union gathering the 28 Data Protection Authoritative have recently issued some further guidelines with respect to this territorial scope.  But we need to go further than that, we need to be more ambitious because the global issues in fact need also global solution.

In that line, we truly support Convention 108+ as a possible instrument to resolve conflict of laws and to refer a common solution.

As Jan said, it is the only binding instrument that exists today at original level and that is open to cert countries.  So it can really be seen as an international instrument and not only as original instrument covering both public and private sector and also intelligence processing which is very important as we have seen that today.  Also with adequacy decision taken by the Commission.

It is really well articulated with the GDPR and it creates a great forum of cooperation between that of protection authorities and government at international level.

Going back to the title Contradiction in Terms, we really think that we can have a kind of a win-win situation where privacy is good for business because it is good for people.  Privacy should be seen as a chance, an opportunity also to create trust and to improve the quality of services because at the end, again, it respect -- the expectation of user end privacy and fundamental rights.  Just to finish, we think we can have a common interest to avoid contradiction.

>> MODERATOR: Thank you very much.  One of the questions that we were asked to consider in preparing this panel was what should governments do?

And I think we had the full range of possibilities proposed by our five speakers.  Jan pointed to Convention 108 and the GDPR as strong pieces of international legislation.  And wished for broader -- continued takeup of Convention 108 and reflected on the need for criminal sanctions.

Fanny wanted the law enforced more effectively and enhanced with e-privacy rules.

Florence pointed to the reinforcement of the GDPR with tools and guidelines.

Rami talked about reinforcing privacy by stopping criminals from abusing privacy online.  And Alexandria pointed to a non-governmental multi-stakeholder engagement to achieve our goals.

I would like to ask our panelists if they want to come back on any of the comments made by their fellow panelists before looking to see if there are questions from the audience.

Okay.  The floor is now open to the audience.  We start with questions straight in front of me.  If you could introduce yourself, please.

>> STEVE DELBIANCO: Thank you.  Steve DelBianco with NetChoice.

The question would be equally framed to the digital platforms represented here, Alexandria as well as the governments, and if there were courts here I would love to understand that, too.

When platforms adopt the UN Declaration of Human Rights, how shall a platform balance two rights that are in conflict with each other?

And the example I would give you is the right to be forgotten, which is sort of an exercise of the 12th human right principle with respect to privacy, against number 19 which says that humans have the right to seek and receive information and ideas through any media.  So I seek to know whether to lend money to an individual, but that individual is using the right to be forgotten to deny me the ability to know that they have been bankrupt.  Or a doctor whose license is revoked or a childcare provider who has had a criminal conviction.

You get my drift, I'm trying to come up with an example and ask for help on how to balance human rights that are in conflict.  Thank you.

>> MODERATOR: I think I would be interested in Alexandria coming back on that.  And in particular, whether you think it's likely that Google would choose to impose a decision on right to be forgotten in cases where somebody would be harmed because they didn't have a proportionate right to ask for the right to be forgotten in the first place.

Facebook -- I'm sorry, Google needs to impose or to deindex or delink content in situations where if it would be unfair on the individual to have certain search results come up.  And a harm that the gentleman just described would not be compatible with that.

So do you see a challenge in that for you?

>> ALEXANDRIA WALDEN: Well, as with all things in human rights, there are oftentimes not simple answers or solutions.

Especially when rights are -- are or appear to be intentioned.  But I think the -- sort of that's a little bit about the beauty of UN guiding principles on human rights, they refer back to the government's duty to protect human rights.  And they refer to a company's responsibility to respect.

And so ultimately what it does is point out that certainly there are actions that government should take, and companies have to think about how they respect the law in the countries where they do business.  In addition, companies have to do their own balancing, their own human rights due diligence to understand how their products are actually operating in the real world.  And are there ways we should be thinking about how we design our products and the features that they include to ensure that the way that users are engaging with them can enable them to be -- have choice and control so that we can have -- be rights respecting on the company side.

But all of that requires both on the government side for them to have strong rule of law and clarity and for them to respect human rights.  And for companies to do it as well.  And then we can sort of work together to deal with these issues.

I think it has been interesting to see what has happened in Europe around the right to be forgotten.  Google did challenge -- we had -- we have a long history of challenging that issue in court.  And when it became clear that that was going to be the law of the land for Europe, we respect the rule of law and have complied and created an ornate way that we comply with the law and allow users to appeal directly to us as part of that mechanism.

So I do think it is -- where the issues are most challenging it requires there to be significant multi-stakeholder dialogue with governments, companies, and society all at the table.

>> MODERATOR: And I'm going to force myself not to comment because it is a subject that I'm very passionate about.

Do we have -- oh, we have one, two three -- okay, we will take them in the order then.  That way, and then Rami.

>> AUDIENCE: Good morning.  My name is Vivian Kedron (phonetic), I'm from the House of Lords in the UK where I sit as an independent.

I'm just interested to hear the panel's views about the failure to uphold children's rights in online situations.  Not least the fact that a child is a person under the age of 18, and we have a de facto age of adulthood online of 13 based on a sort of a piece of very old fashioned law in the U.S. proper.

And so I'd really like to know how you imagine that children's rights could be normatively observed by the platforms.  And I should declare an interest just in that we are currently undertaking a general comment on the convention of the rights of the child for the digital world.  Thank you.

>> MODERATOR: That's a very interesting question.  I look forward to the answers.

There were two further questions behind you and further back.

>> ALEXANDER MALKEVICH: Thank you.  Alexander Malkevich from the Civic Chamber of Russian Federation.

I wanted to mention the problem about censorship on the big social networks such as Facebook, Twitter, and Google.  You know that they perform extensive blocks or even delete accounts where their owners are just trying to express their political views.

And how -- how can we demand from the Facebook, from Google, from Twitter just to publish the relevance, stop lists of words that are forbidden to use on these platforms and in general put an end to politically-based censorship.

>> MODERATOR: Thank you very much.  Next in line, and then Rami.

>> WOLFGANG BENEDEK: Wolfgang Benedek from the University of Graz.  My question would be a good follow-up to the previous one.

I wanted to have your views on the recent Glawischnig case versus Facebook of the European court of Chastise according to which a handful of content on reputation hate speech has to be removed and even criminal in content.  And this on the worldwide scale.

So on the one side that can be considered a big move forward in forcing platforms to adopt the policy against hate speech, protect the reputation of politicians.  On the other side, there are obvious down sides if this is interpreted in the way of going towards censorship as the colleague before just mentioned.

If you allow me a second question, then Council of Europe was quite successful in developing a guidelines for the liability of platforms and intermediaries.

My question would be how is the process of implementation of these guidelines being monitored?  Is there any form of monitoring possible?  Thank you.

>> MODERATOR: Thank you very much.  I think we have about three sessions' worth of questions just there.

I think we should answer those before coming back to another round.  I particularly like the last question.  But does anybody want to go first or should we take it in order of speakers if nobody is jumping in?  Okay.  We will start with Jan then.

>> RAMI EFRATI: I just wanted to refer for your question which I think was excellent.  It's all a question of laws.  Should we like to forget the Holocaust or to let it go?  Or maybe we would like to delete the picture of a child who was abused.  Should we look at a terrorist who made his will on the digital platform and influence hundreds of other people who are going to kill any one of us.  80 people were killed in Nice, France by a truck driver who was influenced by this.

So, first of all, it is a question of norms that I know it is not easy to talk about and not easy to come with an idea.

I believe that the most important way to deal with Facebook, Google, and other is to try and come out together with agreed norms in order to try and come out with the solution how to do it.  Meaning, if you want to delete something, which is -- you have the right not to do it as well -- let's come with the norms together with this organization.

If you will do it only by regulation, by law or by somebody in any country, it will never work.  We have to team and work together.  I think this is the best way that at least I find out when I started to deal with these companies on cyber and other things.

Because when you are dealing with a terrorist, the same question is coming also for the digital platform, are we dealing with the content or are we dealing with to make sure that it is sanitized network.  So this is what I can say on that.

>> MODERATOR: Okay.  Jan?

>> JAN KLEIJSSEN: Thank you very much.  Perhaps 10 seconds for the question given the time.

First of all, on the balancing of the rights by platforms.  I would like to point out there are, of course, standards. There are legal standards with clear case law.  And in the end, it will be for a court to decide if there is a dispute.

Children's rights, I fully agree there is much that needs to be done.  Censorship on social platforms, political propaganda.  One answer could be also for government certainly to promote much more public service internet so that citizens would be able to rely on quality journalism and investigative journalism on the internet supported by our governments.

Harmful content to be removed, the case by the ECJ.  You raised the question there of censorship, but if it is the implementation of a court judgement, in this case an international tribunal, I think that is guarantee at least the judgment that it is not just unwarranted censorship.

As to the necessity to remove certain content, I would again refer to the speech of Sacha Baron Cohen.  On the liability of platforms, the guidelines are being assessed at a committee level, but also considered by the European Court of Human Rights and I would expect there will be cases where this recommendation or these guidelines will be brought up as a sort of the rule of law.

>> MODERATOR: Thank you.  Florence and then Fanny.

>> FLORENCE RAYNAL: Thank you.  I would react on the question related to the children because we think that it is a very, very important topic.

And that the GDPR there are specific provision with respect to consent given by the children in processing of their personal data that is linked to a certain age and needs to be defined at national level but showing that there is for sure a maturity aspect.  Very important on that.

We have done huge work at the CNIL level on the children rights but also at international level with the international conference of privacy commissioners.  We have done, for example, some analysis of different legislation and exercise of rights and we have done also a kind of reference model for the community of education in a certain way, train the trainer material in order to help people in contact with the kids to make them aware of their rights.

It raise a lot of issues.  It raise issue with respect to the scope of exercise of the rights by the kids.  And the children, how far, you know, what kind of right could they precisely exercise.  It raise also question of the role of authorities and whether or not they can endure complaints by kids and at which age.  And the problem of verification, mechanism to verify the age and how companies can put that into practice.

And we definitely think that there is a huge work done by other international organization like OECD and the Council of Europe and the UN that we are trying to follow as much as possible.  And we hope that we will be able to influence the discussion at UN level in order to better frame the exercise of rights on the digital environment by the children.

>> FANNY HIDVEGI: Let me respond to the Facebook case question.

And so I know it is quite well known, but maybe it is worthwhile to offer a little bit of context for the people who have not read that case just so they know what is it about.

Because for -- as a starting point, it is not about hate speech, but defamation.  So the case started back in 2016 when a Facebook user posted an article featuring the photo of this very well-known Austrian politician and used slurs like she is a corrupt oaf and lousy traitor and member of a fascist party.

What happened throughout the litigation in the end Facebook was ordered to remove the post, but there was a huge legal question to decide whether Facebook should remove identical content going forward or also equivalent content going forward.

And that is -- that is the major legal question that is now being discussed.  And the Court opened up the whole possibility for a general monitoring obligation and the use of automated tools in that context which is the most problematic part of the decision.

And in the European context we will see how this is being addressed in the opening potentially of the E-commerce directive or the DSA, the new Digital Services Act.

The problem what we see here is this case might have really negative implications for freedom of expression but also freedom to form opinion, which is, by the way, an absolute right and it has quite a different legal implications afterwards.

So automated tools are demonstrated not to work as picking up on context and being able to make that human rights decision that is already problematic for courts and people.

And it also might create this general monitoring tool that violates universal human rights.

>> MODERATOR: Thank you.  We have time for maybe two more quick questions, and we've already got four.  Those two hands went up first.

>> ANKE DOMSCHEIT-BERG: My name is Anke Domscheit-Berg.  I'm a member of the German Parliament Internet Policy Speaker for The Left Party.  I have a two-fold question.

The first is we have really big issues in Germany because of lacking national responsible context of digital platforms.

I'm not talking about nice buildings, PR offices and events taking place there, they do exist, but there is no address where you can deliver a court order or where a lawyer can send official letters.  They just don't take them.

And then they refer to Ireland or even the United States headquarters and you never hear again from them.  We have with Twitter, for example, an issue already in May, Twitter blocked a Green politician in the middle of political campaigning for a joke he made on Twitter.  He is still blocked, although two court rulings have already ruled that Twitter has to open the account, they just don't do it.  And they refuse to take the court letters.

So how to deal with it and is there an option to make a European legislation forcing them to have legally binding delivery address.

And the second is talking about, for example, digital violence against women on those platforms.  One issue we have first that those platforms don't take this serious enough.  But the second issue is we have a serious lack of capability of law enforcement.  In all countries I know of, definitely in Germany.  It is also an unpunished crime mainly because you don't have police people and justice people knowing what to do.

So what could be done to help this?  In Germany we are talking about creating a specialist police authority where at least you have some trained people to deal with these issues.  Are there other ideas and how is it dealt with in other countries?  Thank you.

>> AUDIENCE: So our question is about -- our concern is that tech companies are developing policies and practices with states who are known to be violating human rights and digital rights of some of the world's most vulnerable including occupied peoples.

And we want to know how companies like Google are working to ensure that policies and practices they develop do not enable states to engage in illegal occupation and to commit further human rights abuses and war crimes.

>> MODERATOR: And finally, if you can -- yep.

>> AUDIENCE: Thank you very much. (Speaking non-English language)

>> MODERATOR: Thank you very much.  The last intervention, for those of you who have made the terrible life choice of not learning French, that was a request for translations of the interventions of our excellent panelists in order to help decision making in Chad, who are working on a legal framework and feel -- the speaker said that they are somewhat delayed in addressing these challenges.

So quickly, if we can make a final set of responses from the panel.  And then I will try to do a wrap-up of this very diverse discussion.

>> ALEXANDRIA WALDEN:  Working our way back down from this end now.  So I will address a few of the questions that came up.

With respect to enabling states to commit war crimes, I think that gets back to what I was saying in the first instance with respect to the Universal Declaration and the UN Guiding Principles and companying embedding, really figuring out how to operationalize in ways that allow them to do due diligence across their business both in how they respond to law enforcement requests but also in respect to how they launch products and who they sell them to.

On the topic of digital violence again women, I can tell you that platforms do take this issue seriously.  At Google, we have a variety of policies and products that seek to address some of the ways that these harms can manifest.  It is certainly not a panacea but we have policies that prohibit -- our policy against hate on YouTube includes gender and gender identity.  So that means when someone is inciting hatred based on that protected characteristic, content is removed on that basis.

We also have a policy against harassment.  And that includes when -- that is focused on when the content is based on an individual, a threat to an individual.  And we have been clear that actually we are evaluating that policy currently to see if there is a need for revision and to tighten it up.

And then lastly, I just wanted to -- I had some comments on the gentleman who raised censorship issues.  I just wanted to clarify that companies, we are committed to freedom of expression, especially in the way that is articulated in Article 19.

There is a freedom of expression and there are legitimate restrictions on that right.  We ensure that we are following the way that the law is playing out with respect to freedom of expression where we are doing business that we understand how the courts work, what rule of law looks like in a given country so that we can understand how we operate there.

That is with regard to how we respect the law.  But we also have policies in place to ensure that our users understand our policies.  We have to be clear about those.  We have to have appeal mechanisms in place.

And then lastly, we need to be transparent about what we remove both with respect to government requests to remove content and with respect to what we remove under our own policies.

>> FANNY HIDVEGI: Thank you.  As a wrap-up, I just wanted to highlight how there is an overarching demand for all of these rights implications, whether it is privacy and data protection to be addressed in a systematic way because it is a business model question and underlying core question.

And whether it is competition and markets or various export control measures, content governance including moderation but also the creation and the design, we need to ensure that the companies follow human rights norms.

>> MODERATOR: In a predictable way.

>> FANNY HIDVEGI: Yes.

>> FLORENCE RAYNAL: Very, very quickly because I know that we are late.  Just to answer your question that it will not be a full answer because it will be just on the GDPR aspects on the reputation, not necessarily on the removal of full content.

The GDPR organized a coordinated manner to answer to infringement of the GDPR on the EU territory around the lead DPA that is restricting the case in coordination with the author of the authority.  It's called the one-stop-shop system.  Right now we are practicing it.

But for the company that are established -- have establishment in the EU, we have a system of cooperation among ourselves in order to ensure the violation infringement and to sanction them.

We are in the practice -- we are putting that in practice right now.

>> JAN KLEIJSSEN: Given that we are nearly out of time, one sentence as a wrap-up.  We are certainly not short of standards, but we we also certainly must do a lot better to ensure they are implemented.

>> MODERATOR: And thank you very much to the panel.  I personally find it kind of shocking that we are asking quite fundamental questions in 2019.

I think that the -- the first and the last -- one of the last questions are the core of a lot of these questions.  When are they doing too much, and when are they not doing enough?

The Council of Europe recommendation on the rights and responsibilities of intermediaries is a very important document that should be the first step in trying to find answers to this.

It is not acceptable that legal content is removed, it is not acceptable that a legitimate parliamentarian is taken offline.  And we need to dig into the basic principles of international law that restrictions have to be predictable.  At least why are we talking in 2019 about unpredictable decision-making?

It is bewildering.  But I think if we can leave this room with at least if you know you have a problem you can start finding a solution.  If we recognize this as a problem that needs to be addressed and build on the Council of Europe's very good work in this area, then we are heading finally maybe in the right direction.

Thank you very much to all of the questioners.  Apologies to the questioners who didn't get to ask their questions.  Thank you to a very good panel with very good insights.  Thank you very much.  And see you soon.