FINISHED COPY
NINTH ANNUAL MEETING OF THE
INTERNET GOVERNANCE FORUM 2014
ISTANBUL, TURKEY
"CONNECTING CONTINENTS FOR ENHANCED
MULTI‑STAKEHOLDER INTERNET GOVERNANCE"
04 SEPTEMBER 2014
9:00
WS 158
PROMOTING PLATFORM RESPONSIBILITY FOR CONTENT MANAGEMENT
***
This is the output of the real‑time captioning taken during the IGF 2014 Istanbul, Turkey, meetings. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the session, but should not be treated as an authoritative record.
***
>> MODERATOR: Good morning, welcome. We're going to start in a couple of minutes. Welcome. Hello.
Good morning, welcome. Don't feel you need to sit at the back. Come on up to the front. We don't bite, really.
Okay, shall we begin? Shall we get started? Thank you, Nicolo.
>> NICOLO ZINGALES: Good morning, everyone, my name is Nicolo Zingales, I want to welcome you to this workshop that is going to speak about a very important concept I think in the development of the social and legal norms that apply to online platforms and the reason why I talk about social and legal norms is that this workshop is going to look beyond what the strict legal requirements for online platforms are in order to escape liability for user generated content and it's going to look into what are the social expectations that people have regarding the role and responsibility that these platforms should play and the role that other stakeholders play in the management of online content.
So for example, some of the concepts that are together with the responsibility of platforms are should platforms play a role of proactive enforcement with regard to a particular set of rights? Should they provide a balanced system to adjudicate conflicting rights, and should they essentially ensure the protection of fundamental rights at all times? Obviously these concepts are somewhat related, closely connected indeed, with the idea of intermediary liability, if not least because the way the intermediary liability legislation is drafted, influences the behavior of these online platforms, so if there is no intermediary liability, specific legislation, usually platforms have the discretion to adopt with the Terms of Service policies that might not be seen as responsible with regard to a particular set of rights, and likewise, if there are due diligence obligations in the intermediary liability legislation, platforms can be required to play this due diligence requirement in order to get the immunity from liability.
So in these two ways, intermediary liability can influence responsibility, but the two concepts should be seen as different and distinct and that's why we are going to focus on what Terms of Service can do in order to influence the responsibility of online platforms and in this regard we have a packed lineup of speakers and we want to see what they think from their perspective, which is going to be quite diverse given the different organisation from which they come, on the role and responsibility of stakeholders.
So we're going to start from my right. We have Konstantinos Komaitis ‑‑ oh, okay. First we have Robin Gross from IP Justice. Konstantinos Komaitis is from ISOC. Paolo Lanteri from WIPO. On my left we have Marco Pancini from Google, Janine Moolman from APC Women and Nick Suzor from the University of Technology in Queensland. Then we have a co‑moderator, Joy, from APC.
So I guess we're going to start looking into the issue of copyright, and then we're going to slowly move into another area of online content management, which is what companies, what online platforms do with regard to potentially offensive speech, so we're going to start first on my right.
>> All right. So on the subject of copyright and intermediary liability, from the perspective of the users of the platform, I think it's important that we don't have rules that require the platform itself to be responsible for policing and controlling the activities of the users on the system. I think that when we shift that burden over to the Internet service provider or the content provider ‑‑ or excuse me, the content platform provider, that we could risk chilling innovation as businesses are going to be less willing to try new businesses if they're going to be held legally responsible for what their customers are doing, I think it can also chill freedom of expression to have this kind of policing and controlling of the users' activity going on. So from the perspective of the user, I think that it's important that we leave some space for innovation and for freedom of expression and not Gao too far with requiring intermediary's responsibility to police and control the activities of their users.
>> NICOLO ZINGALES: Thank you. I forgot to mention that you have to be short. Thank you for doing that spontaneously. Let's move to Konstantinos. As I understand, you don't want to focus just on ‑‑
>> KONSTANTINOS KOMAITIS: Hello, good morning, it's great to see so many of you even though it's this early in the morning. I'm going to take something that you've mentioned about social expectations which essentially what do we mean, we are talking about user expectations and if we expanded and we start seeing and trying to come up with a clearer understanding as to the role that platforms, whatever those palace forms, their function is, performed within the Internet, we need to understand that over the years the Internet has sort of assigned roles to each and every stakeholder and these roles have to be respected and they have to be very clear. Robin mentioned policing. Platforms should not be policing the Internet, I have to take a step back and say that users have certain expectations from all actors that are participating in this thing that we call the Internet, so in the context of platforms they are expecting that they will be protected. Now, they might be thinking of privacy, some of them are thinking of free speech and others are thinking the way their kids are protected. So it is very important to understand that when we're talking about platform responsibility, I want us to put it also a little bit in the context of the things that platforms have to employ, whether they're technical or social, the norms that are able to make sure that the Internet functions in a balanced way and they don't really upset the dynamics that have already been established, so make only that point and then we can deliberate on it. Thanks.
>> NICOLO ZINGALES: Thank you. So Paolo?
>> PAOLO LANTERI: Good morning, everyone, thank you very much, Nicolo. Let me start by disappointing you and all the audience in several ways.
First of all, I was not planning to be that short, and the reason why I was not planning to be that short is that WIPO comes here with very little answers and a lot of questions. The reason for that is that WIPO is a member driven organisations and we have 157 Member States that decide what we should do and I don't think it would surprise anyone, governments are not ready, don't feel comfortable in putting the question, the issue of ISP liability in a framework such as WIPO. So I don't have a position, I don't have an opinion. Nevertheless, I can share with you several consideration on how the framework in place can influence and can be useful to see a way forward in this debate.
So if you allow me, I will go with this. Otherwise, I'll make it as short as possible. So first of all the language responsibility and the concept of platform is something we look with great interest and something that we already, it's a similar approach that WIPO has undertaken since the beginning of at least 15 years ago we were already talking about responsibility instead of liability. How we see responsibility is a broader concept that includes liability, so maybe in this way it's slightly different from the focus of this workshop and also the concept of intermediaries is very much broad and can definitely be similar or identical to the concept of platform which is, by the way, the first question we have what we consider as being platform for the focus of the debate.
So what WIPO does in these ways, what will Member States ask us to do, the guiding principle in trying to answer a specific policy question, the only guiding principle we have are the international treaties that we administer. International treaties will be on copyright, of course, this is very focused on that. What is the guidance those international treaties give us? I need you to work it through some boring legal steps. There are five or six major aspects of that. First of all we have a long list of rights that we consider need to be granted to right owners, copyright. At the same time we have very much possibly the most quoted provision is an agreed statement on the same Article that recognizes the right of making available interactive network, which is a right that is often infringed in peer to peer platforms and sharing that says that if a third party provides physical tools facilitating this communication to the public, this action does not amount to a communication to the public. This may fall outside the specific debate but gives you a clear interpretation of the hands off approach in these treaties debate so they are not focusing on intermediaries. Then we have provisions on enforcement. Again, they are flexible and allow Member States to basically set up the legal frameworks. In fact, all systems currently in place like the European directive, horizontal approach, the vertical approach in the U.S., several grid response are all in line with the Internet treaties.
However, back in the beginning of 2000 the majority interpretation of this provision provides for some clear guidance, says that immunities are indispensable and cooperation among players is possibly the best solution, at the same time we need to maintain creation. And this is on the side of the rights.
On the other hand, we have important provision on limitation exceptions, limitation exceptions are the area where the tools copyright will already budgeting balance between different public interests, many of those public interests reflect human rights and several provisions in the copyright national legislation already provide a balance for reasons for the benefit of Rights of Persons with Disabilities, access to knowledge, education research rights. Furthermore, there are also specific indication that technology, rights management, should not hinder the effectiveness of those free access to content, thereby indirectly protecting public interest and fundamental rights. So summarizing, there is a great flexibility in how to implement them. Enforcement should include also limitation exceptions. Stakeholder agreements are ideal solution and technology needs to be part of that solution. Agreements and technology has proven very successful in other fields, and we can further discuss that together with the panel.
Finally, Member States have been discussing several issues on another stream of work, focus on enforcement, the advisory committee on copyrights ‑‑ sorry, the advisory committee on enforcement. This committee is exclusively a mandate to exchange information. However, in this debate there is a growing interest towards managers, education, alternative resolution, so a focus on softer kind of enforcement measures. If we have time later, I found last night some very important inputs coming from documents that have been shared by governments and discussed in the context of WIPO that are quite recent and talk about the drafting of a possible policy recommendation.
So in that debate, some of the issues we are discussing here have already been raised. For instance, there is a clear trend and here I want to focus only on two aspects. There are problems, of course, those voluntary remedies are not perfect, as already mentioned. They are sort of similar in the sense of the requirement for the remedy, and the remedies usually the takedown of the content can be done by removing or blocking access and it is a burden on the platform. While the remedy focusing ‑‑ I mean request for the remedy is the burden of the right owners.
So the focus here is what would force, what is pushing a platform to move towards voluntary agreements? Well, according to some Member States in our committee, the motivation can be certainly education, can be certainly integrity of the business, but at the end of the day it's mostly still about avoiding liability. So do we need, is law required, or can we just getting stakeholders agreeing on a concrete solution.
Sorry for not being short.
>> NICOLO ZINGALES: Thank you, Paolo. Actually I'm happy that you made two very important points. One regarding exceptions and limitations, I think responsibility of platforms is also to ensure not enforcement of rights only but also of maintaining the same system of exceptional limitation that is in place in the space outside the platform, so in the traditional copyright system. And the second point is regarding technology, how technology today is used to enforce basically the terms and conditions of these platforms are, and this is also a nice segway to the next speaker, who is going to be from Google, so Google has in place this very effective system for detection assessment of the legality of the concept and enforcement, which is content ID, and in this way they are doing basically what in the traditional setting would require several steps and they do it in a very efficient way and at the same time they allow right holders to profit from situations where an infringement is identified. So how can we assure that these systems also take into account limitations and exceptions and further the public interest?
>> MARCO PANCINI: Thanks, Nicolo. Those who don't know me, I am Marco Pancini from the Google office in Brussels. Let me start with an important thing we cannot forget before we get into the details. First is the free flow of information which is the reason why we are all here and the reason it's very successful media and actually all the positive impact of the free flowing information is under our eyes. And second is that a company like Google wants to be a responsible player in the ecosystem towards our user and towards our partners, towards all the other actors in the ecosystem. So if you look at this debate under these two pillars, you can see that there is not a tassel but for sure there is some kind of balance that we need to find. And I think technology can help a lot in finding this balance because technology, especially the one that's used with intelligence and having a clear target like in this case, the example of content ID, can be very beneficial because it's basically allowing us to use the algorithm that we are using in other context in order to provide maximum flexibility to the right owners in order to protect their content, monetize their content, or just simply monitor what's going on about their content on the platform like YouTube.
But this cannot be the end of the story because if you look at that from purely that perspective, it's a voluntary action, it's based on the legal framework, which is the MCA, but then what's about the limitation debate, what about the debate in Europe around the commerce directive and jurisdictional issues and monitoring issues and so on. That's why I think we need ‑‑ I really like what was said from Paolo from WIPO, sorry again for repeating this word, I think in this context makes more sense, and when I say multistakeholder debate, it's really involving all the different parties that need to give opinion and need to give guidance on how to again go back to the balance that I mentioned before between the free flow information and need to be responsible players. Sometimes we find in the middle of this space, but sometimes in legal and ethical debate around the voluntary measure is not the best place to be because again there should be more and more discussion between Civil Society, right owners, law enforcement, enforcement body more in general and how to approach all that was mentioned before. I would like to make a couple of examples. One is the commerce directive that is under permanent discussion but still remains one of the core pillars of the European, and from a lot of parties we hear more and more that the commerce directives is a privilege. I think it's not a privilege. I think it's the rule of the game and it's a very clear and strong rule, which says when you have content online, when you have direct contact with the content online, you are responsible for this content and being responsible and reliable for the content means you need to take action when public authority or when an interested party following due process is asking you to take action towards this content.
And I think it's a fair balance and means at the same time that the intermediaries are developing in a sense platforms that allow people to express their opinion to do business, to just realize what they want, a specific goal that they want to achieve online, but intermediaries are also in the moment when they are providing these services hold liable or held responsible if they are informed about the legality of the content.
So all of this to say that I think that the legal framework has found the balance that is coming before the free flow of information and the need to be responsible for intermediaries, especially in the European core system, but especially I would like to see on the European context. I think the real debate should not be about the privilege of the fiberoptics directed but I think the real debate can be how we can use the commerce directive in the global debate as the best practice vis‑a‑vis jurisdiction and the second should be what kind of measure can be put in place in order to improve and make stronger of the commerce directive. For sure there is a relationship between the different parties of the ecosystem when something wrong is found on the platform that should be better specified. The whole area of voluntary measure can be better specified and better regulated by multistakeholder agreement. For example, there are a lot of discussions in Europe around demand, we believe it's an interesting solution for enforcement. All of these discussions should be first of all‑inclusive of all the different parties and then be put under the umbrella of clear agreements which set the terms and conditions of how to implement. So all this to say that I think there are no conflict between the free flow of information and being a responsible citizen in the online environment. The real question is, the real problem is in the weeds, so how to make sure we are respecting the balance of the rule of law and full inclusion of all the parties, these are still open areas but I think we have made a lot of steps towards a positive solution of this question and actually I'm very curious to hear also from the audience what the audience thinks about that.
>> NICOLO ZINGALES: Thank you, Marco. You mentioned the M word and I think it's important that we engage on a multistakeholder, and we'll have a meeting this afternoon with the coalition at 2:30 and I think there is a possibility to find some agreement on the terms and condition of the social contract.
But another thing that you mentioned that I think is very relevant is how we can improve the safe harbor and when we're talking about safe harbor, I'm thinking about the debate that the reason the U.S. regarding the very broad safe harbor and section 230 of the CDA which gives essentially very broad almost complete immunity for Internet intermediaries for whatever choices they make with regard not to copyright but other types of content, so this is an issue that comes up often in the issue of online harassment, and I would like to introduce us to this debate, and say what the responsibility of platforms should be in this context and how can we frame the legislation or the social incentives in a way that they protect the rights of women online but also other disadvantaged individuals that might be targeted by cyberbullying, so I leave it to you. Thank you.
>> Thanks, Nicolo. Thank you. I often feel very anxious when I start talking about something because anybody else hasn't mentioned any of the words. I work in the APC women's programme and our main concern is violence against women online because really it is death threats that women receive, Facebook pages that are devoted to violence against women, response to sexual violence against women, it's very clear that the violence women experience offline has extended onto the online women. One of the effects to women is freedom of speech and freedom of expression that Robin earlier alluded to but it's not something we often talk about how violence against women is leading to the chilling of a particular kind of speech. So to me it makes very little sense, although it's not surprising, that conversations about intermediary liability really if at all talk about ‑‑ rarely if at all talk about online harassment and there was a workshop yesterday afternoon where that gap was very obvious. Rather the concern has been around Intellectual Property which has a very particular set of imperatives and political imperative and economic imperative and yet those are not necessarily applied to half the world's population when we talk about violence against women online and this is because we world in a world where violence against women is normalized and invisibilized and because of that it's not necessarily an issue that's raised often in any space.
And as a result of this hundreds and thousands and millions of women who experience online violence, don't know what to do. Where do you go? Who do you ask? What kind of help can you get? So APC has just completed some research in seven countries in Africa, Asia and Latin America really looking at the domestic legal remedies and the remedies that women can find through Internet intermediaries that respond to the violence that they experience. The study covered a number of companies ranging from Facebook, Twitter, Google, YouTube, to Internet service providers in the seven countries that we're working in. And all of the things that we found related to content, particular kinds of content and some of these are the creation of imposter profiles of women often to discredit and defame them, the spilling of private and/or sexually explicitly photos with the intent to harm and blackmail, pages, comments, posts targeting women with gender‑based hate, misogynistic slurs, threats of rape, and torture and other kinds of humiliation. And also the publication of personal information identifying where women are, where they live, so when a threat like I know where you live and I'm coming to find you and rape you happens, you know it's true, you know they have your address and you know they can find you.
So this is the context that many women online live with and experience every day, and what we found in terms of the terms of services ‑‑ Terms of Service at the companies that we surveyed were the following, and again, it's not surprising.
Most of the Terms of Service refer to illegal issues that involve a violation of copyright, financial fraud, extortion, child porn, but don't necessarily mention any Human Rights issues, especially those based on gender, sexuality or related issues. Even when there is a clear definition of unacceptable use in the terms of service, it's not usually accompanied by clear and access procedure to deal with complaints. So even if you complain, you don't know what happens to your complaint and how those decisions are made.
We also know that most companies don't enact the disruption of services policy in relation to abusive behavior or acts of stalking, harassment and threats, and suspending services is confined to instances when they are owed monies. If the law of the land does not take abuse of coercive messages sent via phone taken seriously such as in cases of Mexico and Bosnia, it's unlikely companies will take any helpful action on behalf women or any other vulnerable group being targeted. While most companies do have mechanisms in place to respond to these violations, but there's very little public information about how they work, as I've said earlier. We also don't know what kind of training staff receive and what kinds of values are informing the kinds of decision that are being made. And lastly, which is very significant for us, is that there's a tendency to shift the burden of dealing with this kind of harassment to the state or national governments where the operations are located or to individuals facing harassment. The responses we get are that law enforcement is responsible for addressing violence against women and that a court order is essential to protect privacy rights and that women and girls should take steps to keep themselves safe. This is exactly the kind of narrative that exists offline with the burden ‑‑ where the burden of proof to show harm rests with the victims and here we have an opportunity to change the conversation, to talk about a set of norms and values that says actually the burden of proof to show the harm that we experience as a result of the actions of others is not ours, and I think we really are missing out on an important opportunity to really begin to shift things.
And I'll just give you to close so we can speak some more later on, just some of the recommendations that we are making. The first is that policies need to provide adequate protection to vulnerable communities and I think often when we talk about the language of protection, there is an anxiety because the language of protection is often used to close down spaces. You can't have access to this content because it's harmful to women, we're doing it to protect you, so we really need to have a nuanced conversation that allows us to surface these particular issues and find solutions. Companies should also invest in capacity building for personnel and have procedures that relate to online violence against women. Lastly, the last recommendation that I'll speak about is that agreements between the user and the company such as Terms of Service and privacy policies should make explicit references to online violence against women and not only a general prohibition against a legal use and they should provide a transparent and easily accessible process for filing complaints where rights are violated. Thank you.
>> NICOLO ZINGALES: Thank you, Jan, I think one of the points that you touched on is very, very controversial and that is to preserve and the privacy to preserve the harassment, this is something that recently came up, in Brazil there was an order by a judge two weeks ago which required Apple and other, I think Google, to delete apps that would allow anonymous comments precisely because this would stimulate possible online bullying and there would be no remedy against that because you wouldn't know who the perpetrator was, and the same issue came up in a different way but in a case one of the reasons why the Court in Estonia held liable was it was allowing anonymous comments and the system review according to the Court was not sufficient to deal with the potential issues raised. But these are issues that are cutting across the areas of intermediary liability and I think Nick is in a good position to talk about the parallels between these two areas, given that he recently organised and moderated a workshop at the Australian IGF trying to compare the mechanisms dealing with online content management in these two different areas.
>> NICOLAS SUZOR: Thank you, Nicolo. I think this is a really important panel and a really important project actually because we're talking a lot about responsibility of intermediaries and we don't really have a common dialogue yet, and what it means is that in a lot of sessions that I've seen this week and last week, we're talking across purposes because we don't quite understand what each other wants and what the different trade‑offs might be and it is a legitimately hard case. We're talking essentially about an essential tension between the efficiency with which we can enforce our laws on the one hand and the trouble we've had enforcing our laws in online context and on the other hand we're talking about really classic liberal commitment to due process and full procedure of law, the rule of law.
So the rule of law requires a number of different things. It requires transparency, public rules, it requires that the laws are predictable and certain and it requires that they're enforced in a way that has appropriate safeguards for due process. So we're stuck here with a trade‑off where we have to decide to what extent we're willing to give up due process provisions in order to gain efficiency in enforceability, and that's a hard choice that we haven't got consensus on. So this is what worries me the most about intermediary liability. I worry about ‑‑ or platform responsibility, I worry about market deals that don't necessarily have the legitimacy of public law. So in order for law to be legitimate, it needs to be created in a process that is legitimate, and we don't have that certainty at the moment. In most of the cases when we're talking about platform responsibility, a lot of the cases they deal between industry representatives, we don't have the procedural guaranties, we don't have the substantive guarantees to be sure that these laws represent the best outcome for the public, for users.
I think there's a couple of things that have come up over the last week. One of the key ways we can try to increase the legitimacy of these processes is by greater involvement of public interest groups, but in order to have that happen we need user representatives that have a seat at the negotiating table and we need union representatives to actually enforce the norms that are agreed in that negotiation process and most of all we need transparency, that's what we don't have at the moment, when we're asking private actors to ask regulatory things, when we're asking private actors to take on public functions of regulating behavior, we don't have the standards of transparency that we require to trust that the law is being applied in a way that is legitimate, clear and predictable.
I think this leads us to a second big problem and I raise two points before we open it up to discussion. Essentially I don't think we have the data yet, we don't understand enough and there are assumptions being made on both sides that we don't have the empirical evidence to support. We need much more information on what is currently happening and also what is possible, what the trade‑offs really are. Particularly I think on that point, we often get stuck on these assumptions, so we say that, for example, website blocking or takedown notices can't be 100% effective and we get focused on that 100% efficacy and we don't necessarily look at how that might be practically effective.
On the other hand, we make claims about whether or not a particular measure will break the Internet or segment the Internet into different regional jurisdictions and to an extent these are really legitimate concerns that we need to work through, but we need more data in order to make these claims with a little bit more predictability about what the actual effects are. Similarly on the other side, when we're talking about impositions of due process, we need more information to be able to tell what sort of trade‑offs we're making on protection of individuals and vulnerable populations on the one hand and trade‑offs to speech on the other hand. These are legitimate hard balancing acts that we just don't have consensus on and we don't have the ability to work out the effects of proposed measures.
The last point is that I think Malcolm from Lynx summed it up really well yesterday, that in a lot of cases whether or not you like intermediary responsibility or liability depends on whether you like the substance of the law and I think we really need to work quite hard when we're talking about general principles to step back from the law at hand. To give you an example, I don't think that copyright, that intermediaries should have greater responsibility than they currently do to enforce copyright norms but I think that primarily because I think the strength of copyright is too strong and tips the balance for a desirable system, I worry about the effects on innovation that we spoke about earlier, I worry about the effects on due process for users.
But importantly, I don't think that it's illegitimate for intermediaries to, or necessarily illegitimate for intermediaries to enforce public law, and I think there's a mistake we're making in a lot of these discussions, because copyright takes over most of the discussions and there's very heated debates about it and because of that focus on the substance of the values and the norms that are being enforced, I think we lose something about this broader question of when should intermediaries be responsible and when should the individuals be responsible for their own behavior, and I think that's a completely different question to the substance of law at hand.
>> NICOLO ZINGALES: Thank you. Joy, do you want to ‑‑
>> JOY LIDDOCOAT: Thanks. I'm actually at this point going to try to sum up some of the things from this extraordinary range of panelists and open it up actually to the audience and I'll bring the microphone down and see if we can get some of your questions on this topic. Some of the things coming out of this are firstly around which rights are we talking about when we talk about platform responsibility for rights online and obviously we have got some rights around author rights, copyright, intellectual right owners, but we've got actually a whole range of other rights as well, privacy rights, right to be free from discrimination, free from discrimination or violence, innovation and content creation. And one of the difficult questions in the platform of platform responsibility, are all rights equal or are some rights more equal than others? Are we prioritizing citizens' rights because we think they're easier to enforce or avoid liability, and if so, why? Is it easy to monitor and police Intellectual Property laws for satire and parody, is that any more easier than to do violence against women online? I think we also have the multistakeholders tensions that some of the panelists have mentioned and the questions of which remedies are adequate and is there room for innovation and remedies. But I would really like to open up to the audience and get some feedback and thoughts about this concept of responsibility, and is there another microphone? Can I just pick this one up? While panelists think about that, we'll get some responses. Do you have a question? Okay.
>> AUDIENCE: Thanks, this is not as much a question as a reflection on what has been said so far. My name is Manu Sporny, I'm the Chairman of the Community Payments Group at the World Wide Web Consortium. My background is primarily in finance and banking, and the reason I'm interested in drawing a parallel between finance and banking and what all the panelists have said this morning is that we can see what happens when you overregulate a particular industry, and banking and finance is one of the most heavily regulated industries on the face of the planet. It is also one of the least innovative even when you look at some of the innovations that are coming out of banking or finance or in the past half decade, it's arguable that some of the things that are labeled as innovations are really a joke, it's built on top of a very heavily regulated framework and therefore the types of innovations are not the same types of innovations that we're used to coming out of the Web, so I think we can use that as a measure of how much we should be regulating, you know, these particular ‑‑ and I'm not saying all regulation is bad, not even close but when those regulations are put in place, it takes a lot of work to undo the regulations. If we're overzealous in the way that we regulate some of these practices online, it can be incredibly damaging, multidecades worth of damage as far as innovation is concerned to a particular industry.
>> JOY LIDDOCOAT: Thanks. I'll just get several comments and then we can ‑‑ yes?
>> Yes, my name is Bishaka Data and I work with a nonprofit organisation in India called Point of View so I wanted to make a couple of comments and ask a question to the last two panelists. One is I think in terms of when we are talking about due process and online threats and coercion, et cetera, I don't think any of us are arguing that gender rights or the body of gender should be placed as due process because that would be completely counter‑productive. We actually want due process to address these things, the way they do other Human Rights, so that's very important.
The second thing is I think the way this gets framed, it's as if to address online threats and coercion, we have to clamp down on free expression and speech, and I think that's very troubling to many of us who are very strong advocates of free expression and speech. So again, I honestly don't think we are going to get anywhere by sort of through this binarized approach or this polarized approach. We're seeing it in many things of the conference, like yesterday we saw net neutrality versus access, we're seeing privacy versus the right to be forgotten and sort of standing on both sides and saying no, no, I'm going on cling to this position and I'm going to cling to this, we're not going to be able to get to a realistic situation which meets like the needs of users.
So what I want to actually ask is how can we ‑‑ one is in terms of speech. I'm curious about whether we can look at certain forms of online threats and coercion as forms of hate speech as defined by the UN special Rapporteur and whether we can actually expand our understanding of free speech to say that, you know, if somebody's free speech shuts down another person's free speech, then that's also a violation of free speech, and the other thing is how can we sort of really work where we are not saying it's only intermediaries that have to take liability, I don't agree with that, but we clearly need some sort of approach or something.
>> JOY LIDDOCOAT: Okay, so I'll take one more, we've got this one, then we can go to the panelists. Yes.
>> AUDIENCE: Yes, mine is so much not a question but maybe more of a comment. I heard among panelists that we should make e‑commerce an example or best practice, example for the whole world. I just want to say my idea that e‑commerce directive can't be taken as a best practice if it can't be applied in European Union out of 27 Member States, more than 15 have misapplied the e‑commerce directive, so it can't be taken as a best practice if even Member States of the European Union aren't able to apply it. We've had cases in Italy, we've had them. We have cases in Latvia, Slovakia, Poland, Estonia, where intermediaries regardless of filling all the commissions of a directive are still received as strict liability subjects and this has been had seen as a just and cause approach so I think the e‑commerce directive does have the right idea inside, however, I am convinced that it is fundamentally flawed.
>> JOY LIDDOCOAT: Comments from the panelists perhaps before we go to the next one?
>> MARCO PANCINI: I can absolutely take the last one, and I agree that there are challenges in the implementation of the commerce directive, but if you look at the general trends for sure the Court can request liabilities of intermediaries and frankly speaking if you look at the most worrying trend is when the Court started to build a new concept, which is the active concept which moved away from the court of the commerce directive and created a new category of intermediaries that are in the middle between the intermediate areas and the content provider, so created some confusion. But I think, for example, in some of the last Supreme Court decision in France and Italy, the court of Human Rights we hope will enforce the principle that the intermediaries are protected and the condition remain the same, but are protected, but I agree on the issue of limitation is not perfect.
>> KONSTANTINOS KOMAITIS: Thanks. Very briefly, I want to go back to the gentleman that was talking about the payments and the regulation and yes, I think that has become clear over the years that overregulating anything is not good, especially when it comes to the Internet and the technology that is changing so fast, as it also has been proven that the safe harbors in this let's say fair approach in the regulation of platforms has been proven to be beneficial for innovation. However, I think at the same time we need to make sure we understand that those safe harbors should not be seen or interpreted as any sort of immunity. Platforms have responsibilities. They set normative principles that deal with issues of speech, issues of privacy and this is not something that we haven't seen in the past. You know, we have various communities that have come together to create those social norms, and I am thinking for example right now back in the 16th century issues of Lexmatoria, those were really bottom‑up, and in a lot of cases what we're seeing is platforms are creating top down rules imposing on those communities, and in parallel, you know, anyone can say you can leave the community if you don't want, but we've also seen the exit from those communities that is not as easy as one might think.
So there needs to be this balancing act because a lot of those norms that are being set determine user behavior, and we need to be very careful as to what sort of behavior we're encouraging or we're not encouraging within these communities.
>> NICOLO ZINGALES: Paolo?
>> PAOLO LANTERI: Yes. Very briefly, I would like to make two comments. One is an obvious statement regarding the appropriateness of reforming regulation. I think a court of law is slower than technology, and I think there's also very good reason for that. It's just normal from time to time in a fast‑moving environment like this one of internet intermediaries, at some point we need to address the opportunity of modifying the rules.
And the other comment is regarding regulation. Okay? We all agree overregulation is bad. Overregulation is bad. But if regulation is not needed in general, is not the way forward, can someone tell me what would be the main driver, the main force that will make platforms want to become responsible? What would be the driver? Because at the end of the day, there are commercial players, they can't be driven by the good in general, some sort of guidelines should be provided by a neutral and third party that could be public authority.
>> NICOLO ZINGALES: Thank you. I actually want to say that that's exactly what we plan to do with the recommendation and you touched on that technology also is important, when you regulate, if you decide to regulate, you leave the norms in such a way that they can be shaped by the judges interpreting in a way that is in accordance with other technology, but I think Rebecca has a burning question there. Do you have a microphone?
>> JOY LIDDOCOAT: This woman here.
>> AUDIENCE: Nick, you had said online there's a trade‑off between this, I am troubled by this, because from my understanding of the law, if there's an absence of due process that can lead to arbitrary violations of right and this can happen with impunity, I would like it if you could elaborate on what safeguard you think can replace due process as a principle, especially if efficiency is the value that we're choosing online.
>> AUDIENCE: Thank you. Can you hear me? Thank you. Thank you very much. I think it's been a very interesting panel and it seems like there's a lot of conversation about the need to balance, find the right balance between rights and responsibilities. I think there are some clear‑cut cases where you have sites that are basically dedicated to illegal activity, where you have piracy, pirate sites that we heard earlier this week are also a major source of dissemination of child abuse images, of advertisements for prostitution, et cetera, so these sites are just illegal actors, right? And how can we go about handling those? Then I think there was a question I think Google mentioned that you have to look in the weeds, you have to drill down into the weeds on a lot of these issues. We made some progress in the ODC principles in highlighting and in the NETMundial principles, how you balance those it the a very high level, we need to drill down and I think we need to think about this in the continuum. If there are sites that are truly dedicated to illegal activity, that should be easier. Then there are going to be sites, and I don't mean to pull it up, but YouTube where most of it is legitimate content on YouTube, you've done content ID, there are ways that you have to deal differently with different type of actors in the ecosystem. If we can take those dedicated illegal activity and focus on those in the first instance, maybe we can make some progress.
>> AUDIENCE: Hi, I'm Andrew Bridges from Fenwick and West. To the Google comment about follow the money, is that perhaps just an expression of not in my backyard, saying go chase somebody else? Because let's look at the questions of efficiency, due process, administerability and competence. Why are the financial intermediaries in a better position than the online intermediaries? I've had five clients be threatened with the cutoff of their payments simply because of the industry they're in, and arguably one can argue with the banks, but there is no six strikes policy for MasterCard and Visa, there's no graduated response mechanism. What do others propose to be the due process for the follow the money approach?
>> JOY LIDDOCOAT: Thanks, we'll take one more and then go back to the panel.
>> AUDIENCE: Thanks, I'm Rebecca Kenna, I thought I would bring up one thing for those not going to the meeting. When we speak about platform responsibility and what that means, I come from a rites perspective and I thought I would mention the United States guiding principles on Human Rights as perhaps part of the framework for thinking about what platform responsibility means. And according to the UN guiding principles, states have the primary responsibility for protecting Human Rights and that of course means that laws and regulations should be consistent with Human Rights norms, which is, of course, a separate problem, and part of I think the challenge with platforms is sometimes they are subject to laws that are not sufficiently consistent with Human Rights norms or have not been thought through properly in enforcing other interests in terms of what the Human Rights impact will be. So that's one of sort of the challenges that platforms face in responding to laws that may or may not be consistent with protecting Human Rights.
But then secondly, businesses do have their own responsibility to respect Human Rights, and obviously that's challenging when they're operating within inconsistent state frameworks, but nonetheless, the UN guiding principles lay out a set of things that companies should be expected to do, including Human Rights due diligence. So what this would mean in the context of Terms of Service would be that as you are constructing your Terms of Service and as you're putting together your processes for enforcement, are you conducting Human Rights due diligence about what the Human Rights impacts are going to be of these terms and of these processes, you know, not just freedom of expression and privacy, but other Human Rights that may be implicated.
So I think one interesting thing to discuss this afternoon might be how can we help companies put together these due diligence processes which thanks to the GNI and some other processes are more advanced when it comes to Government requests and there are clearer principles around how companies can handle Government requests in a Human Rights compatible manner, but very little discussion and I think no consensus about how that should work when it comes to private enforcement or co‑regulatory mechanisms and so on. So I think there's a lot of useful fruit for discussion this afternoon.
>> JOY LIDDOCOAT: Can we go back to the panel?
>> NICOLAS SUZOR: Thank you so much for all these questions. There's really good questions here on the conflict I was talking about between legitimacy and due process and I think one of the key points is we get stuck, you're absolutely right, and we treat it as an either/or process. I don't think that's necessarily correct, so you ask particularly about limits to free speech, absolutely there are limits to free speech, we all accept that. Difficulty is in the hard cases where you have a borderline judgment, you have a really tough time if you're an intermediary making that judgment call, and that's a risk to legitimacy where we have private organisations that are not accountable outside of the judicial process, that's when things get tough, but by focusing on those hard cases I think we do get bogged down and what seems to be coming out here is a really good suggestion, that there are a bulk of easier cases that might be worth investigating first. Now, they're not as simple, I think, as we've been necessarily talking about, but I think one of the key points is that this conflict between legitimacy and efficiency is not really helpful, and the reason I think that is that we can imagine different mechanisms. I don't think that it has to be either state or public ‑‑ either state or private regulations. So instead of thinking of it as a continuum between state on the one hand or private on the one hand and state on the other, think of it more as a pyramid where we have ‑‑ we can build in due process safeguards in regulatory solutions that see primary decisions being made and intermediary level perhaps with appropriate provisions for users who feel aggrieved by a particular decision to seek the more legitimate redress through a territorial judicial process court.
>> MARCO PANCINI: We all agree, and I didn't mention legal content because we agree the worst of the worst of the Internet has to be taken down, so from this point of view we are absolutely on the same page. I want to tell you that sometimes I was talking about going into the weeds because even on the worst of the worst there are some questions that they are raising, I just want to mention the case of the enforcement that we then automatic detection system of a pornographic system that was changed by a repeated offender already under investigation from FBI and never the discussion raised around technology that can detect the worst of the worst. On this we are all on the same page but even on this, I think it's important for the sake of keeping enforcing against the worst of the worst to keep open the debate with the Civil Society be accountable in explaining why and how we can go after the worst of the worst. So just to clarify that we agree that the worst of the worst has to be taken down across all the different cases.
On the other question on follow the money, I'm sorry, for the sake of time, I didn't express in a more complete way our position. Our position is not to shift the responsibility to going after the financial institution, the financial institution will look into the solutions that are available and I agree with you about, you know, Visa, MasterCard, the importance of the rule of law and making sure that cutting the source of revenues of criminal organisation is done in a way which is again under the rule of law. I was talking more about the initiative that we can take, initiative to our advertising network to make sure that going back to the worst of the worst, nobody that is selling or trading bad stuff online, illegal content online, can use our advertising network in order to do that. So it's our responsibility that we take as first on our platforms, on our properties.
And on the discussion on the possible due diligence under Human Rights principle and guidelines of terms and conditions, I think that's a very important area to explore, and I think we need to look into this.
>> JANINE MOOLMAN: Thanks, I just want to comment on Bushaka's points, and I think it's definitely a red herring between freedom of expression and violence, it seems like we aren't on the same side, when actually we are. We want to create the kinds of conditions where all people are able to exercise free speech and I think it's dangerous to create that binary which doesn't take us further anyway, and I think it also raises the issue of why multistakeholderism is important and the question to ask is who has a stake in what and what is the stake that people have and what that means in those conversations. And lastly, just the point that Rebecca made, one of our key recommendations from this research really is around the guidelines of business and rites.
>> NICOLO ZINGALES: Konstantinos, do you want to respond?
>> KONSTANTINOS KOMAITIS: Very briefly I want to respond to Dave's point and it was interesting and it goes back to what I was trying to say, that a lot of these intermediaries are really but actors, they're disseminating child images, promote prostitution, sell drugs that can kill people, et cetera, et cetera, and this is where these problems begin because these platforms are using the immunity of the safe harbor to sustain themselves. The big challenge is how of course you differentiate between those legitimate actors and those actors that are promoting illegal activities and I really do not have an answer but as part of the platform responsibility especially for these established platforms, those are who are engaging actually in legitimate and legal acts, it is to actually differentiate themselves and it is to start saying that you guys cannot be using something that we have really managed to use, to innovate, and we have worked together with many other actors in order to try to find solutions.
So I really do not have an answer as to how this is done, but for me, it is very important that we start actually contextualizing these are the platforms that we can work with and these are the illegitimate actors and then we hear the issue, for example, the way follow the money is used, you know, it's used across the board. Again, we need to be targeting those actors that are engaged and we know that they're engaged in illegal acts and we should not be creating general rules for everybody and questioning, for example, the safe harbors in a general context. We need to find a way, and possibly that will be done through cooperation, and I'm going to use again the M word, the multistakeholder cooperation in order to be able to identify how this can be done.
>> NICOLO ZINGALES: Thank you. Let me try to wrap up before we ask a last question. It seems that something that comes out strongly is that a way to overcome the imbalances might be exactly to have this multistakeholder discussion, and this is also not only to involve the users that might sometimes be excluded particularly when there are agreements in the industry regarding, for example, voluntary measures, so possible codes of conduct, but also because some I think of these communities don't talk to each other, so you have people that deal with copyright, people that deal with online harassment, women rights, privacy, and they don't interact that much. So I think the IGF is a viable place to do that.
But in the end, the essence of the problem I think which needs to be reconstructed is that there is attention between, of course, due process on one hand and the efficiency of the procedure that is adopted and the extent to which freedom, you know, to innovate can be hindered by a procedure which entitles everyone to be heard in every possible instance where there is a decision to be made. So in the case of online harassment where the danger of harming someone might be very high, it can be problematic to wait until there's a full judicial procedure before acting. So this brings me to a question that I think motivated this panel and the concept of platform responsibility and the Dynamic Coalition, which is how can we enable a market‑based solution which takes into account Human Rights, which ensures that the mechanism that is in place also has the safeguards to respect due process, freedom of expression, privacy? Can we provide transparency so that users can make an informed choice and choose the right platform, choose the right privacy settings and follow the companies that are adopting a platform responsibility, so I would like to ask each of the panelists to say what they think about how we can promote this concept of platform responsibility.
>> NICOLAS SUZOR: That's a really tough question. I think the key first off is transparency. One of the problems with Terms of Service at the moment, one of the problems with seeing where the companies compete on the quality of their processes is that we just don't know what's going on in the vast majority of cases. There are some intermediaries who are doing a much better job of what they're actually doing than others, but for a lot of the time we don't know what's going on, so there's not that competition for consumers to be able to make an informed choice. I think in order to reflect public values we need one, transparency, two, participation in the negotiation process and three, some form of accountability when we do have those great norms that companies have breached in order to make sure that that what that private, the way in which those private decisions are being made is done in line with our public expectations.
>> JANINE MOOLMAN: It's very difficult to come after something so succinct because I think what Nick is saying is exactly some of what the findings beyond the recommendations that we have in our research and I think the one other point that I want to make is that when we are talking about the kinds of solutions that are necessary, we also need to look at, so for example, if we're looking at women, we need to be aware that there are different kinds of women and mechanisms and transparency mechanisms suitable for women in the global North might not be suitable for women in the global South, for example if you make something about a report about something, the report could be very clear but if you have to submit that report in English, it doesn't make any sense, so we really need to think about differentiated kinds of responses to different context.
>> MARCO PANCINI: Free action from our side, someone digital literacy so we need to invest in content online that informs users about their rights and also the terms and condition of the services and give them information how to report content and how to deal with the controversial content.
Second is the transparency. We believe that transparency is very important because on the other side, all the action that we can take in order to improve enforcement and improve the take‑up or takedown for example of reporting of illegal content online has to be counter balanced of full transparency across all the board about our action to adverse content online. Lastly I think there is an opportunity to talk together with all the different parties and get ideas on how to make it better. It can be at the level of the way we write terms of condition, it can be at the level how we inform users about the terms and conditions, we tried with videos, we tried with interactive stuff and we tried even with right owners with videos where copyright text, playing with puppets and trying to explain copyright in an easy way and how we enforce copyright in an easy way. But there are plenty of ways that we can experiment with this.
>> PAOLO LANTERI: So Human Rights in the platform community. First of all to me at least this process of platform responsibility implies that we are dealing with responsible platforms at the same time, so I believe we can focus with these kind of subjects that are willing to tackle this debate. Secondly, I believe we should have a reflection on the focus. Here we're talking about many different rights, even there are different ways tone force them, for instance, limitation and exception on copyrights, so I'm not 100% sure we can budget in the same practical technical solution copyright, Human Rights and many other things. So though we should try and we should think more about that and how to make a stream towards the same channel. And finally, in looking for a solution, process is key, it's already been mentioned. Of course, participation of different stakeholders. But process is key since the beginning of the process, I saw many, many remarkable initiatives in trying to solve the problem that started simply by for getting one piece of the puzzle and therefore was not then awarded the credibility they should have deserved. So these are my key points and suggest them for future work.
>> KONSTANTINOS KOMAITIS: Thanks, I think we've had transparency mentioned and I think there's clear need for services, users need to be able to understand what they're signing up to and what they're getting themselves into. A clear and robust accountability mechanism for me, the way I see accountability in this context is safety. They need to be able to know that their address system out there, when they speak, they're not going to be ex‑communicated from the platform just because they speak and they say something. Communication is key. Regular updates as to what is happening. User empowerment. Allow them to ask questions and challenge those platforms in order to be able and possibly even shift the way those platforms are moving.
And finally, I think that platforms should be asking all the time what do users want. Without users, those platforms are empty vessels, and without users, we don't even have the Internet, and we never actually ask users what they want. So it is important that we keep on asking users what it is that you want, what it is that you don't like, and why is it that you want this to happen.
>> ROBIN GROSS: Thank you. Yes. I think that when I was first asked to be on this panel and it was called responsibility online content management, it seems you could go in two different directions with this. The platform's responsibility to their customers or the responsibility for their customers' activity. I Middle Eastern, this is really two different ways to look at it. Are they going to be ‑‑ are these platforms going to be responsible for the potentially illegal activity of their customers, or are they ultimately ‑‑ is their primary responsibility to their customers to protect them save finding some sort of illegal activity? So I think that was kind of an important distinction to make when we talk about responsibilities.
I also think that the way forward is really just to try to make it ‑‑ don't making a legitimate customer so difficult, and I think that that would be the way, the number one way to discourage piracy as well. And in the case of these ISP's, really the only way I think you can find them responsible for the activity of their customers in a fair way is if they're actually encouraging illegal behavior, if they have knowledge of that illegal activity, and they're profiting from that, if we go further than that I think we risk going too far in the other direction and we'll ultimately chill freedom of innovation and freedom of expression. Thank you.
>> NICOLO ZINGALES: Maybe there's a time for a last question. Is there a remote question? Just a ‑‑
>> JOY LIDDOCOAT: Is there a remote?
>> NICOLO ZINGALES: The remote moderator.
>> JOY LIDDOCOAT: She's been waiting patiently.
>> AUDIENCE: My name is Emile Laotia, I work with the Association for Progressive Communications APC, and my question is directed to Konstantinos from ISOC. You said regulation by neutral party would be ideal, right? Regulation by a neutral party shush the ideal situation but my question is in Africa we have the African union, cyber security and personal data protection, and at least 15 AU Member States and it contains provisions on, you know, cyber security and interest on intermediary, it does not provide safe harbor for intermediaries, so I want to find out the responsibility for online platforms, is it within this framework and maybe something that Robin has also mentioned already, is it the responsibility to the governments because they ratify this and they want to, you know, they would want to create new laws to work with the AU Convention, so would it be responsibility within this law or responsibility to us users, or just responsibility to the bottom line, to their bottom line? So I just want to know from the ISOC companies.
>> KONSTANTINOS KOMAITIS: Okay. I don't think I mentioned neutral regulation. So going back a little bit, the automatic reaction of any Government is to regulate. This is what they do.
And also when it comes to responsibility, I would really like us not to think only responsibility of one party, but responsibility in the context of everybody's responsible for what is happening, especially when it's happening on the Internet because we all have interests on the Internet and we all take from the Internet what we want.
In the context of platforms, we see increasingly that the more we regulate, the more problems we have in terms of understanding how these platforms function, their ability to innovate and their ability actually to be the actors and have the position they want in the Internet. I ‑‑ this is me, this is not the Internet Society's position, so when we regulate, I think that self‑regulation is a very interesting way forward, but again, self‑regulation should not be seen as a solution to everything because self‑regulation needs to come with certain safeguards. And I think this goes back to what Paolo was saying earlier, that, you know, please identify a way whereby we cannot have this governmental regulation but at the same time we make sure that due process and accountability and all the rest of the boxes are taped, and I think this is where cooperation starts and we can see various aspects where platforms are engaging in self‑regulatory frameworks with other actors, to be able inside the frameworks. We need to capitalize on the experience that these actors have within the eke co‑system but at the same time we ‑‑ within the ecosystem but at the same time we need to have mechanisms that allow for these actors to be held responsible and I think that's the point that I was trying to make.
>> AUDIENCE: This is more a comment. I'm from Europe. I wanted to build up a little bit on what has been said on the fact that platforms should mind to care about what users want, but I think platforms should ‑‑ I mean, if we have to envision and develop solutions to allow platform users to protect their rights, platform users should also be able to find them. The problem is not just to take care of what platform users want, but to enable them, empower them to find them, so that is one of the reasons why ‑‑ by the way, I'm one of the evil minds behind platform specific coalition, so what we were thinking about is also to link this model contractor provisions to a sort of visual identity, some labels like the creative comment labels they see there, why people choose creative comment labels, because everyone can identify the labels and everyone can immediately know what are your rights if you use a certain badge.
So if platforms, if we develop together model contractor provisions and platform know that when they put them close to their Terms of Service, users will know that they then will respect their rights, platform will have an incentive on adopting this condition because they'll have more clients, more customers, because customers will trust them because they respect their rights. Thank you.
>> NICOLO ZINGALES: That's a very nice closing and I suggest that we take this debate to the afternoon session. So if you haven't managed to ask a question, you can come there at 2:30 and yeah, we'll have another one hour and a half of discussion with two keynote speakers, and a very interactive participation. So you are very welcome. Yes, please give a round of applause for the panelists. Thank you.
(Applause).
(The session ended at 10:26)
***
This is the output of the real‑time captioning taken during the IGF 2014 Istanbul, Turkey, meetings. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the session, but should not be treated as an authoritative record.
***