FINISHED TRANSCRIPT
NINTH ANNUAL MEETING OF THE
INTERNET GOVERNANCE FORUM 2014
ISTANBUL, TURKEY
"CONNECTING CONTINENTS FOR ENHANCED
MULTISTAKEHOLDER INTERNET GOVERNANCE"
04 SEPTEMBER 2014
14:30
DYNAMIC COALITION A PLATFORM RESPONSIBILITY
***
This is the output of the realtime captioning taken during the Internet Governance Forum 2014 Istanbul, Turkey, meetings. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors.It is posted as an aid to understanding the proceedings at the session, but should not be treated as an authoritative record.
***
>> NICOLO ZINGALES: We will start in one minute. One minute.
Okay. Please someone close the door. Okay. Thank you very much.
Welcome everyone. Thank you for coming back again for a session on platform responsibility, for those who attended the morning session.
I'm Nicolo Zingales. I'm currently a fellow at the Centre for Internet Society in Rio. And I'm very pleased and honoured to welcome you at the first meeting of the Dynamic Coalition on Platform Responsibility.
What we want to emphasize at the start is that we coined this concept of platform responsibility, trying to focus on something more than the legal requirements that the legal system has with regard to online platforms and their behavior. We want to focus on the social role that this platform has, and in particular the expectation that people have that the platforms are adopting procedures that comply with human rights.
As you know, human rights as contained in treaties applies to States. There are no specific hard law provisions that apply to corporations. But we had three years ago some principles issued at the UN level on business and human rights that created a framework on the basis of which States have the responsibility to protect human rights. Corporations have the responsibility to respect human rights. And they both have a joint duty to provide an effective remedy.
So by focusing on the aspect of responsibility, we want to move away the discussion beyond intermediary liability, even if we recognize that the two concepts are not necessarily related.
Just to mention two ways in which they are related, often in the reform discussions that are taking place regarding notice and takedown procedure, arguments are being made that there is a need to include due diligence requirements for companies in order to benefit from impunity or to escape liability.
And on the other hand, there is a problem that if you don't have specific rules or intermediary liability or you don't have specific requirements to respect fundamental rights, companies through their terms of service can enjoy a wide discretion and not necessarily bring to bear that responsibility to respect human rights that the UN guiding principles puts forward.
So because of the importance of terms of service today, and given that essentially it is what companies do, they are based in a country, usually we are talking about Internet companies that would be based in the U.S. and they do a wholesale exportation around the world of the policy that they apply in their own jurisdiction. It is particularly important to ensure that those terms of service contain the minimum safeguards that allow us to have continuous responsibility to protect and respect human rights expected in all jurisdictions.
So, basically, now we are going to hear two different perspectives from two different organisations about what initiatives are there now, and to what extent we can differentiate ourselves from these initiatives. I think my friend Luca wants to make remarks about this space and the extent to which we can bring this debate forward in a normative way.
>> LUCA BELLI: I work at the Council of Europe. I'm one of the co-founders of this Dynamic Coalition on Platform Responsibility. Before the two keynotes suggest introductory remarks, we know that platforms have a responsibility to protect and promote human rights being cooperative according to the UN principles. They have a responsibility to promote human rights. However, we know that at the end of the day the spectrum of rights and remedies that a platform user enjoys are defined by the terms of service which is an essential tool to regulate cloud-based global services that are provided through transnational cyberspace. But these terms of service are frequently difficult to comprehend or even to read entirely, can vary from one Cloud provider to another, can be unilaterally modified by the Cloud provider, and can provide some remedies for some specific violation, such as copyright or some kind of online arrestment, but those remedies are not standardized and are frequently put in place and decisions are taken without a transparent framework by private entities.
So our goal, our initial idea here was to elaborate a model, a contractual provision that could be put into terms of service that could standardize mechanisms and to promote the full enjoyment, concrete full enjoyment of platform users' rights. And we want to build this coalition also on the experience of another Dynamic Coalition, the Dynamic Coalition on Neutrality, which last year gave a model framework and showed that interaction and cooperation among stakeholders could go beyond simple dialog and can achieve concrete outcomes. Those outcomes can be efficient or not efficient, improved or not improved, but concrete outcomes are achievable. We want to discuss with you today how to achieve those outcomes, outcomes together, and to have an understanding of what -- in which direction the dialog can evolve.
We have two keynotes. First is Jan Kleijssen, Director of Information and Society and Action Against Crime at the Council of Europe. And the second will be by Rebecca MacKinnon, Director of Digital Rights at the New American Foundation.
Jan, please.
>> JAN KLEIJSSEN: Thank you very much. Good afternoon. A few words about the role of the Council of Europe in this Dynamic Coalition, which we very much encourage and welcome. The Council of Europe is an organisation bringing together 47 Governments, representing some 800 million people. And it's perhaps best known for the Europe Convention on Human Rights. And to many of you, notably it's Article 10 on Freedom of Expression I think, which will ring a bell. We also have other Conventions in the field that are being discussed at the IGF, such as the Cybercrime Convention, the Data Protection Convention, and as to what indicates that these Conventions are all interstate treaties.
And what we have done, however, since the very -- since our very early days, which go back to 1949 when the organisation was founded, is to involve Civil Society in the elaboration of our legal standards. And that is a tradition and has been carried -- in being carried out and has been going on already for many years.
What is relatively new is to address companies to deal with business as a partner, and we have started to include a direct reference to business in our 2007 Convention Against Sexual Exploitation of Children, whether it's a direct call on the finance industry and travel industry to cooperate in the fight against sexual exploitation of children.
I'll give an example on the travel industry, that passengers who travel to countries where child prostitution is common are warranted by a leaflet in the cabin that they should refrain from doing certain things. And if they notice certain behavior, there are telephone numbers that they can alert in order to help protect children also outside of Europe's borders.
Now, we have applied this principle of not only -- because at this Convention I can assure you that Civil Society, Children's Rights organisations were of course heavily consulted, but we started to introduce this multistakeholder approach in our standard setting as regards to the Internet. The first time was in 2008, when a soft law instrument as we call it, recommendation guidelines, in this case, were adopted as regards Internet Service Providers and the games industry. And that was a text which was drawn up by Governmental experts together with representatives from industry.
We had then in 2013, in fact in the week before Mr. Snowden came out, a declaration by the Committee Of Ministers regarding surveillance technologies and risks relating to that, which stressed the issues of privacy by design and privacy by default measure, both by States and very importantly nonstate actors.
And then this year, and we had a side event yesterday or a workshop regarding the new recommendation on a Guide to Human Rights for Internet Users, and I recognize several of you who were there yesterday as well, which is a compilation of existing rights within the Council of Europe area, found in the European Convention on Human Rights as interpreted by the courts. So it takes the latest case law, and it also directly in its recommendation not only asks states to bear these rights and to guarantee these rights, as of course the States are under obligation to provide human rights to their citizens, but also calls upon the private actors to do so. And that's of course what we will be discussing here today.
It is clear that the -- because of the globalization of the Internet, the global nature I should say of the Internet, no single Government and no regional organisation can alone ensure full respect for human rights on the Internet. And as so, as was said, it's not naive to think that human rights should be ensured. That's not naïve. But it's naive to think that this will happen automatically. So we need partners, governmental organisations need partners, and of course platform providers are the -- one of the first categories of partners that come to mind.
We very much hope that the guide that was discussed yesterday will be taken up by platforms. We also hope that we will get support from leading companies to raise awareness of this guide to Internet users, because it's addressed really to the users. It's not written for lawyers. It's written as the average users, saying you have the right do this and you are entitled to this and that and that platforms will help us.
What we hope platforms will do is examine their own terms and conditions, and my usual question to any audience is how many of you read Google’s or Facebook’s terms and conditions before using their services? Is there anyone who read it before using their services? Is there anyone who has read it at all?
(Showing of hands)
This is a particular audience. This is the largest number of yes --
(Laughter)
I've never ever had more than one hand raised.
>> AUDIENCE: The question is how many read it for their personal enjoyment or professional careers?
>> JAN KLEIJSSEN: Absolutely. But it shows, I think, that the guide -- that there is a need for users to be informed in a more clear and concise way what their rights are. And we very much hope that platforms will do so.
I think there is also an interest. Because at a recent conference I attended in Rome, I had the occasion to say so yesterday so I apologize for repeating myself, but the University of Malta carried out interesting research. It put the question -- it was a survey. The percentage of respondents that trusted Governments with data was -- or that did not trust, I should say, did not trust Governments was more than 50 percent. What was interesting in the study was that more than 70 percent didn't trust companies to protect their rights. So I think there is really something to reflect upon, also, by the platform, especially I think the companies that do take corporate social responsibility seriously. How they can enhance also the trust, that's what we're talking about, the trust of the users that they do take this seriously.
So without further ado, we look very much forward to your suggestions and any proposals that you may make on how we can assist you in ensuring that human rights are really respected on the Internet.
Thank you.
>> NICOLO ZINGALES: Thank you, Jan.
We will move over to Rebecca.
>> REBECCA MacKINNON: I'm very glad that you took the initiative to create this Dynamic Coalition. A couple years ago I published a book in which I referred to global Internet companies as the sovereigns of cyberspace because their private rules, their code, their design choices, have such an impact on what people can and cannot do on the Internet. What you can see, how you interact with society, and with other people.
And a number of the companies complained to me. We're not sovereign. We can't put people in jail. We can’t torture people. We don't have militaries. And we can't tax you. And we are subject to the Government demands ourselves, so you're not being fair. But nevertheless, there is a form, and I think one reason why we’re here, there is a form of private sovereignty. Obviously it interacts with state sovereignty, and I'm very glad that you both referenced the UN guiding principles on business and human rights, which recognizes that yes, businesses do have some human rights responsibilities, that Governments have the duty to protect human rights, that businesses also have an obligation to respect human rights, and that obviously this is a complex dynamic with a great deal of interplay.
And one of the initiatives that I think relates to the work of this coalition, but which I think this coalition can really help supplement because that initiative has not dealt so much with private rules, is the Global Network Initiative, which launched in 2008, with a set of principles on Freedom of expression and privacy for companies to commit to, and a set of implementation guidelines. But this really dealt almost completely or really deals almost completely with Government demands. So given that companies are operating in a regulatory context where, in an ideal world, all regulations and law would be fully compatible with International human rights norms, we don't live in such a world. And so given that companies are faced with demands all around the world that compel them in the view of at least some to do things that are not compatible with human rights, how should companies deal with this?
How can companies minimize human rights harm and maximize the freedom of expression and privacy of their users in relation to Government demands? So the GNI developed a set of guidelines that some companies have agreed to follow and submit to an assessment process to actually demonstrate the extent to which they are actually trying to put these guidelines into place.
But the GNI is not dealing with terms of service, with the private rules that companies set up or consumer privacy, particularly, except where that interacts very directly with Government demands. It's really an initiative that is dealing almost completely with that intersection between the companies’ operation and what the state sovereignty is demanding of the companies affecting user rights.
And so we have seen also, I think in the past several years with some of the other Dynamic Coalitions, I think a great deal of consensus evolving around, you know, what Internet users' rights should be. But we don't have much stakeholder consensus or sort of clear conversations happening yet about what should the private governance of Internet platforms look like if it's going to be human rights compatible?
And the UN guiding principles that you referenced earlier, there are sort of three basic things within the protect, respect, and remedy framework. The first is the companies must make a policy commitment to meet their responsibility to respect human rights. So to what extent are Internet platforms today making a Public Policy commitment? In fact, we're only seeing that happening with a few companies very recently and sometimes it's in bizarre ways. Like, “we're the free speech wing of the free speech party.” Is that a commitment to human rights? You can guess which company I’m talking about. But how is a commitment articulated? Is it in any human rights framing or articulated just in their own kind of, you know, their own corporate culture framing and to what extent is it important that that commitment can be articulated in a way that maps to human rights language and human rights standards? So that's one question.
Also, the business and human rights framework recommends that companies develop a human rights due diligence process to identify, prevent, mitigate and account for how they address their impact on human rights. And that shouldn't just be in relation to Government demands. There should be a due diligence process taking place both in the drafting of terms of service and in the formulation of how terms of service is enforced. And it continues due diligence around that enforcement process and what are some of the human rights issues that are rising, that are being raised by stakeholders. This requires stakeholder engagement in order to carry out this due diligence process in any meaningful way.
What does that look like in terms of private enforcement, terms of service? We have a clearer sense of what that would look like in terms of government demands, because the GNI companies at least have been building that due diligence process. But for private enforcement it's very unclear. There has not been a lot of discussion.
And then third -- the recommendations from the UN framework that companies should initiate processes to enable the remediation of any adverse human rights impact that they cause to which they contribute in some way. And so this was talked about a little bit in the morning's workshop as well. You know, what are the complaint mechanisms? What are the remediation processes that -- you know, what efforts are companies making not only in terms of complaining against abusive behavior by other users, but if your content is taken down and you feel unjustly or in a way that violates your right to freedom of expression, is there any right to appeal? And what is the process? I hear comments a lot about lack of process in how problems are appealed.
And, basically, my own personal observation has been that activists around the world who get their social media accounts deactivated or certain content taken down, and it seems like it was a mistake or a misunderstanding by the platform, if they’re lucky enough to know me or someone at EFF or someone who has contacts with companies and they find a way to have it looked at a second time and get it started. But if they don't know the other people, then they are just plain out of luck.
So what are the mechanisms to help improve that? That's another Question to be looked at. Talking to the companies, I haven't heard a lot of good ideas for how to improve that problem solving that goes on beyond very ad hoc kind of personnel networks.
So that's -- I think there is a lot of work that this Dynamic Coalition can do that really hasn't been done by other organisations. While at the same time there are certainly a lot of organisations that are working on and have been working on different aspects of these problems. CDT is here. CDT has done work on account deactivation and content removal for terms of service enforcement and has published some documents on best practice in that regard. I'm not going to use this as a platform for promotion of my own project, but we are working on rights to create benchmark standards for Freedom of expression and privacy criteria, and this includes not just how they handle Government demands but also how they are handling their private enforcement and what would the standards look like and how do we measure companies? And this is another problem to contribute to this effort but does not duplicate.
We have APC in the room with the women's principles and some important work on how to deal with gender based violence and threats in a way that is mindful of Freedom of expression concerns, but also haw to get companies to engage with stakeholders on these issues. In a manner that makes the platform something that men and women and people of different persuasions can all use safely. And that's a big challenge ahead. So it seems like there are a lot of gaps that this Dynamic Coalition could potentially help to fill.
I don't see a whole lot of companies here -- yes, one, here.
Okay. Yes. So we do have a few. But definitely finding ways to bring -- have a fairly practical discussion on what is the way to move forward. There was some -- and I'll stop talking in a moment -- but there was some discussion in the workshop this morning about how stakeholder engagement can work in a more effective way. And there is a question of are there certain models? Should companies be having stakeholder counscils? Is there a way for user representatives to be brought into discussion to reflect different perspectives, different concerns from different stakeholder groups that can help companies with their due diligence around terms of service, how things are enforced, around how complaints are structured, how remedy is pursued? There could be some I think interesting models or mechanisms from other multistakeholder processes that perhaps companies might be able to adopt elements of. Or perhaps not. Perhaps new needs to be invented. But again that is an interesting set of questions to explore here. So thank you.
>> NICOLO ZINGALES: Thank you very much for a very complete picture of the framework, not only from the Council of Europe side but also looking into the what are the initiatives that are taking place in the field.
We would like very much here also to hear what you are doing, in what way you see important the concept of platform responsibility and what way your organisation has contributed to that end.
So we want to understand in this first meeting what are the main challenges that are to be analyzed if we want to promote responsibility terms of service. What are the main issues that we should tackle in this exercise?
So again, I can start just by mentioning one. I think today one of the big problems that increasingly will be exposed to us is the rise of automated enforcement. So there will be algorithms, essentially, making decisions over us that have substantial impact on our rights. And one of the big challenges I see is how and to what extent we can incorporate safeguards in this algorithm. So as to respect our human rights, I think this can be done including the safeguards in terms of service so at least if the algorithm doesn't respect the terms of service, we can hold the companies to the standard they set in there.
So this is just the start of discussion. I wanted to hear -- we wanted to hear essentially what you see as the main challenges and what you propose this organisation to do to tackle them.
So with this I open the floor.
>> LUCA BELLI: Do we have roaming mics to pass? No? Well, we can open the floor and we can start with some remarks. Jan, you wanted to add something or we can...
>>JAN KLEIJSSEN: Since there are several companies here and the question was asked, it would be very interesting to, I think, now give them the floor and to see what is being done on the issues that we just mentioned.
>> NICOLO ZINGALES: Does any volunteer want to start with any practical questions?
>> JOHN CARR: John Carr, from the European NGO Child Safety Online. And I apologize for arriving at this conversation perhaps later than many people in the room, so I'm not up to speed on all of the things that were discussed in your opening remarks, but just a rather obvious point. Children and young people, that is to say legal minorities, are people too, and they have human rights, too. And I wonder to what extent that dimension would be incorporated in the directions that, say, you are referring to, or others have taken into account when considering this question.
And these rights go in both directions. There are rights to be protected but also rights of access, but is it a distinct and particular dimension and it has legal dimensions to it. They are explained in the (inaudible) Convention and also in the UN Convention on the Rights of the Child. But there are basics of this that I wouldn't want to be overlooked.
>> NICOLO ZINGALES: I think Rikki has a question or suggestion.
>> AUDIENCE: Thank you. So my name is Rikki Jergenson. We do monitoring nationally and we also have research, where I'm located, and then we have done work on human rights and business for many years, but only recently in relation to the ICT sector. First of all, I think the creation of this coalition is very timely. And also if you noticed, I noticed that the programme here at IGF, there are really a lot of workshops addressing this topic from various angles. And I mean, I've been working on it from different angles for a long time. But I think one of the points that Rebecca made was really spot on in terms of what is the sole issue at this particular point in time, where you have -- you have the RACI guidelines that a lot of companies are referring to, and it seems that a lot of Internet companies in particular are framing their activities and their discourse in terms of human rights compliance.
But it's very much like you said in relation to States. It's very much in relation to potential State actions that violate International human rights standards, and the discussion is not near as advanced when it concerns corporate policies or practices that are not related to a State, but simply related to business interests, related to exchanging user data, terms of service. It could also be for other more or ethical courses they have taken on.
So I think that's really very core and it's taken me a while to figure out that when I was talking, for instance, with participants from the Global Network Initiative, and they have all these nice norms and guidelines, and they talk about human rights compliance and on Freedom of expression and access to information, and then at the same time you can point to a lot of concrete practices, terms of service maybe being the most well-known of them, that are clearly not in line with International Freedom of Expression standards that are driven by other agendas, and really getting that up on the table and being able to address that in an open and frank forward looking way, frankly I think it's very important. Plus now we have the RACI framework that we didn't have a few years back.
>> LUCA BELLI: Excellent. There was another comment from the lady there?
>> AUDIENCE: I think -- sorry, I have a cough. Just to emphasize the point that has been made already, but just another thing we have to focus on is the issue of transparency. What kind of information is available? Do we know how decisions are made, who makes those decisions, what kind of training do people get? I think that's the first thing.
And the second thing is to really think about users as differentiated. Often process and mechanisms are clear, they make the assumption around the ability of people to respond, whether in terms of language or whether in terms of whether people have access, people who don't have access to the Internet experience particular violations to the Internet. So we have to think about that.
>> LUCA BELLI: Do we have immediate reaction from the keynote speakers or do we want to open the floor for reaction from other stakeholders? Rebecca.
>> REBECCA MacKINNON: Sure. Just briefly, just to address your remark, John Carr, there is always this kind of tension between the free expression kind of advocacy and the child protection. But this is where I think stakeholder engagement, and I think all stakeholders concerned with all different aspects of human rights, in formulating terms of service, in formulating enforcement practices, is so important. And where, actually, that type of stakeholder dialog around very specific problems and questions of a particular platform, trying to get it right, is perhaps an opportunity to bring together different stakeholder groups that are sometimes at odds with one another around solving a particular difficult protection which is how do we not treat adults like children, and all of those difficult questions. But it's tough and shouldn't be ignored.
>> AUDIENCE: Thank you. I'm the editor of Utopia. We are a Brussels based web magazine and think tank. First of all, let me echo that I also appreciate that this topic is being discussed at the IGF. And my previous background is in the games industry. And I would like to share an experience from working with child protection and the games industry.
There is a system in place which some of you may be familiar with, called PEGI system. Pan European Game Information system. And there are some mechanisms there that I think would be a good inspiration for this process.
So, first of all, there is a control at the point of sale or point of delivery. You have to buy the box game at the retailer and the retailer will see if the person buying it is under age. So there is a mechanism for control there.
The system itself has an oversight Committee, the PEGI advisory board, which is sort of the annual meeting to which the board and operators are responsible. So that is part of the conversation that I heard over the last couple days about accountability, if you have an annual meeting that fixes a lot of those issues, and that fits into the PEGI system.
And there is oversight that goes with the Commission. So every couple of years the PEGI people have to go see the commissioners and answer questions about how they are doing to increase the knowledge about the PEGI system in the general public, etc.
Bear in mind that this is an entertainment product. It has nothing to do with trafficking, child abuse, personal photos being distributed, or big data tracking of personal information. So the challenge is less severe than what we're talking about for Internet services, but the control system is so much more developed. So I think these three mechanisms could be a good inspiration for this conversation going forward.
Thank you.
>> LUCA BELLI: I would like to highlight the goal also is to gather suggestions, but the goal should be to develop model contractual provisions that define this mechanism and that can be implemented, inserted directly in the terms of service. And what was also said this morning, I said it, I had this comment, sorry for repeating myself, but what is essential, what is key to restoring trust in online services is that the users can have a connection, a visual connection with the provisions.
So, for instance, little labels or badges like the creative common badge allow the user to understand that a certain mechanism is really the terms of service without having to read four or five or six pages of description, without understanding what is originated inside. So what is essential is to model contractual region and to link them to specific labels that can allow the user to understand their existence and restore trust into the service that used them.
So do you have any other remarks maybe on this specific topic or on your personal experience in your companies, in your usual activities with regard to platform responsibility?
>> NICOLO ZINGALES: This is an important clarification, that the final aim is to produce a document. We cannot focus on all human rights within the next few months. So hopefully we will continue this conversation on a regular basis with the mailing list to which everyone is invited to subscribe. But I think one of the key challenges and key objectives of this day is to identify what particular rights we want to focus on, what particular issues. Precisely because we want to have a practical focus as opposed to just enunciating principles, which maybe the Dynamic Coalition on Rights and Principles has already done.
>> AUDIENCE: I'm Emma Lonzo, Center for Democracy and Technology. I just wanted to offer a bit more information about the report that Rebecca mentioned that CDT and the Berkman Centre had done several years ago, where we were taking a look at, after there were several high profile incidents of major social networks taking down activists profiles or deleting their content because of reasons that weren't really very clear. Or they were reasons that you could find in the terms of service but that had not seemed to be enforced consistently across the board on the platform. So what we found as an important first step for protecting individuals Freedom of expression rights as they used these platforms is this point of clarity and consistency and comprehensibility of terms of service. And making sure that what the platform is deciding and what is not allowed to the platform is clearly identified in a way that users can understand. Because maybe of them are not employed in the reading of terms of use like many of us in this room.
And, also, when there is a suspected violation of terms of use, so you don't have an ad hoc response where flags of content are responded to and others are not. And then it becomes unclear how bound by the terms of service the platform considers themselves to be, but also the opportunity for redress and for appeal that has been discussed. And along with that as one of the ways to avoid abuse of these kinds of issues, whatever the key concern is, whatever is articulated in the terms of use as allowed or disallowed content, will then be used by others to silence, which is not a new concept for anyone in this room. But these are some of the things that we identified as crucial issues when looking at how to implement the terms of service that the platforms develop.
>> NICOLO ZINGALES: It seems like the ability to appeal an effective remedy is definitely one if not the most important issue in this space.
Any other?
>> LUCA BELLI: Gabrielle?
>> GABRIELLE: Just a note on how to draft the model terms and conditions.
We work a lot on conflict of rights. So conflict between freedom of expression and the right to privacy, hate speech, or hate speech online. And so what we find is a lot of the time there is a lack of clarity. So, of course, it’s great to get either standards or provisions as clear as possible. But the problem is that ultimately you always end up with a conflict of rights. And a lot of the times you are in the gray area. So you might fall on one side of the line. So you may end up with someone who disagrees with the decision made based on the terms and conditions.
I think it would be useful to refer to the International standards. And I think, for example, in relation to the conflict between freedom of expression and privacy, there is fairly well developed case law at European Court of Human Rights on the sort of criteria that can be used. And I think it would give quite a legitimacy to these appeals processes if they rely on these fairly well-established criteria. You can even imagine sort of a screen and just make sure that the person who is looking at compliances is looking at these different factors and trying to get to a fairer decision.
Thank you.
>> NICOLO ZINGALES: Is the microphone somewhere? Maybe it's better to use the mic there.
>> AUDIENCE: I'm Sasha. I'm with Greenhouse. And I'd like to push the discussion a bit back to the platforms themselves. I think effective remedy is the major point we have to solve. But it's also a very hard point to get across towards companies. So I think we have to come up with something that sounds to them or to us as a solution and not as another problem.
I think in other fields, for instance insurance, companies have actually teamed up to have some kind of appeal board that functions for multiple organisations. I think that could be an interesting thing to explore if we can have some kind of appeal mechanism that is actually outsourced in the sense that it's not a company's problem.
Then we need to have appropriate feedback loops, especially because a lot of the takedown action are automated, are algorithms. Those algorithms will have bugs. If you look at the context of what they are trying to achieve and what they achieve in real life, we need to see how we can actually make sure that in the years they will be improving those algorithm, et cetera.
So I think that we really need to look as a Dynamic Coalition on how we can come up with tangible solutions that actually make the life of those platforms easier and at the same time solve the problems we find are very urgent to solve.
>> CARMEN: I have more like a comment.
I think it's timely to have this, because I just read the paper last week where the researchers suggested that in order to read all the privacy terms you use during one year, it would take 65 working days. So in that, none of us have 65 working days.
So I would take the red label that has this criteria, and then you have a different set of labels on how to handle the complaints, notice of takedown, what kind of system do you use? Automated system? So you have to -- I don't think you can actually start out very, very largely as a whole terms of services. However, it is to start first the privacy, because you can actually see -- you have alternatives for every single problem that you can see. But you have either this or this. Whether you store data? How do you encrypt it? What do you do with it?
So I think it would be very, very huge also for Internet users to open a page and then to see on the down side that okay, it has a blue privacy and green notice of take down. So that would be it.
>> NICOLO ZINGALES: Yes. I just want to mention about that, that there is already something similar called terms of service didn't read. But that is, as I understand it, has a broader focus because it's a consumer protection oriented mechanism. Instead we want to focus on human rights, so we will have a more concrete focus on what the mechanism is to ensure the protection of human rights are.
There is a question here by Nic. But it is an excellent suggestion.
>> AUDIENCE: I want to follow up a little bit on both of those comments. And you’ll have to excuse me, I ‘d like to try something that I don’t normally do, but I’m leaping to the defense of platforms. Because I don't think we want due process, not in any way that we understand that term. When we look at terms of service, they are always drafted to have maximum discretion to enable platforms to control what happens on their network. And if we take the sovereignty metaphor further, that is incompatible with the rule of law values, which are fundamentally a restraint on the arbitrary power. So the problem is, or a large part of the problem is, we don't actually know what we want. And by that I mean there are significant costs to due process, there are barriers to entry to the market, and there are costs on innovation as well as static costs. And with that understanding, the costs that we're talking about imposing on intermediaries, on platforms, I think it becomes difficult to talk about the abstract things that we want.
Because in the end it will be detailed and very highly context specific in order to understand how we might want to see this space. Because a legitimate space that has no innovation or services is also not really a space that we want to create.
>> AUDIENCE: Hi. David Ferris with 21st Century Fox. Sorry for those of you who were at the morning session, but I think it's important to reiterate the point that I made there.
I recognize that oftentimes there is a balancing act that needs to be done between or among different human rights. And there is some significant jurisprudence around that. But at the same time, there are clear cut cases where sites are dedicated to illegal activity and we need to be clear that it's, you know, laws that create illegal activity that are consistent with Democratic principles. We're not talking about political censorship or things like that.
For example, the pirate sites that I mentioned which are facilitating child pornography, it's the same sites that also host ads for prostitution and malware. There should be easy solutions to those. And there is responsibility among the entire ecosystem to work together to address that type of clear illegal activity. And I don't know how it fits in the terms of service activity that you are working on in this Dynamic Coalition, but there has got to be some way it can be brought in. And I'm talking about our own interests, but then there is also the fake pharmaceutical sites and a whole host of sites that are dangerous to individuals.
>> NICOLO ZINGALES: I think this, just to clarify, this, for example, clear cases of obviously illegal websites, it would be probably something that falls into intermediary liability if you don't act to protect the rights of people who are being harmed in those cases. So it wouldn't fall under the terms of service framework that we are trying to design here.
>> AUDIENCE: There could be links to those sites dedicated to illegal activity in platforms, et cetera. So that would seem to me to fall in some way into the terms of service of those platforms.
>> REBECCA MacKINNON: Isn't that a law enforcement issue or is it a private enforcement?
>> AUDIENCE: I think all -- the entire ecosystem probably needs to work together, because law enforcement isn't going to have the resources to do all this.
>> NICOLO ZINGALES: There are cases where the conflicts are not that problematic. It's more clear. I think what we can do here is, indeed, on the basis of the human rights jurisprudence, identify what those clear cut cases are. But in the case of blatantly illegal activity, there is not a need to worry because that would be covered by standard liability laws. But maybe others have other ideas.
>> LUCA BELLI: Let's not be over ambitious and try to solve any kind of cybercrime problem with terms of service.
>> AUDIENCE: I want to go back to the point about using creative common style icons or something that is easy. The truth is, if you look on terms of service that you didn't know or whatever the website is, I read the terms of service. In many cases I don’t know that the platforms want users to really know the extent to which their flexible language has been written. So putting up a logo saying “we can take you down at our discretion and we can do this and that” is probably not in their best interest. And this is probably going to have to be voluntary and it’s going to be tough to ask.
What we might ask is a higher level ask which would be in your terms of service and your general operating procedures, maybe you can use the Universal Declaration of Human Rights as a good corporate citizens. So if you are a search engine and you're linking to illegal pharmaceutical websites, even if by the letter of the law what you are doing is not illegal, but in the case of Google it cost them $500 million in the United States.
There’s a next level down which is: Are you doing the right thing? And I think some corporations will step up and say yes. We want to be good corporate citizens and we tried to design our terms of service around, and that may be a way to try to lure some corporations into this voluntary process. I don't know.
>> AUDIENCE: One exercise that is useful and interesting, and I come here more to listen than to give input, again I would really like to see how we can solve this problem. Because, again, I'm sure that getting -- we can get -- we can have icons, we can have shorter time set conditions, we can have clear wording, but I'm sure that there will be challenges in any case.
I mean, I can tell you that we have normal lawyers; we have normal people at Google. We try to solve the problem to simplify the way that we express our terms of condition. But I can tell you that whatever condition we come with is raising control. So we have more to learn from this process than to give input. This is why I'm here.
On the comment on illegal content, what solution is there and what we can do in terms of responsibility for pharmacies and the technical measures? Again, that’s another - we should probably take a step back and let the people fight because we want the users on our platform to have the best experience and we try and do the best to make this happen, closing accounts, collaborating with authorities, making sure there is no way for females to be exploited.
But at the same time, the question of the balance between becoming the voice of the Web and not doing anything, it's out there. Probably we need to listen more and do things to make sure that we can improve. But the solution is not easy to find.
>> NICOLO ZINGALES: Just a question because you are speaking. The Council of Human Rights was mentioned. Google will have a huge challenge to deal with, which is the implementation of the right to be forgotten. Will this end up in the terms of service? Do you know if there is anything that the Dynamic Coalition could do, for example, to facilitate human rights compliant implementation of this?
>> AUDIENCE: Sure, EDS is welcome. If we can find a wording to be used to make effective the implementation of the right to be forgotten, that is going to be terrific. I mean, we are not the ones coming here saying that we have all the answers. Actually, in the last month we showed that we want to do our best to comply with, for example, with the decision from the court. But we also want to listen that what we are doing is in line with their expectation. That's why we have the advisory committee in place. This is why we are having public hearings. This is why we are listening to everyone. And feedback is actually very welcome on this.
>> NICOLO ZINGALES: Was there another question? Yes?
>> AUDIENCE: So I don't want to derail the conversation, but this is the second time I heard about prostitution and sex work and all of these things as being thrown about as unproblematic areas of legality or criminality and then using it as a way to sway strong arm responses and takedown procedures. I would like to say that these things are complicated. Sites struggle. In fact, sex work is not that simple. Solicitation may be legal, but action or sex work may be illegal. And what is sex work and prostitution? All of these things are still being contested. Stuff like abortion as well, for example Google does not allow abortion in some countries but in some countries abortion is legal. I don't want to get into that conversation, but I just want to say that is complicated.
So my actual point to add into this debate, into this -- sorry, into this conversation, is that maybe -- actually, I really quite like the idea of trying to push for service providers or social media companies at least to have some kind of a -- a -- how do you put it? It's just like a principle. Like we adhere to principles of human rights and that's your basic standpoint of before we do anything else, these are the principles that we try to respect and adhere to and here are the different mechanisms of trying to deal with these things. And maybe there is peer ranking that can happen between different social networks. Like to what extent we are doing the job -- to what extent are we doing the job that we set out to do, according to these principles that we are trying to address?
I like Rebecca's idea of getting the different human rights sectors to be part of the conversation and human rights isn't just like one monolithic thing.
>> AUDIENCE: Paola Anteri from WIPO. I'm sorry to -- I need to ask you some more questions. Just to get a better understanding of what we are really discussing here.
The first thing is really what is the scope, what is the -- it's an extremely ambitious project. But are we talking about all human rights? Are we talking about some human rights? Are we talking about other rights like copyright or not? I mean... related to human rights as well? Like what is the scope, the first question.
The second one, really, on the process. We -- the Dynamic Coalition has the objective of drafting recommended terms of use. How is it going to happen practically? Who is going to do the first draft? How does the multistakeholder -- how the multistakeholder approach is going to work in practice? How you make sure that all players are contacted and welcomed to contribute? And maybe this is not the purpose.
And, finally, a specific question, like would this draft include or provide for something like a desperate resolution when there is a case, for instance, one human rights case, regulation case, for instance, of privacy? Would that include the dispute resolution mechanism? Do you foresee something in that? And if you do, how do you budget in the concept of neutrality? Who would be the neutral party, the third-party to take a decision on whether there is an infringement or not.
>> LUCA BELLI: Those are excellent questions that will lead us to the second part of the meeting, which is to establish a roadmap on how to do this practically, and we are very open to all of your suggestions. But I think that those are the suggestions.
>> ELLEN BLACKLER: Related to that, I think it would be helpful to think about platforms as the different kinds of platforms. I think in the end how it would be applied would be different from a cyberlocker versus a social media platform. So it would be good to start parsing that out.
And also, I just want to say I think the point David was trying to make about -- I don't know what point David was trying to make. He can speak for himself. But I think it's complicated. And that's one of the reasons it can't be just -- I'm not sure practically it's going to be easy to separate what is a law enforcement matter and what is not. And that will continue to be an issue to talk about as we go forward.
>> AUDIENCE: I think what I was saying is that the terms of service and the operating procedures of many corporations are based on the letter of the law. They follow the law. And it would be nice if they took the position, as this young lady just said, that perhaps our terms of service and our corporate operating procedures are in line with the Universal Declaration of Human Rights. And even if the law is written so that we can get around it, we won't, because we want to be good corporate citizens. That was sort of the message I was trying to make. And start at the highest level and then work down. Because if we start at the bottom, we're not going to get anything out of this group.
>> LUCA BELLI: Allen.
>> AUDIENCE: Allen Bart with the Human Rights Project. I want to get to the second part of the meeting because we don't have much time left. And I would encourage us to add to that concrete output that you're foreseeing about terms of service provisions, that we can use this platform also as a means here at the IGF in a more structured way and also a Government way to encourage discussions and to have this platform for discussions of different kinds of initiatives to promote how well platforms protect the human rights of their users. So I hope that will be a clear objective of this Dynamic Coalition as well.
>> LUCA BELLI: One last comment by Sasha.
>> AUDIENCE: So I just wanted to bring forward something that we have done in the Netherlands about notice and takedown, where after some case law intermediaries had to put down content that was either illegal or unlawful without any -- if there were -- if they were requested to and they saw the content and they felt they had to do it. What we did as a sector was create guidelines that people could implement, that providers could implement, to streamline this process. Because sometimes it's very hard for people to see if a complaint is actually a valid one. We have complaints to take down books of which the copyright had already been passed for like 100 years, et cetera.
So it's good to have these kinds of practices, because if these kinds of instruments are in place, it saves a lot of time for platforms, actually, because you will have structured ways to handle this type of information. So I think that will be very beneficial. Maybe not for like the big five platforms, but at least for a long tail.
And I think we should also make an instrument that would be useful for the long tail, because the long tail today might be one of the big platforms in the future.
>> LUCA BELLI: Exactly. That is a good point and one of the reasons for the development of this model provision is to maybe facilitate a startup that wants to have a market, because that startup is protecting the user’s human rights. So that is the next point.
So coming back to the second part of our debate, we have to identify which are the key challenges or the key rights. As Nicolo said before, it's not possible to elaborate a complete set of provisions in a couple of months. It will take probably the time lapse from now until the next year to elaborate a couple of provisions that can have effective mechanisms inside. And so I would like to open the floor to have suggestions of which are the two, maybe three key provisions, key challenges, that have to be faced at the very beginning of this project?.
>> REBECCA MacKINNON: I just wonder if I could make an alternative suggestion, which you are welcome to reject as being completely out of hand. Just having observed some efforts to draft sort of ideal terms of service and so on, and just knowing how difficult that is, I wonder if, again, just -- it depends on the interest of the group. But another approach might be to, I don't know if "Draft" is the right word, but to make some recommendations on what company due diligence ought to look like on terms of service. Make some recommendations on what stakeholder engagement around rights of users in private enforcement could potentially look like. Or make some suggestions for some experiments or pilot efforts by particular platforms in carrying out due diligence or improving sort of the human rights grounding of their terms of service. Whether, you know, that would be an alternative way to go, just depending on -- I mean, if you have a group of people that are really prime to do drafting of terms of service, then please go ahead.
But if there is also interest in perhaps -- maybe there is a group of people who want to draft ideal terms and conditions, and there might be another group of people who might be more interested in perhaps suggesting some processes or some -- just engaging in discussions with companies about what an engagement process on some of these issues might look like or what a due diligence process might look like. That might be an alternative stream.
>> NICOLO ZINGALES: Yes, I think indeed those are the two alternatives. Either we go for very specific terms of service and maybe on one specific right or two., or we go for a broader over the top view, perhaps through some sector guidance, as you suggested.
But I want to mention, we are just starting at CTS, the University where I'm working, a review of terms of service which will enable us by the end of this calendar year to come out with what the best practices are with regard to a high number of platforms, and so that's why we thought we could use this sort of empirical look to inform the process of drafting.
And given that we have the Dynamic Coalition, we can maybe circulate the results, and then on the basis of that see what people suggest and come out with better, even better practices or a system of scoring. So that was the idea there.
But of course this shall be decided by the Dynamic Coalition. So it will be a collective effort to decide our roadmap. I think we will expand on this also through the mail list, which everyone is invited to sign up again. It's [email protected].
But given that we have another ten minutes, any other suggestions?
>> AUDIENCE: I was going to make a similar point to Rebecca. Just personally, I spent three or four years of my life trying to come up with a set of substantive values that might be applicable. Coming up with substantive values is hard, which is why we often turn to process instead of substance. Perhaps that might be a way to get better buy in. We might be able to come up with a set of principles, but as the point was made before, if they are not practical then they are of very limited significance.
And I wonder if I might suggest focusing on some of the procedural issues about consultation, about transparency, those sorts of aspirational things that can actually add clarity and looking at -- if you are looking for a base set of substantive rights, then things like the unfair terms directives probably give you the closest set to a base set there that you can actually use, things like the limits on absolute discretion, termination of a contract which are process based rights, rather than substantive rights.
>> NICOLO ZINGALES: Is that the human rights perspective? That's a question.
>> AUDIENCE: Yes, I think it is a human right in certainty of how the rules apply to you. I think there is a clear human right in living according to the rule of law. It can be brought into human rights, yes, but the difference with the consumer protection, the commercial directive, needs to be taken into account.
Gabrielle.
>> AUDIENCE: I agree that if we're going to do terms and conditions, one section would be on appeals or address mechanisms at some point.
But I think on the substantive part, personally, I just think that it would be really hard to draft something. Even if you look at the U.S. Declaration on Human Rights, it's broad, there are certainly restrictions, but it's the case law that has said that well, in those circumstances, you know, that's a law, International law or not.
So personally, I think what I would see as being useful, even as a user, is having guidance on how the particular terms are applied. So maybe by referencing the criteria in this area but also having case study, examples, so this is how we would approach it. We don't guarantee this is how it will be in your case, but just giving examples.
>> LUCA BELLI: I think one of the first steps, regardless of the direction and towards which we are moving, is to have a compilation of all the different kinds of approaches, and approaches that were already put in place.
>> AUDIENCE: It could be useful also to have meetings or to hear from lawyers that have worked in the past on terms and conditions, to hear the challenges that they were facing. Because I'm sure that there was some kind of pressure in order to fit these into the boxes that we need to fit when we work for a company.
I've done this exercise before, not for Google but for another company, but I can tell you that it's not easy. So there are some specific challenges, even when you talk about Article 19. Article 19 allows defamation and God only knows what you can do with defamation online and offline. So there are specific challenges in drafting terms and conditions. So even before and beyond the due diligence check, I think you could -- it would be good to understand more of the challenges that you face when you draft terms and conditions and when you incorporate a lawyer.
>> AUDIENCE: Particularly on how we -- we have talked about both the big five platforms, or the very large International companies who host a lot of content, and then also a real interest in providing guidance for the long tail. And I love both of those ideas. And I think those two buckets of platforms will probably end up having different implementation issues. And I'm sure for the massive platforms that receive hundreds of thousands of requests and host millions and billions of pieces of content scale, and how you implement at scale is a huge driver in terms of what you put in the terms of service. So being able to have those conversations and really draw that out with the big and small platforms and see how do implementation questions affect what they are willing to put into the terms of service, I think that would be hugely informative for this group as we talk about what kinds of recommendations would be effective and useful to the platforms.
>> AUDIENCE: And another way to think about dividing up the work, in addition to the split Rebecca made about the process issues and then drafting, even in the drafting you've got to get an understanding of the substance, and then this issue of how to present it to the customer are pretty separate. And I would, I think once you do the substance first and then worry about the way that we present it to the customer, we have been around this horn on the privacy side for a long time and it gets confusing if you're trying to do both at the same time, since both are difficult to solve.
>> LUCA BELLI: So --
>> AUDIENCE: Maybe I missed what you were saying earlier. But if we want to remain engaged in this process going forward, what is the mailing list or how do we do it?
>> LUCA BELLI: You can find information on platformresponsibility.info. And if you want to join the coalition, you have to send an e-mail at [email protected].
So in order to understand how do we proceed if we go towards a set of recommendations or if you start by analyzing a couple of provisions, I think, to be democratic, either we try to do both. But it would be, according to me, quite ambitious to do it together, unless we decide to do previously a set of recommendations and then some concrete -- social net could be a third option. So to be democratic, I think it would be better to maybe ask to the audience maybe raising hands? How many people here think that the best way could be to start with concrete, let's say, two concrete provisions that could be inserted into terms of service. Could you -- as opposed to starting with recommendations, as proposed by Rebecca. Could you please repeat your proposal on the recommendation to make it clear?
>> REBECCA MacKINNON: I don't know if I was presenting it as a binary, we have to do one or the other. Maybe there is one group that is interested in drafting and another group that is interested in recommendations. And I doubt that everybody would want to be involved with either thing, because of just different sets of expertise and different concerns and interests. And to get kind of maximum participation, if you just have draft, what is going to happen is then the small group will go away in draft and then bring it back and just ask everybody yes or no and give us some track changes or something. And that's great.
But if you want to be a Dynamic Coalition -- you know, that is not giving everybody an opportunity to participate with their different competencies and concerns. So it would be great to have a couple of different types of ways that people can contribute. But, you know, I may be an outlier in this view. So feel free to do something else.
>> LUCA BELLI: In the coming days we can explore the mailing list of the Dynamic Coalition to identify which individuals or which organisations are keen on undertaking the development of the conditions and who is keen on drafting the recommendations.
>> AUDIENCE: My question is what is the exact scope of the recommendation? Because the point is I think right now there are a lot of projects around this room who are willing to come up with a set of policy recommendations related to intermediary liability. Because my understanding with this project was that it was supposed to produce a set of model closes, the recommendation is something different. That's going to overlap with all the projects. So I'm trying to understand.
First of all what is the scope of this recommendation, if you are going to move forward in that direction? And I'm trying to understand that it makes sense to join forces with the project and then use that set of recommendations that may be produced elsewhere to be used as a basis for the model closes. So the first question is what is the exact scope of this recommendation?
>> REBECCA MacKINNON: So I think this is something that would have to be hashed out on the mailing list. There were a number of ideas that were just aired. Perhaps I came to this meeting with a wrong understanding. I didn't realise that there was already a plan in place that people were just going to sort of implement. I thought this meeting was to discuss what the Dynamic Coalition wanted to do. And generally, you don't hash out the scope in one meeting. But, you know, again maybe I totally misunderstand the purpose. If it's really just to support your project and get multistakeholder approval on it, then that can be the purpose.
But again, if -- one reason I was -- the reason I was very interested in this Dynamic Coalition is that there is a gap, that terms of service and private enforcement is not being handled well on a number of levels. One is the language. But another is on a whole set of practices around the language and the enforcement of the language. And I think there is a certain set of people in this room today who are very concerned with that set of issues.
Now, I did not come with sort of a detailed work plan prepared to present to the group, because I thought we were going to discuss it and then find out who is in the room and then continue the conversation to decide what the scope is around practices. But if I'm mistaken, I'm happy to --
>> AUDIENCE: No. But my question wasn't addressed to you. It was addressed to the Dynamic Coalition. Because now I see that there is this new option on the plate. And then I want to try to understand if the scope of this -- potentially the scope of this recommendation is going to be different than what we discussed. And I want to understand that and I want to understand if it made sense to have a specific set of recommendations specifically related to this project. That's just with I want to -- what I want to understand.
>> NICOLO ZINGALES: I just wanted to clarify. It's a fair point. Thank you for bringing that up. I only mentioned the project that we are starting as a possibility. We have the capacity and still are very much it's being shaped. So on the basis of what we decide here, it could be oriented towards one solution or another. So there was in no way a predetermined plan. The only final idea or ultimate objective that we had was to have eventually some model closes, but that may well have to go through earlier sets of recommendation, engagement, talking to people and finding out what their challenges are before coming out with the actual model.
We have time maybe for another two questions.
>> LUCA BELLI: Very short and concise, not a question but remarks.
Also, because we are almost running out of time, for the sake of clarification, we have a mailing list. This is an inception meeting. Obviously we cannot solve anything today. We have a mailing list exactly to discuss this. And I think that there will be some wrapping up with some clarifications after these remarks. But obviously we cannot resolve every problem today with this meeting.
Sasha.
>> AUDIENCE: Yes, I just wanted to quickly say that I do like the idea of having a set of recommendations and then using that recommendation to have like a model implementation of the recommendations as something that people can just use. It makes it concrete but it gives people the opportunity to implement their own versions of it if they want to. So that is the thing that we should try to do.
And I would urge for really ending up in the end with a model contract, as one of the -- or model terms and services, as one of the outcomes. Because like the drop-in stuff is important, especially if that solves problems for like upcoming new players, et cetera.
>> LUCA BELLI: Excellent. Was there another comment? Rikki? No? Do you want to say something? No. Okay.
So I think that just to wrap up a little bit, and then I will leave the floor to Nico to wrap up definitely. To me, the best way to proceed would be to, first of all, create a sort of compilation of all the projects that have already been undertaken, because it is basic.
Secondly, start developing a set of recommendations on the basis of which a set of model contractual provisions could be developed. To me that is the most logical approach. I would like to know what the other members -- not members of the coalition, but attendees think about this. And Nico if you have any remarks?
>> NICOLO ZINGALES: No, I wanted to say this effort is important and it differentiates from another because it's a multistakeholder effort. So it would be particularly important to have, as Rebecca mentioned, more of the private sector and more specifically more platforms involved, so that, you know, we can have more recognition on the market. So I was pleased to see someone from Google. We had someone from Facebook that signed up as a member of the Dynamic Coalition, but was unable to attend. So in the future if you come across platforms that might be interested, that would be very much welcome. And we hope to spread the word and have a broad membership. This would be the main advantages of everybody, the IGF, to the debate on platform responsibility.
So with this I think --
>> LUCA BELLI: We can decide by acclamation the plan that would be, I repeat, a compilation of projects, recommendations, and an elaboration of model contractor provision discussing this on the mailing list. Who is in favor?
(Showing of hands)
Well, quite a few people are in favor. Okay. We are running out of time. Do you have anything else?
>> AUDIENCE: Just one thing to clarify because I'm not sure. Before you just said that what you wanted to do is to draft those recommendations and from that work on those model provisions. So that is not exactly what we discussed before and where we would say we have different ideas and we will decide which of these people want to volunteer for and want to be involved with. So I suggest that you leave it that way, and leave in more flexibility. I don't think that we might have the necessary support to work in that fashion saying that this is where I want to be; this is where I want to be.
>> LUCA BELLI: That's an excellent suggestion. I think we can work with this. I'm perfectly -- I mean, in favor of both alternatives. Maybe this one allows more flexibility. So I think this one could do better. Is everyone in favor with this?
>> NICOLO ZINGALES: I think we should keep it open. Sometimes things work out of inertia. If someone proposes something, then we can have this proposal consulted and eventually decide upon it. So for now this is the proposal. If we want to change, someone else has the burden of coming up with another proposal.
>> AUDIENCE: A final suggestion. In my view, I think it would be best to avoid fragmentation of drafting of the recommendations. So in order to be as a ready as possible, we should be able to join up projects that are now on the plate, undrafted recommendations. Now I see a new drafting for policy -- there is another project on the plate and another one that we discussed the other day. This is my point of views. And I think fragmentation should be avoided. We should be able to join up those projects.
>> LUCA BELLI: And thank you very much for all of your extremely fruitful and interesting suggestions. Thanks a lot and let's keep on discussing this with the mailing list.
Thank you.
(End of session, 4:05)
***
This is the output of the realtime captioning taken during the IGF 2014 Istanbul, Turkey, meetings. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the session, but should not be treated as an authoritative record.
***