IGF 2017 - Day 2 - Room XVII Plenary - High Level Thematic Session 'Impact of Digitization on Politics, Public Trust, & Democracy'


The following are the outputs of the real-time captioning taken during the Twelfth Annual Meeting of the Internet Governance Forum (IGF) in Geneva, Switzerland, from 17 to 21 December 2017. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 




>>  Sound check.  1, 2, 1 okay.  Ladies and gentlemen, welcome.  Thank you for being here today.  Welcome to this high level session.  It's a pleasure and honor, of course, to discuss such an important and complex matter as the impact of digitization on the politics, public trust, and democracy.  There's not a day that goes by without this issue making the news headlines.  It's everywhere, benefits and challenges, of course, of the digitization. 

During this session, we'll be discussing both aspects:  The benefits in the first group of panelists and the challenges in the second group of panelists.  We want this session to be as interactive as possible.  That's why I ask our panelists to be as short and direct as possible so we all get a chance to debate. 

I'll be asking you, the agents, of course, if you have some comments or questions.  And, of course, I'll be moderating this with Katharina Hoene, our remote moderator.  We will be letting you know what is happening outside this room with the participants that are following this high level session on the net.  You will let us know if there are some comments to be brought in. 

So thank you so much for being here.  What are the ways of strengthening the benefits of digitization on public ‑‑ how can digital tools promote democratic participation.  On the other hand, what are the risks and responsibilities of the stakeholder?  These are the questions we're going to debate today. 

Before we start the debate, I'm very happy and it's a pleasure for me to introduce our host, chairperson, Mr. Philipp Metzger, your Director General of the Swiss Federal Office of Communication.  He is a privileged observer of the digitization society especially in the field of communication and information and in particular in a country that values enormously direct participation to democracy.  Philipp Metzger, we're happy to hear some of your intervention as an introduction to this debate, please. 

>> PHILIPP METZGER:  Excellencies, ladies and gentlemen, distinguished colleagues, panelists.  Good morning to everyone.  I'm delighted to be here with you today and to welcome you on behalf of the Swiss government to this high level session of the United Nations Internet Governance Forum 2017.  The topic of today's session is particularly dear to my country as Switzerland has a long tradition of direct democratic processes, active citizenship, and engagement in dialogue. 

The digital space can be a great enabler for more inclusive democratic discourse and participation as well as more inclusive policy making.  At the same time, the misuse of the digital public policy space can lead to information disorder, mistrust in public information, and misrepresentation or manipulation of public opinion. 

But let me first talk about the benefits of digitization that can be brought to democracy.  Digitization, especially social media platforms, empower citizens.  Today Civil Society movements and campaigns are much easier and cheaper to organize, and information flows faster. 

The financial power of corporations and interest groups can be outbalanced by the power of crowd funding.  Furthermore, social media are giving a voice and a rallying tool for citizens to demand for more accountability. 

Authorities are under closer scrutiny.  They can also benefit from the information provided to them by the citizens.  The digital space also impacts democracy through the perspective of human rights.  Access to a free and open Internet means easier access to information, knowledge, culture, and education, among other things. 

The Internet can also facilitate the exercising of fundamental rights, such as freedom of expression and freedom of assembly allowing individuals to become active participants in the digital society. 

Now in spite of its empowering potential, digitization also brings about new challenges to political processes, public trust, and democracy.  Social media platforms have given voice to the many, but who and how can distinguish between false facts, manipulating opinions and feelings, that is a real question.  Democratic stability and an open society are not self‑fulfilling but must be defended again and again under changing conditions. 

We are witnessing today the return of threats that we believed we had left behind us.  Liberal open societies come under pressure.  Despite the most modern means of communication, the time we live in is not only characterized by growing mutual understanding but also increasing insecurity and growing fears. 

We all know the discussion about the so‑called fake news.  The spread of information disorder through manipulated or manipulating online communication has raised growing concern among governments, end users, and intermediaries.  We also know that it is reported to be influencing democratic elections.  Furthermore, there is a danger of echo chambers as people are increasingly only talking to like minded people.  Many of these phenomena are not new but have existed in various forms in all periods of human history, but social media are increasing the pace and reach. 

Digitization is also leading to a profound process of information in the media ecosystem.  In the past years digital informants have come under pressure.  Online services as well as a change in the way citizens communicate, the news media has provoked a major structural change which affects the traditional print and broadcasting media and challenges to business models. 

Young people, in particular, are using trust and tradition less and relying on information and opinions shared by their friends on social media.  So what is the way forward?  Against the backdrop of opportunities and the challenges I've outlined, the question arises as to how citizens, in particular young people, can use ICTs to participate in our democratic societies in an informed manner. 

How can they form an independent p/olitical opinion based on verified facts and trustworthy information, and how can they be empowered to orient themselves in an increasing complex world with more diverse voices, some of which are trying to fuel anger, fear, and even hatred or violence? 

In this regard, in our view, education of digital literacy are crucial.  It is clear that the more digital tools permeate the pursuance of traditional democracy, the more urgent the need to tackle the view of literacy in the digital sphere. 

In Switzerland we have trust in Civil Society.  If citizens have access to a wealth of diverse information and learn to deal with it with a critical mind, then we do not have to consider fake news as a threat to democratic opinion and decision making. 

Promoting education and critical digital literacy alone will, however, not be sufficient.  There is also a responsibility within the media actors and platform operators to support citizens, end users, in finding out who they can trust and how to identify manipulated information.  It is therefore to be welcomed that difference.  Stakeholders have recently started to engage in fact‑checking activities. 

Also, many governments and intergovernmental institutions have realized that they have a responsibility in creating an enabling environment for a trustworthy public sphere and more democratic processes.  Concrete steps include, for example, increasing the technical security of voting mechanisms and providing support for quality and reliable journalism media. 

When we talk about responsibilities of governments, let me stress that Switzerland's connection that restricting freedom of expression information is not the solution to secure stability and to fight against voices that try to stir anger, violence, and hatred.  Rather, governments, in our view, should take steps to earn the trust of their citizens.  This can be achieved by being transparent and accountable, by providing an environment for open and constructive public debate and by supporting free equality media. 

Citizens' voices must be heard, and it needs to be ensured that democratic processes will lead to results that improve their lives.  This way we're empowered and well‑informed citizens will ignore voices of those who try to incite unrest, anger, and violence. 

Let me conclude what is needed here is a joint effort of governments, the media, the online industry, Civil Society, and academia to share public dialogue and truly democratic processes that will produce the results that citizens deserve. 

I am looking forward to our discussion and wish us a fruitful exchange.  Thank you very much. 


>>  NATHALIE DUCOMMUN:  Thank you very much.  We're going to start with you, Mariya Gabirel.  You're a European Commissioner for Digital Economy and Society.  We've had a few highlights how we should strengthen some processes and use some tools to maximize the impact of digitization on society.  What is your view? 

>> MARIYA GABRIEL:  (Speaking non‑English language) thank you for giving me the opportunity to address you today, Minister, Excellencies, participants, ladies and gentlemen.  I'd like to say that we're all an importance of platforms, social networks.  There are many ways of engines of growth.  But in time we have to accept increasingly it can be seen as a threat, increasingly those of distrust vis‑a‑vis a small present ‑‑ recent census have shown that Internet source of information have been shown to be least reliable.  One out of two users in the EU limit their online activities for reasons of security, safety, and also to protect privacy.  So respect for privacy and data protection, personal data is a fundamental right for us. 

I think together we need to really work so that we can protect the interests of citizens.  Some things have already been said.  The impact of digitization on citizens' trust and democracy.  It has been considerable over the last few years.  I would like to stress trust confidence.  It is not something that is just given.  We need to win it back. 

We've seen that there is a very positive interest in the Internet, the democratic mechanisms, representation, inclusivity, but today we've seen a spread of fake news on the Internet which is contributing and also reflecting social divisions and a degree of social unease.  And that can also go so far as influencing political outcomes. 

We believe in the EU that we need to combat fake news, because the stakes are too high.  It's already been said, this is not a new phenomena.  What is new is the speed in which this is happening and also the scale it may take. 

Fake news is not just a technological problem.  It's true we need to understand how the algorithms work, but it is a political as well as societal issue.  It has to do with our democratic values.  It attacks what is closest to us as a society, and we're trying to defend these principals.  I believe that reliable information is key for democracy. 

However, many people believe that it's very difficult to distinguish between real and fake news, hence, the dangers.  I really think this is what is at stake.  We need to ensure that people have skills and competencies.  The red lines being freedom of expression and the right of access to information. 

We also need to make available to citizens instruments which will enable them to make informed choices, informed choices and democracy.  Our ambition in the European Commission is to engage the whole magnitude of this phenomenon when it comes to economy, and we need to look at it all as a whole.  We need to have a multistakeholder approach.  That can be effective.  That is how we will be in a position to have effective remedies to remedy things. 

We need tangible, short‑term solutions without overlooking the long‑term solutions.  In the short term we need to stimulate technological and cooperation involving all concerned stakeholders, information platforms, civil societies, and public authorities. 

First, in the Commission there are four major challenges we need to address.  First of all, how can you ensure transparency?  We know that citizens are placing their trust in the democratic model institutions based on trust.  Trust comes from transparency.  So we need to have more information on financial flows as well as sources of information, production methods as well as distribution methods of such fake news. 

I never get tired of saying this, but light of day is the best thing.  Let the light of day shine in order to combat this lack of transparency.  Secondly, diversity of information that feeds into a critical spirit.  That's a challenge as well. 

Thirdly, credibility of information.  Reliability and credibility that should be evident for citizens because it's not sufficient to have access to information.  Citizens need indications as to the reliability and credibility of information. 

Fourthly, inclusion.  You cannot have lasting, long‑term solutions without involving all parties concerned.  I would like to focus my attention on two points which were reflected.  We need to strengthen quality journalism and rebalance relationships between the traditional press, broadcasting, and the social networks and platforms. 

One of the consequences of digitization is that advertising revenue has developed and it's shifted from editors to these platforms.  It's important to get a balance in this new context.  See how all can be involved in this reflection. 

Secondly, stepping up media education and training.  That is primordial.  It's a key element.  Media, social media users need to be able to improve their own ability to critically assess the content of information. 

So that is also where we're enabling them to be more aware, more conscious of the consequences of their own online behavior.  This is something that has been discussed ever since the start of this forum.  Make users responsible.  And I think that is also a long‑term solution to combat this scourge of fake news. 

I would also like to share with you our most recent initiative in the European Commission at a European level.  We started thinking on fake news.  On the 13th of November I launched public consultation in that regard.  All experts, journalists, to take part.  Also on a high level group of experts we have 349 candidates in just one month which testifies the cross‑border interest in this.  The group will be meeting in January and will be submitting to me a report in spring 2018 to establish whether we can say today, are we able to have a minimum level of criteria on ‑‑ because we don't have common definitions on fake news to identify good practices.  By pooling our efforts, we can limit the consequences of fake news and then identify a common approach and identify common measures to do this. 

There's no miracle solution to this challenge, of course.  We don't want to impose this choice on people when it comes to the use of information because these are fundamental rights.  It's a red line.  We need to look at the consequences that this phenomenon can have on our democratic model and values.  Therefore, we need to do something. 

I would like to end my speech with that appeal to you all.  I think it's our shared responsibility.  It's for our citizens.  It's for future generations, and I do believe that we need to send out a message of support to them but also one of action. 

>> NATHALIE DUCOMMUN:  That was a clear message from the European commissioner.  Thank you for giving us some examples of how you wish to tackle this main problem, namely fake news and problem of information will be discussing very much so, especially in the second group. 

Now we have the introductions.  Let's start debating and listening to all of your intervention as short and concise as possible so we can all react to either one.  Minister Hasanul Haq Inu, thank you so much for being with us.  You're the minister of information at Bangladesh.  Excellency, you gave us a few examples of the benefits and what you're trying to accomplish in Bangladesh for inclusion and for protecting the Internet as a human rights that is your ‑‑ let's see.  What would you identify as being the main benefits notably in your country of digitization on democracy and politics? 

>> HASANUL HAQ INU:  Thank you very much.  I will try to be brief.  I'll try to come up with certain observations on digitization of the political process and democracy.  Having said that, Excellencies, ladies and gentlemen, the impact of digitization can be most simply yet effectively understood when we see our power to generate from the battle of the gun.  Now it is coming to ballot but now power is emanating from (?) Internet has empowered people. 

It is the Internet that empowers people the most.  Democracy is not only vote centric.  It is a multidimensional party which encompasses various components of fundamental human rights.  We look at our governments.  We see not only ours, but in many country constitution.  Fundamental policy of the state does not only include democracy but also fundamental human rights.  But the trend of politics is still practicing (?) In the name of democracy. 

So people's participation is not ensured.  That is why politics in many countries are either nontransparent and nonaccountable or transparency is not 100%.  Having said that, I must mention few impacts of digitization (?) number one, digitization is a glass house we live in.  Two, it is shedding light on the nontransparent aspect of democracy.  It is sharing more information.  Four, digitization is widening participation of people in the state mechanism.  Five, it has more accountability.  It has a booming effect on mass media which is becoming a vibrant watch dog.  Seven, digitization is democratizing society in wide terms and wiping out the dividing social gaps. 

The question is how to harness the positive impacts of digitization for a better world for all.  For that I propose every country should acknowledge Internet as a fundamental human right, and right to Internet should be enacted by law and should ensure free Internet for all.  We cannot talk about a digitized globe until it is free for all.  Then all people will be digitally empowered. 

For that, if we can do that, then the autocratic ‑‑ bureaucratics can be abolished.  People will become closer.  People will have the capacity to hold information on (?) and hence, every person will be digitally enlightened. 

Having said that, let me conclude.  Digitization will bring the ideals from the pages of the constitution to the real life of the people and will make politics and democracy real participatory, inclusive, built on public trust.  Thank you all.  

>>  NATHALIE DUCOMMUN:  Thank you very much.  Thank you, your Excellency.  That's a fast call for the recognition of Internet as a fundamental human right.  I would be interested to know if all of you agree around the table.  Secretary General of the Interparliamentary Union.  Hello, Mr. Martin Chungong.  You've dedicated your professional life to promoting democracy world‑wide.  Of course Internet is a fantastic tool for you. 

>> MARTIN CHUNGONG:  Thank you for giving me the opportunity to participate in this debate.  Yes, we spent a lot of time promoting democracy.  But at the start of this message I want to convey to this meeting.  I want to say that when we look at an assessment of democracy, we seem to think that democracy is losing ground.  It is under threat and assault. 

But I hasten to say that this is only a perception.  It is not democracy as a system of values and principals that are under threat.  It is the institutions being challenged daily.  I approach the debate from the point of view of regaining trust.  I think it's something the previous speakers mentioned. 

How can we restore trust in the democracy institutions?  I take an institution I know very well, Parliament.  When you do an assessment of Parliament, you'll see they're very low in popularity ratings because of a number of reasons.  They're not delivering to people expectations.  They're not seen to be transparent in the way they function and legislate.  It's important that Parliaments embody democracy. 

I think that digital technology, access to the Internet gives Parliaments the opportunity and tools to regain the legitimacy.  That is a must especially among the youth.  For this to happen, we want to see Parliaments embodying these core values of democracy that are representativeness.  Parliaments should be seen as representatives, not only in terms of numbers but issues addressed by Parliaments.  We want them to be accountable, not only asking governments to be accountable, but they themselves being accountable.  We want them to be accessible to the people and delivering on the expectations of the people. 

We do think that these are all things that digital technology help Parliaments would do.  They're closer to the people, especially the young people and women and those marginalized in society.  They can promote accountability of the Parliaments.  Today it is evolving very rapidly. 

Institutions of governance have to react and respond in a timely fashion.  I think that technology allows Parliament to do this.  I've mentioned legitimacy of decisions because the decisions will be informed by the interests of the people as expressed through the media, through Internet, and the like.  Of course, it is important that government and Parliaments be seen as accessible to the people, and digital technology allows this to happen. 

A question was raised as to what stakeholders should do.  I think that Parliament should not only be expecting technology to happen upon them.  They should be promoting the use, the responsible use of technology.  My colleague here mentioned misuse of the media, fake news.  It is for Parliaments to establish the legal legislative frameworks that would help us streamline the use of technology to good effect. 

I think it is very important.  There is ample evidence out there for my point of view that Parliaments are embracing these new technologies and using it to carry out transformations.  I'm also looking how they relate to citizens, to civic society, young people and how they're using this to improve upon the point of methods.  I think this is very crucial. 

Again, I want to conclude at this stage that Parliaments have a great opportunity to regain legitimacy, to regain trust from the people. That's the message I want to convey here.  Of course, we're going to be discussing later on and I can provide more examples of how Parliaments are using these technologies. 

>> NATHALIE DUCOMMUN:  There have been some examples of what has been done.  Thank you very much Martin Chungong.  Mr. Hossam Elgamal.  You're the chairman for the Egyptian Cabinet Information and Decision Support Center.  We were talking about fake news just earlier on.  You were saying that's a really interesting example.  You have set up a form of ‑‑ it's like an application to check and to ‑‑ what the rumors are about the activity of the government.  Is that it? 

>> HOSSAM ELGAMAL:  Hearing what's been said, there are a few examples certainly in developing countries that shows the benefits of digitization on the democratic discourse. 

We won't talk about the challenges now but benefits.  First of all, the Egyptian Cabinet Information Support Center is a hub where information is gathered for the Egyptian Cabinet and then analyze and then decision support suggestion goes to the Cabinet.  Along with that and with the digitization era, it was important to be able to give the citizens opportunity to voice itself. 

In fact, the digitization goes for governments, financial inclusion, for social inclusion, and for viable information sharing.  For this we have a few solutions that we have implemented that might be useful to share.  We have an application on mobile, smart phones that is called Positive that engage the citizen to voice himself, to give his opinion, and also to reflect if there is any complaint, if there is any rumor, if there is any problem happening even general. 

This is gathered in this hub that is a neutral body within the government and allows and then reviewed with different authorities in order to see if the rumor is right, if information is correct or wrong, and then we provide feedbacks from mass media to correct information if it is wrong. 

If there is a complaint, then we follow up with the authority to make sure that they handle the complaint on time.  But more importantly it allows if there is any problem of corruption, if there are any issues that need to be handled with implementing new policies, et cetera.  So once again, an application. 

We have also our website solution for complaint.  It is a centralized e‑complaint system that is reviewed not just by the Cabinet but even by the Presidency.  In order to have governments of each authority to make sure everyone is moving on implementing solution where the citizen is the center focus of it. 

But more than this, I would say that we have ‑‑ we use digitization to achieve social inclusion because in fact with the information that is gathered, we are able to identify the underdeveloped villages and communities.  We are able then to communicate back with those communities and identify their exact needs so that we minimize the social divide.  I can stop here. 

>> NATHALIE DUCOMMUN:  We'll go back to your example.  Thank you very much.  Hossam Elgamal.  Of course, it's going back to public trust as well because as we heard earlier on, the public trust in Parliaments and governments is not that high.  There's an application to try to communicate with the Civil Society.  We need that trust, of course. 

Please welcome Mrs. Malavika Jayaram.  You're the executive director of the Digital Asia Hub which is an independent research think‑tank incubated by the Berkman Klein Center.  Thank you for being with us.  What do you know about this public trust and what are the main benefits we have to capitalize, maximize today in terms of digitization?  Please. 

>>  MALAVIKA JAYARAM:  You seem to have married me off prematurely but that's okay.  Fake news. 


>> MALAVIKA JAYARAM:  I'm happy to be here.  It's the closest I will be to ‑‑ it's like being in the "Star War" Federation looking out across a room like this.  Sometimes when you listen to views of different people across different cultures and values, it does feel like they come from different planets.  I don't think it's an inaccurate analogy.  Thank you for having me. 

I wanted to say as a preliminary point there's a binary (?) we've divided the panel into in terms of benefits and challenges.  I think it's a good forcing function for some of us who tend to work on one side of the spectrum rather than the other.  It's good to ‑‑ I'm much more comfortable on the dark side not to carry on the "Star Wars" thing any further.  It's good to see in every technology (?) observation changes the phenomenon.  Looking at technology is beneficial, actually makes you see things differently if you look at it from all the challenges and things that go wrong. 

At the Berkman Center we play this wonderful game which is panics and progress narratives where exactly the same technology.  We throw it out into the room.  Half of the room talks about all the different mild panics within self‑driving cars, all the things can go wrong.  The other side is coming up with the progress narratives that you see in the industry that the tech industry promotes that it's the best thing since sliced bread.  And then you swap.  That's what we're doing here which is great. 

I think for me, I think we have to see this issue as one of digital citizenship.  I think the second we think of it as a handout, subsidy, or entitlement, something we're giving other people who are targets of development, I think we've failed.  If we don't acknowledge that users, consumers, subjects, whatever you want to call them, they're enabling in others.  Unless we see them as co‑participants and co‑creators in the future, I think we're going to fail.  That's the first point I want to make about digitization. 

We often see it that a government rules out as a benefit, as something that, you know, it's an end goal in itself.  It's not necessarily a process.  It's not something that has to be embedded as part of a sociotechnical system.  I was happy that the commissioner said that we see them as systems and not just as particular points and processes. 

I think that's a really key point that we don't see them as targets.  I think also in all of this work about digitization and benefits and the way of developing people, I'm always reminded of something.  The very, very often quoted line about a society will be judged on the basis of how it treats its weakest members.  Half the room is going to say that and it was another piece of fake news but it wasn't.  It was Pope John Paul, II.  There are certain business models that optimize for certain outcomes that benefits certain actors and the incumbents, the governments in power that actually enable and amplify power structures and existing structure and qualities. 

If we don't focus on the margins and outliers, I think we run a risk in our decision making and sort of data‑driven policy making.  Data ‑‑ digital can tell you any story you want to.  There's nothing neutral about data.  That's one of the first lessons we're learning as we see systems fail and break down. 

I think if you don't actually cater to counter the harm that can happen to the outliers and the weakest on the margins, I think that's a real problem.  I think actually one of the good benefits I see about technology is with all of the focus on bias and discrimination and algorithm fairness, it's forced us to have a conversation about how flawed technology is and how flawed the society is.  The more we talk about fake news and the things that algorithms do wrong or the way they voice homophobia, I think they reflect the ugliness. 

I think in the same way technology can be used to exacerbate those, if we could harness the same tools to optimize and nudge towards good outcomes, I think the same bias we see we can nudge toward great outcomes.  I feel it's exactly the same tools that can engender bias are the same tools that if you know certain kinds of data sets are not representative, you can counter for them so that actually the outcomes are better than you would have offline. 

>> NATHALIE DUCOMMUN:  Do you have an example of these tools?  If you could use the same tools, what could they be? 

>> MALAVIKA JAYARAM:  He talked about how so many polls got it wrong about the U.S. election.  One of the polls that was actually really accurate was from the XBox.  You would think that's bizarre because it's a narrow margin of 18 to 22‑year‑old men.  Why would anything like that be representative on the XBox?  How could that possibly represent people?  It turned out because they knew they were dealing with a small slice of the population, they corrected for it.  That's when you know that a particular population database is skewed.  When you know that the data you're dealing with is not representative, you can use the machine learning tools to compensate for that. 

I think in similar ways when you're dealing with property and dealing with fewer tools or temples or churches or less financial aid or fewer banks, when you know that's what you're dealing with, the algorithm could ‑‑ in the way banks lend money to people who already have money and won't lend to those who need it.  I think that's something that technology tends to do.  It tends to favor people who already have it.  If we know that, we can compensate for that.  We're dealing with 18 to 22‑year‑old men.  Let's normalize across that and standardize it for having women and different age groups to make it more representative.  That's one little example. 

I think the other benefits of technology are ways in which it can actually close skills gaps and participation gaps, and I think a lot of emerging communities ‑‑ we've had a history to look at technology as a savior to leap frog those stages of development.  We didn't have 2G, we went to 4G.  We didn't have industrial revolution but went straight to something new.  We haven't had the time to form the institutions, the rights, and the values in a society that actually go with that leap frogging of technology.  You go straight to a new tool or to a new paradigm of future without having learned through the mistakes of what went wrong with other technologies. 

I think sometimes going for the shiniest or the newest is not the best solution.  In the way that governments go through endlessly bureaucratic processes, buying fighter planes, technology is already outdated.  It's redundant.  I think there is some value in having the time to look at the impact of something to study pilots.  That's something technology providers don't necessarily do especially when the prevalent values (?) governments don't fail faster and break things because that failure will reflect in the next election.  They have the right incentives to perform well or be voted out of power. 

I think with a lot of these problems the solutions will come from a multiplicity of actors.  I think something like the digital ‑‑ the promise of digital transformation isn't something that governments can provide in and of themselves because they're often behind the curve on technology.  There's often the statement about how we've got technology 4.0 being dealt with by policy 2.0.  There's always that gap.  I think that's where Civil Society plays a good role and bend that. 

Technology companies when they try to do good can fill that gap and help equip governments with skills.  I think it takes Civil Society to push back and say, what are the incentives?  Who are we optimizing for?  Who is this going to benefit?  Did you ask them if they wanted this?  Did you take consent?  Was it meaningful?  Was it informed?  I think that's something that technology is like artificial intelligence completely disrupt, the idea of consent, the idea of personal identifiable information.  Everything is in the aggregate and everything is being done without people's active participation or consent, that's a real problem.  I think equally artificial intelligence does hold a lot of promise of scale. 

>> NATHALIE DUCOMMUN:  We'll be able to discuss that later.  Benefits and challenges. 

>> MALAVIKA JAYARAM:  Thank you. 

>> NATHALIE DUCOMMUN:  The roles and responsibilities of multistakeholders in terms of opinion.  Ms. Mijatovic, you're an international expert on human rights and media freedom.  The multistakeholders' roles in benefits.  Is it working?  Are the relations strong enough to work together? 

>> DUNJA MIJATOVIC:  Thank you very much for inviting me.  That wasn't exactly what I have to talk about here about the stakeholder ‑‑

>> NATHALIE DUCOMMUN:  Just react to what we've heard.  Please deliver your message. 

>> DUNJA MIJATOVIC:  The main issue here I'm sure that many colleagues that can talk about multistakeholder process is something that is of crucial importance when we discuss the topic today.  It seems to me that we are somehow from this virus called fake news.  And if we look at media reports, it would seem that we are discussing a new concept, something completely new.  Fake news also has become a political and popular obsession around the world.  We are even stating that the multistakeholder approach can fix fake news. 

As a label, the term has also been invoked by political leaders.  This is something that we should not forget about.  It was invented in a time to discredit legitimate journalism and to brush for accountability.  There have been calls to ban or restrict fake news through the adoption of legal measures.  The last thing we need in a democratic society is an overreaction.  I think we are overreacting at the moment. 

I know that many will disagree with me, but there is no single day that we do not hear from international organizations, governments around the world, democratic governments as well about fix fake news by establishing certain agencies, high level panels, wise men, sometimes women as well trying to fix the problem that we are all facing as societies. 

Fake news is not new, and it's old as humanity, I would argue.  The potential impact of fake news increased after the invention of the printing press and the introduction of film and radio and television in the 20th Century.  So the Internet era we have boundless capabilities.  Dissemination, speed and everything is changing, and we're more exposed as a society. 

My question would be also for our discussion here.  Why as a citizen would I trust any international organization, not to mention government telling me what is right and what is wrong?  I think it's almost like filtering our minds not to mention times to block.  Instead of pooling the funds and financial aid to public service broadcasting to media, to new online portals, to international organizations for Internet literacy, we are seeing this sudden movement in establishing certain agencies and organizations that are dealing with fake news. 

There are NGOs that are fact checking.  I think this is something we should welcome and it's of extreme importance.  But I'm very much against the potential shifting the responsibility from the stage to the intermediaries.  This is happening as we speak in too many countries at the moment.  I do not think engine searches and intermediaries can have this responsibility.  It's like shifting the responsibility to the state to the judiciary and the role they have in our societies. 

We should not forget about this.  I think we already have tools at our disposal, particularly in democracies to fight I think the way it's moving at the moment, I think it's affecting journalism a lot.  This is something that should ‑‑ we should take into consideration and really do as much as we can in order to protect journalism, investigative real journalism that is also suffering because of these global changes that we are all facing. 

There is an urgent need, I would say, for a smart answer to disinformation and propaganda as the damaging effects are visible to all of us.  But there's also an urgent need for smart answer to the attempt to fix the problem that our societies are facing.  At the moment I can see that also here since yesterday I heard many saying, yes, we organize this, or we will fix it.  And then the society ‑‑ it's somehow like we all became so stupid suddenly, and we cannot really make our own decisions and really decide what is right and what is wrong. 

>> NATHALIE DUCOMMUN:  Are you calling for individual responsibilities of all of us? 

>> DUNJA MIJATOVIC:  Claim individual responsibility.  We are capable people of making our own decisions.  But I think we failed as societies in the attempt to educate, particularly when it comes to intermediary media literacy in the process of digitization and in the process of more information.  I think we even failed if we start from kindergarten, the really small children, that needs more education on our digital life.  And this is something that I would like to see governments doing, pushing more funds and efforts in doing this and not really organizing new agencies and trying to fix something that I do not think is fixable in this way. 

>> NATHALIE DUCOMMUN:  Thank you very much.  Thank you very much.  Ms. Nanjira Sambuli, you're the digital advocacy manager at the Web Foundation where you lead advocacy efforts to promote digital equality and access to use of the web.  We've been talking about the benefits in terms of inclusion.  Of course, that's one thing that you observe in our work.  Let's hear about it. 

>> NANJIRA SAMBULI:  Thank you.  At the core of this discussion around political processes is a question of power and who has it and for whom it is being asked to be distributed to all citizens.  I want to highlight ‑‑ for instance, as Malavika was saying, I would like to reflect on maybe a couple ideas online talking about ‑‑ you know, digitization.  What has happened with the Internet and spaces for discourse, even once we leave rooms like these, discussions go on.  It becomes normalized once we walk out the door. 

Politics of representation has been one great stride we've seen especially where gender dynamics are reflected.  Representation global north global south.  These are being taken beyond rooms like this and into spaces occupied by citizens.  I'll give a practical example of a big challenge that I would love to hear Mr. Martin respond to where women who constitute 52% of the population have been using digital platforms because it's an issue in the media about the unconstitutionality of our top Parliament of our constitutionally appointed bodies and a cabinet about to be appointed.  Does it adhere to the two‑thirds gender rule when it comes to discussions on gender in public discourse has been ignored and normalized. 

Women occupying digital spaces have access whether it's just Twitter or Facebook or exchanging emails to say we have to make this an issue, a thorn in the flesh, in the political system.  It's going to be wonderful to see what happens even more women who are not connected gain access and their own perspectives on how they should be represented.  How they want to have their own voice coming up. 

I want to draw back to the link to the fact that political voice and processes as we see through digitization are a function of how they're understood offline.  What we're seeing with people coming online is a two‑prong approach.  We're trying to negotiate our approach in the digital world.  Claiming the democratic values for many countries have existed on paper and through dialogue through argument, fake news, to figure out what we mean by democracy, what we mean by values enshrined in it.  At the end of the day we have a role to play in making sure that even in digitization is enhanced and denying people power and creating new divides. 

>> NATHALIE DUCOMMUN:  Thank you for the list of benefits you gave us.  Robert Strayer, hello.  Thank you for being with us.  You're the Deputy Assistant Secretary for International Communications and Information Policy at the U.S. Department of State.  Let's hear you on the benefits. 

We've been talking about fake news and the problem of public trust.  I know it's a big, big issue in the States as well, of course.  We're all witnessing it because you have a very participative Trump on Twitter.  We know all about him.  Tell us what your view is on this. 

>> ROBERT STRAYER:  Thank you for the introduction.  It's a privilege to be with these distinguished panelists and all of you participating in the IGF.  Democratic policies is the foundation of everything we do.  In the source of peace for any public policy we create must be grounded in that experience by all of us ‑‑ all of our fellow citizens. 

I'm pleased this high level session is devoting time to appreciate the positive dimensions of digitization before focusing on the related challenges.  In considering the best way to address the challenges we must remain vigilant not to undermine the benefits that technologies have brought. 

In the United States and around the world citizens are assembling online and organizing their advocacy both in the virtual and physical worlds.  We also remember that these online spaces are even more critical in parts of the world where democracy is not practiced, where political activity is not tolerated, and where public trust is replaced with fear of reprisals. 

But on the positive side, we have seen people able to start participating in their diagnoses, in public debate, in determining their own futures through being online. 

I would like to offer two practical steps that we can take to strengthen the positive aspects of digitization.  First, given the centrality of freedom of expression and assembly to the democratic process, we must ensure that people are able to exercise these fundamental rights online.  It's primarily the responsibility of governments to ensure that their citizens are protected online. 

But it's also true that other stakeholders should recognize and be aware that the same rights individuals have offline must also be protected online.  So they also have a role. 

Second thing I want to highlight, there are many processes for a collaborative and participatory involvement in making decisions.  We're in the early stage of thinking about how to have multistakeholder participation.  All of us here and all stakeholders should seek to identify ways that we can further improve and find the best process for a participatory decision making process. 

We should have robust and inclusive and deeply participatory stakeholder models we can both (?) for the benefit of what we do on the Internet and for democratic institutions around the world. 

In that regard, I would like to point out that the IGF is a valuable resource for collecting positive stories for how technology has positively shaped democratic participatory outcomes.  I've heard stories this week particularly from Civil Society.  I hope we have time to discuss more of those.  Thank you. 

>> NATHALIE DUCOMMUN:  Thank you very much.  We're running a bit late on our schedule.  I would like to have a little form of exchange among our panelists before we go to the origins.  You wanted the reaction of Martin Chungong earlier to ‑‑

>> PANELIST:  (Off mic)  I'm curious.  Many Parliaments through the parliamentary union need to be more citizen facing how they would encourage, say, the Kenyan Parliament to recognize the voices that have been raised because what happens in the processes of engagement in making your voice heard starts in these platforms but be invited to dialogue and proceed is a whole other matter.  In his view I would be curious to see how that power being behind more if you will. 

>> NATHALIE DUCOMMUN:  Thank you. 

>> MARTIN CHUNGONG:  It happens that the Kenyan Parliament is one we're working when it comes to the issue of gender equality.  That truth hits the gender rule that my colleague mentioned.  It's something that has been at the fore with the discussions we've had with the Parliament.  In the previous Parliament before the one elected we studied that reflection.  We have provided tools for them to ensure that the Parliament is legitimate constitutionally by respecting the gender rule. 

This is premised on the fact that, yes, it is important to give online access to everybody to contribute to democracy processes.  But at the end of the day, how are the decisions being made?  Are they made online or in the formal structures of Parliament?  For us it is important that women be at the table in rooms like this where the decisions at the end of the day are being made in Parliament.  So you have online participation which is important to inform decisions, but the presence of women in Parliament in order to participate in the decision making processes. 

So this is something that we are engaging with the Kenyan Parliament to try to fix.  It's not easy because there's a lot of resistance including among men.  So my colleague, I think, that it is something you and I can discuss and see how we can be most helpful to the Parliament. 

>> NATHALIE DUCOMMUN:  Thank you very much.  Would someone like to react? 

>> HASANUL HAQ INU:  I was wondering listening to my panelists about the media and fake news and fake information.  In Bangladesh we're victim to this provocative instigated fake news I call is information terrorism.  Information terrorism is a tool of (?) dual on the action.  On the ground they're looking at the government installations and other religious institutions, chapters, and other things.  And on the Internet there is information (?) going on.  They're placing religion against democracy in a position.  That is a challenge in the institutional democratic situation.  How to combat information terrorism so that ‑‑

>> NATHALIE DUCOMMUN:  If I may, how do you draw the line as a government between trying to tackle the problem and balancing into censorship?  You want people to have freedom of expression.  That is also a human right.  Yeah.  How do you draw the line?  And how do you act? 

(Off mic).

>> MARTIN CHUNGONG:  As I told yesterday ‑‑

>> NATHALIE DUCOMMUN:  Speak a bit closer to the mic. 

>> MARTIN CHUNGONG:  Thank you.  We are not going for censorship because ‑‑ the freedom of expression is guaranteed by the Constitution.  There is a constitution where this clause is inserted in Article 39 where peace and (?) is predicted besides other human rights. 

My point is we're not going for censorship.  We are trying to combat this info terrorism by enacting a broadcast commission and offer ‑‑ and broadcast law in line with the human rights and freedom of expression.  But we have to understand that fake news ‑‑ okay.  It is a problem.  But info terrorism is a different thing.  It is coupled with the terrorism on the ground.  There is info terrorism weighing against Christianity or Islam.  Then on the ground (?) mosques, chapters and ‑‑ this is a couple things.  It is the criminal activities but a political dimension. 

>> NATHALIE DUCOMMUN:  Thank you. 

>> MARTIN CHUNGONG:  It is the digitization process.  Either one is very important.  In our country we are digitizing the whole government.  Government is giving poor services.  When you digitize the whole government (?) that ensures more (?) to be poor, better service.  I'll come back around.  Thank you very much. 

>> NATHALIE DUCOMMUN:  Thank you very much.  Katharina, should we take a few comments to our high level session online? 

>> KATHARINA HOENE:  Thank you.  We had an interesting discussion online which touched on a lot of topics, fake news, citizen Parliament.  I would like to share two comments.  The first one comes from a man from Kenya.  He said, for example, fake news will always thrive in the absence of a strong value system.  We should study the correlation between fake news and fake data. 

He had a comment on Ms. Mijatovic about responsibility and the system that we need to support and embrace.  Being a Kenyan, he agreed with Ms. Sambuli's comment about citizens in Kenya. 

Another comment comes from Christian from Nigeria.  He was very interested in fake news.  He reminds us that there are technical solutions for fake news that we need to look at. 

>> NATHALIE DUCOMMUN:  Technical solution ‑‑ before I get back to you, Elgamal, on the solution, you say we're overreacting, but obviously it's a main concern for everybody.  Some of the tools probably one example? 

>> DUNJA MIJATOVIC:  The overreaction that I'm talking about is the overreaction to certain news that we can also call stupid, provocative, vulgar, something that we have to or I thought we accepted as a prize to live in democracy.  I do not want my right to be shocked or discussed with certain views to be taken away from me and somebody else tells me what is right and what is wrong. 

What we are discussing here is also problem with propaganda, with the information wars, with many societies facing propaganda from different states and information used as a tool.  Also, criminal acts that should be dealt with using everything we have at our disposal as a society, infrastructure, institutions:  Incitement to hatred, calling for killing people, matters of national security.  That is something that is not debate or discussion about free speech agenda.  That is anarchy or crime and it has to be taken ‑‑

>> NATHALIE DUCOMMUN:  The values we have in common. 

>> DUNJA MIJATOVIC:  Absolutely.  I think you're also mixing these things in this discussion about fake news and what we are actually affecting as our right to accept information that we also do not like.  That is the problem that I see is growing. 


>> HOSSAM ELGAMAL:  We have been facing fake news and manipulated information.  In Egypt among others, especially in the area of a lot of debates, we have faced this a lot.  And one thing is I just want to highlight that social media and digitization went very fast.  And we need to accommodate this in our society and life in a regulated way if possible and an institutionalized way. 

Simply the problem of manipulated information, I see at least several ‑‑ one of them is related to privacy.  It is not about this fake news or information that affect the general direction of political decision, but it could sometimes cause issues related to a patient somewhere.  In many countries we don't have privacy regulation.  This is extremely important related to diffusion of information because I do not expect to find my health file published and manipulated on social media. 

So one of the great challenges is privacy and how we can put regulation that can make sure that privacy is not broken.  Another one is strangely manipulated information is becoming an industry.  So it is not just a coincidence that we are facing manipulated information or fake news. 

The problem is that diffused and manipulated information on the world‑wide web or social media is in the hand of the financially capable who can do this.  Even if you have the right information, you don't have the same balanced right to collect,  regardless if this is government, if this is NGO or as a citizen, but you can find easily on social media sponsored and boosted manipulated information with thousands of dollars.  This is the only way you can spread it all around.  And the correct legitimate body is not capable of defending the information as it is.  We need to identify ways of handling this. 

In all cases the simplest way we are now doing it through our information support center is at least for government, whenever there are manipulated information that is provided through social media and the mass media, we review with the right body the exact information regarding the project, regarding the policy, regarding a plan, and just bringing back a clear way provided back to the citizen.  And this is in our opinion, we don't sensor.  We provide the right information and we try to provide it as much as possible through mass media within the country. 

Again, we face that this balance of financial capability that boost such as manipulated information, whether from a terrorism organization or from anybody, it can be there.  Just to add for the challenges, certainly you have a problem with cyber crime policy.  We don't have something to gather us on cyber crime.  This is part of manipulated information or fake news or a problem happens and cause disruption.  Who to blame? 

>> NATHALIE DUCOMMUN:  Let's try ‑‑ if I may, let's go back to benefits.  There must be something about human nature.  We keep going back to challenges.  Probably the room can help me.  Share with us some of the benefits of digitization of society on public trust, democracy, participation in politics.  Are there any comments, experiences you would like to share?  Yes, please.  Just press on the button of the mic and it will go red.  There you are. 

>> Audience:  Thank you very much.  I come from India.  As far as numbers go we have talked about.  We have a billion connections in India and yet only 28% of people in India are online.  The tremendous potential for women to work from home to be able to experience agency in voice, it is incomparable when you're looking at a society that has cultural bias, social bias, (?) at the women's college we know what young girls can do when they're able to actualize their potential online.  We're looking at moving toward a billion plus digital economy

I also ‑‑ I think Internet is an imparting tool.  It has a way in which it changes lives.  For people who are from education, that direct experience of literacy, 200 million people are keypad literates.  I can't emphasize enough on what it means to experience a (?)

I want to end with a short question if I may.  We've been talking about barriers and challenges.  When they talk about the fact that the Internet should be a fundamental right, it's music to my ears.  When I hear statements where we're talking about a government‑lead initiative where you want to enshrine some mechanisms in which you can dispel misinformation, disinformation.  What has been of concern for us in developing countries when incidents happen online matching the reaction of governments is control. 

As far as senior journalists, we have objectives.  Fairness and balance is something that comes.  What we've not seen from governments is utilization of the same power of the Internet and all the institutional resources that they have to engage on the same public sphere to dispel misinformation.  It takes about three days to sort of see this is fake piece of news or a three‑year‑old video.  Instead of saying marginal processes of self‑regulation should not come into being.  Instead of counter‑speech with more speech, let's not look at a slippery slope where we look at more regulation.  Because that the Internet will not like or cherish.  Thank you. 

>> NATHALIE DUCOMMUN:  Thank you for the first part where we had some of the benefits to the digital economy.  I think we have Bruna Santos, a representative of the young ‑‑ I don't know.  No.  Let's ‑‑ I think we have a comment here and then we'll go further in the room.  Just press on your microphone.  Let's try and ‑‑ stick to benefits that we can highlight. 

>> AUDIENCE:  Yes, benefits.  Unfortunately some gaps.  But definitely. 

>> NATHALIE DUCOMMUN:  We'll get to gaps and challenges. 

>> AUDIENCE:  I'm speaking for the academia group and for the foundation (Speaking non‑English language) I've been a pro ‑‑ I'm pleased that on the panel this topic was mentioned several times.  We have many very good practices in media information literacy.  For us Internet has been an opportunity to spread ‑‑ to upgrade people's skills on the digital level and digital citizenship. 

We've worked with the Council Europe and UNESCO and have been extremely active.  It's been a golden age for NGOs like these to provide instructional opportunities across education, across culture, and across media.  There are many examples.  There's been a mapping done of Europe.  I would call for a mapping to be done for the rest of the world to show abundance instead of the scarcity we were before the digital era. 

For us educators this has been a plus.  Yet nothing is happening at a national level.  Media literacy is always used as a kind of lip service to what the new literacies are about.  I'm extremely preoccupied with this to push ‑‑ I would say thank you fake news, because fake news allows us to have a crash course 101 on what it is to be media literate, how to read what's happening online; how to go back to the sources; how to see who is saying what with what agenda, what advantage or hidden agenda, et cetera, et cetera. 

So let's use fake news not to disengage the states because that's how it's happening, and I agree with Ms. Mijatovic about that.  Let's use fake news to reengage with states because it is missing the engagement of the states at a national level, in schools, and out of schools with real money, not sprinkling, not just parsley on a salad.  Really going into the deep uses we can make of media information literacy. 

So this is what I would really call for.  It would have been too soft ‑‑ this is something to everyone.  We have been too soft on issues.  We're still treading on (?) we're not engaging.  We should be much more disruptive in the constituency. 

We need to claim ‑‑ we need different learning and knowledge in digital age, new cyber literacies.  We need to say to schools, learning the old way is not going to help young people today with the democracy that they need to create.  And it is disrupting democracy as we know it. 

>> NATHALIE DUCOMMUN:  Thank you very much for the cyber literacy and going back to the practices that journalism, and we'll talk about that later on about the practices of journalists as well facing fake news.  Yes, woman, please hit yes.  You just switched your microphone.  Yes. 

>> AUDIENCE:  So my name is Elena.  I'm from the Institute Technology of Society.  One of the missions is to use technology to strengthen democracy.  We believe that solution is democracy and not less.  We created an app (?) which is a way to amplify democratic participation by enabling citizen to create (?) online.  It uses block chain technology which is much more secure than the paper that is used until then. 

So this project has been used by some legislative houses.  It's really been a specific example that can bridge this gap of participation and really create a channel for people to participate. 

>> NATHALIE DUCOMMUN:  Thank you very much for that.  Unfortunately I only have time for a last intervention.  Yes, sir?  You can switch your microphone on. 

>> AUDIENCE:  Can you hear me? 


>> AUDIENCE:  I wanted to comment.  I'm CEO of ConnectSafely.org.  The parent and educator guide to media literacy and fake news that deals with the root causes of fake news from the standpoint why we consume it and why we accept it and pass it on.  It's advice to parents and teachers.  We have an element in there about emotional literacy or emotional intelligence.  That's also very important.  I would invite anybody here interested in reading or vocalizing it to visit connectsafely.org/fakenews. 

I want to make one question I guess to the panel and perhaps to the gentleman from my own country.  What do we do when our highest political leaders share fake news and try to delegitimize legitimate news such as the judiciary and law enforcement?  How do we as citizens respond to that type of leadership? 

>> NATHALIE DUCOMMUN:  I think that's for you, Mr. Strayer. 

>> ROBERT STRAYER:  The democracy is a panoply of voices.  There are a lot of opinions expressed online.  It's not for the role of State to restrict those voices.  And the best way if one disagrees with their leaders is to speak out online themselves.  There's been nothing more democraticizing than the Internet itself.  Before you could have your pamphlets and newspapers, but now you have online resources at your disposal.  Something like Twitter makes everyone able to interact and engage. 

>> NATHALIE DUCOMMUN:  Thank you for your response.  We'll be continuing on that subject of course.  Because I think we've got the message fake news is a great concern. 

I want to thank our panelists.  Thank you very much for your interventions this morning.  I would invite you to just sit down for ‑‑ for the audience you can remain seated.  We're going to welcome the second group of panelists on stage to continue the discussion around the challenges.  But, of course our first group is staying in the room and will be able also to interact with the second group.  So just stay with us.  Thank you.  Your names will be put down. 


>> NATHALIE DUCOMMUN:  Let's remain seated and take our places.  I've invited the second group of panelists to take place on stage so we can continue the debate.  This is a high level session about the impact of digitization on politics, on public trust, on democracy.  We have tried ‑‑ benefits that we've witnessed so far thanks to Internet.  But as I said, human nature always catches back up.  We're discussing a lot about challenges.  We'll continue to do so.  Probably we still have solutions to these challenges.  If you have solutions, I am encouraging them. 

So as we did previously, I would kindly ask you to be brief and direct as possible in your intervention so we can have enough time after for debate.  It's really a good thing when you can all answer to each other.  That really helps in the debate. 

I will start ‑‑ I have to get used now to my new panel.  (Laughing) I have one missing.  Okay. 

So Mr. Bobby Duffy is with us because I'm still waiting for Nighat Dad who is not here yet.  Let's start with you, Mr. Duffy.  You're the managing director for public affairs in the UK and Access Social Research.  You have also worked a lot on social exclusion I know.  And social exclusion will definitely be a big challenge notably in terms of artificial intelligence.  We'll probably be talking about that. 

But what would be your main message today about the challenges of digitization, please? 

>> BOBBY DUFFY:  Thank you.  I'm delighted to be here.  There is so much we could say about this hugely important topic as reflected in the excellent briefing paper.  If you haven't seen from IGF, which is available online, it covers the ground so well.  I'm going to focus on public perceptions because that's what we focus on. 

We've done work on the impacts of digitization elections, public dialogues, concerns about AI and machine learning.  The ethics of using social media and data use and lots, lots more which we can get into the aspects of discussion.  I'm going to limit the opening comments to the challenge of misperceptions and the role of digitization in that mainly because I'm finishing writing a book on that over Christmas with January 1st deadline.  Your reflection is very useful, would be useful to that too. 

There are so many misperceptions, one.  There are the pure fake news one.  We did a survey for Buzzfeed on, for example, the endorsement on Donald Trump and his candidacy.  1 in 5 U.S. online citizens saw that.  But the scary thing that two‑thirds of those believed it.  Despite the reputations and all those things that followed after that. 

There are key social issues that can effect ‑‑ we did a study that shows only 42% know that the public ‑‑ that the vaccines cause autism in healthy children is false.  That's kept alive online and affecting real people's decisions.  There's crazy belief about the nature of populations in the coverage we see.  In that same study people think the murder rate it's increasing but it's decreasing.  In the Netherlands 51% of the prison population are immigrants in the population when it's only 19%. 

I want to make two points on this.  There are two elements as to why we're wrong.  It's what we're told which is a lot of what we're focusing on today, what we see.  It's also how we think.  Misperceptions have always been with us.  One of my favorite quotes is from Francis Bacon from 1620, the human understanding, once it adopts an opinion, draws all things else to support and agree with it.  Though there be a greater number to be found on the other side, either these are neglect or (?) or by some distractions set aside.  This is 1620 Francis Bacon scribing. 

From Leon Festinger's work in the 1950s about the psychological pain that we suffer from cognitive dissonance, there are great experiments that show we have this confirmation bias where we discredit information that we don't agree with or don't fit with our already formed views. 

There's a great story how Charles Darwin was getting closer to finalizing his theory went out and looked, positively looked for information that contradicted what he was trying to prove.  We can't all be Charles Darwin.  We're not like that. 

The key is this is not new for us but the environment has changed hugely.  Filter bubbles are real because that's part of our nature, not an evil of technology.  It's hard wired in how we think.  We need to recognize that.  Left alone, it won't happen naturally in the way we hoped in the early days of this.  We need the intervention. 

The second point is I'm a big supporter of fact checking.  With do lot of work with fact checkers.  The impact is inconsistent.  If we're thinking about purely correcting facts once they're out there, there are potential backfire effects when drawing attention to these.  We need to change our views on what fact checking is.  It's more than correct misperceptions and then a deterrent effect.  The third generation fact‑checking approach is really important. 

The finer point really is that facts are clearly not enough.  We need to recognize in so much of our work, so many misperceptions we see are emotional and tied with identity.  It's not just about mythbusting through facts.  It's about creating a story and narrative, particularly because, again, because the way our brains work, negative information takes up more cognitive space.  We give it more weight just because that's the way our (?). 

Evolutionarily negative information is what you need to act on.  We store it more readily.  The challenge is to counter that.  That is a significant challenge to get the positive stories outside with those facts.  Really difficult to do, but this is a challenging section.  I feel at liberty to leave it there. 

Can I end with a little bit of hope?  Just to clarify, I'm not at all saying that people can't change their minds and facts don't matter.  We run a whole series of deliberative events on all sorts of difficult subjects online and offline with people.  There are great examples here and in the stands outside and talked about Taiwan and (?) there is some hope there too. 

>> NATHALIE DUCOMMUN:  How to counter the effects of fake news rather than fake news itself as we heard earlier on as well.  Ms. Farida Dwi Cahyarini, thank you for being with us.  You're Secretary General of Communication and Information Technology of Indonesia.  Okay.  Is fake news one of the main challenges you have to face in Indonesia? 

>> FARIDA DWI CAHYARINI:  Thank you very much for having us here.  I would like to speak to the challenges ‑‑

>> NATHALIE DUCOMMUN:  Talk into the microphone. 

>> FARIDA DWI CAHYARINI:  The challenges of the impact of digitization on public trust.  (?) I ask the association Indonesia has 132 million Internet users from its 256 million population.  It means 42% Internet penetration in Indonesia.  In our Constitution in 1435 (?) that everyone has the right to freedom of association, assembly, and expression.  Is seems that the step for the public to express opinion (?) organization. 

However, contrary to opinions should not violate the Constitution and disrespect the rights of others.  In terms of democracy in the ‑‑ (?) social network in which people should have face‑to‑face meeting as the impact of ICT development, the social network democracy move through social network sides.  Through this medium the public can submit their ideas to (?) on the condition that the community related to telecommunication or Internet facility. 

(?) of the digital era to the democratic development and the delivery of political message that takes us through online media or social network on the Internet as fast, widening continues communication as we see in an effective way and transparency.  On the other hand, in fact, Indonesia is still suffering discomfort situation.  (?) in 2007 and the government (?) the disagreement between supporter of the parties and then the rest of hate speech as well as how to over(?) is social media. 

In August 2017 Indonesia police have arrested three members of hate speech group (Speaking non‑English language).  It is one organization, fake news industry.  Again, political target including our president.  Josh Dawson (?) and spreads customized fake news from media social.  They have 800,000 social media account.  The group doesn't only make new account but also hack other account and (?) news. 

From January to October 2017 (?) from public about online contained 51 million submission where the first position of the report above the hate speech and (?) however, our Indonesia stakeholder in Indonesia are not remain silent.  Their movement of digital literacy is getting stronger as they develop more online content, and we have on the Internet to suppress the negative things of the Internet. 

For example, the Internet cyber‑wise that won the WSIS award 2017.  Then the impact of the (?) movement that collaborate with communities where there is also the Indonesian ICT volunteer who have 10,000 of number and actively encourage digital literacy through stakeholder around Indonesia. 

Recently (?) joined Indonesian multistakeholder for the digital literacy movement.  Now Mr. Grayson has more than 65 members from government, association, organization, technical community, and private sector.  (?) it was launched three months ago.  Mr. Grayson has facilitated and collaborated around 33 stakeholder in activities with a number of participants reaching 57,000 people. 

The digital literacy in Europe is an approach which the government priority when it comes to online content governments.  Its knowledge education and knowledge development where law enforcement access (?) upon requests from institutions have conducted based on prescribed (?). 

Thank you very much for this session.  Maybe we will discuss ‑‑

>> NATHALIE DUCOMMUN:  Yeah, we will.  Thank you very much for a few of these figures that show that, of course, the spreading online of content, negative content or what we commonly call fake news is also a concern and a challenge in Indonesia. 

Mr. Frank LaRue, thank you for being with us.  You're assistant director at UNESCO.  You were special rapporteur on the promotion and protection on the right of freedom of expression at the UN Commission of Human Rights.  That was the question I wanted to ask starting with.  Freedom of speech on one side.  Fake news on the other.  What can we do about it?  What is the challenge here? 

>> FRANK LARUE:  Thank you very much, and I thank the opportunity to participate in this excellent panel.  Let me begin by making one important statement that I made almost everywhere where I have spoken on this. 

Number one, I don't like the term fake news because I think there's a bit of a trap in it.  We are confronting campaigns of disinformation.  So we should talk about information or disinformation. 

Fake news is a trap why?  Because it uses two different concepts.  It uses fake information linked to the concept of news.  I think that is the trap.  They're trying to dissuade us from reading news and thinking news and defending freedom of the press which we believe.  We can agree or disagree with the news or with the media or with the focus that media has.  I think Dunja Mijatovic mentioned clearly before.  There can be mistakes by a journalist.  That doesn't necessarily present us with a phenomenon with a campaign of disinformation that we could call dissemination of fake news. 

What we're confronting really is something that is intentional that has a malicious intent which is the dissemination campaigns, disinformation campaigns.  This is truly happening.  This is not new either.  This has occurred in humanity since we exist.  The difference with Internet, though, is the fact that now obviously it can reach many more people in a faster way.  Because of the interactive nature of Internet oftentimes the public itself, the users, will pick up that disinformation, reproduce it, forward it, or reconsider it in different terms.  That's why it's having a bigger effect.  I think it is something we should consider because it may put, especially on the political process, it may put serious challenges to our democratic model. 

There are obviously other types of campaigns that are of criminal intent.  But those are in a way already defined in Articles 19 and 20 of the IICPR.  There are forms of expression that should be prohibited by states and legitimately prohibited by states.  I won't go into detail now but widely defined.  I think this is important to mention. 

When it comes to a systematic campaign of disinformation, especially in the political world, it can totally distort the process of elections or the decision making or the definition of policies.  This is a very serious problem.  Because we always thought the Internet, as was said by the first panel, was going to be the facilitator, in a way it is, of citizen participation, the voices of citizens.  But I think we have discovered that just raising the voices of citizens is the first step, a very important step, but not the only one. 

I think we have to go by the citizens to see who are the interlocutors and whether they're listening, whether it be a Parliament or government authority, health authorities, or any other type of authority setting public policies.  This is where it becomes a serious problem.  Internet should be an instrument of dialogue of everyone raising their voice but an interactive dialogue. 

Here's where I put the challenge into the positive side.  Governments have to discover how to use Internet to interact with their citizens.  It's not only a fact that Internet exists and everyone can put up their daily activities and ‑‑

>> NATHALIE DUCOMMUN:  Not just communication but interaction. 

>> FRANK LARUE:  It's a great system for consulting a policy, for consulting decision making.  And the other positive development, it's a great way to inform citizens what government is doing.  This is where we need transparent government, e‑forms of government.  This would be an appropriate way to regain trust of the citizens is to precisely use this instrument of communication to effectively show the public what is being done in their name because government officials and especially Parliament officials are elected in participation.  They have a right to know what are they proposing.  What budget is being used?  More effectively, what are the policies being used and decisions being made?  This is the positive side that is often neglected.  We have campaigns we have to look at.  On the other hand we have the fact that governments have not used fully the extent and the possibilities of the Internet. 

One last word of caution would be in the campaign of disinformation.  We want to show different alternatives.  They can have serious problems included in health, as was mentioned on autism, and other campaigns against vaccination or other issues related.  We would like to have a multistakeholder dialogue on this. 

I don't think ‑‑ and it was already mentioned in the last panel as well.  The easiest solution is to put it in the hands of the big platforms.  The big platforms are acting as a business.  Obviously that is their logic.  There's nothing wrong with that.  Effectively it's not going to be the logic of all different sectors of society.  The response of these campaigns has to be a response that has been thought through in a strong multisector of dialogue. 

It's essential and we have ‑‑ Internet universality that UNESCO provides (?) everything should be discussed and debated.  In that multsectorial dialogue this possibility of the dialogues means media information literacy.  It's very important that we take this opportunity to show that Internet is wonderful but has dangers and pitfalls, but those will make us react and develop a more critical mind from the earliest ages of childhood so everyone can use Internet safely, not because they're being told what to think but more they're given an opportunity to have their own critical perspective and their own critical minds. 

>> NATHALIE DUCOMMUN:  Thank you very much for showing us the benefits and challenges.  Thank you for the definition.  You said you didn't like the fake news definition.  We had when we prepared this session discussion on that matter.  I would like to share with you this expert study that found these definitions.  Disinformation, false and deliberately created to harm people or organizational country.  That's closer to what you were saying.  Misinformation, false, misinformation but not created to harm.  Malinformation based on related information but meant to inflict harm on an individual or country.  The effect of these information on democracy. 

Let's continue with Ms. Claudia Luciani.  You're director of Democratic Governance and Antidiscrimination at the Council of Europe.  You're currently working on reforms in the area of democratic governance.  So what would be your introductory remarks on the challenges we're facing today?  You just have to press on the button. 

>> CLAUDIA LUCIANI:  Okay.  Thank you very much for this opportunity.  We're particularly glad that the Council of Europe is associated to this discussion because of our collective strive our group takes based on shared definitions.  That's exactly what we're trying to do here, I believe, finding common definitions and biometers. 

We have recently tried to understand a little bit better what those challenges were when it comes to the impart of the Internet notably on public trust and citizen trust in parties and democracy.  We have done so in a world for the democracy setting in November this year.  I just need to share with you some of the conclusions that we had come to. 

First of all, as has been said before, Internet is a huge enabler.  Technology has facilitated civic movements, the use of deliberative democracy and participative democracy initiatives.  We've heard them before.  However, when we took a closer look at the impact ‑‑ that's what we wanted to see ‑‑ the impact of those initiatives ‑‑ we found important questions to be answered. 

First of all, the nature of those initiatives, private versus public, the sponsorship, the transparency, their often inability to deliver what was promised to the citizens.  In other words, there was a risk that those very vulnerable Internet platform disappoint the citizens before.  They were already disappointed with party politics and traditional democracy and turned to other platforms that, in fact, were not able to deliver.  That is because of the question that the colleague from my ‑‑ raised before.  Who gets to decide in the end?  I can share and participate online, but the decision making part is fundamental. 

These conclusions have lead us to call for understanding better how those initiatives work to make sure that themselves are democratic, that they respect the fundamental biometers of what is a democratic initiative.  In other words, their qualities are really such that citizens would trust it and engage in a general fashion and not be disappointed.  For us democracy online should not be democracy with weaker rules but should be democracy with rules that guarantee to the citizens the ‑‑ that they will indeed have the ‑‑ that can be trusted and can come to deliver.  I would stop here for the first round of comments and come back to social media. 

>> NATHALIE DUCOMMUN:  Okay.  If you have solutions how we can strengthen that and allow people to participate in decision making, that will be interesting.  Thank you very much. 

Mr. Gonzalo Navarro in Chile, and you've been acting as permanent representative of Chile at the Internet Governance Forum notably.  Of course you have witnessed the evolution of digitization and its impact on democracy.  What are your first remarks today? 

>> GONZALO NAVARRO:  Thank you.  Thank you very much.  It's a pleasure to be here today.  Thank you in the way you organized with the panel.  Because we started with the positive aspects of Internet, sometimes we forget those positive aspects while we're addressing issues like this one. 

I would like to see the glass half full and the potential that Internet has.  It's quite good to have this kind of dialogue in a multistakeholder way because it's the only way to learn different aspects and address these issues and try to bring common solution because I think that's the ultimate objectiveness is to try to address and try to find some solutions to the issues we're talking about here. 

So let me start by saying that societies and institutions and communities we're building a world in which this global network didn't exist.  People (?) and representatives to access group services and make decisions.  And the digital economy has changed, and with change obviously we have challenges.  Online companies as part of this ecosystem, obviously they have some responsibilities too.  They have have a stake in it, and they have to create a trustful environment and that's vital for the users in order to create this trustful environment. 

Empowering users is a fundamental tool and is how many platforms find their success and creativity.  Most of the online companies relationships to develop trust.  (?) to the voluntary initiative to address the issue of counterterrorism which was one of the things mentioned previously in the first part of this panel. 

We also addressed at the beginning in the first session that Internet is an emanator of inclusion.  These commercial communications are platforms and act as enablers.  Benefits for democratic participation and inclusion is what that means.  That's precisely, that's why we're addressing in terms of opportunity that Internet is bringing to the people especially in terms of democratic exercises through human rights like freedom of speech, for example. 

I think that one of the elements that we're not discussing here up to this point but it is important is to create a framework for ‑‑ to address the intermediary liability in terms of how they behave.  This kind of framework (?) you have a couple of cases or places where countries are taking steps towards this.  It enables us to have productivity for the citizens specifically to know what to do and how Internet is going to respond to some of the content that can be inflammatory or fake in case. 

>> NATHALIE DUCOMMUN:  Best practice on how you should ‑‑ it's like education on how to use ‑‑

>> GONZALO NAVARRO:  Education is an essential tool.  That was mentioned before.  The lady was saying this is disruptive and it's creating a crash.  That crash is an opportunity also to address the issue through education which is fundamental here.  But coming back to the legal framework (?) rules are super necessary.  Create that framework in which the platforms, which are also the citizens and governments, can rely in order to know what to do and how to do it. 

Cases in Latin America that are useful for this is the Brazil which contains a set of rules and principles to apply to intermediary reliability and citizens.  Argentina is taking lesser steps to approve (?) the same issue and the same permissions that were included there must be found in some ‑‑ (?) or not the negotiation nowadays.  That's part of the things. 

I mentioned at the beginning I was happy to be here and be learning about different perspectives because I think that a possible solution and a way to address this issue is a multistakeholder approach in which we can see, learn, and share views about what's going on and how to address it. 

So education is fundamental.  Legal frameworks that are balanced, rights and (?) are a super effective tool and dialogue, multistakeholder dialogue.  Thank you very much. 

>> NATHALIE DUCOMMUN:  Thank you very much.  We're going back to what Frank LaRue told us earlier on.  Thank you for being with us today.  Jean‑Paul Philippot, you're the director of the European Broadcast Union.  We've been talking about fake news which is a great issue in the news rooms at the time being.  We've been talking about media and their responsibilities and roles.  What are your main challenges today in terms of digitization? 

>> JEAN‑PAUL PHILIPPOT:  Thank you.  Delighted to be here.  I think I'm going to stick to talking around media issues obviously.  I agree ‑‑ I just have difficulty with the term fake news.  I think it's just been used by so many people in so many different ways.  I'm not even sure I totally understand what it refers to now.  It's become so complex.  I prefer those references to misinformation, disinformation. 

I think with what Dunja was saying earlier, I absolutely agree that a lot of discussion around this, I believe, has been exaggerated.  I think when it comes and we have had this ‑‑ we've had misinformation, disinformation, sensationalistic information for a long time. 

What is absolutely different now is the reach.  We have never in history had such a small number of companies whose platforms reach so many people.  And that is a fundamental shift. 

I think in terms of the trust issue that arises out of some of this misinformation, disinformation, all of the categories, I think we need to remember that, you know, the public haven't lost trust in media.  We did a survey in 33 countries in connection with the barometer last year.  There 59% trust in radio.  Public service radio service that's 80% today in quite a few countries.  And in television the figure was 50%.  In some countries in television it was lower and others much higher.  What we found was when it came to social media platforms, the figure was 21% and falling.  I think ‑‑ you have platforms that now have the biggest reach that are actually getting the lowest trust figures and also are attracting the younger audiences.  That is a huge problem for all of us, that is those platforms that are attracting more of the younger audiences and where the trust levels are so small. 

I think a second issue for all of us is consolidation.  I absolutely agree that the positives that have been spoken about digitization.  I absolutely praise the pioneers and the tech companies that are leading the way in innovation and what they have done for society.  But we do need to stand back and say, once again, we have never seen consolidation like this.  Last year 50% of the digital advertising revenues world‑wide will have been taken by two companies.  That's new.  Over 60% in the U.S.

So we have a consolidation of reach.  We have a consolidation of revenue.  The difficulty is, we still don't know ‑‑ we can't predict because this is relatively new.  Where does this go in terms of the impact on the rest of the media ecosystem?  Where does it go in terms of the ability of companies, both public service and commercial to invest in content?  They're still the big consistent investors. 

We're in a bit of experiment here.  We've never seen this before.  It's still relatively new.  We're not standing back enough.  Where are we going to be in five or ten years if this continues?  I think other issues that arise is around standards, around approach. 

Social media is driving a revolution in terms of how other media covers stories.  We in the media need to question ourselves around this.  At the very least we need to say that we are shifting our rules.  We're shifting our own approach in terms of we are reporting things that even 12 months, 24 months ago we would not have reported.  That's because they're widespread in social media.  It's out there.  You begin to see the relevance.  I don't think we talk enough around how our own approach is being influenced and our own approach has been shifted in that regard. 

I also think in terms of investment is absolutely being affected as advertising revenue, particularly in the digital sphere, is being affected.  I don't just mean general investment.  Like if you take public service media for me is a very privileged position because we receive public funding.  And that puts responsibilities on us to invest in things that others don't. 

Without us, those things will not be invested in.  I think the pressure on all companies, both public service and commercial, in terms of what the consolidation of revenues, where the fragmentation of audiences is jeopardizing a lot of that content.  Remember, an awful lot of it is European content.  It's content ‑‑ original content 87% of EBU members' investments is in European country.  18 billion every year.  That is a unique contribution to European society. 

I think that that is a problem.  In terms of coming quickly to the ‑‑ how can this be addressed?  I think there's a responsibility on everybody.  Responsibility on the state, I would argue coming from the EBU to ensure that there is a properly funded ecosystem and public service media is not restricted from investing in the digital world.  I believe there is a responsibility on the state. 

I'm nervous as some speakers earlier about widespread strong regulation on fake news a an reaction and what the wider implication is on freedom of speech.  I think it has responsibility around hate speech and a whole range of things that are effectively criminal and that those responsibilities need to be taken more seriously.  I think that we in the media have responsibility, particularly in public service media, that we invest in investigative journalism, that we'll invest in our journalism. 

As journalism chases the 24‑hour news ISOC to get it out there immediate pressure, I do believe there are standards that fall.  I think we need to invest in training and in our journalism and in elements like investigative journalism.  We need to prioritize those. 

I think the social media platforms we're beginning to see some kind of response we've seen from Facebook in the last six months.  Google made the announcement yesterday, I don't think it's enough, but there is some movement.  I think they have a lot of revenue and they could do more.  But there has been some movement. 

Finally, I think where we all have a responsibility is around education.  It's around training, the lifelong learning experience that the state can help with, the training and educational programming, et cetera, an emphasis that broadcasters can have but also the training of our own staff, talking about the issues around quality and discussing them openly, how our own approach is changing through the influence of social media and the impact it's having. 

I think those are the areas we should be looking at. 

>> NATHALIE DUCOMMUN:  Thank you very much for that.  Mr. Sebastien Soriano, you're a chairman of the French National Regulatory Authority for telecoms and posts.  You're also chairman of BEREC 2017, the body of European regulators for electronic communications.  Let's hear from your point of view about the challenges. 

>> SEBASTIEN SORIANO:  Thanks to Philipp Metzger to give me the opportunity to speak to you today.  What I would like to stress is to put on the table a testimony, a testimony from a regulator.  That is someone working day to day with the industry, to lead the industry in a good direction. 

So I've been working at competition and telecom for 15 years working with media player and Internet giants.  I would like to take a little bit of a helicopter view and not to talk specifically about fake news. 

I think the biggest challenge for governments, public authorities with digitization is the acceleration.  In our history we never have met innovation with so close waves of innovation coming after one another.  Internet platforms, smart phones, Internet of things, big data, artificial intelligence.  All of these innovations are going so fast.  I mean, this is basically ‑‑ that's the main challenge.  This leads first to education, of course, but this is not my party.  I will not talk about the challenges that I see for policymakers, for legislators, when they have to define solutions to fix the problems. 

What is impressive with this so fast changing environment is that every day there is a new problem.  We had the right to be (?) we have the fake news.  And we have the artificial intelligence and new challenges every day.  We have also big companies that appear in a very short period of time.  It's like empires that have less than ten years.  This is also part of the challenge.  Maybe the most difficult is that there is no certainty about the future. 

Three or four years ago many people were saying, talking about education, you have to learn to your kids how to code.  Basically everybody was key on that.  Learn to code to your kids.  What do we hear now?  Artificial intelligence will be coding.  Don't do that.  Teach your kids poetry and art and anthropology, what the machine will never be able to do at any time. 

So the main challenge is how do we face the unknown?  Regulating the unknown is something quite difficult.  In that frame I would advocate in two directions to be concrete.  Coming from my experience of a regulator, first, rely on long‑term principles.  When you define a principle, be generic.  Don't change it every two or three years. 

When I look at what is happening in the United States regarding net neutrality, for instance, you can be for or against net neutrality, but what is striking is that they changed their mind three or four times in ten years.  And this is not good.  Principles must be laid down for a long time. 

And the second principle I would advocate for is flexibility when you want to implement the principles.  And this is where I think we have not only to be improving ourselves.  I think we really need here a revolution because for the governments, for regulators it's so easy.  You need to define something to oblige the company to do it, and when they don't do it, you just take ‑‑ say it's an infringement.  You put a fine.  Now it's not possible.  Because just the time to define the problem, just the time to define the rule, then things have changed because of the acceleration. 

So we need to rethink totally how we regulate the markets.  And what I see in my experience is that sometimes it is more efficient to nudge the market than to regulate it in a proper way with classical legal tools. 

>> NATHALIE DUCOMMUN:  You have a concrete example? 

>> SEBASTIEN SORIANO:  Yes.  For instance, an example coming from my telecom experience, something that is very difficult is to put pressure on telecom operator to invest in networks.  It is something difficult.  You have competition.  Competition is good.  Sometimes it's not enough especially in rural areas. 

So what we did is that we ‑‑ this is a program we called unbundling of the data.  Classically a telecom regulator is on the network.  We have asked operators to give us very detailed data about the coverage of their mobile network.  And we have published this information of open data, and now we are working with steps that are launching (?) service.  So thanks to this service you can say, okay, I live here.  I work here.  I take this road.  I take this subway.  Here is my gym club.  Then the services say to you, what is the best operator in terms of coverage for you, not an average coverage of the country but just for you.  It's a telemating information. 

Thanks to that, we are nudging the market to invest more and more and to not only compete in prices.  It's important to have good prices in the market but also to compete in investments. 

So typically nudging by using data, you have a comprehensive program that we call regulating with data.  It's basically a direction very concrete that I really recommend to work on to find solutions.  Thank you for your attention. 

>> NATHALIE DUCOMMUN:  Thank you very much.  Let's have a little bit of dialogue.  Sorry, you were missing earlier.  I'm sorry.  Dad, you're executive director of Digital Rights Foundation in Pakistan.  Thank you for being here.  I didn't see you arrive.  You're one of the pioneers who have been campaigning for access to open Internet in Pakistan and globally. 

We've been talking about the reach through Internet and inclusive way of seeing that and that's one of the benefits that I think most of you agree we need to protect.  What would be your main challenges faced today, please? 

>> NIGHAT DAD:  I'm sorry.  I had a conflicting workshop, so I was a bit late.  I think when we usually speak about the challenges in question, we have to take into account the violation of human rights that are occurring on the individual level online, gender or otherwise.  At the organizational level for example, the promotion of violent extremist ideologies that endorse (?) there are human rights violation at the official level in the form of censorship, shutdowns, and the passage of legislation that stifles and criminalizations on dissent on national security grounds. 

One challenge of digitization that can bring and most of the panelists have talked about it, it's the danger of false news narrative or fake news being unchecked and allowed to hijack the democratic accountability process I refer to.  We have seen it occurred in India with social media during and after Indian elections.  This can tie into the criminalization of freedom of expression by the state.  This is (?) are not allowed to be cultivated and practiced enough. 

The goal is not just to bring in technology and increase the number of connected people but to take on board democratic criticism from the public who have in turn elected the ones they are criticizing.  What I mean by this is that government officials that were elected into office and have not made good on their campaign promises hence the criticism.  Greater are the avenues people are using in order to make their voices heard are people in office can make sure that the elected are held accountable in a constructive manner. 

Furthermore, it becomes problematic when the penalties are draconian and overly broad.  So many are unclear about what is the line of democratic discourse and state language begins or ends.  We're left with a situation where democratic voices can be silenced as they have seen this year and years past by the accusations of terrorism or blasphemy. 

I'm here in the context of Pakistan.  It's important to push forward the idea that they're giving someone access to technology.  Just as important, if not more, is the ability to make responsible users access without it being criminalized by the State in the name of security.  The ability to use that skill set without being penalized by the State.  (?) people are made aware about the consequences of misusing technology as well as the rewards of an inclusive public space that can come about.  To put things in perspective as they want out of a space that is global and blurs the concept of (?) people and only people collectively are responsible to imagine and strive for spaces that are safe and inclusive for everyone. 

What governments must do is to ensure that their citizens are able to utilize digital literacy skills to their fullest potential without fear of being penalized for exercising civil discourse which is a key part of democratic society one must replicate online and offline. 

>> NATHALIE DUCOMMUN:  Thank you very much for that.  Katharina, let's listen to comments from participants ‑‑ I was told I must speak in the mic for that.  Do we have comments from people following online? 

>> KATHARINA HOENE:  The discussion online has been honing on this question of fake news.  The comments and questions have gotten a lot more concrete.  This is quite interesting.  Let me highlight two questions or two sets of questions.  First one comes from Desiree who says she agrees with very much the assessment on fake news.  Then she asks a question that goes into this question of regulation that we've been addressing.  She asks should platforms and governments think about how to deal with filter bubbles and what kind of policies can they employ?  What kind of ‑‑ what policy should search engines employ and how?  She also says what do speakers think of the Vicky (?) trip Wales to the fact ‑‑ the initiative is to have professional journalists and community produce fact‑checked global news stories. 

There's a second set of questions focused on fake news by Sophia.  She picked up on ‑‑ we argue that we can counter fake news not by fact checking but producing a positive story.  She's wondering are there examples of this and what channels should we use? 

>> NATHALIE DUCOMMUN:  Thank you.  Would somebody like to answer a few of these questions?  Mr. Soriano? 

>> SEBASTIEN SORIANO:  What I said about regulating data and how you can nudge the market and nudge people.  For instance, when you have this connected thing on ‑‑ that counts your number of steps.  When you open your smart phone, you have the application that says, oh, you have walked only 3,000 feet today.  This is not good, sir.  You should work more than that.  Why not have the same thing with filter bubble?  Sir, yesterday you were really in your bubble.  You should open your mind. 


>> SEBASTIEN SORIANO:  You should go through an application that will give you a different point of view. 

>> NATHALIE DUCOMMUN:  You have to trust in the bubble to believe what the bubble says.  Are we going back to public trust?  Thank you for the example, of course.  Thank you.  Mr. Navarro? 

>> GONZALO NAVARRO:  We may agree and we may disagree respectfully.  I would like to offer an alternative vision of what Mr. Soriano is mentioning here in how we relate.  First of all, I'm not afraid of the future.  Second, Internet is a positive element for development.  We are forgetting that.  We are talking about trust problems and issues and not addressing in a positive way. 

I work with platforms.  Usually we tend to think that all the relationships go to big giant companies.  (?) but we are forgetting the 99% of the companies that also provide services online to platforms.  They're going to be affected by the relation in terms of how they relate.  It's important in terms of are we going to take one deep breath in order not to force regulation at the beginning but to think and to be flexible in terms of how we use the principles that we agree like democracy, freedom of speech, and public goods that are behind it usually, and how we can work to raise a solution to the problem. 

I mentioned in terms of perhaps of fake news, how are we going to address it?  The multistakeholder could be effective where we're starting to think about these issues.  Obviously solutions are not going to come, you know, from one way to another.  But we are ‑‑ eventually we're going to find a solution.  The thing is I also agree with you.  Everything is super fast now.  We are dealing with issues, good and bad issues, every day. 

But if we are not able to talk about this, we are not able to think and consider other ways, unlimited ways, not only in terms of technology but also ‑‑ (?) to address these issues, we are not going anywhere.  Thank you. 

>> NATHALIE DUCOMMUN:  Thank you.  We have Mr. Duffy and then Mr. ‑‑ please. 

>> BOBBY DUFFY:  To pick up on those examples, two points, one general approach and one specific from my experience.  So the BBC has a program of solutions focused journalism going on now which they're just starting and trying to embed within the BBC.  The important part about that is it's not fluffy happy news stories you get at the end of bulletins.  It's addressing real issues where you acknowledge a challenge and then work through to the solution pointing out how it can be addressed.  While we have this focus on negative information, people also look for solutions.  We're hard wired to look for the way through this problem.  That fits with that.  It's not distracting people with fluffy news stories. 

We're working on a series of different together, how diversity can actually make connections between people rather than fragment people.  It's telling those stories that is really key. 

Then one specific example is from the immigration sector which we focused on very much in the UK.  The campaigners in that sector learned early on that you can't counter the misperceptions about the scale and nature about immigration in the UK by firing stats back at people, firing statistics back at people because it misdiagnoses the problem.  People don't engage with the general but with the specific.  That's where all the campaigns that we saw in the UK and in other countries about IM and immigrant telling positive stories about immigration.  It's not covering up.  It's not trying to distract with facts and figures.  It's telling that individual story which is what we naturally engage with as humans. 

>> NATHALIE DUCOMMUN:  Thank you for these examples. 

>> SEBASTIEN SORIANO:  Just the last two questions around the fact checking and positive journalism.  Just to say that everyone is responding to some degree in this, newspapers, media organizations, EBU, all setting up fact‑checking units.  I think we need to see over time just how effective they are.  I think they're effective on the stories that they work on.  It's just how many are slipping through regardless.  I think that's just something we're going to have to see over time. 

I think in terms of the positive news and approach to views there are a couple new initiatives.  There's a journalism initiative which is taking old ‑‑ I've had misgivings about the title constructive journalism.  Does it mean the rest is destructive?  I think the elements in that where journalists need and editors need to question the priorities that they're putting on certain stories, the number of those stories that they cover within an overall mix of a news bulletin and news output.  I think there are important questions that we need to ask ourselves.  

Similarly, in the EBU we're starting a quality journalism initiative which is tied in with the constructive journalism which is about us posing questions about ourselves as it is about anything else. 

>> NATHALIE DUCOMMUN:  Thank you.  Yes, please. 

>> FARIDA DWI CAHYARINI:  I think the fact checking is very good, and it goes in hand with communities to do it.  But it is therefore not only government, it is stakeholders together that makes fact checkers.  Thank you. 

>> NATHALIE DUCOMMUN:  There are quite a lot of initiatives at the time being around fact checking and how to counter all this problem of misinformation.  In some of the examples it seems that asking for the participation of the public and what are your problems and how can we answer and address it, the participative phase helps.  Does it restore public trust would be one of the questions.  Ms. Luciani? 

>> CLAUDIA LUCIANI:  (Off mic) the ones that work best, this was tested during the French presidential campaign were those that relied on different media outlets from across spectrum views.  The voters in this case trusted more when they thought that there was a contradiction in the fact‑checking partnership.  There were different views that were looked.  They trusted a lot less the fact checking that was from one media outlet it had less of an impact.  These are some of the conclusions that we had. 

As to the ‑‑ there is an app actually.  We reviewed that that helps fight the filter bubbles.  It's called Reading Across the Aisle when indeed you are brought automatically your ‑‑ you're made to look at a news feed that come from across the aisle of your normal feeds.  Again, those tend to have a limited impact on fighting this echo chambers and reinforcing messages.  They exist.  This one I recommend is called Reading Across the Aisle. 

>> NATHALIE DUCOMMUN:  Thank you.  Do we have other examples?  We're talking about multistakeholder perspective, and then we'll go to the audience on how to, you know, encourage initiative to gain more public trust.  What should they ‑‑ there should be an infrastructure, a form of governance about how to regulate ‑‑ the word is too strong, of course, but to have a sense of surveillance of what is happening on the net and having a sort of control on the Internet?  Should we go that far?  I don't know if someone wants to answer to that.  Mr. LaRue, you were talking about how to encourage dialogue amongst the different stakeholders. 

>> FRANK LARUE:  UNESCO just published a study on multistakeholder dialogue for specific cases, the case of Brazil, of Kenya and South Korea and the IGF itself.  It was interesting because one of the conclusions we had about it is that it is important to ‑‑ the multistakeholder dialogue has to be a systematic exercise in order to have an impact.  It cannot be a one‑time event.  It has to be in a way clearly linked to the different interests and perspectives of all the different stakeholders. 

In the case of Brazil, when I always point out, has an additional effect is it has been institutionalized.  Institutionalizing something is always a way of making it more permanent.  It can also limit it in some cases if it's not well done. 

I think in the case of CGI of Brazil, it is interesting because now they have an independent institute of the State where they carry on their multistakeholder dialogue.  So they were able to establish principles and it was made public.  Out of the principles came out of the policies there, and out of that came the law that they follow for Internet. 

So clearly this is a multistakeholder dialogue that had a very specific impact in policy making and legislation in their own country.  Like I said, this is part of the media information literacy that has to be done because people participating were representing different sectors but were strongly studying their positions before coming to the dialogue and taking very concrete proposals forward.  This is what I think made it successful. 

>> NATHALIE DUCOMMUN:  Thank you.  Thank you very much. 

>> PANELIST:  In terms of the multistakeholder approach I think we've heard it today.  If you listen to what the commissioner said earlier and other panelists have said in terms of around this issue and themes that are coming up around transparency and pluralism and around skills and education.  So I think there are common grounds. 

The differences are between the media organizations perhaps and states will be the amount of direct regulation, and in fact media organizations could react and be misused particularly in some countries or under some governments to restrict freedom of the press.  There's an awful lot of ‑‑ even what we discussed today where there's a lot of common ground and the multistakeholder approach where there is that much common ground. 

>> NATHALIE DUCOMMUN:  As we said earlier on as well (?) hands are raising.  I suppose there's a lot of comments.  Yes.  Let's start with you.  Just press on the microphone. 

>> AUDIENCE:  Thank you.  (?) I would like to also continue with this line of questioning with regards to multistakeholders' approach.  Thank you to Mr. Navarro for touching upon this.  I would like to ask the floor opinion whether there's common basic agreement between the different stakeholders.  For me this is visible in the title of this session.  Are we really talking about the impact of digitization on politics, public trust, and democracy?  Are we talking about public trust politics and democracy on a digital world? 

I would say the majority of governments in traditional media are pursuing this first approach where the focus is on digitization either as a tool or a threat, which then brings forward this question of security, control, and privacy.  The other side you have private sector and users in the digital world that are pursuing the second approach where the digital world is a reality. 

So we have to actually come up with new definitions or new themes of politics, public trust, and democracy.  The rest of the users are somewhere in the middle.  If you're talking about multistakeholder approach and these are completely different views on engagement, on producing content, on information, misinformation, where are we on this area?  Thank you. 

>> NATHALIE DUCOMMUN:  Mr. Soriano? 

>> SEBASTIEN SORIANO:  Yeah.  And this comes to the question of threats and opportunity of digital issue.  It's about the government as a platform.  Thanks to the digital tools you can totally reinvent how governments are acting, and it is maybe also a more positive way to see it.  So that's mainly something discussed on the platform of the open government partnership. 

When I'm talking about regulating data, as I mentioned, this is typically one part of it.  So you're right that in the end, the question is not how do we react to something, but how do we put the digital in the D.N.A. of the governments?  Yeah. 

>> NATHALIE DUCOMMUN:  Thank you.  Yes, sir, you were asking for the microphone. 

>> AUDIENCE:  Oops.  Joe Calla.  To complete this picture, this discussion about the impact that the Internet is having on public trust and so on and the democratic processes, politics, and government in general, I think we mustn't overlook, it hasn't been mentioned yet, at least I've heard ‑‑ the way to pay for political advertising both in region to specific political campaigns and broader campaigns, cause‑related campaigns.  In the old days there was a comfortable and comforting degree of imprecision about most forms of political advertising.  It was there.  We knew about it.  But it didn't work for everybody in quite the way it can do now where because of the large amounts of data the platforms have collected, people can microtarget specific demographics in particular seats, marginal seats, marginal states ‑‑ let's not forget Donald Trump did not win the popular vote but the (?) In Brexit if 600,000 voted a different way would be a different result.  I wouldn't say Donald Trump or Brexit would be attributed to microtargets as I alluded to. 

There are requirements in the United States and the United Kingdom about impacting that rule (?) to pay for it.  It's very, very important part of the total picture and can develop filter bubbles and disinformation because it's political confidence in our political processes. 

>> NATHALIE DUCOMMUN:  Would you call for a regulation and how of that?  Because if you want to have an open Internet, that is the risk is everybody can take place and everybody can deliver their message.  Regulation? 

>> SEBASTIEN SORIANO:  We've always had rules about foreign money influencing elections.  We need full transparency about any political paid‑for advertising and limits as well. 


>> FRANK LARUE:  I think the last contribution from the public have been very important because, yes, are we trying to ‑‑ when we talk about trust, are we talking about trust in democracy as a system, or are we talking about trust on Internet and communication, which was raised before.  I think this is a crucial question.  I have a feeling that even democratic models in a way are being challenged by the new systems of communication.  We really have to have a look at ourselves in general. 

In a way democracy was built on the idea that everyone should be able to participate and contribute.  But when everybody participates and contributes means everybody should be well informed.  If the information is designed and almost channeled in a personalized way because of the amount of analysis of big data, then how much is induction and how much is information?  That's a crucial question.  There are serious questions being raised. 

I think that this is a reflection to be had.  On the opposite side is the opportunities to have transparent government, as we were saying, to have the openness of what is being done by all authorities, whether local, city level, Parliament, executive, even judiciary to make it more open.  My fear one of the key actors in all this now will be more and more the judiciary.  People are turning more to the courts also to define these issues which is why we have been working with the judges on issues of freedom of expression and access to information which in a way were relatively new for judges. 

One element that I think it's important to keep in mind is that as we were saying before, access to information is the first step, but also who is the interlocutor?  Who are the people speaking to?  Is communication being used in an interactive way or not? 

This is crucial to say because we also see more restrictive laws now precisely because people are scared of disinformation or national security or other issues.  There's more and more progressive restrictions.  At the same time, we see more fear in handling ‑‑ we see more shutdowns, for instance, of Internet happening.  More than 71 shutdowns this year alone.  Oftentimes it is the lack of information not the disinformation. 

>> NATHALIE DUCOMMUN:  Who should be held responsible for the spreading of this mis, malinformation for these political message we were talking about?  We're talking about Facebook, Twitter, should they in some ways be held responsible for the contents? 

>> CLAUDIA LUCIANI:  Mr. LaRue's comment that obviously everybody understands including ‑‑ go back to the political parties and the trust and democracy, democratic processes.  There's no doubt that Internet has been a true democratic promise tool used across the world.  It has helped improve the traditional party's functioning because it has exposed them to more transparency to have investigated a lot more. 

Having said that, until and unless there are common definitions about what democracy is all about and whether it was democratic.  Go back to what I said, online democracy cannot be weaker than offline.  The citizen is more or less protected than it is offline. 

Obviously the answer has been said before, multistate governance discussion about having common definitions about what those concepts mean today and how they will evolve today.  We need government, members of Parliament.  It hasn't been mentioned the local intervention where democracy happens, Civil Society, and businesses. 

There has been signed recently an exchange of letters with a number of Internet companies because we want to engage in that discussion.  We want to discuss with them how we can bring about those common definitions.  They need them too because they need to know when they operate in different countries they're not going to have different systems they're subject to.  Different courts shutting them down and asking content to be removed.  So the need for common shared definitions is as important as ‑‑

>> NATHALIE DUCOMMUN:  Yeah.  We're going back to governance and multistakeholder perspective on that matter rather than a straightforward regulation.  We're getting close to the end unfortunately to this panel.  Only time for one last intervention.  Yes, please.  I'm sorry.  You already spoke before. 

>> AUDIENCE:  I'm Sylvia.  I'm the head of the Media and Internet Division at the Council of Europe and I service the steering committee on media and information society.  Now this very committee is discussing all the issues and is assisting Member States to come to grips.  I invite you to read a very important report.  It's called information disorder.  I think we should have called it information pollution.  It will help you to contribute to an informed debate, and it will help at the Council of Europe to move the issue further and what's on the menu for the next biannual is work how to support quality journalism online, offline.  I stop it here and please read to contribute in an informed manner like Frank LaRue has said. 

>> NATHALIE DUCOMMUN:  Thank you.  We have the European Commissioner.  In the question of time and history we've focused on the misinformation.  Less, that probably will be for next week is artificial intelligence that will be changing as well, our social interactions online and offline.  I would like to thank you all for your participation. 

We're getting close to the end of this session and we want our host chairperson, Philipp Metzger, to let us know the points that he finds important to keep with us today.  Philipp Metzger? 

>> PHILIPP METZGER:  Thank you very much.  Ladies and gentlemen, it's been fascinating sitting here and listening to the discussions we had.  I think we had a very informed discussion, really high quality contributions and interactions.  So my prevailing sentiment is, there is hope.  Maybe I'm saying that in my own personal bubble, you know.  But I had the impression listening to all that has been shared that ultimately the bottom line is that the opportunities outweigh the challenges.  But, of course, there are a number of points we have to work on very hard.  I find it quite interesting. 

We had this segmentation in the section of opportunities and then the challenges.  I found it interesting that many of the panelists and there were contributions from the audience in the first half were often about challenges and vice versa. 


>> PHILIPP METZGER:  That demonstrates the complexity of the matter we're dealing with.  Again, there's hope.  In my personal assessment the opportunities outweigh the challenges.  I would like to start about the question with the access.  I think it has been said by the Minister of Bangladesh.  I think the access remains fundamental.  Without access we wouldn't have what we're discussing today.  This is the actual connection but also the information. It's something that gives us a tool to include everybody. 

I think it has been said that we, ultimately the society, a stronger society are those that include everybody and also the weaker participants of society, particularly members of society.  In that sense, I think it is important to reach from the outside that the Internet is an extremely effective tool.  As a member of the Burbank Commission For Development, and we're trying to figure out very concrete as a policy and regulation recommendations to give that access to everyone.  I think that's the basis.  Without that, we don't have to discuss any further. 

From that, I think we (?) it was prominent here was the question of public service is ultimately.  I think it's quite a good phrase to phrase it that way (?) I think it's quite clear that there the governments are dependent on the stakeholders.  I think the governments can learn from the stakeholders and improve the public services they're rendering not just for the sake of it for the stakeholders as well.  That's key so that the government doesn't (?) on a certain level and moves on to a more modern modus operandi. 

I look at myself as well.  We have the same challenges in my government service.  That's something we have to constantly remind ourselves, and we're reminded by the stakeholders as well.  That's why we're here together. (?) related to the transparency and accountability which will increase through the digital technology and Internet. 

We heard from the parliamentary point of view what could be improved there, and I think that's one aspect.  Parliament is a key player in the democratic processes.  Again, I think it applies across the board. 

Then, of course, if you talk about Parliament, you really come to the key issue of participation and of the exercise of your fundamental rights.  I thought the example I heard from Brazil about the e‑petition project, I found it to be a telling story of the potential that the digital world, the Internet, are unleashing.  E‑complaint system, a complaint hub system, a very telling story which also concretely demonstrates the benefits we can reap from the digital tools. 

Then, of course, bottom line for all this aspect of ultimately infrastructure and public services and the engagement in the public sphere is education, education, education.  I think it's quite clear that we have massive challenges there.  Of course, we have opportunities, but we have to provide the proper education to everyone basically so that we have a full engagement. 

That for me is a big ‑‑ the one aspect, an overall infrastructure on services and interaction for political purposes and by that I mean democratic processes of taking decision with other stakeholders.  And then, of course, we had a strong focus on the question of information versus misinformation.  I'm using that as a term now because I think quite correctly there were many reactions to the term fake news.  I found it quite interesting that the moderator said at some stage a temporary conclusion.  Rather fake news is concerned. 

What resounded with me more than fake news of a concern was actually the statement by one of our panelists who said that we shouldn't overreact to fake news.  For me that was a key aspect, and certainly overreacting is one thing. 

The other fundamental thing for any democratic society is not to censor.  Don't go into situations because you have a significant concern about misinformation that you overreact by censoring.  I find that very key.  I think there were a number of voluntary efforts by various stakeholders which I mentioned, of course, the intermediaries and NGOs that are fact checking.  But there, as well, that I found a key conclusion certainly that I heard was that we could use that challenge. 

We should use the challenge of misinformation to reengage with the governments and with the state and that the governments should not then ultimately disengage and delegate the responsibilities away when it comes to assuring proper information of the public.  So I think that is also a beacon of hope.  If that is an opportunity to reengage with the state, with the governments and together with the population, I think that is something which is healthy. 

I'll come to values again.  It pares down the question of values as well.  Of course, the public trust will be key in this.  I found it quite interesting the 59% trust in the radio and it's quite clear that we have the tools.  We have public broadcasting systems in a number of countries which can also support that effort in building and rebuilding the trust. 

Again, I think when it comes to that, education, education, education.  Again, somebody used the phrase, use cyber literacy.  I think that may reflect well what we have to look at going forward because we find at the national level, we find it difficult at times to engage, for instance, with the young citizens and ‑‑ because they are on the wrong channels.  I think that is something we should (?) the question is, where does that leave us going forward?  We have indeed a situation where we're grappling with speed.  I think that was hugely designed with the speed and developments we're dealing with.  Everything happens so fast, and there's no certainty about the future. 

We have conservation phenomena which affects us with investment in the infrastructure services that are relevant for building that trust and for the democratic processes.  We have clearly disruptive nature with big data and artificial intelligence where we don't have ready‑made recipes to find solutions just like that.  And somebody mentioned that we didn't have an international cyber crime or cyber policy. 

So if you look at those challenges going forward, I think it is an important statement that was made that we have to look at the long term, and we have to find to the extent possible, main principles that allow us to set a certain base or set certain frame to then find solutions in the cases at hand going forward. 

I think to that principle‑based basis for future digital cooperation I think belongs the question is cross‑border enforcement.  I think we clearly have an issue when it comes to trust by the public that if you don't find the perpetrators of any crime but happens online in our courtyard and in our home country.  It's somewhere else.  I think we have a clearer challenge to also focus on cross‑border enforcement of the existing rules that we have because we have so much.  I think it was said by a panelist, the international laws and rules that apply in the analog world also apply to the digital world.  I think that communicates for better enforcement across the borders. 

So maybe I finish on a very almost philosophical level and sense, I think this is about transferring our human nature into the digital space and making it ‑‑ how shall I say, use the digital space to the best of our possibilities, but we have a phenomena that we don't really know yet how to serve. 

I think there is hope again that long term (?) a book was mentioned.  In the 15th Century it wasn't very far and not necessarily accurate.  It could be wrong.  People learned as well.  Of course we had the speed issue.  The question is how much that will test our physical and human limits.  It boils down really to the strong value system.  That was mentioned in the discussion as well, the discussion we've had today.  There are strong values amongst all of us that allows us to have the debate we have.  I think to find that value system going forward under new circumstances in a digitally‑challenged world and to invigorate that system as well because sometimes we have a strong value and we have lost it partly, be it through generations, be it through also other challenges in integrating certain groups in our society.  So that, of course, ultimately is the D.N.A. of the multistakeholder to find common solutions. 

I liked yesterday what Kathy Brown said on the high level panel with our president that there are ‑‑ she was identifying agreed outcomes on the panel.  I think agreed outcomes could be a guiding goal almost for our IGF here and also the coming ‑‑ (?) to finding solutions that's certainly my role as a government official.  That's what I'm challenged to do, find concrete solutions and try to help our society going forward. 

I would like to thank everyone here on the panel very much.  It's been a great pleasure and an honor to be the chairman of that debate and be able to listen to it very carefully.  I would very much further discussions this week and of course the many more areas to come.  Thank you also very much.  Outstanding job. 

Thank you very much Ms. Ducommun.  I wish you a safe return home and all the best.  Thank you. 

>> NATHALIE DUCOMMUN:  Thank you, Mr. Metzger.  That wasn't an easy task to wrap up the content of this panel.  It was so rich and such a complex theme.  So thank you ever so much. 

I would like to address our apologies to the translators because I have been told it wasn't always easy to catch what everybody was saying because of the microphone.  So sorry for that because you do a fantastic work.  Apologies.  Thank you very much for your participation and have a lovely day.  Thank you. 


(Session concluded at 1:07 p.m. CET)