IGF 2017 - Day 1 - ROOM XXVII - WS157 What digital future for vulnerable people?


The following are the outputs of the real-time captioning taken during the Twelfth Annual Meeting of the Internet Governance Forum (IGF) in Geneva, Switzerland, from 17 to 21 December 2017. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 





>> ENRICO CALANDRO:  Those include cybersecurity, cyber trends, what is the level of awareness for users on cyber threats.  Freedom of expression.  Do you use feel always comfortable to share different information on themselves with -- on the open Internet or would they prefer, maybe, to share them all in closed groups, and does gender impact on that in any way or does the level of education have an impact on how people share information online.

So, those are some of the questions that we will try to answer today and to discuss with the panelists.

So, I would like to start with a short presentation by one of our researchers and communications manager, Chenai, who will give us some information on after access surveys.

We are still at the beginning with analysis of the survey.  We just completed the first round in seven African countries.  We waited our results, so this is the first time actually that we are presenting our findings publicly in a public forum.

And then we have with us would like to briefly introduce our panelists.  We have Wakabi Wairagala, the director of CIPESA. He is a think tank and democracy organization working on Eastern African and Southern African countries on digital policy issues.

We have Alexandrine Prilot de Carbion.  She is Advocacy Officer at Privacy International.  Apologize for my pronunciation.

     And, then we have Ursula Wynhoven, who is an ITU

representative to the United Nations in New York.

And, then we have Yatanar Htun, who is the director at Myanmar ICT Development Organization.

And, then we have -- sorry.  We have another speaker.  Jorge Vargas is head of strategic partnerships for Latin Region Media Foundation and Aloni Hecock, co-director of the Center for Internet and Society in India.

Okay.  I would start the presentation, so Chenai, you have the floor. 

>> Chenai:  Okay.  While the presentation is coming up, morning everyone.  So, thanks Enrico for the introduction.  My name is -- ah, we've got it.  The presentation almost disappeared (Laughter).

So, thank you for everyone.  I'm just going to run quickly through this presentation as we're actually late.  So, I'm going to start from the second slide.  So, I mean, the purpose of the research for us as Enrico has pointed out is trying to understand with all this move towards access we've actually got a developing digital rights divide also people with access to their rights and who understand what their rights are online, where as there are now people coming online who don't understand the extension of the human rights off line to the online space.

Next slide, please.

Sorry.  Could we do full screen, please.  Okay.

So, that just presents that the high levels of differences in Internet use between urban and rural, so that takes into account that first you can see that the total represents the levels of Internet use within those countries, and then when you take a look at the rural and the differences, you can find off the rural population that we interviewed, 15% in the case of Guana, only 15% were making use of the Internet.  As you look further

you can find most people who actually make use of the Internet are online in urban areas.

Next slide, please.

And then, once again, one of the big issues has been around the gender digital divide.  It's still a persistent issue in these countries that we surveyed, and we found like in the case of Rwanda, only 5% of the population we interviewed that were female made use of the Internet.  So, we still have striking measures in differences in terms of level within use within the Southern African countries that we surveyed.

Next slide, please.

So, while we are fully aware that there is a gap between rural and urban and male and female, we posed the questions around privacy concerns where we wanted to find out if Respondents -- one of the barrier that we perceive might limit Internet use might possibly be privacy concerns and what we found is that most Respondents did not see privacy as a big barrier to Internet use in South Africa and 0.47% of interest yet, only .47% cited privacy as a concern.  With exception of Kane and Rwanda, they cited privacy of Internet use.  So, we can see the gender digital divide is actually playing out when it comes to privacy concerns.

Next slide, please.

So, in this slide, this actually just hi lights the aggregate that we found from Kenya and South Africa in terms of challenges of Internet use, and I think only in Rwanda and South Africa we can see a slight he different percentage of people, below 10%, who were concerned with privacy.

Next slide, please.


So, another issue that we looked at was around freedom of expression in terms of what content people were comfortable with putting online, as well as what content people were comfortable with discussing.  So, what we found interesting is that in Kenya, people were comfortable talking about sexual orientation, where as in South Africa, where the legislation is open for people to be comfortable with sexual orientation they're comfortable, only 11% were comfortable talking about their sexual orientation online.

Once again, South Africa is also a significant case where one would consider the environment to be open for everyone to talk about political issues.  Only 28% of the people were interviewed were willing to talk about, express their political views online.

Next slide, please.

So, then we thought about a case of self-censorship that leads to people not being able to talk about the issues that -- to talk about some issues online, in particular religious matters, political views, health issues or -- and we through in an interesting one, gossip among friends.  I call it interesting (Laughter).

So, what we found was that in specific countries, for example in Nigeria, people were least comfortable to talk about financial matters. 

68%, and followed by political matters, and health and sexual matters.  Religious matters were the only ones to be likely be discussed in public, and by public we mean in any public forum.  They didn't have to be in a specific group.

And then gossip among friends is likely to be discussed within closed groups online.  So, the specific Facebook pages or WhatsApp. 

Then, in Rwanda the issues people were least likely to talk about were political, health matters, and financial information.  Once again, gossip and work-related issues were more likely to be discussed in closed groups.  So, this raised for us an interesting point that it seems that yes, to some extent some people are comfortable with talking about issues online but it's in a moderated space, in a space that they feel comfortable with, not just any public platform.

Next slide, please.

So, I don't know if you can lower it a little bit so people can see the title of that slide.  Thank you.  So, once again we looked at how come fort table are you to discuss gossip between friends in all of these countries, and this is in correlation the levels of education.  And, what we found was that people with lower levels of education, that is either no education at all or primary education were least likely to be comfortable to talk about certain issues online.  So, this particular case it was gossip.

Next slide, please.

In this particular case, it was the second slide looks at comfort to discuss professional web related information.  People were not likely to discuss that publicly.  But what we can see once again is the trend around being able to talk about issues in closed groups rather than in a public platform.

Next slide.

And, then, religious matters was something that was interesting for us, as well, and what we find that once again in terms of education, when we related to educational levels, people with a lower level education are unlikely to feel comfortable to talk about religious matters online.

Next slide, please.

So, one of the questions that we also asked was around have you been a victim of online bullying, so when you look at the differences, going to have to explain this one quickly.  So, t blue represents meals, the red represents females.  And, the two categories in each Country are either no or yes.  And, what we found was that there weren't as high reported incidents around online bullying from the surveys.  So, this is what the responses would be that have you ever been a victim of online bullying and most of them would say no.  I think interestingly enough in the case of Tanzania we see that there is actually quite a high-level response of in comparison to the other countries, Tanzania and Ghana, of yes, I've experienced online bullying, and in Ghana it seems as if it is the men that is the reported online bullying.  So, that is something we need to further impact and see what level education, what age is this that is experiencing this level of online bullying.

Next slide, please.

So, have you ever been confronted with unwanted offensive or inappropriate content.  So, this is a question that emerged from an earlier study that we did.  Qualitative study where we had asked people what were the issues that they faced online, and a lot of them were talking about offensive, inappropriate or bad content.  They did not really quantify to say what exactly is bad content, but they described it as it could be bad content in terms of pornographic content or be bad content in terms of anything that made them feel uncomfortable or went against their values.

And, in this instance we looked at male and female -- sorry, had to change my slide.  And, you can actually also see that a lot of the times in some instances women in the case of Rwanda, women were more likely to say no, they haven't experienced any unwanted content, while men were less. 

Men actually had experienced more exposure to unwanted content.

Last slide, please.

So, in essence, what we've just done is present a teaser to some of these issues that are emerging, and they still need to be unpacked to really understand who exactly is saying that they are experiencing these issues online, and perhaps the other question around it is also trying to understand the way in which people define privacy concerns or bullying online, because what may be described in a quantitative survey may be different in a qualitative survey.

Thank you.

>> ENRICO CALANDRO:  Thank you.  Thank you very much, Chenai, for this interesting presentation that sheds light on really how people perceive some of the digital rights online, right.

Most of the time these kinds of discussions are informed either by a legal perspective, so when we discuss privacy, freedom of expression and cybersecurity, cyber threats, we normally addressed these issues from a legal regulatory perspective.  Always difficult to bring in the evidence from users.  There is some anecdotal evidence yeah, anecdotal evidence in the chilling effect, but it's never been really measured.

My first question for Wakabi, you in Eastern Africa and Southern Africa, you work on digital rights issue, both as a conducting research and as advocacy group.  So, from your perspective, how can this kind of evidence inform policy makers, and how this kind of evidence can bring users voice into policy making debates on digital rights? 

>> Wakabi Wairagala:  Thank you, Enrico.

The advocacy is really difficult to do if you don't have evidence to back it up.  In our areas of operation there is a short age of evidence inform the kind of (?) which is going to have an impact with policy makers, but also with others who might want to do such efforts to drive the advocacy.  So, it is a real important to establish what this status is of the issues that we are advocating for, and also come up with possible solutions before we are able to do the advocacy.

Research on these matters, I think, has not always been a whole so many in terms of addressing more traditional access issues in terms of pricing, in terms of affordability, and linking that to issues of what is more mainstream taking digital rights such as privacy and free expression and access to information.

I think what this research is doing, which would be very useful for people working in our field is that it brings in both of those aspects.  These are the costs, this is what is hindering people, this is where they feel they cannot speak, this is what is, you know, sort of offending those who are online.

So, I think once we have both of those dimensions, then it becomes easier for us to be able to advocate.

If I may say a little bit more on that.  We have a lot of, for instance, broadband policies in our origin.  Most of the countries actually have broad band policies, but in reality, with very fine (?) but in reality, they have done next to zero in terms of making those policies achieve the status that they said.  There is little that has been done to reduce the cost of data, the cost of air time.  There is little that has been done in terms of getting modernized groups to (?) benefits and use the technology and afford it.  The agenda targets, but in reality, nothing is done in many of these countries to enable more women access and gainfully use technologies.


So, I think this kind of research can also be used in terms of advocating to have policies such as asset policies and broad band strategies and some of these issues that are being brought out by this research.

Data protection, again, is something which there is such, as mentioned, in (?).  Of the countries, I think Chenai has told us that many of the people did fought feel that the lack of protection of their data was a hindrance to use but those are obviously for whom it is an issue, and this protection is needed from governments, from companies, from individuals.  Because there are breaches, huge once are arising from all those actors, but the countries, we have just about ten or 12 countries that have data protection policies.  Without data protection policies, again, we obviously have issues of free expression, of offensive content, of not being comfortable to speak about certain issues that may hinder people from being able to gainfully use ICTs.

So, in some, we need research that speaks to various issues on affordability and rights and (?) hoods, that is the evidence which we think we can use in order to gain the attention of other actors in this space but also of policy makers.


>> ENRICO CALANDRO:  Thank you very much, Wakabi.

And Alexandrine, from, again, Privacy International perspective, we are expecting now by the end of May 2018 that all African countries are discussing of African issues now need to be complied with a new European regulation on data protection and get many debates on that on how small and medium enterprises and from an African perspective or so the economy will be able to be compliant, right, in order to protect the privacy of citizens.  But, from what we are actually seeing in this presentation, it seems there is very little level of awareness from the majority of the population and what privacy is.

So, can you see any risk on that?  And, what do you think that, you know, from a really European regulator perspective, why hasn't the use perspective taken into account or has it been taken into account?  Do they know what the situation is for the majority of the populations in terms of the effect that there is not awareness on privacy and what kind of repercussions do you think it will have once we arrive on that day when we will be all required to be compliant? 

>> Alexandrine Prilot de Corbion:  Thank you very much to the organizers first in inviting us to be part of this conversation.

Also, a short disclaimer.  As privacy international, we're based in London, but we work with an international network of partners and we have five partners in Africa, which is the work that we do with them that I'm going to be reflecting upon in the discussion now.

In terms of the different question posed, even though people may be surprised to see the low level of concern in regard to privacy in the concern presented, it is actually something we see quite a lot, not only in Africa, but in Europe and other regions, as well.  And, I think it all comes down to how we talk to people about privacy and data protection, because we can be doing surveys and saying, you know, do you care about privacy or do you know what privacy is, but unless we break it down to what people can relate to, they won't be able to answer that question and be fully informed.  And, like I said, we work with an international network of partners of around 20 organizations, and this really reflects that people are concerned about privacy.  All these organizations are doing incredible work in the different national countries, because people are responding to some of the things they're seeing, but they're not sure how they can respond and what they can do about it.  So, I think another element when it comes to the lower levels of awareness that we're seeing, as well as it all comes down to some of the systemic problems that we're seeing around the lack of transparency and accountability of both government, but also corporations.  So, how is an individual who is concerned about something if they're not even informed or they don't even know it's happening?

And, particularly when it comes to some of the policies that we're seeing creep up in different countries and not just in Africa again, but in terms of expensive surveillance laws in the name of terrorism, in the name of national security, but also, you know, policies and practices such as SIM card registration, Wi-Fi registration, these are all things that are having an impact and interfering with the right to privacy, but individuals are not informed in the way that this is interfering with their right to privacy.  They're saying you need to do this so you can access this service.  And, I think we need to challenge the way in which we're having these discussions and to put it into perspective as to what the implications are of such practices and policies.

Another aspect which is a challenge that privacy advocates face a lot when undertaking their work is often when it comes to privacy and data protection data surveillance, you only see the implications, the harm once it's occurred.  And, that's when people start caring, usually, and it shouldn't get to that.  We shouldn't have to demonstrate the harm of something for policy makers to be taking into account the needs and the perspectives of users.  We should be working in more preventative way than waiting for it to go so far down the line that you actually have victims.  So, that's probably in terms of the GDPR kicking in, and it's been really interesting to have these different conversations with international actors and some of the policy makers working on drafting data protection laws, because, as said by our colleague from Suposa, there are only 14 countries in Africa with data protection laws.  A lot of them are looking what is happening in Europe, about you elsewhere.  Seeing a colonization 2.0 in terms of European entity impose ago framework of how they should be running their affairs, but at the same time, I think it gives an opportunity, as well, to be even though there are still problematic areas with the GDPR of showing what is currently the highest standard in terms of data protection, particularly when it comes to informing citizens more transparency from both government, but also private actors, when it comes to issues that are not visible in this fear, be it profiling or tracking, the use of algorithms as well or to make the decision making, and particularly the emphasis that GDPR has heard given more rights or at least providing mechanisms to inform the rights of data subjects.  And, these all are good practices, I think, that we can be integrating into the work that we're doing at the national level in terms of reforming laws that already exist in these 14 African countries where this level is not at the standard we would want to see it, but in the countries that Uganda and Kenya where there are still discussions and drafting processes to be integrating the high standards we could.

We could be using the GDPR language and saying, you know, if you don't adopt the GDPR standards, you will not be able to trade Europe, you will not be able to do all these things, but I think at the same time we should focus on what the GDPR brings, rather than focusing on the treaty, you know, what the legal instrument itself is and saying, like, these are the fundamental principles you should be adopting when drafting and implementing a data protection law in your different countries.

And, maybe just a last point, as well.  I think what we're seeing in our engagement with our partners, as well, is how the issue of data protection and privacy is framed, as well as a human rights issue.  Very few national human rights institutions are actually integrating issues of data protection and privacy within their work, within their mandate, within their national reports, and as far as I know, actually, it's only the human rights commission in Uganda who last year had a chapter on digital rights with elements of privacy and surveillance and were trying to get them to integrate as well a chapter on privacy and data protection, but I think that all forms the discourse in terms of how people are perceiving these issues, as well.  Is it a human rights issue or not, and I think if we can frame it in that way, it will alert to people that data protection privacy is on the same level that they shouldn't be concerned about as freedom of expression, as access to education, health care, non-discrimination and other fundamental rights.  I think in terms of the framing that can really help push the discourse, but also get people more engaged from civil society, but also different stakeholder groups f we're able to frame it in that way.

Then, just a final point.  I don't know if this is something that was addressed in the survey, but it's also around the choices and alternatives that are given to users in different countries.  A lot of the time these are not accessible, or people are not informed that they have alternatives to the main platforms or maybe have more exploitive business models, other alternatives, but are they always practical when it comes to hardware it becomes more of a luxury to be able to choose which hardware you can use.  So, we need to make those more accessible in that way, as well, both in terms of services, hardware, and software.  The users can choose and make an informed decision as to what they want to be using based on how they're using these different platforms.

>> ENRICO CALANDRO:  Okay.  Thank you very much, Alexandrine.  What are you saying, I think it highlights also the fact that specifically related to this kind of research that there might be, you know, within the population a group, a substantial group of population that is just not informed or does not have enough skills to understand issues around digital rise.  So, on the one hand maybe there is a portion of the population that is more educated, that has got more knowledge on differences between across platforms or hardware and knows how to protect himself, the first level of cybersecurity, and therefore they are able to somewhat to enforce this kind of rights at the individual level.  There is a big portion of the population that instead might be completely left, you know, out from that, and they might be really risks of vulnerability.  So, those are new and emerging debates that historical has not really been discussed.  But, for instance, Alan, from your perspective, Center for Society works on these issues in Asian countries in India in particular you might have this portion of the population, right, as well that might not be prepared to enforce their own rights online.

So, how a group like yours can try somehow to include them in the debate, make them more aware of what their rights are and to protect them online, even if their choices are less than enough knowledge on what platforms and alternatives are and they might not be -- might not know how to avoid cyber risks that are there? 

>> Panelist:  Thank you.  Thank you for that presentation, as well.  That was very interesting.

So, a couple of things.  Just to echo a little bit of what Alex was saying, I think it's really important that language that we use when we talk about privacy and we try to ask people do they care about privacy online, off line, because it can be understood in a variety of ways, especially something like privacy, because it can be so personal.

It's a little bit different when you're talking about access.  You're taking a survey and trying to measure do these people have access to the Internet.  That is a very easy thing to do a survey on.  We have always found that doing a survey on something like privacy or freedom of expression is difficult, because it can -- there is such a wide variety of interpretations and reasons why, you know, somebody might be responding a certain way.  So, we have looked at surveys that have been done in Indian context and tried to contextualize those with legal frameworks, with political situations, and then also see how those surveys can inform policy.

This is at an interesting place right now in respect to privacy in particular.  We just recognized privacy as a fundamental right, and they're considering, or they're in the process of drafting a privacy legislation.  And, one of the debates that is emerging is if India needs to take a more paternal list stick approach given, perhaps, low levels of awareness and the variety of awareness that you'll find across sectors in India.

And, so, that's really interesting to me, because in some ways it's taking this kind of data, looking at levels of awareness, and then trying to say what is the very high-level approach to a privacy legislation, that we need to take and that would be appropriate for India.

I think these numbers can also be useful when you're looking at education gaps, and so with any right and enabling any right, I think there is a massive amount of education that needs to go into it.  When India recognized the right to information, they had huge campaigns about, you know, enabling people on how to use this right, what does this mean, what can you ask the government, and now it is a very robust right in India, but it took a lot of groundwork.  So, I think if you, you know, it would be interesting to dig into something like privacy and awareness around privacy and really start to understand how are people using services and where are certain education gaps and using that to guide, you know, very large amounts of education.  I think that could be really interesting.

Maybe I'll go ahead and stop there.

>> ENRICO CALANDRO:  Thank you.  And, Jorge, Wikimedia Foundation represents one of the main platforms to access and to share knowledge, but I think what we can see from this platform is actually different groups of people feel comfortable with sharing certain kind of information or not.  And, maybe might wrong perceptions on how the platform works and therefore they might feel a more comfortable with contributing with some content rather than other content, and as a result I believe might shape actually how the kind of knowledge is produced and how it's shared.

Does Wikimedia do any work on inclusion, on inclusion of different groups and on making them feel comfortable and safe while they produce online content and access local content, some content can be considered inappropriate according to different cultures and beliefs.  What is the role of Wikimedia on that respect? 

>> Jorge Vargas:  Well, first of all, allow me to introduce a little bit of the Wikimedia Foundation and how Wikipedia works.  I hope that you are all aware of the existence of Wikipedia.  Wikipedia being a free on-line Encyclopedia that has been around for over 15 years, and it's fully created by millions of volunteers around the world that research, edit and curate the content.

The role of the Wikimedia Foundation is to basically be in the background, keep the sites running, maintain the websites, maintain the applications, do the partnerships, do the legal work, but at the same time to support our community.  Definitely supporting our community is taking into consideration these different issues, like how to make people feel more safe in order to be able to create content, how people cannot have any privacy concerns when they're accessing certain kind of content, because at the end of the day, we care about access to knowledge, and access to free knowledge, and as people start going online or people start using the Internet in general, in order to be able to have real access to knowledge, we think that they need to feel safe, they need to feel that their privacy is being considered, and they also need to feel that they can be part of knowledge creation.  Sometimes there is the perception of knowledge being created from above and not something that you can create oneself to share with the world, and that's definitely one of the big tenants that we in the Wikimedia movement try to foster and push with all of our communities.

A little bit of a plug to talk about our movement and the communities in Africa and how this research and all of this information will be very relevant for them, and for us.  First of all, I should congratulate you for this wonderful research.  I think that it's a very innovative and new approach to start asking this kind of questions that we need to know from people in the ground that rather be creating or making assumptions sitting in a desk in San Francisco in order to be able to actually serve our community.  So, this is wonderful information.

In order to talk a little bit about, like, the work that we have in Africa, we can say that we have a thriving and amazing community emerging in Africa.  For instance, our annual celebration or annual big conference called Wikimania, the changes year, next year is going to take place in Cape Town, so all of our community is very excited there.  We have a lot of work being done in Nigeria, in Ghana, in South Africa, and emerging communities in Tanzania and other place these are starting to care more and more about knowledge creation.

So, because this communities are starting to foster this local content creation, and start getting involved into access to knowledge, I can share a little bit within the Wikimedia foundation do in order to protect those rights for people to feel safe.  Our first and most important thing is a privacy policy that we worked for with the Wikimedia foundation.  We take privacy very seriously in the sense that we want people to feel that anonymity or (?) is okay when posting something, when reading something, and for this we have very minimum data protection, data collection.  We don't request people to sign you, we don't need people to share their personal information, if they want to create content, or if they want to read any content.  As far as the content itself, has a proper research, proper citation, has rules of neutrality, verifiability.  Doesn't matter who is posting it or who is reading it.

So, this privacy policy was created by a lot of the volunteers that basically started stating what their privacy concerns were, what their privacy needs were, and how the foundation should create its own privacy policy based on that and not a privacy policy that was built from unilateral perspective from the foundation in San Francisco.  So, that's going to be one of our big ideas that we've done there to protect privacy and data protection.

At the same time, in order for people to feel safe, we try to foster different programs that aim to address issues of harassment, and for this we have several training programs around anti-harassment.  We try to have code of conduct and friendly space policies both online and off line in different kinds of events, but still all of the things are issues that in emerging communities need to be -- there needs to be more awareness of, right all these policies are clearly informed by people that have been online and Wikimedia for over ten, 15 years, and some people not necessarily not aware of the issues that could exist around privacy or safety when being on Wikipedia, and that's why we try to not only care from the beginning to have them in the safest environment possible, but to try to educate and raise awareness, and for this we do a lot of work with our community.

Finally, and just to kind of like leave a final teaser and allow room for more time, we also do a lot of partnership work.  My team, particularly, partnerships in the global reach team what we do in the foundation is to find partnerships in order to foster inclusion of those people that are being left behind for any different reasons, right.  It could be access barriers, or it could be cultural barriers, and for this we look into different programs that address, for example, the gender gap and the creation of content for women and by women.  We support different efforts and partnerships not only done by international organizations, including ITU with equals or the web foundation.  Also, initiatives in the communities like a recent contest that I love sharing, it's called (Not translated) or the comp women that you have never met.  It is this idea that was based in Latin America by our community to just encourage people to create content about relevant women in their countries that was not already in Wikipedia.  To also address the gap, the gender gap, not only of people participating, but of content available about women and relevant to women.

I'll leave it there, and thanks again for this invitation.

>> ENRICO CALANDRO:  Thank you very much, Jorge, for this very interesting intervention on how Wikimedia is trying to create awareness on one hand and very inclusive space on the other hand.  And, presented some research results on the gender differences in terms of digital rights.  And, our sister organization in Asia has conducted similar research in southern Asian countries, and we have with us Yatanar from Myanmar, who has conducted some research especially on gender differences, and Yatanar would like to share some findings on the Myanmar from a Myanmar perspective on the gender gaps and perceptions of these rights. 

>> Yatanar Htun:  So, I have two points here.  So, before that I want to introduce myself.  I'm Yatanar from a local organization called Myanmar ITC Development Organization, together with Lan Asia we conduct some researches around digital rights.

So, based on two national representative surveys of ICTUs where we surveyed about 12,000 people in Myanmar in 2015 and 16, the agenda divide in Mumbai for ownership is significant many Myanmar and has been around 28% in both years.  Sorry, I don't have a slide here.  So, yeah.

So, 28% in both years.  So, that means that is nothing change in gender divide within one year.

And, it's surprising, because Myanmar is a society that give a strong place for women, and like in other countries in Southeast Asia in that aspect.  And, our qualitative work from 2016 shows that women are the final decision makers indeed in their family, so everyone in the family gives them the money or ending at the end of the month, and she has to locate the budget.  So, she will indeed decide how much to spend on the phone and when.  But, because she doesn't have the technical knowledge or network to get this information, she has to rely on men or friends or son or husband to go to the shop and buy the phone.  And, the phone, and then the phone or friend travels with man or son or daughter from the family who is working or studying outside of the Country -- outside of the home.  So, t woman, who mostly work at home, doesn't have access to it.  And, so, even in a society like Myanmar where women are more empower, compared to India or Bangladesh or Pakistan in the region.  Still women are at a disadvantage in having access.

Then, we also talked to (?) people who are current user of Internet.  We did some focus group and interviews with them.  We interview men, women, trans gender people.  These people are included in our sample.  And, we saw -- what we saw is that even when online women face different types of problem, and therefore, have to behave differently to men, because of problem.  Like, for example, women were harassed online, for example, they say like can you lift your skirt, take a picture and send it to me, you know, like that kind of comments from strangers that they receive.  And, some women repeatedly said that say how images of women are photoshop.  So, as it always happens, they photoshop into nudes, editing their bodies to, you know, and then post it online.  So, those people we interviewed, they said they don't post their photos online, or sometimes post only a photo of their face, assuming that these are (?) to photoshopped.  So, self-censorship is very common in our Country.  Men never mention this as a problem.  I mean, according to our research.  And, some of the women we interviewed, they said they often use male identity and adopt (?) personnel and they check male in gender box if gender is asked, especially on social media and Facebook.  So, they think this is the way to participate online and access information while not being harassed based on gender.  And, the same as the previous point, men never mention this as something they do.  And, sometime women often post a photo of their husband and kids in their profile picture, even when they identify as female, because this is a way of signaling, you know, the world like I'm a woman, but look, I have a family and I'm married and have a child and they want to seem as a way to get harassed less.

And, sometimes women don't even bother to open an account, they just use their husband's account to access Facebook, because they think -- because they feel like it is safer.  So, these are some of the points we found from our research.  I mean, men, of course, had problems mostly, but those problems are mostly related to their phones or to their accounts, but none of the gender specific problem were experienced by those men, I mean according to our research.

>> ENRICO CALANDRO:  Thank you very much, Yatanar for this very enlightening finding, because it really shows that there might be, based on this kind of differences of risks if you're a man, women, or trans gender person, there might be also a need really of developing maybe different tools, strategies or advocacy campaign to literally try to meet needs of different groups of people.

My last question, last but not least, is for Ursula.  Wakabi briefly mentioned that, you know, one of the relevant issues on ICT policy, especially in Developing Countries in Africa historically has been broad band access, and the ITU has done considerable work on that respect, particular would like to mention the part (?) measuring Information Society, and these very important contribution on trying to understand level of access and use of ICT across the population.  Both from a demand side and supply side perspective to really try to develop formed point of policy intervention, right o this evidence.  But, as we see as more and more people move online, there are new emerging and relevant issues.  For instance, on cybersecurity, privacy, freedom of expression and so on and so forth.

So, what do you think, would that be relevant to include some of these new digital rights, call them like that, indicators in the world that ITU has done on measuring the Information Society?  Would that be relevant?  Not?  Is this actually terrain that is not covered by the ITU and maybe other organizations might do it better?  What do you think?  What is your view on that? 

>> Ursula Wynhoven:  First of all, thank you so much for inviting us.  We are really pleased to participate, and congratulations for the research that was done.

One of the challenges in all of these areas is lack of data, including sex desegregated data, including so you mention there can be more targeted policy interventions, et cetera.

For those that don't know ITU, we're the Intellectual Technology Union, specified agency been around since 1865 when "T" stood for telegraph, so been around a long time.  Specifically, what we do, we are really focused on trying to bring half the world that's not yet connected online.  And, as we all know in a sustainable development goals, which settle the different goals and targets for 2030, one, however, which is set for 2020, just three years from now, is the universal connectivity.  So, we're very focused on that, and in particular looking at issues of digital divides, including the gender digital divide, and also trying to unpack what other rationales behind these divides, which are holding people back in accessing ICTs and all the benefits they can bring full array of the sustainable development goals.

So, for instance, some of the issues that have been found as we know for the digital divides include things like cost and relevance of content, culture norms, location literacy level, and importantly for the work you're doing, also safety and security, time and even competing priorities, lack of electricity often in a lot of cases, lack of self-confidence to use technology and digital resources, and of course, concerns about safety and security can also impact that.

And, these kinds of issues can disproportionately affect women and girls, and so remain important determinants for benefiting from the full array of opportunities that the Internet provides.  And, some of these concerns are other speakers have highlighted include concerns around issues like harassment online.  And, even in terms of violence or harassment that women may experience and even accessing, purchasing a phone if they have to buy a phone from a man, and they may be asked for passwords, et cetera and the reported instances of women having negative experience in those kinds of context.

So, definitely exciting work also going on in terms of for instance providers specifically working with women franchisees and promoters, so women can than gauge with other women and reduce these concerns which can bring a lot more women online.  A key statistic around this is at the moment there is 250 million less women online then men, and in a lot of cases also when they even are online they're using it less.  So, there are definitely a lot of issues around this.  So, just to underscore again that having more de-segregated data, having more data is incredibly important to be able to have these more targeted approaches and is really useful for all societal actors concerned.

I would also just mention that, as you know, the UN system, of course, consists of many different entities with complementary mandates.  So, some of the themes that you talked about and flagged, such as freedom of expression and privacy, et cetera, of course worked on also in the context of the human rights council and the office of high commission of human rights and special wrap paw tours looking at these issues and looking at

the questions of these issues in the digital world.

And Governments themselves in a resolution that has just recently passed in New York called ICTs for development.  Also in that resolution underscore that people must have the same rights online that they have off line.  So, sometimes what we're seeing is kind of paradox that ICTs, of course, bring so many opportunities for realization and enjoyment of rights and improvement of lively hoods, et cetera.  In the case of some areas, such as gender equality, challenges that women face in the analog world can be accentuated online, but at the same time, the ICTs also offer the opportunity to actually help address some of the gender equality challenges that women face in the analog world, including through access to education, livelihoods, financial services, building new communities, that can make it a lot easier to access these than they have in the analog world.

So, there are those different dimensions, and certainly as some of the other speakers have mentioned, as we see these new technologies becoming available, like AI, et cetera, Internet of Things, Big Data, some of these opportunities and challenges also can be accentuated.  So, one of the things that happens in the ITU context is a lot of standard development in a multistakeholder context where Governments and others come together to discuss key issues, like cybersecurity and technologies that can enhance that, and similarly on the privacy piece, too. 

So, thanks again for inviting us.

>> ENRICO CALANDRO:  Thank you.  Thank you very much, Ursula, and thank you also for highlighting the importance of good quality and relevant data for targeted approaches.  And, actually, the value of the demand side data that it allows to de-segregate these kinds of indicators across different demographics, gender, level of education, level of income.  It won't be possible to do in any other way.

I would like to open up the microphone to the floor or maybe to somebody remotely.  Is there any question or any comment on what we discussed this morning?  Any contribution?  Yeah, like to briefly introduce yourself as well.  Thank you. 

>> Audience:  I'm Chu, and from Fiji, and I represent the small island developing states in the Pacific.  The research is really good, I just wondered if your focus was on vulnerable people.  So, did you focus a lot on the persons with disability, one, and communities like agriculture sector, health sector?  Did they participate in this research?

And the other -- can I ask another one?  In terms of best practices and good practices, I think I'm a bit short, in terms of good practices and best practices, have you guys done any outreach or engagement plan on data protection policies?  Like you mentioned something about making end users understand what it really means, but have you got any best practices on that?

>> ENRICO CALANDRO:  I would like to take maybe another one or two questions, and then we will ask our panelists to answer.


>> Audience:  This research, it's very, very good, and we can see, like, differences between men and women, but in term of vulnerable people, in your research, have you included LGBT groups, like -- and also because in gender, we also have men and women, and other groups also have it.  And, also it is important to include in a human rights perspective, as well.  So, we also want to learn and, like, if there are any findings on the research.

Thank you very much.

>> ENRICO CALANDRO:  Is there one last question?  Okay. 

Maybe, Chenai, would you like to --

>> Chenai:  Okay.  If I miss something, please do it (?) on to me. 

So, thank you very much with the questions with regards to who we were interviewing.  I think what I've just presented is high level where we didn't actually do a cross cut to see did we interview people with disabilities, but I think Enrico, did we have a question on different people with disabilities in the survey?

>> ENRICO CALANDRO:  Yeah.  I don't think we included the


>> Chenai:  So, one thing we are aware with our survey is there are certain communities that we actually do not get to, and that is something -- in particular LGBTI community, we worked on gender research, and one thing that we do point out is that surveys can only go as far as they can go.  There are certain communities that you actually need to have specifically designed research that goes into those communities and asks the targeted questions.  So, with one point with the surveys is just to get that nationally representative data.  But, when it comes to actually like LGBTI community and specific gender issues moving away from the -- we actually like to say what we're saying is de-segregated data, it is male and female.  We haven't got into the real juicy gender issues, but what we do then is take the research a step further by having qualitative interviews, which is what I really work on myself.  And, one area that we actually want to take this research further is to look at women within refugee communities or refugee camps, because what we're trying to argue is that the data that we have at this one level represents just a particular subset of people that we can reach out to.  But, in, like, certain African communities, it is difficult to go in and ask, so, what do you identify as -- whereas that community does not want to talk about people having different sexualities at all.  Having that one question do you want to talk about sexual orientation, we didn't probe it further to say are you then going to talk about, then going to identify as not being the accepted norm in your community.

And, then, when it comes to specific agricultural communities, we do have a subsection of questions that tried to ask people what is it that you're doing.  So, we capture your daily activity, and then in the next, as we take the research further, we can actually do across analysis to say people who conduct this particular activity, are they then concerned with this issue of privacy.  And, we also have a youth study, as well, which I didn't mention, but we, that's another thing we're trying to understand for young people who are driving Internet use in the continent, what are the issues that they face online.

>> ENRICO CALANDRO:  Okay.  Is there anybody who would like to answer to the question on advocacy. 

>> Maybe your question about best practices.  There is a project call the ranking digital rights project, and ranking digital rights project, and it's based in the U.S., but it's ranking 22 of the world's largest ICT companies, and if you look at their methodology, it is a very comprehensive compilation of best practices in terms of what company’s practices and policies should look like to enable user’s digital rights.  So, it goes into, you know, essentially what would a user -- what does a company need to disclose to enable a user in terms -- in privacy policies and in terms of service.  So, we've been contributing to that, and I think it is a really good place to start when you're looking at best practices.


>> Wakabi Wairagala:  On the issue of privacy and data protection, last month the organization I work with helped draft a report on best practices, but also looking at the model on data protection in Southern Africa, there is another one in west Africa economic block, and then also in east African region there is some guidelines on data protection.  So, what we helped do is to draft a paper that compares the three, but also looks at international best practice, there is nothing like that, and then to recommend what we feel is not right with some of these regional instruments, but also some of the national data privacy laws.  That paper was presented at the African union ministers of ICT meeting last month.  Eventually the idea is that we'll have that finalized, we are the second actor.  There is a principle actor and then the brides made (Laughter).  So, to finalize it, and have African union accept and do that, and then we can be able to use that as an advocacy tool within the different African countries.

>> ENRICO CALANDRO:  Thank you.  Thank you, Wakabi.

    Do we have any other question, or we've got actually only

one minute. 

Would you like final one? 

>> Audience:  There is a lot of work done in (?) and I was just wondering (Audio pause)

(Captioning will conclude in a couple minutes)

>> ENRICO CALANDRO:  Thank you.

Our sister organizations in Asia conducts research on a few certain Asian countries.  I do not have the list with me, but I can easily let you know to see the countries that you're interested in are included in this kind of research.

And, is there any final remark from our panelists? 

>> Maybe on the question around Pacific and Caribbean.  We work through a network of partners and we don't have partners there at the moment, but we have been monitoring what has been happening in Trinidad and Jamaica as well around data protection laws that are being drafted.  So, if you come across a text and, you know, please get in touch and we can -- we're drafting -- we're going to publish early next year a framework that allows  non-practitioners, non-lawyers to be able to assess data protection laws to see what to look out for.  So, we are going to make this tool available, but we're always happy to provide input, so yeah. 

>> Ursula Wynhoven:  I want to mention that we are just about to embark on a project with a partner where we're going to look at all broad band policies and digital strategies and look for all the different vulnerable groups that are mentioned in the SDG and see what the practices and policies that Governments have around the world, to be able to pull them together so that

Governments can also be inspired by different practices that they see already being in place around the world.  And, then, finally I wanted to mention for those who are interested in the gender digital divide, which is hopefully everybody (Laughter), I wanted to flag that there is an online consultation going on right now of the ITUs council working group.  They're accepting online submissions 23, December 2017, and you can even submit PDFs or links to your existing work, but I want to flag it because I think what is really important is that even just the sheer number of submissions that are made help convey to Governments the importance of these kinds of issues.  So, just invite people to consider using this channel as well as an opportunity to flag the different issues that they're interested in working in -- on working on and that they're already doing work on.

Thank you.

>> ENRICO CALANDRO:  Okay.  Thank you.

(Captioning concluded)