IGF 2022 Day 0 Event #75 Future of a Female Web

The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> I work with the organization for security cooperation in Europe.  We also work a lot on enabling safe online freedom of expression for all I will be part of the Best Practices Forum on Friday, I look forward to hearing about things to take to the Best Practices Forum. 

>> Greetings, I'm a researcher and media practitioner, I'm interested in the issue of male and female.  I'm here to obtain this conference and hope to benefit.  Thank you very much. 

>> Hey, everyone.  My name is Mariana, I'm from Brazil, I'm here through the youth President program.  Thank you. 

>> Hi, my name is Giovanna, also from Brazil and attending IGF with the Brazil youth program.  I'm happy to be here.  Thank you. 

>> Hi, everyone.  My name is Anna Carolina.  I'm also from Brazil, also a member of the youth Delegation from the CGI, I'm very excited to be here and listen to all the considerations that you have about the theme.  Thanks so much. 

>> Hi, I'm from Rwanda and I'm also part of the IGF Committee and Rwanda IGF Committee. 

>> (Off mic)

>> Hello, everyone.  I'm here because I think the title Future of a Female Web caught my attention.

I'm here to hear what you say.  Thank you. 

>> Hi, good morning. (feedback)

Thank you. 

>> Good morning.  My name is Sandra Aceng from Uganda, I work with WOUGNET, and the title of this session speaks to my work.

>> Good morning, I'm Tina Power, I am a human rights attorney in South Africa I work on litigation on questions of online harm and online gender‑based violence.  It is lovely to be here. 

>> Good morning, everyone.  My name is Patiba, I am happy to be here.  My organization looks at the idea of looking at spirituality and values especially in the web front and also women's development and leadership.  Thank you.  >> Good morning, everyone.  My name is Moira, I'm from Morocco. 

(Feedback)

     >> (Feedback)

For the young people.

To accelerate impact.  Thank you. 

>> Good morning everyone, I'm Moira Whelan I lead the NDI and we have a presence in more than 50 countries.  A significant part of work is addressing online violence and politics.  And we serve on the Advisory Committee for the global partnership to prevent online violence against women. 

>> Hi, everyone.  My name is Malavika Rajkumar I work with IT For Change in India, I am a lawyer.  I work specifically in the areas of social governance platform and gender‑based violence.  We have been working with the wonderful partners here for a very long time.  It is safe to say that the problems haven't really changed.  And we're still trying to find unique solutions to it.  So really happy to be here and discuss.  Thank you. 

>> Hi, good morning to everyone.  My name is Lucía León I'm from Peru.  A work for a Civil Society organization named Hiperderecho.  We have special interest for gender ‑‑ the intersection between gender and technology.  Thank you. 

>> Hello, my name is Eduardo Carillo I'm from Paraguay with an organization called TEDIC.  We do the traditional digital rights issues and have a gender lens, particularly on online gender based violence and we're interested in violence in political context both for political women and women journalists.  Yeah, that would be it. 

>> I am with the pan‑African digital rights organizations.  Originally from Zambia.  Happy to be here and meet you all.

>> Hi, I'm Sandra Aceng, I work with WOUGNET, our work is around women's rights online and strongly around the intersection of gender and ICT.  I'm happy to be here.  Thank you. 

(Feedback)

>> MODERATOR: We are live streaming to YouTube we don't have our virtual participants able to join us, but I think we can begin if that sounds all right to everyone.  Slides, you want me to say next slide or is there a way I can advance it?  Great, I know when a few more people are here and in our breakout rooms, everyone can join each other.

Tim Berners-Lee, the reason he decided to create websites is because the Internet was for military and few elite academics to connect to each other.  He thought what would it look like if the whole world could actually speak with each other?  How could we convene, how can we share creativity and how could we make this technology available to everyone?  Next slide.

So at the Web Foundation, our mission for the past 13 years has been working towards that.  What does the web look like where it is for everyone?  As almost everyone knows who is here today, that's a lot of work.  (Chuckling).

It is a lot of working toward digital equality where everyone has the same rights online.  It is a wonderful vision but requires a lot of stewardship and a lot of delivery.  Next slide.

Anyone that has come in, please feel free to fill in around the room.  Grab candy on the way in.  We're an early‑morning session and want to make sure you are well taken care of.  Thank you. 

One of the big initiatives is to develop what does it look like to have a global governance, a vision for what the web could be and should be?  So everything the Web Foundation does works in a multistakeholder way.  We brought together

(Feedback)

Visions are great, they really give us a framework of what it is we're trying to get to.  But the question is, how do you actually build it?  How do you drive it and get there?  Next slide.

We developed something called a technology policy design lab.  Similarly, it takes a multistakeholder approach and we take a specific policy problem and we try to build it out and work in this consensus way so whatever comes out in the end is something that is actionable and measurable and something that we can hold organizations accountable for. 

So the first one that we chose was on online gender‑based violence.  And why?  So many of you who work on this issue know this quite well, but the prevalence and the experience of being on the web right now for many, is very hostile place.  Google and the intelligence unit did a global survey in 2019 where they got a report back that everybody who identified as a woman responded and said 85% said they had either directly experienced or witnessed someone they know experience some violence online.  That is a very different experience if you are a man online. 

So what we're working towards is this space where we have a global virtual presence where some people expect to be harassed and expect for it to be hostile.  If you are a female politician, it is rampant.  In Chile, 75% of Twitter messages sent to female politicians are insults or violent hate speech. 

Why would you go online to share your creativity if you know it is a place where you are attacked and harassed?  As we know, the web is increasingly a place where you have to contact business.  You have civic life, you have healthcare, you have to run your campaigns, all of this happens online.  Next slide. 

Why does this matter?  Some of you know Hera.  They're doing fabulous work.  We're losing females and queer because we didn't create an environment to keep them safe.  We want to create a web to be comfortable to those that are in hostility and violence, and not for everyone.  We had people around the world join our lab. 

Next slide.  Thank you.  A lot of the workshops look like this.  If that seems familiar.  Many virtual participants.  Next slide. 

Sometimes look like this.  Which is much more fun.  You will recognize some of the faces of the people in the room that you will hear from shortly.  Next slide.

After working on about a year, we got the four of the large tech organizations to make specific commitments of changes.  So this is Twitter, Google, TikTok and Meta.  They said they would change how an individual could curate their experience online and how to remote violence and harassment.  One year on, we worked with ‑‑ next slide.

We worked with a group to say all right, let's see what actually happened.  You made the commitments.  Show us what has changed.  And worked with the organizations over a year.  And each of these represents ‑‑ or is an image from the different organizations of how they changed the way to curate and the way to report experiences of violence, harassment.  So they asked us, all right, Web Foundation and partners, tell the world all the great work we have done.  Look how much we have changed. 

We said, well, this looks great.  But we don't actually know if anything has changed, because you won't tell us any of your data.  So we don't know if because of these changes measurably there has been an impact on the number of instances of violence online and the severity of it. 

So what can we do about that?  Next slide. 

So we set a commitment to try to figure out how to get that data.  Then we send it to them and say if it is wrong let us know.  So we're trying to get better data about violence online, what the companies are doing and how they're changes and how to support Civil Society globally so they can be their own power base to push back against tech companies and Governments that are responsible for carrying the worked for.  Next slide.

So similarly, working with the women's rights online network and some affiliates, whom are in the room.  And working on stories, key studies, data reports, tech company initiatives, Government and Civil Society initiatives.  If you go to the map, you will see an initial landscaping of policies, stories and initiatives that are happening around the world.  Next slide. 

This is just so that you understand we have actually done the work we have collated 150 other reports, analyzed the data, next slide. 

And what we realize is that a few sort of key early insights, this is a work in progress and we hope this session today will help improve on it and give a bit of feedback.  So many are trying to lobby for change to inform policies.  But how much that has actually been able to reach the result that they're looking for?  Everybody said we feed better data and we need data that speak to our specific Governments and our specific location. 

There is no standardized comparable data on OGBV around the world.  There are organizations that do global assessments, but how do you compare them to others that are doing global assessments.  A lot of this is fragmented work.  That serves to continue and perpetuate the difficulties in the space.  It is vital to take the worked for.  What is not fragmented in the experiences.  How people experience this is similar around the world. 

If you are in the public eye you get attacked more.  If you are a woman of Color in the public eye, you are more likely than anybody else to get attacked than anybody else.  That is universal across the globe.  It means there aren't completely unique ways of tackling this.  We're all in this together.  Next slide.

Again, a current map, feel free to play around.  Next slide. 

I'm going to skim through ‑‑ yeah.  Let's see.  All right.  So there is a lot of push for ‑‑ if you are looking at the legislations that are going through Governments.  A lot of them focus on freedom of speech, criminalization of OGBV, the problems are we have a lot of legislation that focuses on gender but is talking about gender online and doesn't talk about gender.  So we have to figure out how to, if not write new laws, at least educate why these are cross‑relational and have impacts more broadly.  Tech companies will say, as they have with us, that they're doing a lot of work.  But it is fairly Ad Hoc.  And very much spreads people apart.  Civil Society just does a ton of this work.  They really are at the forefront.  They have a lot more information about this space than tech companies or Governments do.  When there is instances of violence or harassment, this is where they go.  They don't go to the platform and say this is a problem because they know they won't get a response.

Civil Society is carrying the work of how do I protect myself online, how do I prevent it, what do I do when it happens?  How do I respond?  This is a difficult and traumatic experience, help me navigate.  They're carrying a lot of work that is not something they are supported for or have the expectation to lead.  Next slide.

Okay.  So ... you have heard from me.  Let's hear from people doing this actually on front lines.  So I will introduce a great group.  They're lined up right here.  See who has the first slide.  I believe is Lucia.  Lucía León from Hiperderecho. 

>> LUCÍA LEÓN: Thank you, Lucía León from Peru.  From Hiperderecho as I said before.  Could you please?  Thank you very much.

The last few years, at Hiperderecho we have started to do specific research of online gender‑based violence.  So in Peru, we did first quantitative and qualitative research on this topic and also decided we wanted to go farther.  So we campaigned some people, five people in specific that had gone through this type of violence.  Doing the activity, desk research and legal research and going with people in their specific cases, we found that most of the time, perpetrators are those one who know the victim intimately. 

And we also found out that most of the time, these cases happen to young women and also in spaces like the Universities.  And through the Universities, they're not ready yet to face this issue and to take the right measures to protect women and make sure women are safe in their environment.  So we have constructed this story, probably a very optimistic one about what should happen when authorities give importance to protect women from their abusers.

We have the story of Maria, a 25‑year‑old University the student grew up with her family in a low‑income family in Peru.  She loves art and has ambitions to be a well-known and respected reporter.  She decided to go to University to pursue this dream.  In the first year of University, Maria met and started a relationship with Luis, a fellow student.  After Maria ended the relationship, Luis started to threaten and harass Maria, hiding behind fake social media accounts.  He started to blackmail her to share intimate material during the relationship if she refused to go back and rekindle the relationship.  This abuse took place in online spaces, the fear that Maria felt started to impact her daily activities and behaviors, including of course, her activities at the University.  She became scared of him appears in the same classes.  She became altering her movements to avoid bumping into him.

Eventually Maria decided to confide in a friend.  In this story, we also tried to present a united front in which fortunately for Maria she was not alone.  Within her faculty at University, there were many other women who were already calling on University authorities to address all forms of gender‑based violence which occur on campus.  The name of this front is the women assembly.  A space which brings together, feminist activists and started proposals of solutions for safe space for women on campus for women to enjoy and study at University life.

Her story is the evidence of the terrible impacts ‑‑ I'm sorry.  Gender‑based violence both offline and online can happen to women.  It is especially in academic environments.  At Maria's University, this cause was championed by a Professor that could influence decision‑makers, in this case, authorities from the faculty.  With the help of the Professor and efforts of fellow students, Maria's concerns were heard and taken seriously by University authorities.  With the University authorities for gender‑based violence need to be heard, they also did not allow him to enroll in same classes as Maria.  The assembly will call on the University to codify this into policy and offer support including legal and psychological pressures.  This will help students that experience abuse and those like Maria that have to make special arguments for those to be protects.

The future looks bright.  The assembly continues to grow.  It shows that simple functioning on sanctioning perpetrators of online gender‑based violence can have real positive impacts on their victims.  As I said, this is like a very positive story but we think that is something to highlight here is it is important to form alliances, to have a united front and to use every space we can to mobilize about the issues and to set these to the agenda of the space.  It could be a public space or very specific one, like in this case, a University.  Thank you very much. 

>> MODERATOR: Beautiful.  Okay.  Next up, we get to hear from Sandra.  Take it away. 

>> SANDRA ACENG: Okay.  For those that have not been in the room, hi, Sandra Aceng is my name, I'm with WOUGNET.  I will talk about safe sister.  I know some of you might be already aware or know what safe sister is.  That is a joint fellowship initiative, by defenders and Internews.  The major goal was to address the growing widespread of online gender‑based violence against women and girls.  This is done by empowering women to learn digital safety skills to protect themselves online.  And of course, the people that they work with.  So this fellowship focuses on digital safety skills building.  And do a lot of ongoing mentorship program and of course, hands on training.  So you might be wondering who are the fellowship for?  So the focus majorly on the women's human rights defender, journalists, media workers, activists, feminists doing work.  And in general, I would say anyone that is doing work around human rights issues.  Of course, the major focus is in Africa.  They have done a lot of work in East Africa and Uganda being one of it.

So what is the fellowship about?  Majorly, the focus on training these women and girls to be able to understand and respond to some of the issues such as abuse, harassment, threats that women face online through the digital security training.  So they are introduced to understand the challenges women, girls, and other identities as they do their work it the daily life. 

These women have gone on to do -- several work on digital security, professionally.  I'm going to highlight some of the amazing women who have done work in the fellowship.  One is IDA, because ‑‑ sorry for not pronouncing the word well.  She founded the IDA gender and technology initiative, it is based in Uganda.  After the fellowship, she did ‑‑ she did ‑‑ she founded the organization and has been doing amazing work on digital security in Uganda.  And of course, because of that now she has gone for further studies on digital security under that scholarship, and we also have one woman who is also in the room called Sandra Carr.  If you could wave to the room.  She was a fellow of safe sister.  She's doing amazing work in Uganda and of course, with the communities that structure the lens in Uganda on digital security. 

Of course, we also have one on the team that also as a result of safe sister fellowship founded the initiative, that is doing work around the online violence for journalists and also done a couple of trainings on online safety for women if Uganda.  Of course, we also have our neighbors, for example, from Kenya, we have Cecilia, who has also done a lot of training for female journalists in Kenya on digital security and also she hosts the digital data podcast that of course, a platform for different feminists, activists to share the work they're doing on promoting online safety for women around the world and of course, in Uganda, Kenya and different East African countries. 

Of course, we also have one from Tanzania.  She founded the foundation and has been doing a lot of work from empowering women in Tanzania with information about protecting themselves online.  This is some of the amazing work that came from the safe sister fellowship to protect women's rights online and ensuring that women serve openly, freely access and use the Internet.  Thank you. 

>> MODERATOR: Thank you.  Sandra, next to Bulanda.

>> BULANDA NKHOWANI: Good morning, I'm Bulanda Nkhowani, I work for Paradigm Initiative.  A pan‑African digital rights and advocacy organization.  Within that, we also engage in women's rights online work where we have done research and capacity building on online gender‑based violence.  And one of the things we do also is teach practical ICT skills to underserved youth and call that program life.  This is the focus of our case study this morning.

So again, bringing it to our case study.  It illustrates the importance of teaching women how to respond to different form says of online violence while at the same time working toward legislative change.  We have a 23‑year‑old lady at one of our life centers, and a trainer.  We teach practical ICT skills, life skills and entrepreneurship skills.

During one of the trainings, she noticed one of her vibrant students had been missing classes.  So her name is Sophia, who is an 18‑year‑old entrepreneur and makeup artist.  A little bit about Sophia, she's a makeup artist and joined the business at the roadside and joined the life legacy program to learn how to market her services online.  She joined one much the digital marketing skills programs.  Also, again, just to mention, this was during her gap year as she was trying to save University fees. 

So Sophia would take pictures of herself and post them online as a way to market services.  Sophia is someone you would describe with a slender physique.  All of her life she experienced a form of bullying.  One after another.  She posted pictures online, she was bullied and trolled.  Sometimes described as skin and bones.  Being told you need to put on weight.  You need to eat more.  The more she posted her services online or her makeup services, rather, the more she got trolled.  The hateful comments stuck with her.

Interestingly, some of her fellow students from the life program were part of the trolls.  So in person, Sophia of course became extremely self‑conscious and insecure and began to wear baggy clothes to look a little more plump.  She of course considered leaving social media but she relied on it to market her services beyond her community.  She was forced to endure but at the same time she was not as vibrant online.  In the long run she began to use models to advertise the business because she had to pay services of models that meant less money in her pocket.

The ICT trainer emerging as a mentor.  They tried to report to the police.  At the time there was no ICT law for cyber bullying or trolling.  It was then considered frivolous.  What stands out the is the lack of capacity on her part to know the protective measures and tools available to her within online platforms to stop all the trolling.

Lastly, what we really see from this scenario is the interest to revive the life training to include online violence prevention, how do recognize and protect themselves or help victims. 

Secondly, she joined hands with other Civil Society organizations to raise online gender‑based violence and lobbied policymakers for legal frameworks for online gender‑based violence prevention.  Specifically inclusion of aspects to the existing anti gender‑based violence act.  The importance is teaching women of how to respond to the different forms of online violence as we lobby for greater policy and legislative change, which we all know takes forever.  Thank you. 

Kat, over to you. 

>> MODERATOR: Thank you, Bulanda and thank you for all of the work and for sharing.  We will turn it over to Malavika who will speak. 

>> MALAVIKA RAJKUMAR: Hi, everyone, I'm Malavika.  I will speak about two important resources that we have been working with for a while.  One is that we partnered with Bumble to actually come up with digital safety resources for the country.  If you have followed the news in India, a lot of the dating applications are under file for not doing enough verification checks and for actually perpetuating a lot of violence on the platform, which then later translates to online violence or violence offline.  So this sort of continuum is what we were trying to address when we were giving those kinds of resources. 

So that is one of the initiatives that we have.  But one of the main studies that I would like to share here with all of you is the study done by IT For Change known as profitable provocations which is a study of abuse and misogynistic trolling on Twitter, that is known in India.  We did a study of the misogynist speech on Twitter that is correct is directed at 20 women, six in politics.  And several others.

If you are not on the platform, you are too getting hate and being targeted.  The reason we did the study is because we wanted to understand the nature of online hate and recurring patterns of abuse that happened for those in the public political life, especially India.  Twitter is a toxic space for a lot of women politicians.

Secondly, we wanted to propose a literary framework to contend with the hate that is caused by virtuality of the platform.  So while we selected this group, we had actually gone through like 30,000 mentions.  And we had come up with annotation guidelines, which was quite difficult to come up with also.  Like, for instance, we had 19 codes that we made that were further condensed into seven categories, taking one example, if it was a threat we saw, we categorize that as intimidation or threats of direct violence.  Or saw hate speech, we saw it as caste slur or hate speech.  The broad findings we got were on the nature of misogyny.  First is the pervasiveness of the misogynistic speech and the targeting women face.  That is all women, regardless of who we took in our sample.  Regardless of how famous they were or kind of political organization they're affiliated to or even on the platform are being targeted.  Secondly, herd regression.  We understood the profile of the troll that is targeting women.  If there is a particular incident or particular woman to be trolled, we see a group of trolls coming together in this sort of psychological behavior on the platform is what actually causes a lot of trauma for a lot of women politicians.

Secondly, lighthearted trolling in the form of memes, jokes, morphing the pictures, deep fakes taking the previous maybe sexual political or any sort of life history and sort of making a joke out of it and again, it turns into herd regression.  And fourthly intersectional violence in a country like India with around 22 official languages in the constitution.  But we have at least 400 running simultaneously.  So that becomes a lot of problem.  And cause a lot of problem on the platform.  Secondly, we have a lot of intersectionality including gender identity, caste, and if you follow

In the third person.  So we have not

This is also prevalent on the platform.  There is an overarching subtext of the patriarchy as we call it.  Is one of the most dominant caste that is actually seen on the platform.  We see that the women politicians are targeted for many reasons including to be criminalized.

And it is seen on our platform and lastly objectification of the politicians like I mentioned we have a prominent Congress leader that is actually objectified for a nickname.  The mother even more prominent is objectified we notice that there is machine learning on the platform, they can't keep up.  If you ban certain words.  You realize because of so many different languages, you can take out the hate.  You can change one word and one letter and the machine won't protect it, you can put it up.  Then secondly, the abuses, there is abuse on that.  There is no way of detection.  All of the trolls and populists that come and attack women and not give them the freedom online.  We have responses that also include one of our core research work that we are trying to focus on, which is platform accountability which includes engaging with the companies like the foundation has done quite a bit.  We have also been doing that with our national teams and secondly, arguing for independent authorities to look at the issues.  And we don't want to across that fine line.  India is coming with a lot of legislation on this over the past six months.  There is a lot of progress that is happening.  That doesn't help anyone.  And the complaint is also anonymous.  On Twitter, you have a lot of issues, you file a complaint, you don't know how it is being resolved.  You can ‑‑ what kind of complaints can you file?  The issues are linked tow transparency where they ask for better data in the reports because right now, gender hate speech is just not measured enough that is something that we are arguing for.  And I just want to end by saying that we actually shared the annotation guidelines.  And gone to open source machine learning tool, for hate speech that is being developed.  This is open datasets that is to train on the algorithm of gender hate speech detection.  This is something we should collaborate towards and also a question of keeping up with technology and keeping up with the sort of issues that arise and I would love to have an open dialogue with anyone who wants to speak afterwards.  Thank you.

>> MODERATOR: Thank you, Malavika.  You can have an open dialogue afterwards and also in your breakout session, which I am very excited about.  All right.  Two more great speakers before we get to hear from you all.  Ed‑Eduardo Carillo from TEDIC.

>> EDUARDO CARILLO: I'm from an organization in Paraguay called TEDIC.  The case to share is a case of ongoing process where we’re leading in Paraguay that speaks to the story of Berlin.  She's a person that has suffered systematic sexual harassment through digital means, particularly through WhatsApp and this harassment was made through text and also images.  The caveat of it all is that she was harassed by a Professor of her University that is also high‑ranking official in the Government, related to the justice system.  It is a complicated situation from Paraguay where I am from, where there is the intersections of power dynamics and also a real lack of regulation on online gender‑based violence.  So basically, even though Berlin went to the justice to sort of like seek redress and reparation for the situation, the justice failed in giving her this and a fair process. 

Up to the point that the case wasn't really investigated because it was dismissed by the justice because they said that what the Professor did was courtship. 

So there is different complicated situations that are reflected in this case that speak to the urgency of online gender‑based violence and understanding gender‑based violence in all of its complexities.  The more complicated situation comes afterwards where after she's denied of having a third child, she's then subject of two cases, one of them being led by the Professor actually asking for reparation because he claimed that his image was damaged in the public.  Because although there wasn't a child.  There was a media outlet or was made public through media outlets.  This was sort of like a situation we took notice of and in cooperation with another organization in Argentina, we decided to present the case to international commission of human rights.  We sent a letter Article that the inter‑American court has to ask for responses from the state on what the Paraguay state has been doing to address the situation of the victim in this case, Berlin.

Sadly, Berlin is not in the country and has let's say requested if refugee asylum in Paraguay, it is not that she is trying to avoid the processes open to her in the country, but she wants the guaranties of the process will be fair and no abuse of power will be done through the process.  They're available both in Spanish and English.  And we basically not only sort of like made public the situation of Berlin and the whole process is who is ongoing.  It is used as an opportunity to educate about online.  It is experienced as a dissident voice or person in a vulnerable community and as a woman.  Everything is there to download and we want to share the information with you, because it gives a real overview.  It needs to be addressed and hopefully eradicated at some point.  Thank you so much. 

>> MODERATOR: Thank you, Eduardo.  If anybody is curious about more of this and really wants to see some of the fantastic and campaign.

>> Okay, Moira. 

>> MOIRA WHELAN: At NDI, this issue came to us, very us coming to this issue.  Because we had drastic, really shocking incidents happening around the world where for instance, in Iraq, training 26 women for a matter of months and seeing in the last election none of the women decided to run for office, citing online violence against them as the primary reason they wouldn't participate and become a candidate.  We come at this issue differently, from the standpoint we look at it as a threat to democracy, rather than simply a threat to women online.  It is actually undermines the system of democracy.  We view it as a real intersectional point or real point to focus on broader intersectional issues, where the solutions we're looking at lie not just for addressing this issue for women but for marginalized groups, ethnic minorities, LGBTQIA+ groups, because we talk about rebuilding Government, tech companies and Civil Society to address this.  I think it is important to note that we come at this issue separately because we welcome other organizations from the environmental sector, economic sector to take this issue on.  We're finding from this research, those on the table that approach from a feminist and digital perspective, find ourselves not seeing it reflected in the broader Civil Society network. 

We have been coming at this for a long time, developing hate speech lexicons and working with others to address the gender violence coming at women that isn't exactly explicit language or sexual imagery, but rather gender‑based attacks on their credibility as candidates.  And political parties. 

So we ‑‑ our former Chair, Madeline Albright identified this as an opportunity for change.  We identified it as a game changing for democracy in the world and also a solvable problem.  And from there, what we did is decide to stop admiring the problem and cataloging it and looking at it and went back to the many in the room that we worked with before to identify interventions that would work.  What we spent a year producing was a list of 24 interventions focused much like today in the areas of Civil Society, technology and Government.  And governance.  So we also looked at our sweet spot, political parties, structures, elections for possible solutions.  And you can find those at NDI.org.  They're the interventions to end online violence against women in politics.  We'll spend the next year operationalizing those.  Several of my colleagues here at IGF and around the table joined us last week to bring these with us to Silicon Valley and talk to tech companies about the structures in place.  We will go forward and identifying countries to work in to look at things like political codes of conduct for elections and other issues. 

So I urge you to visit the website and find that information.  But I think that's what I really wanted to leave everyone with is this understanding that, you know, this is ‑‑ the issue of online violence against women in politics specifically really does undermine democracy.  The importance of addressing this broadly as our colleagues at the Web Foundation has done is wrapping it in a broader agenda.  Because by the time a woman gets to the point of running for office she has experienced violence online and it is causing her to change her behavior.  (Audio skipping)

Intersectional points come when women are younger.  They come with legal frameworks that protect spaces for full participation.  And allow people to flourish and lead their communities.  So I will leave it there.  I do urge you all to visit NDI and catch us in the breakout session.

>> MODERATOR: Thank you so much.  So now we are getting to kind of the more fun part.  Although that was fascinating round of presentations.  Thank you to everyone for all the work you have done with Web Foundation, the time you have shared with us, it has been truly a pleasure to work with you.  So to dive in more deeply.  This is a day zero event.  Which means it will be more collaborative.

We have time to do so.  You will see there are different sections of the room.

We'll focus on those involved with Civil Society, here is Government and here is the tech sector.

There are speakers to join or facilitate a discussion.  If you are here to learn, they can share more in depth some of their work.  We have a few components of the breakout group that we really wanted to share with you all to give a better sense of, you know, the complexities of the space and to learn from you about how to be more effective. 

So you will see there's sort of a few Post‑its.  It will get very colorful in this room fairly soon.  We want to hear what enables change, what are the big blockers in this area?  What are specific regional contexts to be aware of?  Who else should be involved?  There is oftentimes we work in the spaces and we don't always have everybody in the room to be able to make the change.  We would like to know from you all who you think needs to be part of this work. 

So I think we have got our facilitators dividing themselves up among the different sections.  We will have about 40 minutes for this breakout. 

You will also see a sign‑up sheet that is being passed around the room.  You will get it in the breakout too, it is helpful if you are interested in this work to share your contact information there and what topics are interesting to you.  So we can connect with you and connect you to other individuals and organizations that have shared interests.  Okay.  That all sound all right? 

Great!  (Chuckling).  I will take that as consensus.  A show of hands, anybody particularly interested in the tech sector?  Nobody cares ‑‑ have a handful.  Government?  Great.  Civil Society.  Okay. 

I know we have a packed room, make your way, if you need to break into a few smaller breakout sessions.  Group on Zoom, are you able to unmute yourself? 

(Double audio feeds)

(Silence)

(Breakout sessions)

(Incorrect audio being received)

>> MODERATOR: It was linked to their audio, which is why you couldn't hear what was happening in the room.  There will be a recording, so you will be able to eventually hear what people shared.  I apologize that we didn't have the technical staff up to speed.  Part of this is that we are day zero.  One of the first sessions of the conference.  Part of this is that we're talking about feminine issues and people are trying to actively shut that conversation down.  So there is a really good group in the room.  It is active and lively.  I'm grateful to everybody that has been able to join virtually.  Thank you. 

(Breakout chatter)

>> Offices of the big tech companies, and we understand there are differences in the way you approach those offices, and of course, you have a lot of advantages when going to headquarters.  For instance, versus other regional offices.  So it is a ‑‑ those are some of the reflections that we did.  So ... if there is anyone else that wants to jump in.  And then go straight to Malavika. 

>> I think one of the big questions like he posed is who has power when it comes to effectuating change in big tech ‑‑ it is going back to the U.S. most of the time.  That is the sort of observation that we were able to sort of put forth and when we look at the kind of stakeholders that we must engage with when it comes to actually getting to big tech, it is one thing to get to big tech and another thing to follow up with big tech and engage with big tech.  There are many groups that we felt would play a big role in that.  First being Bar Associations, which include group of lawyers or those that are law professionals to actually come together and actually assist ‑‑ work with Civil Society and big tech

(Overlapping audio from other sessions)

It is difficult, we all acknowledge it is difficult to engage with the judiciary on certain concepts that are there beyond the law, so this includes changes from our end, including resources or trainings and actually engaging with the judiciary and effectuating change from that end to actually hold social media accountable and in litigation, et cetera.

Then we have the role of media both traditional and digital media.  It is important to highlight the sort of like work that CSOs are trying to achieve or the sort of inequalities or lack of engagement that the big tech is actually putting out for the public and sort of hold them accountable through that.  So media campaigns are very important and media plays a big role for the same. 

     And we also acknowledge that Government entities are very important.  They're named differently in various jurisdictions, could be Ministry of I.T., broadcasting, telecom ministry.  It could be important to actually get to big tech and have conversations.  It is important to actually have the sort of multistakeholder approach and approach ‑‑ and collaborations that exist where big tech may be part of it.  And also speak to advertisers and those that play a role in software development and AI regulation in those bodies and discuss how a standard‑setting body is useful when it comes to big tech and engaging with the sort of approaches big tech should take.  An example would be something like the workings of GNI. 

I think I covered everything but happy to let someone jump in. 

(Overlapping discussions).

>> There is one thing we mentioned that effects the whole conversation.  That is the contradiction between those who are very committed to freedom of expression. 

(Overlapping audio sources)

>> So when we are talking about violence against women and we're talking about making a safer Internet, that is that fundamental objection that some people have. 

(Silence)

>> Sorry about that, there is too much audio, and I accidentally muted the room that we were all listening to.  Yes, that was my fault.  I have asked them to unmute again. 

Do we know who Shannia Lewis is?  It looks like it is connecting to audio.  I was hearing two streams. 

>> We are having difficulty, the person hear captions is hearing audio from this room and another room, we are trying to take notes from what you are saying.  My hope is that there is a recording of the session.  So ...

No. 

(Overlapping audio from multiple sessions)

>> Two more points were solidarity to engage in formal sector.  Related with the second question of the primary blockers, yes ‑‑ they're all about locks.  Most of the time the funding for Civil Society are two human rights ... it is a circular problem.  And also I would say the lack of access to justice because women don't have the information or don't claim.  Yes, there would be the primary blockers that we discussed.  And regional context. 

It is also related to digital literacy.  Right?  There is a digital gender gap.  And related with what has been mentioned, we have issued in the U.S., where there is more emphasis on free speech than human rights.  About who needs to be included are some organizations that would be great to include, we have for example, the case of Philippines, when there is a party ‑‑ I understood the only party with woman representation named Gabrielle, there is also ‑‑ and also, keep in mind that we need a conversation with feminist movements and groups and work in a very sustainable way with them.  Thank you. 

(Overlapping audio)

>> MODERATOR: Thank you to everyone that joined and those that started early and those that stayed, really appreciate you.

I would kick it over to the team online who have joined and wanted to run their own breakout session.  As you heard, there was difficulty with the Zoom starting on time, and difficulty with the audio feed.  There is another major difficulty online is that there wasn't a regulation of who was able to get in the room.  There was no moderation of the people that did join the room.  The people that joined online were subjected to violent images, videos and pornographic images and had no way to stop the audio.  And unfortunately we lost a lot of that participation understandable.  It is an upsetting experience for those trying to join to learn about feminist and feminine Internet.  It is worth knowing and sharing that as you all have experienced this, this is what people are subjected to.  And anytime you try to discuss and move forward in these spaces, this is the kind of targets harassment we get.  And so, you know, if someone is attacking you, you know that you are really bothering them and you are moving in the right direction.

I'm really grateful to everybody that was able to join online.  I'm grateful to them for leaving so they can protect themselves.  And I'm hoping to follow up with them from the conversations that you all have had to continue this work. 

So ... we do have one more session.  One more breakout.  We do want to hear from you.  We have a platform to build on more specific work.  As I shared, when we ran our tech policy design lab, it was quite broad.  We looked at online gender‑based violence overall.  I would say this is a good reflection of the scope of the Web Foundation, which is everything that happens online. 

But also just trying to understand what can the lab really tackle?  (Overlapping audio)

As we take the lab forward into 2023 and work on the overlapping and large barriers and difficulties, we're working with the U.N. and UN Women on the status of women Commission coming up in March and greater organizations like NDI and those involved in the women's rights network to create a better system and structures and support for this work and working with the U.N. tech envoy office to bring in a feminist and gender lens.  All of that is continuing.  We welcome your participation in it.

We will also try to pick up very targeted interventions on some of the issues that you all have raised.  So if possible to bring back up the slide deck. 

(Overlapping audio)

The tech is improving minute by minute.  We hope that our experience has now helped others who have sessions following us and sessions throughout the week to have a better experience.  Okay.  So tech policy design lab as we shared, multistakeholder design centered workshops to tackle a key tech policy problem with measurable results.

There really is a design component in this.  I have been learning a lot about the design community and sector.  And why it is so vital to think through and plan through the interfaces and experiences that people have online in addition to the laws, legislation, how you think about reinforcing it. 

(Overlapping audio)

These are the organizations we bring together.  The Civil Society, activates, media is in there as well.  Some things we have taken forward, though this is not the only list we would think of our products, principles for design, policy changes, new operations, new processes in Government in large organizations, multinational organizations, and in Civil Society and the tech Secretariat. 

So what I would like to ask you all, if you are up for it, when you think of partner organizations, stakeholders, commitments, what would you like to see a tech policy designed lab, what would you like us to be focused on?  Who do you know who would like to work with us in this space? 

So we tried to frame a focus with a very specific structure.  So who are the organizations interested?  What does good look like?  And we have this structure of organizations, actors, actions.  Which organizations need to be involved to get a group like Instagram?  So we have several groups working on something called malicious tagging.

So where you get your profile pulled offline.  How can these organizations work to get Instagram to have better reporting processing and better protections so that people who are targets of malicious tagging would get their accounts restored quickly or not have them blocked in the first place. 

(Overlapping audio)

The proposal is we have our same breakouts and ask you all to put in a construction of a very sort of targeted, narrow, what would you like to see changed in the work that you are doing?  That you would recommend the technique work for three‑month, six‑month, or nine‑month iteration.  Is how long we tend to run these to have an outcome index.  So that is my proposal. 

I realize I'm just running through.  But I didn't give anybody the opportunity to sort of respond or reflect on the readouts from the breakouts or the information of what's happened on the Zoom today.  I admit that I am a bit shaken up by it.  If anybody else would like to say anything before going into a breakout, please feel free.  (Overlapping audio).

>> MODERATOR: We have alerted the head of IT and tech.  So now I think there is training for the people in the room that are ‑‑ somewhat new to the interface and technology of here's how you moderate a Zoom.  There is automatic mute and you cannot open up your video for people that are joining virtually.  So hopefully now people who are in charge of the Zooms in each of the different rooms know how to make the person who is joining online a co‑host to have better freedom.  But it seems like it was sort of an open Zoom link for everybody to join. 

>> AME ELLIOT: These are our participants, Ame, Luisa, and the Bangladesh team and we have women and digital.  We had quite a few others that had joined previously. 

(Overlapping audio)

>> To participate in any IGF you had to be registered with IGF.  There should be a clear traceability for those people that are engaged in that.  There is no open Zoom at IGF.  Everybody has to be registered as a participant.  So I can't understand how this happened. 

>> MODERATOR: Yeah.  It is a mystery to me as well.  I think sometimes that happens when you are ‑‑ when you are one of the earlier sessions they start to work out the difficulties.  The only thing it takes to register to IGF is an email address.  There is no independent verification, nor do I think there should be.  But there are technical protections to be put in place to avoid Zoom bombing, which we have all known about for many years. 

(Overlapping audio)

>> Anyway ... moving forward? 

>> MODERATOR: That can be perhaps a specific topic that we can tackle.  Although I think the tech is really in place for that. 

(Overlapping audio)

That there is a design or policy change that can come with that.  I am open if others have an idea ‑‑ (overlapping audio)

And putting it in the context of

(indiscernible)

(Overlapping audio)

>> MODERATOR: We're just going to take about 10 minutes for people to brainstorm, 20 minutes for the discussion.  We'll have a brief readout and then send you all to the rest of IGF.  Thank you very much. 

(Breakout discussions)

>> AME ELLIOT: This is directed to the women in digital and Bangladesh remote hub. 

(Overlapping audio)

>> AME ELLIOT: I invite you to make some notes for a few minutes.  Try to come up with some ideas following the similar template.  And then we can share and discuss in some of our remaining time. 

(Breakout) 

(Overlapping audio)

>> MODERATOR: Okay, I will take notes, people online will take notes here.  If you want to follow along from this session, up top are the notes we will take and cleanup from the Post‑its there.  It is case sensitive.  So if you write that down, you can stay in touch.  Okay.  So I think team selfie, you want to go first? 

>> (Off mic).

>> MODERATOR: (Chuckling)

(Overlapping audio)

>> Okay.  So maybe one minute to share kind of a sentence or structure.  Yeah. 

>> We are looking at malicious tagging, fake Instagram accounts, secondly online gender‑based violence, specifically addressing violence in the political sphere.  The third category is language, exclusions in content moderation, in terms of who can we engage with to actually look at these issues, Internet lab, obviously all of us here.  And then talk to ‑‑ like, for instance, one of our members was mentioning about Facebook ethical groups that can actually effectuate change.  Secondly, women's safety teams in national jurisdictions if they operate really well.  It is a bit of a risk, but we must try.  And in terms of Coalitions or collaborations, women's rights online, APC, and one of the most important categories, which is linkages with global media, organizations, and basically look at media coverage internationally to sort of put pressure on big tech to address these issues. 

And the solutions that we have ... for malicious tagging and fake Instagram accounts.  Again, this is a policy approach, but addressing verification policies.  Secondly, while looking at political violence and online gender‑based violence look at how to change policies but not just stop there.  Is the change effective?  So that including monitoring and data evidence afterwards straight from big tech.  What is the other? 

(Overlapping audio)

>> This could address all of the issues given here.  Inoperability between data and big tech, Google has its own form of reporting, Facebook has its own form.  Categories are different, data projections are different.  Finally, in the solution for language exclusion, we want to look at content moderation policies.  Because a lot of regional and traditional media that have their digital publications online, they're usually taken down for XYZ reasons and it is very difficult to put that content back online.  This is also a language gap, in the big tech teams.  So that is a solution for that.  So I think we have multiple problems and multiple solutions.  It should work.  (Chuckling)

>> MODERATOR: Totally going to work.  Okay, thank you for that.  It is helpful.  I'm looking forward to the structures and hope to follow up with you all to see what we can take forward.  We have just a few minutes left in this session.  So Bulanda Nkhowani, if you have something to share. 

>> BULANDA NKHOWANI: We have security enforcement or law enforcement to include training to online violence.  To increase access to justice.

We have a case study in South Africa where Civil Society have trained law enforcement on online violence.  We are looking at it in terms of reviewing curriculum so the training is more structured and sustained. 

I don't have anything. 

(Overlapping audio)

>> MODERATOR: I love that.  That is very focused.  Do we have a training or curriculum to specifically for law enforcement to take this forward.  This is something the generation Forum, and there is action Coalitions that is focused on how do we include the judiciary or train up those that are better able to identify problems, enforce the laws and move through the Court system.  Very glad to hear Paradigm Initiative is working on that.  And look forward to hearing about that. 

(Overlapping audio).

>> MODERATOR: So we just need for Zoom to ‑‑ sorry.  Thank you.

>> Okay.  So from Civil Society, we discussed (overlapping audio) and actors, of course, action to be taken and the people and outcome.  So looking at ‑‑ we will discuss more action to be taken.  A few of the specific stakeholders and the people and all of that.  So one of the things that we discussed is about the ability to be able to demand for increased transparency on data received by the authority.  And of course, this involves (overlapping audio)

 All right.  The system that exists in the Regions or countries on online gender‑based violence and we discussed on being able to understand what are some of the laws, existing laws on online gender‑based violence.  Or find a way to implement some of them that exist in different Regions.  Of course, highlighting some of the challenges, we looked at lack of feedback from social media platforms in case maybe someone raise a concern on online gender‑based violence issues and of course, the ability of aspect of limited data that might be on online gender‑based violence.  And I think the aspect of backlash in the work that has already been done by different actors or Civil Society organizations on online gender‑based violence.  Thank you. 

>> MODERATOR: Super. 

(off mic)

Okay.  Really appreciate everybody that has joined who has stayed.  I learned a lot.  This is a good discussion.  I think the themes that are consistent about the need for better understanding of the laws, how do we share this out?  How do we get them enforced?  I think the tension between the freedom of expression and how to have protection definitely something that we want to dive in more and explore.  If we go to the next slide.

So here's where we're at this week.  Some of us here from the women rights online network, from our partners, if you are interested in continuing these kinds of conversations or you would like to share out more of your work, please join some of the sessions going forward.  There is a town hall Friday morning that I'm particularly excited about, but many of these, the lightning talks and video brief suggest important.  So please come and join us.  And this is really, as I said, it is a big year.  2023 has quite a few ‑‑ this is not exhaustive, there is the U.N. data Forum and other events happening.

Really, what we are looking at is how do we get this work into the Commission of the status of women and working closely with the organizer.  How do we get this to impact and taken forward and how do we get the online gender‑based violence lens in there.  We believe that will be a structure to be in place for a long time.

I would say that clearly, this work is needed.  We're not aware yet if any other sessions had been targeted, but the fact that ours was really shows that this work is something that is bothering a lot of people.  So we're here to fight it.  We will continue to do so.  And really grateful to the people that are staying strong through it.  And who are driving this worked for.  So thank you to you all.  And please ‑‑ and thank you for the time that you shared.  Please stay in touch.  And I think we go to the cafeteria now to reward ourselves with some lunch.  All right.