PART 3 Appendices
¶ 1 Leave a comment on paragraph 1 0 LIST OF APPENDICES
¶ 2 Leave a comment on paragraph 2 0 Appendix 1: List of contributors (included in Draft JP)
¶ 3 Leave a comment on paragraph 3 0 Appendix 2: Synthesis document (included in Draft JP)
¶ 4 Leave a comment on paragraph 4 0 Appendix 3: Call for participation (included in Draft JP)
¶ 5 Leave a comment on paragraph 5 0 Appendix 4: Survey (page 2 in this section)
¶ 6 Leave a comment on paragraph 6 0 Appendix 5: Case studies (included in Draft JP)
¶ 7 Leave a comment on paragraph 7 0 Appendix 6: Social media campaign (included in Draft JP)
¶ 8 Leave a comment on paragraph 8 0 Appendix 7: Meeting summaries (included in Draft JP)
¶ 9 Leave a comment on paragraph 9 0 Appendix 8: Progress updates (included in Draft JP)
¶ 10 Leave a comment on paragraph 10 0 APPENDIX 4: SURVEY
¶ 11 Leave a comment on paragraph 11 0 PART 1
¶ 12 Leave a comment on paragraph 12 0 A. DESIGN AND METHODOLOGY
¶ 13 Leave a comment on paragraph 13 0 Survey questions were derived from the skeleton document to address specific sections of the BPF’s scope of work. The questions were drafted and refined in consultation with the BPF’s community (henceforth ‘the survey designers’) after consultation on the BPF mailing list and during a virtual meeting dedicated to a survey planning session.
¶ 14 Leave a comment on paragraph 14 0 The survey (see Part 2 for the survey questions) focused primarily on two aspects of the BPF’s work: defining the problem of online violence and/or abuse, and measuring the impact thereof on both communities and individuals. Because the target audience of the survey was not defined and invitations to complete the survey would be sent to both experts in the field and general Internet users, the survey provided relevant background, context and descriptions where perceived necessary. To encourage more stakeholder participation, the survey was also kept relatively short, with a combination of close-ended categorical and open-ended questions; the latter providing the opportunity for lengthy, substantive responses.
¶ 15 Leave a comment on paragraph 15 0 Responses were elicited over a period of one month by calls on the mailing list, social media (including tweets from the IGF’s Twitter account), and emailed invitations (see Appendix X for the call for participation) to various mailing lists (including mailing lists within the Internet governance, academic and broader community).
¶ 16 Leave a comment on paragraph 16 0 Diversity of respondents
¶ 17 Leave a comment on paragraph 17 2 A total number of 56 responses were collected, with the largest proportion of responses submitted by respondents who identified themselves as part of the civil society stakeholder group (41%), and the smallest number from the technical community (4%). It should be noted, however, that the identified stakeholder groups were not necessarily mutually exclusive. Of these stakeholders, 31 respondents also identified their organizations, which varied from civil society organizations to police and government departments, universities, intergovernmental organizations, etc.
¶ 18 Leave a comment on paragraph 18 0 The survey attracted responses from a rich diversity of regions, particularly from developing countries. Of the respondents that identified their countries (52 out of 56 respondents), 25% were from Africa, 23% from Europe, 17% from Asia, 13% from Central and South America, 12% from the Middle East and 10% from North America. Within these regions a vast number of countries were represented. From the Africa region, for instance, survey responses were received from South Africa, Zambia, Nigeria, Ghana, Tunisia, Kenya, Cameroon and Uganda. There were a limited number of countries represented in the Europe region, however, with responses only being received from the UK, Estonia, Switzerland and Germany.
¶ 19 Leave a comment on paragraph 19 0 B. ANALYSIS
¶ 20 Leave a comment on paragraph 20 0 The survey analysis was conducted with the goal of gathering stakeholder perceptions and comments on the BPF’s topic. The analysis was done to consolidate and identify common concerns, issues and definitions for further study and for incorporation into the main outcome document where relevant.
¶ 21 Leave a comment on paragraph 21 0 Due to the number of substantive responses for open-ended questions, many interesting comments and/or quotations were also highlighted for inclusion in the main outcome document.
¶ 22 Leave a comment on paragraph 22 0 a) Definition of online VAW
¶ 23 Leave a comment on paragraph 23 0 The first task of defining the BPF’s scope of work was outlining what constitutes online abuse and online violence against women. The survey asked respondents to list examples of the types of behaviour that they consider to be within this ambit in their knowledge and/or experience. This was an open-ended question that received a total of 43 responses.
¶ 24 Leave a comment on paragraph 24 0 In the survey responses, proffered definitions of online violence against women and girls generally contained three common elements, including:
- ¶ 25 Leave a comment on paragraph 25 0
- range of action/ behaviour that constitutes online VAWG;
- impact to rights and harm experienced;
- the role of technology in enacting/ enabling online VAWG.
¶ 26 Leave a comment on paragraph 26 0 Many respondents also stressed the fact that online VAWG is also echoed in offline spaces, whilst some personal definitions also specifically recognised online violence/ abuse as a violation of women’s rights.
¶ 27 Leave a comment on paragraph 27 0 Types of action/ behaviour
¶ 28 Leave a comment on paragraph 28 0 The list compiled below consolidates responses submitted during online virtual BPF meetings, through the mailing list and on the first draft outline document, published on a shared Google doc. Content marked in grey font was derived from survey responses:
¶ 29 Leave a comment on paragraph 29 0 Infringement of privacy:
- ¶ 30 Leave a comment on paragraph 30 0
- accessing, using, manipulating and/or disseminating private data without consent (by hacking into your account, stealing your password, using your identity, using your computer to access your accounts while it is logged in, etc.)
- taking, accessing, using, manipulating, and/or disseminating photographs and/or videos without consent (including revenge pornography)
- sharing and/or disseminating private information and/or content, including (sexualised) images, audio clips and/or video clips, without knowledge or consent
- doxxing (researching and broadcasting personally identifiable information about an individual without consent, sometimes with the intention of providing access to the woman in the ‘real’ world for harassment and/or other purposes)
- contacting and/or harassing a user’s children to gain access to her
¶ 31 Leave a comment on paragraph 31 0 Surveillance and monitoring:
- ¶ 32 Leave a comment on paragraph 32 0
- monitoring, tracking and/or surveillance of online and offline activities
- using spyware without a user’s consent
- using GPS or other geolocator software to track a woman’s movements without consent
- stalking
¶ 33 Leave a comment on paragraph 33 0 Damaging reputation and/or credibility:
- ¶ 34 Leave a comment on paragraph 34 0
- deleting, sending and/or manipulating emails and/or content without consent
- creating and sharing false personal data (like online accounts, advertisements, or social media accounts) with the intention of damaging (a) user’s reputation
- manipulating and/or creating fake photographs and/or videos
- identity theft (e.g. pretending to be the person who created an image and posting or sharing it publicly)
- disseminating private (and/or culturally sensitive/ controversial) information for the purpose of damaging someone’s reputation
- making offensive, disparaging and/or false online comments and/or postings that are intended to tarnish a person’s reputation (including libel/ defamation)
¶ 35 Leave a comment on paragraph 35 0 Harassment (which may be accompanied by offline harassment):
- ¶ 36 Leave a comment on paragraph 36 1
- “cyber bullying” and/or repeated harassment through unwanted messages, attention and/or contact
- direct threats of violence, including threats of sexual and/or physical violence (e.g. threats like ‘I am going to rape you’)
- abusive comments
- inappropriate jokes that serve to demean women
- verbal online abuse
- unsolicited sending and/or receiving of sexually explicit materials
- incitement to physical violence
- hate speech, social media posts and/or mail; often targeted at gender and/or sexuality
- online content that portray women as sexual objects
- use of sexist and/or gendered comments or name-calling (e.g. use of terms like “bitch”/”slut”)
- use of indecent or violent images to demean women
- exposing women to unwanted imagery that may impact them negatively
- abusing and/or shaming a woman for expressing views that are not normative, for disagreeing with people (often men) and also for refusing sexual advances
- mobbing, including the selection of a target for bullying/ mobbing by a group of people rather than an individual and as a practice specifically facilitated by technology
¶ 37 Leave a comment on paragraph 37 0 Direct threats and/or violence:
- ¶ 38 Leave a comment on paragraph 38 0
- trafficking of women through the use of technology, including use of technology for victim selection and preparation (planned sexual assault and/or feminicide)
- sexualised blackmail and/or extortion
- theft of identity, money and/or property
- impersonation resulting in physical attack
- grooming
¶ 39 Leave a comment on paragraph 39 0 Targeted attacks to communities:
- ¶ 40 Leave a comment on paragraph 40 2
- hacking websites, social media and/or email accounts of organisations and communities
- surveillance and monitoring of activities by members in the community
- direct threats of violence to community members
- mobbing, including the selection of a target for bullying/ mobbing by a group of people rather than an individual and as a practice specifically facilitated by technology
- disclosure of anonymised information like address of shelters, etc.
¶ 41 Leave a comment on paragraph 41 0 Limiting women’s access and/or use of technology
- ¶ 42 Leave a comment on paragraph 42 0
- limiting women’s access to the Internet and/or online services that men are allowed to use
¶ 43 Leave a comment on paragraph 43 0 Intellectual property
- ¶ 44 Leave a comment on paragraph 44 0
- Stealing, manipulating and or abusing a women’s intellectual property online, including ideas and/or content
¶ 45
Leave a comment on paragraph 45 0
Legislative/ research definitions submitted by survey respondents
¶ 46 Leave a comment on paragraph 46 0 ‘Violence against women’ is defined in article 1 the Declaration on the Elimination of Violence against Women (United Nations General Assembly, 1993) to mean:
¶ 47 Leave a comment on paragraph 47 0 “any act of gender-based violence that results in, or is likely to result in, physical, sexual or psychological harm or suffering to women, including threats of such acts, coercion or arbitrary deprivation of liberty, whether occurring in public or in private life”
¶ 48 Leave a comment on paragraph 48 0 Women’s Aid report[1] ‘Virtual World: Real Fear’ looked into online harassment, stalking and abuse and defined online abuse as:
¶ 49 Leave a comment on paragraph 49 0 “the use of the internet or other electronic means to direct abusive, unwanted and offensive behaviour at an individual or group of individuals.”
¶ 50 Leave a comment on paragraph 50 0 Research by APC[2] on online VAWG defines technology-related violence as encompassing:
¶ 51 Leave a comment on paragraph 51 0 “acts of gender-based violence that are committed, abetted or aggravated, in part or fully, by the use of information and communication technologies (ICTs), such as phones, the internet, social media platforms, and email.”
¶ 52 Leave a comment on paragraph 52 4 The UN Broadband Commission for Digital Development Working Group on Broadband and Gender report on “Cyber violence against women and girls” defines cyber violence against women and girls to include:
¶ 53 Leave a comment on paragraph 53 0 “hate speech (publishing a blasphemous libel), hacking (intercepting private communications), identity theft, online stalking (criminal harassment) and uttering threats. It can entail convincing a target to end their lives (counselling suicide or advocating genocide). The Internet also facilitates other forms of violence against girls and women including trafficking and sex trade.”
¶ 54 Leave a comment on paragraph 54 0 Online violence and its relationship to offline violence
¶ 55 Leave a comment on paragraph 55 1 Various respondents stressed that online violence not only permeates the offline sphere, but also often extends from offline environments (and patterns of abuse, like ongoing domestic abuse) into an online sphere (i.e. vice versa). Online VAW thus needs to be studied whilst keeping the offline environments in mind.
¶ 56 Leave a comment on paragraph 56 0 Some relevant research shared by survey respondents in this regard:
- ¶ 57 Leave a comment on paragraph 57 0
- Women’s Aid research with nearly 700 survivors of domestic abuse who experienced online abuse from a partner or ex-partner found:
- For 85% of respondents the abuse they received online from a partner or ex-partner was part of a pattern of abuse they also experienced offline.
- Nearly a third of those respondents who had received direct threats stated that where threats had been made online by a partner or ex-partner they were carried out.
- Women’s Aid believes that progress has been made over the past two years but there is still far to go to ensure that women are safe online
- End Violence Research from APC was conducted in 7 countries as well as the mapping of online violence globally and in the aforementioned 7 countries.
¶ 58 Leave a comment on paragraph 58 0 b) Impact and consequences of online VAW
¶ 59 Leave a comment on paragraph 59 0 This section was designed using a combination of close and open-ended questions. Options were provided through existing research and work in this area by survey designers, with an open-ended option to ensure new knowledge could be captured.
¶ 60 Leave a comment on paragraph 60 0 Respondents were asked to tick what consequences they believed online VAW to have on individuals and communities respectively. A multi-option variable was provided: they could tick more than one, and an ‘other’ category was also provided.
¶ 61 Leave a comment on paragraph 61 0 Impact on individuals
¶ 62 Leave a comment on paragraph 62 0 The most common consequences of online VAW on individuals, according to survey respondents (see Table 1 below), are that women suffer fear, anxiety and depression (89% of respondents) and that they withdraw from online spaces and reduce the extent of their engagement with the Internet (83% of respondents). Other common consequences, according to survey respondents, include that women may consider or attempt suicide (66%); have their work and income affected (66%); and experience their mobility being limited and/or curtailed (64%).
¶ 63 Leave a comment on paragraph 63 0 It is notable that response rate for all options are relatively high (above 60%), which points to the significant and multi-dimensional impact that online VAW can have on women who experience them. Although the “Other” as an option was selected, no respondents listed the consequences.
| Table 1: Potential impact on individuals | |
| suffer fear, anxiety and depression | 88.7% |
| consider or attempt suicide | 66% |
| withdraw from online spaces and engagement with the Internet | 83% |
| lose their social networks and/or support | 62.3% |
| have their work and income being affected | 66% |
| experience their mobility being limited and/or curtailed | 64.2% |
| Other | 39.6% |
¶ 64 Leave a comment on paragraph 64 0
¶ 65 Leave a comment on paragraph 65 0 Impact on communities
¶ 66 Leave a comment on paragraph 66 0 The most common effect of online VAW on communities, according to survey respondents (see Table 2 below), is the creation of a society where women do not feel safe online and/or offline (83% of respondents). Online VAWG also contributes to a culture of sexism and misogyny online (77%) and, in offline spaces, to existing gender inequality (74%). It also is seen to limit women’s ability to benefit from the same opportunities online that men frequently benefit from (e.g. employment, self-promotion and/or self-expression) (69.8%). As a result, online VAW also contribute to the creation of a society where men and women cannot participate equally online (62.3%).
| Table 2: Potential consequences for community | |
| create a society where women don’t feel safe online and/or offline | 83% |
| create a society where men and women do not participate equally online | 62.3% |
| contribute to a culture of sexism and misogyny online | 77.4% |
| disadvantage women, as they do not have the same opportunities for benefiting from the Internet as result (e.g. employment, self-promotion, self-expression) | 69.8% |
| contribute to existing gender inequality in offline spaces | 73.6% |
| Other | |
¶ 67 Leave a comment on paragraph 67 0
¶ 68 Leave a comment on paragraph 68 0 In some of the other, open-ended survey questions, respondents also sometimes highlighted the impact and consequences of online VAW (e.g. in the question about defining online VAWG). These include (in no specific order):
- ¶ 69 Leave a comment on paragraph 69 0
- limiting and/or violating women’s rights;
- physical or psychological damage, including public humiliation;
- making women feel unsafe;
- silencing individuals; and
- forcing women out of online spaces.
¶ 70 Leave a comment on paragraph 70 0 c) Enabling environments
¶ 71 Leave a comment on paragraph 71 0 The survey also aimed to identify underlying factors that can contribute to online VAW. These potential factors were identified and compiled by the survey designers from existing experience and research. A non-exhaustive list was provided in the survey and respondents were asked to tick all the relevant factors, and to cite any additional factors not identified (see Table 3 below). No respondents specified other factors, although 12% of the respondents did think there were ‘other’ factors involved.
¶ 72 Leave a comment on paragraph 72 0 The two most significant factors recognised as contributing to and/or enabling online VAW are a lack of awareness and recognition of online VAW (86% of respondents), and inequality and sexism offline that is reflected and amplified in online spaces (80%). This seems to point to an existing culture that accepts gender disparity and online VAW as part and parcel of interactions online, which renders it invisible and normalised.
¶ 73 Leave a comment on paragraph 73 0 The second highest group of factors selected were related to gaps in legal remedies and barriers to access justice. Namely, a lack of trained moderators, police officers, etc. available to respond to cases of online VAW (74%), and the lack of legal remedies available to respond to cases of online VAW (70%). This seems to indicate that greater regulatory guidelines are needed to, at a minimum, provide recognition of online VAW as a violation of rights. Second, it also points to the important role that first level responders play in creating a safer online environment that rejects VAW.
¶ 74 Leave a comment on paragraph 74 0 Women’s unequal participation as decision makers in the development of technology platforms and policies was also recognised an important factor (68%), which may be linked to the lack of mechanisms available in online platforms to enable effective responses to cases of online VAW (66%).
¶ 75 Leave a comment on paragraph 75 0 Gender disparities in both access to the Internet and in terms of the skills of Internet users were seen as relatively less significant factors by the respondents (40% and 54%, respectively).
| Table 3: Factors that might contribute to/ help enable online VAW | |
| gender disparity in terms of access to the Internet | 40% |
| gender disparity in terms of skills in using the Internet | 54% |
| inequality and sexism offline that is reflected and amplified in online spaces | 80% |
| lack of awareness and recognition of online VAW as a serious issue | 86% |
| women’s unequal participation as decision makers in the development of technology platforms and policies | 68% |
| inadequate mechanisms available in online platforms that enable effective response to cases of online VAW | 66% |
| lack of legal remedies to respond to cases of online VAW | 70% |
| lack of trained moderators, police officers, etc. to respond to cases of online VAW | 74% |
| Other | 12% |
¶ 76 Leave a comment on paragraph 76 0 As with impact, in the other, open-ended survey questions, respondents sometimes also referred to factors that might enable online violence and abuse. These specifically included the ability to remain anonymous online and a sense of immunity that exists in the online sphere.
¶ 77
Leave a comment on paragraph 77 0
The responses provide an insight into measures that can be taken to address this issue. Although capacity building on skills is seen to play a role in creating a safer environment online, the responses point strongly to the need for addressing underlying structures of gender disparity and the culture of sexism that facilitates the perpetuation of online VAW. This includes the need for greater regulatory guidelines and measures to both provide recognition, as well as resources and prioritisation to train first level responders in this issue, and the need for more equal participation of women in technology development and decision-making.
¶ 78 Leave a comment on paragraph 78 0 d) Specific examples
¶ 79 Leave a comment on paragraph 79 0 Respondents also shared specific examples on particular cases of online VAW that were faced, which helps to outline the interrelated and complex dimensions of this issue. This can be read together with country case studies that provides more detailed illustration of how online VAW is experienced and responded to in different contexts (see Appendix 5). Note that these responses were not edited.
¶ 80 Leave a comment on paragraph 80 0 Respondent from Zimbabwe, residing in UK: I want my case and of about 109 women in Zimbabwe community in UK to serve as an example. All of us 109 women have suffered from one cyberbully. We are willing to be interviewed to help in the study because everyone is yearning to speak out about it. The results might help the world to understand violence against women. Our question is will the Human Rights Act be used to protect the online victims by Vio Mak in UK. Zimbabweans in UK are suffering… Online abuse is defined with my own personal experience. This is where one takes presents your information wrongly on their website with intention to put your reputation in disrepute. I live in UK where many other laws could have protected me but simply because this happens online police have said they can not do much yet the perpetrators are known. In her many fake websites a woman called Vio Mak who parades as a human rights defender has distorted the work I do for charity by misinforming the public. Daily she posts defamatory statements about me or other women from Zimbabwe. She labels us fraudsters, prostitutes, witches and many cultural unacceptable names. This has caused me and other victims to be hated by the public. She has cyberbullied us almost daily. Two women almost committed suicide. Since she has assumed name Human Rights defender police in UK believes her and so she is untouchable. She threatens women with deportations as she claims she is linked to UK government. Employers who google victims have dismissed from work. She can pick on anyone from Zimbabwe and defame them. To me this is online abuse and violence against women done under guise of goof name.
¶ 81 Leave a comment on paragraph 81 0 Respondent from Uganda: Of recent in Uganda, there have been case of leaking nude photos of women and girls having sex (sextapes) by their enstranged lovers.
¶ 82 Leave a comment on paragraph 82 0 Respondent from Germany: Since a while I have seen a few threats by terrorists which have messages against all women or especially girls (The most brutal I saw: picture of a grown up 30 years old terrorist holding a 7-9 year old girl on a market place in public, posing for the picture, – I guess it was a yezidish girl- next to him saying that all those girls where being married as a loan for that kind of terrorists, that all those (girls) are their loan and slaves; I was shocked and twitter after my complaint deleted it;)Marion Böker, Germany: Since I use online media and options since long I remember having worked 15 years ago for a political party; I had to email a lot, had to be and wanted to be online; and felt harrassed by a group of men (they only write very long e-mails, or chat messages- in the night at 3 am or so; and they were calling me part of a feminist (communist, jewish…) conspiracy against men, they described them as victims of women like me and threatened me,- it was bad since the internet keeps that somewhat forever… I received threatening fascist emails which forsaw torture for me, and finalized with: we know you private address, we kill you- and the police to whom I reported categorized it only as a ‘insult’, may be because the e-mail started weith ‘You cunt (bad word for vulva), but the police ignored all parts where they wrote about torture and my death. The report at the police ended in nothing: but impunity.
¶ 83 Leave a comment on paragraph 83 0 e) Other comments
¶ 84 Leave a comment on paragraph 84 0 The survey included a ‘catch-all’ question that asked respondents if they had any other comments regarding the definition, scope and/or issue of online abuse and VAW. The survey designers included this question with the aim of gathering any comments that other questions did not directly address and providing respondents with a space to share additional thoughts and advice for the work of the BPF. Where relevant, responses in this section have been incorporated in the analysis of other survey questions. For example, where a comment related to the definition of the issue, such comment was included in the survey question asking respondents to define the issue and analysed in that section.
¶ 85 Leave a comment on paragraph 85 0 Lack of awareness/ need for literacy programmes:
¶ 86 Leave a comment on paragraph 86 0 The awareness and visibility around this issue for younger people is still lacking. (Anonymous)
¶ 87 Leave a comment on paragraph 87 0 I strongly think there should be a heightened awareness program about online safety and how to safeguard yourself. (Respondent from India)
¶ 88 Leave a comment on paragraph 88 0 Youth:
¶ 89 Leave a comment on paragraph 89 0 Online Abuse/ VAW is best curbed in early stages through Child Online Protection. If children are taught on child online safety, they tend to grow up knowing the do’s and don’ts in the online environment. (Respondent from Zambia)
¶ 90 Leave a comment on paragraph 90 0 Proactive versus reactive responses?
¶ 91 Leave a comment on paragraph 91 0 Online Abuse / VAW is a gradual behaviour that does not happen overnight but keeps growing if not stopped or controlled. Any form of intervention at any stage can help reduce the vices. (Respondent from Zambia)
¶ 92 Leave a comment on paragraph 92 0 I am a Law Enforcement officer and I have seen how weak our laws are when it comes to combating Online Abuse / VAW. We have Re-active laws and not Pro-active ones. Until such a time we have Pro-active laws, Online Abuse / VAW will continue to disadvantage victims of the vices. (Respondent from Zambia)
¶ 93 Leave a comment on paragraph 93 0 Importance of context
¶ 94 Leave a comment on paragraph 94 0 Online VAW are inherited from offline social problems that we might have in our societies and these problems could vary from one community to another. In my mind, these studies could be customized at regional or local context, which will enable us to gather more accurate data about a community and what constitutes towards VAW so that appropriate and effective recommendations could be provided. (Respondent from Afghanistan)
¶ 95 Leave a comment on paragraph 95 0 Importance of the technical community
¶ 96 Leave a comment on paragraph 96 0 There should be a special team at National CERTs looking into Online VAW at a grass-root level so as to ensure elimination of VAW at the very basic level. Also, at the intergovernmental and global level (UN and other international organizations) the issue needs to be debated and a comprehensive framework built to fix this menace of VAW. (Respondent from Pakistan).
¶ 97 Leave a comment on paragraph 97 0 Building awareness of female experts in the field
¶ 98 Leave a comment on paragraph 98 0 I also think that men still, especially in infotechnology consider women as not equal – and awareness how good women really are in that field should be emphasized much more.
¶ 99 Leave a comment on paragraph 99 0 Intermediary responsibility
¶ 100 Leave a comment on paragraph 100 0 Internet intermediaries (ISPs, telephone companies, website hosts) also hide from the cloaks of their terms and conditions, they do not claim responsibility and have no accountability when online VAW take place using their platforms. (Respondent from the Philippines)
¶ 101 Leave a comment on paragraph 101 0 Following work on the topic of online abuse and VAW social media companies such as Twitter and Facebook have improved their safety processes and organisations such as google highlight their policies on issues such as revenge porn more prominently. (Respondent from the UK)
¶ 102 Leave a comment on paragraph 102 0 PART 2
¶ 103 Leave a comment on paragraph 103 0 SURVEY CONTENT
¶ 104 Leave a comment on paragraph 104 0 The survey was conducted using Google Forms, which allows an unlimited number of questions and responses and user-friendly design mechanisms to aid the layout of the survey.[3] The survey contents are copied below:
¶ 105 Leave a comment on paragraph 105 0 SURVEY: Countering the Abuse of Women Online
¶ 106 Leave a comment on paragraph 106 0 This brief survey is the first in a series of two surveys designed with the aim to gather broader stakeholder input on topics that are of vital importance to the work of the Internet Governance Forum (IGF) best practice forum (BPF) on Countering the Abuse of Women Online.
¶ 107 Leave a comment on paragraph 107 0 All contributions will be used to guide our work, which is aimed at creating a compendium of practices that help to counter the abuse of women online.
¶ 108 Leave a comment on paragraph 108 0 Read more about this initiative here: http://www.intgovforum.org/cms/best-practice-forums/4-practices-to-countering-abuse-against-women-online
¶ 109 Leave a comment on paragraph 109 0 For questions, please contact the BPF rapporteur, Anri van der Spuy (avanderspuy@unog.ch).
¶ 110 Leave a comment on paragraph 110 0 * Required
Tell us about yourself
¶ 111 Leave a comment on paragraph 111 0 This BPF is an open and inclusive platform that aims to collect experiences from a variety of stakeholders. To get an idea of how diverse contributions are, we appreciate your responses to these two basic questions.
¶ 112
Leave a comment on paragraph 112 1
What stakeholder group do you belong to? *
Select closest option.
- ¶ 113 Leave a comment on paragraph 113 0
- Government
- Technical community
- Civil society
- Private sector
- Intergovernmental organization
- Individual user
- Academia
- Youth
¶ 114
Leave a comment on paragraph 114 1
Where are you from? *
Please write only the country name where you are ordinarily resident.
¶ 115
Leave a comment on paragraph 115 1
What is your name?
You can remain anonymous if you choose to. If you don’t mind telling us who you are, please write your name.
¶ 116
Leave a comment on paragraph 116 0
What organization do you work for?
You can remain anonymous if you choose to. If you don’t mind telling us who you are affiliated to, please write your organization’s name.
About online violence against women (VAW)
¶ 117 Leave a comment on paragraph 117 0 There is still a significant lack of awareness regarding what kinds of online conduct constitute abusive and violent behaviour. To address the increasing prevalence of online VAW in an effective manner, we need to understand how you perceive online VAW, the factors that enable and/or contribute to such conduct, and the impact that online VAW has on not only individuals, but also communities.
¶ 118 Leave a comment on paragraph 118 1 How would you define online abuse and VAW?
¶ 119 Leave a comment on paragraph 119 0 Please add specific references from research or other policy documents as you see relevant.
¶ 120 Leave a comment on paragraph 120 1 In your knowledge or experience, what are the types of behaviour of conduct that you think constitute online abuse or VAW?
¶ 121 Leave a comment on paragraph 121 0 What impact do you think online violence against women can have on individuals? Individuals suffering from online abuse and VAW may:
¶ 122 Leave a comment on paragraph 122 0 Choose most appropriate option(s). Please add any comments or thoughts as you see fit, or other effects that are not included in the list.
- ¶ 123 Leave a comment on paragraph 123 2
- suffer fear, anxiety and depression
- consider or attempt suicide
- withdraw from online spaces and engagement with the Internet
- lose their social networks and/or support
- have their work and income being affected
- experience their mobility being limited and/or curtailed
- Other:
¶ 124
Leave a comment on paragraph 124 0
What effect(s) do you think online VAW can have on communities? It can:
Choose most appropriate option(s). Please add any comments or thoughts as you see fit, or other effects that are not included in the list.
- ¶ 125 Leave a comment on paragraph 125 0
- create a society where women don’t feel safe online and/or offline
- create a society where men and women do not participate equally online
- contribute to a culture of sexism and misogyny online
- disadvantage women, as they do not have the same opportunities for benefiting from the Internet as result (e.g. employment, self-promotion, self-expression)
- contribute to existing gender inequality in offline spaces
- Other:
¶ 126
Leave a comment on paragraph 126 0
What do you think are some of the factors that contribute to online VAW?
Please add to the list or elaborate on your thoughts in the ‘other’ box below.
- ¶ 127 Leave a comment on paragraph 127 1
- gender disparity in terms of access to the Internet
- gender disparity in terms of skills in using the Internet
- inequality and sexism offline that is reflected and amplified in online spaces
- lack of awareness and recognition of online VAW as a serious issue
- women’s unequal participation as decision makers in the development of technology platforms and policies
- inadequate mechanisms available in online platforms that enable effective response to cases of online VAW
- lack of legal remedies to respond to cases of online VAW
- lack of trained moderators, police officers, etc. to respond to cases of online VAW
- Other:
Other advice & help
¶ 128
Leave a comment on paragraph 128 0
Do you know of any resources that could help this BPF’s work?
Resources include research, reports, documents, etc. Please include a link to the relevant source, or otherwise cite the title of the publication, name of author(s), publication date and/or source.
¶ 129 Leave a comment on paragraph 129 0 Do you have any other comments or thoughts about the definition, scope and issue of online abuse and VAW?
Join us & make a difference
¶ 130 Leave a comment on paragraph 130 0 Are you interested in helping us address the challenge of online violence against women? We welcome all participants:
- ¶ 131 Leave a comment on paragraph 131 0
- Join our mailing list for updates on meetings and other developments: http://mail.intgovforum.org/mailman/listinfo/bp_counteringabuse_intgovforum.org
- Visit the BPF’s platform on the IGF’s website: http://www.intgovforum.org/cms/best-practice-forums/4-practices-to-countering-abuse-against-women-online#about
- For more information, contact Anri van der Spuy (avanderspuy@unog.ch).
Thank you
| We appreciate the time you spent in completing this survey, look forward to learning from your valued responses, and hopefully to welcoming you to our BPF in the future. |
¶ 132 Leave a comment on paragraph 132 0
¶ 133 Leave a comment on paragraph 133 0 [1] Available online: http://www.womensaid.org.uk/page.asp?section=00010001001400130007§ionTitle=Virtual+World+Real+Fear.
¶ 134 Leave a comment on paragraph 134 0 [2] Available online: http://www.genderit.org/articles/impunity-justice-exploring-corporate-and-legal-remedies-technology-related-violence-against
¶ 135 Leave a comment on paragraph 135 0 [3] The survey as on Google Forms can be viewed here: https://docs.google.com/forms/d/1Az3fSQRX5nVlkMpReLz4Vtk8QWygHqqJRrSrbvK5ZS0/viewform?fbzx=3083091881606085133.
56 responses is a very small number to conduct a survey. even by high school standarts
Considering the countries picked (for example from Europe) the data is so limited that those countries aren’t enough to represent the continents they are from.
56 responses is an extremely small sample size, even more so considering the respondents come from a variety of continents.