PART 2 Results
¶ 1 Leave a comment on paragraph 1 0 A. PROBLEM DEFINITION
¶ 2 Leave a comment on paragraph 2 0 a) Context
¶ 3 Leave a comment on paragraph 3 1 Due to a lack of awareness of the types of behaviour and/or conduct that constitute violence or abuse against women in an online environment, it is important to clearly define such conduct, with enough room to allow for its changing expression with technological development that affects new ways of interaction and potential violations.
¶ 4 Leave a comment on paragraph 4 0 In the survey responses, definitions of online violence against women and girls (VAWG) contained three common elements, namely: i) the range of action/ behaviour that constitutes online abuse and violence; ii) impact to rights and harm experienced; and iii) the role of technology in enacting/enabling online violence and/or abuse. In addition to these common elements, considerations in the following paragraphs should be kept in mind.
¶ 5 Leave a comment on paragraph 5 2 Online violence forms part of offline violence and abuse, and thus frequently permeates the offline sphere and extends from offline environments into online environments. For many women who experience online abuse from a partner or ex-partner, for instance, online abuse often forms part of a pattern of abuse that was also experienced offline, like ongoing domestic abuse. Online VAWG thus needs to be studied and located within existing offline environments, and with potential repercussions in and of offline environments, in mind.
¶ 6 Leave a comment on paragraph 6 1 There is a serious lack of awareness about women’s rights and the impact of online VAWG on women’s rights, as is also indicated by the survey results.[1] This lack of awareness not only reinforces the importance of sensitisation, awareness raising and literacy programmes on the topic, but it also makes it difficult for victims to make claims for the fulfilment and enforcement of such rights.[2]
¶ 7 Leave a comment on paragraph 7 0 Women do not have to be Internet users to suffer online violence and/or abuse (e.g. the distribution of rape videos online where victims are unaware of the distribution of such videos online). On the other hand, for many women who are active online, online spaces are intricately linked to offline spaces; making it difficult for them to differentiate between events that take place online versus events offline events.[3]
¶ 8 Leave a comment on paragraph 8 0 b) Types of behaviour and/or conduct that constitute online VAWG
¶ 9 Leave a comment on paragraph 9 0 One of the first tasks involved in defining the BPF’s scope of work included outlining what constitutes online abuse and online violence against women. This was seen as a critical task by participants because of the perceived lack of awareness regarding this issue. A survey was designed and broadly distributed to ask a range of stakeholders to list examples of the types of behaviour that they consider to be within this ambit in their knowledge and/or experience. There were 43 responses in total on this question, from different regions and stakeholder groups who responded to this section (see Part 3, Appendix 4).
¶ 10 Leave a comment on paragraph 10 0 Many respondents stressed the fact that online violence also echoed in offline spaces, whilst some definitions also specifically recognised online violence/ abuse as a violation of women’s rights.
¶ 11 Leave a comment on paragraph 11 2 Many of the examples of online abuse and violence (discussed in more detail below) cited by survey respondents were similar or overlapping (especially when synonyms are considered). The examples most frequently identified related to infringements of privacy, harassment, surveillance and monitoring, and damaging reputation and/or credibility. Direct threats of violence, blackmail and attacks against communities were less frequently listed as examples of abuse. Some respondents also felt that excluding women from the accessing the Internet and/or certain online services because they were female amounted to online abuse and/or violence.
¶ 12 Leave a comment on paragraph 12 0 The following non-exhaustive list of online conduct and/or behaviour that potentially constitute abusive behaviour has been identified as practices that are seen as constituting forms of online VAWG. Note that while conduct has been divided into categories for ease of reference, these categories are by no means mutually exclusive:
¶ 13 Leave a comment on paragraph 13 0 i) Infringement of privacy
- ¶ 14 Leave a comment on paragraph 14 4
- accessing, using, manipulating and/or disseminating private data without consent (by hacking into your account, stealing your password, using your identity, using your computer to access your accounts while it is logged in, etc.)
- taking, accessing, using, manipulating, and/or disseminating photographs and/or videos without consent (including revenge pornography)
- sharing and/or disseminating private information and/or content, including (sexualised) images, audio clips and/or video clips, without knowledge or consent
- doxing (researching and broadcasting personally identifiable information about an individual without consent, sometimes with the intention of providing access to the woman in the ‘real’ world for harassment and/or other purposes)
- contacting and/or harassing a user’s children to gain access to her
¶ 15 Leave a comment on paragraph 15 0 ii) Surveillance and monitoring
- ¶ 16 Leave a comment on paragraph 16 0
- monitoring, tracking and/or surveillance of online and offline activities
- using spyware without a user’s consent
- using GPS or other geolocator software to track a woman’s movements without consent
- stalking
¶ 17 Leave a comment on paragraph 17 0 iii) Damaging reputation and/or credibility
- ¶ 18 Leave a comment on paragraph 18 1
- deleting, sending and/or manipulating emails and/or content without consent
- creating and sharing false personal data (like online accounts, advertisements, or social media accounts) with the intention of damaging (a) user’s reputation
- manipulating and/or creating fake photographs and/or videos
- identity theft (e.g. pretending to be the person who created an image and posting or sharing it publicly)
- disseminating private (and/or culturally sensitive/ controversial) information for the purpose of damaging someone’s reputation
- making offensive, disparaging and/or false online comments and/or postings that are intended to tarnish a person’s reputation (including libel/ defamation)
¶ 19 Leave a comment on paragraph 19 1 iv) Harassment (which may be accompanied by offline harassment)
- ¶ 20 Leave a comment on paragraph 20 2
- “cyber bullying” and/or repeated harassment through unwanted messages, attention and/or contact
- direct threats of violence, including threats of sexual and/or physical violence (e.g. threats like ‘I am going to rape you’)
- abusive comments
- inappropriate jokes that serve to demean women
- verbal online abuse
- unsolicited sending and/or receiving of sexually explicit materials
- incitement to physical violence
- hate speech, social media posts and/or mail; often targeted at gender and/or sexuality
- online content that portray women as sexual objects
- use of sexist and/or gendered comments or name-calling (e.g. use of terms like “bitch”/”slut”)
- use of indecent or violent images to demean women
- exposing women to unwanted imagery that may impact them negatively
- abusing and/or shaming a woman for expressing views that are not normative, for disagreeing with people (often men) and also for refusing sexual advances
- counselling suicide or advocating femicide
- mobbing, including the selection of a target for bullying/ mobbing by a group of people rather than an individual and as a practice specifically facilitated by technology
¶ 21 Leave a comment on paragraph 21 0 v) Direct threats and/or violence
- ¶ 22 Leave a comment on paragraph 22 2
- trafficking of women through the use of technology, including use of technology for victim selection and preparation (planned sexual assault and/or femicide)
- sexualised blackmail and/or extortion
- theft of identity, money and/or property
- impersonation resulting in physical attack
- grooming
¶ 23 Leave a comment on paragraph 23 0 vi) Targeted attacks to communities
- ¶ 24 Leave a comment on paragraph 24 1
- hacking websites, social media and/or email accounts of organisations and communities
- surveillance and monitoring of activities by members in the community
- direct threats of violence to community members
- mobbing, including the selection of a target for bullying/ mobbing by a group of people rather than an individual and as a practice specifically facilitated by technology
- disclosure of anonymised information like address of shelters, etc.
¶ 25 Leave a comment on paragraph 25 1 vii) Limiting women’s access and/or use of technology
- ¶ 26 Leave a comment on paragraph 26 0
- limiting women’s access to the Internet and/or online services that men are allowed to use
¶ 27 Leave a comment on paragraph 27 0 viii) Theft or abuse of intellectual property
- ¶ 28 Leave a comment on paragraph 28 1
- stealing, manipulating and or abusing a women’s intellectual property online, including ideas and/or content.
¶ 29 Leave a comment on paragraph 29 0 c) Policy and research definitions
¶ 30 Leave a comment on paragraph 30 0 There have been some efforts to define what constitutes online violence against women, drawing from policy documents and research initiatives. This provides some guidance in delineating the dimensions of this issue.
¶ 31 Leave a comment on paragraph 31 1 Violence against women is defined in article 1 the Declaration on the Elimination of Violence against Women (United Nations General Assembly, 1993) to mean:
¶ 32 Leave a comment on paragraph 32 0 ‘any act of gender-based violence that results in, or is likely to result in, physical, sexual or psychological harm or suffering to women, including threats of such acts, coercion or arbitrary deprivation of liberty, whether occurring in public or in private life.’
¶ 33 Leave a comment on paragraph 33 0 The phrase ‘any act’ covers online violence against women.
¶ 34 Leave a comment on paragraph 34 0 Women’s Aid report[4] ‘Virtual World: Real Fear’ looked into online harassment, stalking and abuse and defined online abuse as:
¶ 35 Leave a comment on paragraph 35 0 ‘the use of the internet or other electronic means to direct abusive, unwanted and offensive behaviour at an individual or group of individuals.’
¶ 36 Leave a comment on paragraph 36 0 Research by APC on online VAWG[5] defines technology-related violence as encompassing:
¶ 37 Leave a comment on paragraph 37 0 ‘acts of gender-based violence that are committed, abetted or aggravated, in part or fully, by the use of information and communication technologies (ICTs), such as phones, the internet, social media platforms, and email.’
¶ 38 Leave a comment on paragraph 38 0 The recently published UN Broadband Commission for Digital Development Working Group on Broadband and Gender report[6] defines ‘cyber VAWG’ to include:
¶ 39 Leave a comment on paragraph 39 1 ‘hate speech (publishing a blasphemous libel), hacking (intercepting private communications), identity theft, online stalking (criminal harassment) and uttering threats. It can entail convincing a target to end their lives (counselling suicide or advocating genocide). The Internet also facilitates other forms of violence against girls and women including trafficking and sex trade.’
¶ 40 Leave a comment on paragraph 40 0 As early as 2006, the UN Secretary General’s In-depth study on all forms of violence against women[7] recognised that:
¶ 41 Leave a comment on paragraph 41 0 ‘More inquiry is needed about the use of technology, such as computers and cell phones, in developing and expanding forms of violence. Evolving and emerging forms of violence need to be named so that they can be recognized and better addressed.’
¶ 42 Leave a comment on paragraph 42 0 It is only in recent years that UN experts and intergovernmental bodies have begun to pay attention to this issue. In March 2013 the Commission on the Status of Women’s Agreed conclusions on the elimination and prevention of all forms of violence against women and girls,[8] adopted at its 57th session, urged governments and relevant stakeholders to:
¶ 43 Leave a comment on paragraph 43 0 ‘… develop mechanisms to combat the use of ICT and social media to perpetrate violence against women and girls, including the criminal misuse of ICT for sexual harassment, sexual exploitation, child pornography and trafficking in women and girls, and emerging forms of violence such as cyber stalking, cyber bullying and privacy violations that compromise women’s and girls’ safety.’
¶ 44 Leave a comment on paragraph 44 0 In mid-2013, the UN Working Group on Discrimination against women in law and practice included a specific reference to the Internet as ‘a site of diverse forms of violence against women’. The Working Group expressed concern that for ‘women who engage in public debate through the Internet, the risk of harassment is experienced online, for example, an anonymous negative campaign calling for the gang rape of a woman human rights defender, with racist abuse posted in her Wikipedia profile’. It also recommended that states support women’s equal participation in political and public life through ICTs, including by ensuring gender-responsiveness in the promotion and protection of human rights on the Internet, and improving women’s access to the global governance of ICTs.[9]
¶ 45 Leave a comment on paragraph 45 0 At the end of 2013, the UNGA adopted a consensus resolution[10] on protecting women human rights defenders with language on tech-related human rights violations:
¶ 46 Leave a comment on paragraph 46 0 ‘… information-technology-related violations, abuses and violence against women, including women human rights defenders, such as online harassment, cyberstalking, violation of privacy, censorship and hacking of e-mail accounts, mobile phones and other electronic devices, with a view to discrediting them and/or inciting other violations and abuses against them, are a growing concern and a manifestation of systemic gender-based discrimination, requiring effective responses compliant with human rights.’
¶ 47 Leave a comment on paragraph 47 0 Most recently, the UN Special Rapporteur on VAW’s report to the 29th session of the Human Rights Council on her mission to the UK, expressed concern about “women aged between 18 and 29 being at greatest risk of threatening and offensive advances on the Internet”.[11]
¶ 48 Leave a comment on paragraph 48 0 B) IMPACT AND CONSEQUENCES
¶ 49 Leave a comment on paragraph 49 0 [Note: this section will to be augmented by social media campaign in October, and will be updated accordingly in Draft JP]
¶ 50 Leave a comment on paragraph 50 0 The social and economic impact of online VAWG on not only individuals but also wider communities are influenced by a multiple factors, and can be difficult to measure. Some of the common were identified BPF participants’ input, and through the broad stakeholder survey, and are outlined below.[12]
¶ 51 Leave a comment on paragraph 51 0 a) Impact on individuals
¶ 52 Leave a comment on paragraph 52 0 Women commonly suffer fear, anxiety and depression as result of online VAWG; reducing their involvement with the Internet and leading to withdrawal from online spaces (sometimes to the extent that it may lead to suicide or attempted suicide). Victims’ work, ambition and income are frequently affected; and experience their mobility being limited and/or curtailed. Online VAWG is furthermore translated into offline environments when women experience their mobility being limited directly be transformed into offline, physical abuse because of online information; and it can lead to the identification and/or preparation of victims for trafficking and/or other forms of offline abuse/ violence.
¶ 53 Leave a comment on paragraph 53 0 It is notable that response rate for all options in the survey were relatively high (above 60%), which points to the significant and multi-dimensional impact that online VAW can have on women who experience them.
¶ 54 Leave a comment on paragraph 54 0 b) Impact on communities
¶ 55 Leave a comment on paragraph 55 0 One of the most common consequences of online VAWG on communities, according to survey respondents, is the creation of a society where women no longer feel safe online and/or offline. Online VAWG also contribute to a culture of sexism and misogyny online and, in offline spaces, to existing gender inequality. In respect of the latter, online VAWG disadvantages women in limiting their ability to benefit from the same opportunities online that men frequently benefit from (e.g. employment, self-promotion and/or self-expression).
¶ 56 Leave a comment on paragraph 56 0 In some of the other, open-ended survey questions, respondents also sometimes highlighted the impact and consequences of online VAWG. These include (in no specific order):
- ¶ 57 Leave a comment on paragraph 57 0
- limiting and/or violating women’s rights;
- physical or psychological damage, including public humiliation;
- making women feel unsafe;
- silencing individuals; and
- forcing women out of online spaces.
¶ 58 Leave a comment on paragraph 58 0 c) Impact in specific contexts
¶ 59 Leave a comment on paragraph 59 1 Online VAWG can impact women in different ways depending on their context or identity. This can be attributed to multiple and intersecting forms discrimination that women and girls face based on these factors. For example, women can be more at risk to diverse types of abusive or violent behaviour because of their profession, age, identity or geographical location. Some of these specific contexts or ‘classifications’ of women are outlined in the paragraphs below.[13]
¶ 60 Leave a comment on paragraph 60 0 i) Girls and young women
¶ 61 Leave a comment on paragraph 61 1 There is growing recognition of the particular risks that young people and children face online in many countries around the world, including an IGF BPF on the topic in 2014.[14] Some initiatives, like that of Disney’s education programmes aimed at young children (under the age of 10 years) are also designed to teach children to respect everyone online and to prevent cyberbullying, for instance.[15] However, although girls and young women are often more likely to experience certain forms of online VAWG, particularly with respect of their bodily development and sexuality, most literacy programmes and research into child online protection is not gender-specific.
¶ 62 Leave a comment on paragraph 62 1 Example: Ranking girls for alleged sexual promiscuity in Sao Paulo, Brazil[16]
¶ 63 Leave a comment on paragraph 63 0 In Brazil, a practice called ‘Top 10’ ranks teenage girls between the ages of 12 and 15 has led to school dropouts and suicides in at least two peripheral neighbourhoods of Sao Paulo. Profile pictures of the girls are mixed with phrases of the girls’ alleged sexual behaviour, and the girls are then ranked according to ‘how whore they are’. The InternetLab believes the practice to be quite widespread in at least Brazil.
¶ 64 Leave a comment on paragraph 64 0 Example: Filming and blackmailing a girl for sex and money in Pakistan:[17]
¶ 65 Leave a comment on paragraph 65 0 A 16 year-old girl in Pakistan was filmed having sex with an older man and repeatedly blackmailed for sex thereafter. Her family were also subsequently blackmailed for money, and her father said the incidents had happened, bringing ‘shame and dishonour’ to the family, because the girl had been granted the ‘freedom’ to attend school.
¶ 66 Leave a comment on paragraph 66 0 Example: Sharing videos of a girl without her consent in the Philippines:[18]
¶ 67 Leave a comment on paragraph 67 0 Videos of what purports to be a well-known 12 year-old female actor allegedly masturbating in her room were shared online in June 2015. While the identity of the person(s) who uploaded the videos remains unknown, the videos were shared repeatedly on social media platforms. No action appears to have been taken against the perpetrator(s).
¶ 68 Leave a comment on paragraph 68 0 ii) Women in rural contexts
¶ 69 Leave a comment on paragraph 69 0 Women in rural contexts face multiple challenges in terms of access to the Internet. This includes access to available and affordable infrastructure, and importantly, different gendered norms that apply when it comes to who is prioritised for accessing and using technology, as well as existing gender disparities such as income and literacy.
¶ 70 Leave a comment on paragraph 70 0 As a result, digital divides tend to affect women more than men, and in rural areas even more so and in different ways (also see section for challenges related to increasing connectivity and access for women). Further, women in rural contexts may also be subjected to greater social and cultural surveillance that can result in far greater impact and harm in incidences of online VAW. When compounded with the existing gap in access to and control over technology, this also significantly impact on their capacity to take action and access redress (see section iii) below too).
¶ 71 Leave a comment on paragraph 71 0 Example from Pakistan:[19]
¶ 72 Leave a comment on paragraph 72 0 In a remote Pakistani village, women who had been filmed with a mobile phone dancing and singing together with men at a wedding ceremony were reportedly sentenced to death by a tribal assembly. In this area, strict gender segregation beliefs do not permit women and men to be seen socialising together. The video was disseminated without their knowledge or consent, and had a far-reaching consequence by transmitting a private moment into a more public space.
¶ 73 Leave a comment on paragraph 73 0 Example from Mexico:[20]
¶ 74 Leave a comment on paragraph 74 0 In a small village in Mexico, an active parishioner and teacher was accused of cheating on her husband, and their children of being fathered by others, on a Facebook page dedicated to gossip in the community. The accusations damaged her reputation in the community, made some parents unwilling to trust her as a teacher, and also lead to her being abused by her husband.
¶ 75 Leave a comment on paragraph 75 0 iii) Religion, culture and morality
¶ 76 Leave a comment on paragraph 76 0 Women often disproportionately bear the burdens of upholding religious, cultural and moral values of a particular society. As such, they can face additional risk of attacks for being perceived to violate a particular religious, cultural or moral norm. This is especially in relation to issues related to bodily autonomy and sexuality. For example, organisations working on the right to abortion face frequent attacks, which extends to the digital sphere, as noted by the UN General Assembly Resolution on Protecting Women Human Rights Defenders.[21]
¶ 77 Leave a comment on paragraph 77 2 Religious, cultural or moral norms can also be used as methods to attack and threaten women online. In some contexts, this can be put women especially at risk to physical violence, where the line between online threats and the likelihood of offline occurrence is fine. Access to justice can also be challenging when the state or law enforcement prioritises prosecution of offences against religion, culture and morality rather than online VAWG.
¶ 78 Leave a comment on paragraph 78 0 Example from Pakistan:[22]
¶ 79 Leave a comment on paragraph 79 1 Bayhaya developed a campaign as part of her work as human rights activist in Pakistan. Following the launch of the campaign she, along with her female colleagues, received serious online threats and abuse. Although she closed her social media accounts, her personal data (including pictures) were stolen and used for posters that accused her for ‘blasphemy’ and insulting the Quran and Prophet Muhammed.
¶ 80 Leave a comment on paragraph 80 0 Example from India:[23]
¶ 81 Leave a comment on paragraph 81 0 Sonali Ranade is a trader who tweets[24] about a range of issues, from market trends to gender. She faced vicious attacks on Twitter after one of her posts called for the Chief Minister of Gujarat (a state in western India), to make amends for the way in which he handled riots in the state. Analysis of the online attacks traced it back to an organised effort by the religious Hindu right wing.
¶ 82 Leave a comment on paragraph 82 0 Example from Latin America:[25]
¶ 83 Leave a comment on paragraph 83 0 The Latin America and Caribbean Women’s Health Network faced systematic hacking of their website immediately following the launch of several campaign activities in September 2013 to decriminalise abortion in the region. This was seen as a serious extension of the harassment and intimidation of women’s human rights defenders who worked on the high-risk issue of promoting women’s sexual and reproductive health and rights.
¶ 84 Leave a comment on paragraph 84 0 Example from Malaysia:[26]
¶ 85 Leave a comment on paragraph 85 0 A radio journalist received numerous threats of violence, rape and murder on social media after presenting a satirical video that questioned the opposition state government’s intention to push for Islamic criminal law, or hudud, in Malaysia. The video was removed, and the journalist was probed for ‘blasphemy’.
¶ 86 Leave a comment on paragraph 86 0 Example from the US:[27]
¶ 87 Leave a comment on paragraph 87 0 USA’s largest reproductive health care provider, Planned Parenthood, faced systematic digital attacks in July 2015 by self-professed anti-abortion hackers. Attacks included a breach into their encrypted employee database with the stated intention of releasing personally identifiable information of abortion service providers, and disabling of their websites from distributed denial of service (DDoS) attacks. Set within a context where abortion is already stigmatised and morally politicised, the physical threat to safety was underlined through the risk of publicising personal information.
¶ 88 Leave a comment on paragraph 88 0 iv) Women of diverse sexualities and gender identities
¶ 89 Leave a comment on paragraph 89 0 For lesbian, bisexual, transgender and queer women who face existing discrimination, stigma and in some contexts, criminalisation and serious threats to their personal safety, the Internet can be an important space for the exercise of their rights. They are able to gain access to critical information that is otherwise restricted or censored, to form communities in relative safety, and to organise for the advancement of their interests and human rights.
¶ 90 Leave a comment on paragraph 90 0 Despite this positive potential, studies show that LGBT individuals and advocates face more threats and intimidation online.[28] Stonewall, a UK-based organisation, has reported that 23% of LGBT students reported experiences of cyberbullying, with 5% of LGBT adults reporting that they had been the target of homophobic insults in 2014.[29] A global monitoring survey conducted by APC in 2013 found that 51% of sexual rights advocates had received violent messages, threats or comments while working online, while 34% mentioned that they faced intimidation online. In the same study, 45% of respondents indicated serious concerns that their private information online can be accessed without their knowledge or consent.[30]
¶ 91 Leave a comment on paragraph 91 0 Online anonymity is particularly important in this context. As noted in a May 2015 report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression:[31]
¶ 92 Leave a comment on paragraph 92 0 ‘Encryption and anonymity […] provide individuals with a means to protect their privacy […], and enabling […] those persecuted because of their sexual orientation or gender identity […] to exercise the rights to freedom of opinion and expression.’
¶ 93 Leave a comment on paragraph 93 0 Threats to privacy and disclosure of personal information can subject women of diverse sexualities and gender identities to significant threats and attacks, both online and offline. At the same time, perpetrators often anonymise themselves online in their attacks. This presents a challenging context for addressing the issue.
¶ 94 Leave a comment on paragraph 94 0 For example, Facebook’s real name policy, which requires all users to use their real name when using the platform in an attempt to ‘keep our community safe’,[32] has reportedly indirectly led to a new form of abuse for transgender people online. As reported by a transgender user in a media report,[33] the policy has led to transgender users’ profiles being reported in what is viewed as a form of bullying or harassment, often motivated by sexual aggression or homophobia.
¶ 95 Leave a comment on paragraph 95 0 Example in Cameroon:[34]
¶ 96 Leave a comment on paragraph 96 0 Activists working on LGBT issues often face threats of violence in Cameroon. A primary method is through intimidation via SMS or Facebook messages, often sent anonymously. Threats also extend beyond the advocates, including lawyers and family members. While advocates of all gender and sexual identities are at risk, women face the additional threat of sexualised attacks, including sexual assault. The online threats often escalate to physical violence.
¶ 97 Leave a comment on paragraph 97 0 v) Women with disabilities
¶ 98 Leave a comment on paragraph 98 0 While this section was noted as being of particular importance, the BPF has yet to receive information that can help to illustrate the specific risks that women with disabilities face online. The area is sufficiently critical enough to be included as a clear gap where more research and analysis is needed.
¶ 99 Leave a comment on paragraph 99 0 vi) “Public” women and women in technology fields
¶ 100 Leave a comment on paragraph 100 0 Women who are prominent in online or offline environments tend to be subjected to more abuse when they interact or express opinions online. Such cases of abuse often also attract more media attention (and awareness) than the cases of ‘ordinary’ women. Examples of such prominent women include human rights defenders, women journalists (including citizen journalists and bloggers), and women who are active in technology industries.
¶ 101 Leave a comment on paragraph 101 0 Example from Pakistan:[35]
¶ 102 Leave a comment on paragraph 102 1 Bayhaya developed a campaign as part of her work as human rights activist in Pakistan. Following the launch of the campaign she, along with her female colleagues, received serious online threats and abuse. Although she closed her social media accounts, her personal data (including pictures) were stolen and used for posters that accused her for ‘blasphemy’ and insulting the Quran and Prophet Muhammed.
¶ 103 Leave a comment on paragraph 103 0 Example from Britain:[36]
¶ 104 Leave a comment on paragraph 104 0 Caroline Criado-Perez took part in a successful campaign to retain a female face (that of Jane Austen) on one of the Bank of England’s pound notes. As a result she suffered severe online violence, including rape threats and other abuse. Her supporters – including a prominent politician and female journalists – faced similar abuse online.
¶ 105 Leave a comment on paragraph 105 0 Example from India:[37]
Sagarika Ghose and her husband, Rajdeep Sardesai, are both well-known journalists and active Twitter users in India. While both receive frequent attacks on Twitter for their views, the forms it takes is notably gendered. The vitriol faced by Sagarika is often sexually violent in nature. One example includes: “Bitch, you deserve to be stripped and raped publicly.”
¶ 106 Leave a comment on paragraph 106 0 Example from the US:[38]
¶ 107 Leave a comment on paragraph 107 0 In August 2014, a series of coordinated and escalating incidents of harassment, which included doxing, threats of violence and rape and death threats, were primarily targeted at prominent and vocal feminists in the field of gaming, including Zoë Quinn, Brianna Wu, and Anita Sarkeesian. The harassment campaign later became synonymous with the name Gamergate (or #Gamergate); among other things raising awareness of sexism in the gaming industry.
¶ 108 Leave a comment on paragraph 108 0 C. UNDERLYING FACTORS AND ENABLING ENVIRONMENTS
¶ 109 Leave a comment on paragraph 109 0 As also explored in the preceding sections, various factors – including cultural norms, socioeconomic status, the ordinary level of violence in the community concerned, the rate of Internet adoption and accessibility, among others – play a role in creating enabling environments for online VAW to perpetuate. They can also have a compounding effect on the impact of online VAW, as well as the allocation and effectiveness of resources to ensure women gain access to justice and redress when they face violence online. Some of the key enabling factors discussed during the BPF are outlined below.
¶ 110 Leave a comment on paragraph 110 0 a) (A lack of) awareness, education and digital literacy
¶ 111 Leave a comment on paragraph 111 0 A lack of awareness and recognition of online VAWG as abuse and/or violence was the factor that most survey respondents, for example, thought contribute to the incidence of online VAWG. Other related factors include a similar lack of awareness regarding available remedies and a lack of digital literacy and awareness of how to protect yourself from harm and how to be safe online.
¶ 112 Leave a comment on paragraph 112 1 In addition to a lack of awareness and low levels of digital literacy, there is also a tendency to trivialise or normalise online VAWG, particularly on social media platforms. This apparent trend, which is arguably related to a lack of awareness of the effects of online VAWG, is particularly harmful as it contributes to gender inequalities and a culture that may be increasingly hostile to female Internet users. An example includes the sharing of pictures of injured women with captions like ‘next time don’t get pregnant’ on Facebook.[39]
¶ 113 Leave a comment on paragraph 113 0 b) Gender inequalities
¶ 114 Leave a comment on paragraph 114 0 The ways in which inequality and sexism in offline environments, including gender norms, expectations and/or stereotypes, are reflected and amplified in online spaces is also an important enabling factor for online abuse (for example stereotypes like ‘women are not good in tech’, ‘girls cannot be gamers’).
¶ 115 Leave a comment on paragraph 115 0 Example from Afghanistan:[40]
¶ 116 Leave a comment on paragraph 116 0 A BBC study shows how Afghan women face widespread online abuse including the unlawful distribution of naked pictures and the creation of fake accounts, particularly on Facebook. As a BPF contributor explains:
¶ 117 Leave a comment on paragraph 117 0 ‘These stories are examples of what actually happens with women online in Afghanistan. Men usually send friend-requests or inappropriate text messages to women they don’t know. They are harassed when they post comments or publish their pictures. Their pictures are stolen and fake accounts are created to defame them or destroy their reputation in the society. Naked pictures or other forms of sexual material are transmitted to them which forces women to use aliases and take down their pictures or they shut down their social media accounts.’
¶ 118 Leave a comment on paragraph 118 0 Discrimination against women in education in general, and gender gaps in academic disciplines of specifically science, technology, engineering and mathematics (STEM) mean that fewer women and girls are able to participate in fields relevant to the Internet and Internet governance.
¶ 119 Leave a comment on paragraph 119 0 Women’s unequal participation as decision makers in the development of technology platforms and policies is also an important factor; leading to the frequent neglect of issues that are of particular relevance and importance to women. Not only is technology furthermore sometimes viewed as ‘masculine’, but the technology industry also has significantly more males than females. Policy discussions around Internet governance furthermore tend to be dominated by male participants, with frequent examples of technology-related harassment occurring during and after women’s participation in Internet governance events.
¶ 120 Leave a comment on paragraph 120 0 Furthermore, multiple and overlapping forms of discrimination against women based on their race, ethnicity, caste, class, sexuality, disability, migrant and/or refugee status also permeate online environments, as can be seen from the sections above discussing the impact of online VAW in specific contexts.
¶ 121 Leave a comment on paragraph 121 0 c) Social norms/ cultures of violence and patriarchy
¶ 122 Leave a comment on paragraph 122 0 Closely related to the aforementioned factor of gender inequality, the existence of certain social norms and expectations in many societies lead to reluctance to report abuse. Not only are victims often blamed for the abuse that they experience online, but they also feel that perpetrators will not be held accountable and that online actions are seemingly immune from the rule of law.
¶ 123 Leave a comment on paragraph 123 0 Not only do victims often face social stigma and reputational risks in reporting online abuse, but technology-related abuse is still trivialised in many societies. There appears to be a related lack of recognition for the psychological and other harms of online VAWG, with victims having limited or no recourse to relevant support systems.
¶ 124 Leave a comment on paragraph 124 0 d) Legal and political context
¶ 125 Leave a comment on paragraph 125 1 For victims, a lack of support frequently extends to the legal and political environments they find themselves in. Authorities, including police officers, are sometimes unsympathetic, tend to lack relevant training in how to address online VAWG, and often do not have the necessary equipment for finding evidence of online VAWG. When victims do manage to have an incident of online VAWG reported and investigated by law enforcement officials, they face further difficulties in terms of the abilities and technological knowledge of moderators and/or the judiciary (including court systems, magistrates, judges, and other officers of law).
¶ 126 Leave a comment on paragraph 126 0 For example, some (particularly older) judges faced with deciding a case about defamatory and abusive posts on a Facebook wall might struggle to understand the potential impact of such a form of abuse, and similarly, in making rulings and issuing judgments, tend to neglect the realities of how online posts spread on the Internet. The pace at which many cases can be investigated and heard, and the costs of judicial proceedings, are furthermore prohibitive for many victims.
¶ 127 Leave a comment on paragraph 127 0 In addition to potentially reluctant and apathetic law enforcement and judicial systems, legal and political environments in many countries make it even more difficult for victims to institute complaints and cases. There is also a reluctance in some countries to extend existing definitions of violence and abuse (and the availability of related legal remedies) to cover online VAWG; and where new forms of abuse develop, there is a further lack of political will in some countries to enact laws and/or deal with it. In many countries there is an urgent need to review existing legislation and policies and to determine how relevant they are for current realities in the context of online VAWG; while keeping in mind the need for flexibility to account for the pace of technological change.
¶ 128 Leave a comment on paragraph 128 0 In comparison to other forms of abuse, online VAWG is also often trivialised. Offline VAWG, like for instance domestic abuse, is difficult enough for authorities to investigate and prosecute. Because the consequences of online VAWG is not as easy to detect as offline violence, authorities tend to prioritise forms of offline VAWG with visible ‘effects’ on women and girls.
¶ 129 Leave a comment on paragraph 129 0 When cases are effectively brought before and adjudicated by courts and tribunals, existing legal remedies for criminal abuse/ violence are often unsuitable for and ineffective in an online context as they fail to account for and adequately deal with the pace of technological change and the ways in which, for example, content is shared and distributed online. The inadequacy of mechanisms available on online platforms to enable effective responses to cases of online VAWG makes it even more difficult for victims. For these reasons, women often feel that there are little or no consequences and a perceived impunity for online crime. There is therefore a need to review existing remedies and to, importantly, talk to the women and victims involved to determine how suitable and useful existing remedies were and are to them.
¶ 130 Leave a comment on paragraph 130 1 Lastly, the cross-jurisdictional nature of the internet means that authorities, including law enforcement or even internet intermediaries like telecommunication companies can find it difficult to investigate and pursue cases of online VAW.
¶ 131 Leave a comment on paragraph 131 0 Example from Britain:[41]
When a Muslim woman left her forced marriage, her ex-husband (who is not based in the UK) started setting up fake profiles for her on social media. He not only alleged that she is a prostitute, but also offered her services and shared her personal contact details. She was disowned by her family and received requests for her ‘services’. While Police are involved, little support has been offered because the woman’s husband (the perpetrator) is outside the UK.
¶ 132 Leave a comment on paragraph 132 0 e) Access to and ability to use ICTs
¶ 133 Leave a comment on paragraph 133 0 Digital divides often impact women in particular, with gender disparities in both access to the Internet and in terms of the skills of Internet users. This may be an additional factor that contributes to the creation of enabling environments for abuse. Existing gender inequality in the field of ICT (discussed in paragraphs 112 to 119 above), as well as in economic, social and political dimensions, are related factors that impact whether women can participate online and what behaviour women experience online.
¶ 134 Leave a comment on paragraph 134 0 f) Histories and environments of abuse
¶ 135 Leave a comment on paragraph 135 0 Offline violence penetrates and reverberates in online environments (as mentioned in paragraph 5 above). Conflict within an intimate partner relationship or a woman’s immediate circle of family members, friends and/or colleagues can contribute to the incidence of online violence; as well as emotional trauma like separation, divorce, and/or a history of rape, sexual or domestic abuse.
¶ 136 Leave a comment on paragraph 136 0 g) Prominence
¶ 137 Leave a comment on paragraph 137 1 Women who are prominent and find themselves in public spaces tend to face more abuse (see paragraphs 98 to 106 above). Women also often face threats and violence after political articulation or participation, and, similarly, women who are famous (e.g. actors) also tend to experience more online violence or abuse than women who are not as well-known.
¶ 138 Leave a comment on paragraph 138 0 D. SPECIFIC CHALLENGES RELATING TO ACCESS AND ONLINE VAWG[42]
¶ 139 Leave a comment on paragraph 139 0 Technological advancement in connectivity has expanded broadband access and mobile penetration in recent years – also for women. But a gender digital gap still persists and is expressed in multiple dimensions. This begins from unequal access to basic Internet infrastructure; the affordability of connectivity costs and devices; gender disparity in education opportunities, including digital literacy; uneven capacity to use the Internet for their needs and priorities; specific gender-based challenges and barriers, including the availability of relevant content and the censorship of online content related to gender and sexuality; and gender-based harassment and violence, both in physical spaces for accessing the Internet (such as public access points like cybercafes) and in online environments (including online harassment and cyberstalking).
¶ 140 Leave a comment on paragraph 140 0 While there are various current processes aimed at improving connectivity – like the IGF’s intersessional activity on the theme,[43] the ongoing process of reviewing the World Summit of the Information Society (WSIS+10), the discussion of the post-2015 Sustainable Development Goals (SDGs) – it is crucial that these digital divides are addressed. This points to the fact that addressing gender and access to the Internet requires an approach that is located within economic, social, political and cultural contexts. It is both short-sighted and inadequate to respond to this issue by looking at infrastructure or economic issues without examining the interplay of various other factors that act as pre-conditions as well as influencing factors to the extent that women and girls are able to access and use the Internet freely, safely and equally in the full exercise of their rights. This includes taking into consideration the impact of online VAWG as a barrier to access, as well as the creation of enabling environments for the protection of women’s rights online in tandem with efforts to connect women to the Internet.
¶ 141 Leave a comment on paragraph 141 0 As with other broad public policy issues in Internet governance,[44] efforts to combat and address online VAWG often emanate from the developed world and also tend to reflect conditions, cultural perceptions and expectations in developed countries. On the other hand, addressing online VAWG is generally less of a priority in developing countries with lower levels of Internet penetration and/or access; where there may also be a lack of infrastructure, will and/or capacity to monitor, address and prevent online and offline VAWG. A way forward is to extend current definitions of VAWG to include online violence and abuse, which can then translate into the inclusion of online VAWG into the application and reform of existing anti-VAWG laws, the allocation of resources, and the development of policies and programmes.
¶ 142 Leave a comment on paragraph 142 0 While much can be learnt from the experiences of developed countries with high levels of Internet access and use, preconceptions of what constitutes ‘best’ practices to counter online VAWG cannot simply be extrapolated from developed countries for implementation in developing countries. As mentioned and investigated in some of the preceding sections, different contexts have a significant impact on the nature of online abuse and mechanisms used to address online abuse. There is thus a need for further research to study differences in context and to tailor programmes and mechanisms to specific local contexts.
¶ 143 Leave a comment on paragraph 143 0 Many social media platforms used globally are located in developing countries, specifically in the US, but have diverse regional, national and local impact. As such, greater attention needs to be paid by Internet intermediaries in thinking through responses that are also responsive to experiences in developing contexts (see paragraphs 180 to 192 in respect of intermediaries and private sector actors below).
¶ 144 Leave a comment on paragraph 144 0 Online VAWG also constantly evolves and may become increasingly sophisticated. It was interesting to note, for example, that many survey respondents felt that denying women access to the Internet or to certain online services is a worrying tendency particularly relevant as and when governments are working to connect the unconnected. Women may, for instance, be denied access to information and crucial support (including when, for example, governments in both developed and developing countries install filters to ostensibly address online pornography, thereby also filtering women’s access to information about female health, reproduction and other issues).
¶ 145 Leave a comment on paragraph 145 0 Similarly, controversial initiatives like Facebook’s Internet.org may deny women of the developmental benefits that the Internet can offer. For example, research[45] into gendered differences of social media tools like Facebook and WhatsApp in Pakistan indicated that women in Pakistan are much more likely to adopt WhatsApp and men are more fond of Facebook. The research seems to suggest that technologies can also reinforce and maintain existing social norms, with women preferring WhatsApp as a tool to reinforce and strengthen relations with family and friends while retaining their privacy, while men can be more ‘public’ on Facebook.
¶ 146 Leave a comment on paragraph 146 0 E. SOLUTIONS, RESPONSES AND/OR STRATEGIES TO COUNTER ONLINE VAWG
¶ 147 Leave a comment on paragraph 147 0 Violence against women, whether perpetrated online or offline, is difficult to address because of the attitudes, stereotypes and beliefs that underpin violence. In an online context, deciding upon appropriate measures to protect women is complicated because such measures need to be taken within the global context of the Internet with the cooperation of a multitude of stakeholders.
¶ 148 Leave a comment on paragraph 148 0 Efforts to develop, encourage and implement practices to counter online VAWG vary significantly around the world. The factors that contribute to the creation of environments that enables online VAWG also determine whether stakeholders will allocate resources to protect women online, including existing gender inequalities, education systems, digital literacy levels and the importance attached to encouraging gender equality, social and cultural norms in the country concerned, legal and political environments, and Internet adoption rates.
¶ 149 Leave a comment on paragraph 149 1 Tensions around competing rights have often been raised in discussions to address online VAWG; particularly through measures that involve the takedown of content, which brings to question issues of freedom of expression. Women’s rights advocates have responded by stating that online VAWG in effect curtails women’s right to freedom of expression by creating a hostile and unsafe online environment that can result in women withdrawing from online spaces. Similarly, while anonymity and the protection of privacy are often described as vital for the exercise of freedom of expression online, these rights also enable online VAWG by hiding the identities of perpetrators (as explained in paragraphs 90-93 above). There is thus a need for measures that protect women online to consider, include and balance multiple rights including the right to safety, mobility, to participate in public life, freedom of expression, and privacy; and to take into account existing inequalities and discrimination which may affect how rights are protected and recognised. Such balancing exercises need to consider the importance, nature and extent of any limitation proposed and should opt for the less restrictive means to achieve the purpose.
¶ 150 Leave a comment on paragraph 150 0 Due to the relatively recent recognition of this issue, the list of existing measures provided below is not an exhaustive list but is intended to be part of an increasing effort to document and collect emerging approaches. The aim is not to list all approaches but rather to provide snapshots of approaches as submitted by BPF participants, with the aim of potentially highlighting trends. For this purpose, stakeholder groups were divided into government and public sector responses; multistakeholder and intergovernmental approaches; private sector responses; and community/ user-led approaches.
¶ 151 Leave a comment on paragraph 151 0 a) Public sector initiatives
¶ 152 Leave a comment on paragraph 152 0 While the scope of this BPF did not allow a detailed analysis of public sector approaches to the issue, various participants did submit examples useful to this work. General trends noticeable include that some countries have amended existing legislation to ensure applicability to online environments, whilst others have explicitly enacted new legislation to not only achieve the aforementioned, but to also criminalise specific acts online and to ensure that intermediaries cooperate with authorities. Some examples of legislative initiatives in the public sector include:[46]
¶ 153 Leave a comment on paragraph 153 0 Examples:
¶ 154 Leave a comment on paragraph 154 0 In New Zealand, the Harmful Digital Communications Act[47] was passed in July 2015. Initially introduced with the aim of addressing cyberbullying alone, the scope of the Act has since broadened significantly (and controversially) to target all content online that might be harmful. The Act not only amends existing legislation to ensure applicability to online spheres, but also creates an agency to which victims can turn when they face online abuse; a set of court orders that can be served against Internet hosts and authors upon referall by the aforementioned agency; new civil and criminal offences; and a 48-hour content takedown process whereby individuals can demand that online hosting providers remove content they believe is harmful.
¶ 155 Leave a comment on paragraph 155 0 In the Philippines, while there are various legislative instruments that indirectly and directly protect women, it was only recently that policies and laws relating to ICTs are being put in place. This includes the controversial Anti Child Pornography Act, which has been criticized for potentially eroding Internet rights, the Anti-Photo and Video Voyeurism Act, and the Cybercrime Prevention Act (RA 10175). The latter includes a definition of ‘cybersex’, which was particularly criticized by women’s rights groups and advocates for (among other reasons) its vagueness and broad scope, and for neglecting the underlying causes of VAWG.[48]
¶ 156 Leave a comment on paragraph 156 0 In Estonia, a Strategy for Preventing Justice was approved in February 2015 and is currently being implemented by the Ministry of Justice.[49] Although online VAWG is not a separate topic in this strategy, measures to prevent cyberbullying, sexual offences online against children, and other forms of online abuse are reportedly being planned. Estonian courts enable victims of online abuse to apply for restraining orders in both civil and criminal proceedings; and the country is in the process of amending and adopting provisions relating to hate speech and criminalising stalking.
¶ 157 Leave a comment on paragraph 157 0 While Nepal does not have any legislative provisions dealing directly with online VAWG, the Electronic Transaction Act[50] does deal with cybercrime in general. The Act provides that the publication of illegal material online – specifically material prohibited by other legislation and that will be ‘contrary to the public morality or decent behavior’ or that will ‘spread hate or jealousy against anyone or which may jeopardize the harmonious relations subsisting among the peoples of various castes, tribes and communities’ may be liable to a fine or punishment.[51]
¶ 158 Leave a comment on paragraph 158 0 In the United Kingdom, the government helped launched a website, Stop Online Abuse, in June 2015 to provide information, legal and practical advice to victims of online harassment, revenge porn, hate speech, sexual harassment and blackmail. The site is aimed at women, lesbian, gay, bisexual and transgender people after a research found that they were most affected by the extreme cases of online abuse.
¶ 159 Leave a comment on paragraph 159 0 In South Africa, the Protection from Harassment Act[52] came into force on 27 April 2013; enabling individuals subject to online or offline harassment to apply to a competent court for a protection order lasting up to five years. The Act also contains provisions requiring electronic communications service providers to assist courts in identifying perpetrators responsible for harassment; and creates the offences of contravention of protection orders and failure of an electronic communications service provider to furnish required information.
¶ 160 Leave a comment on paragraph 160 0 The Cyber-safety Act[53] of Nova Scotia (Canada) came into force in August 2013; enabling individuals subjected to cyberbullying (or, in the case of minors, their parents) to apply to a judicial officer for a protection order against an individual. The Act also contains provisions requiring electronic communications service providers to assist courts in identifying individuals responsible for cyberbullying, and creates the tort[54] of cyberbullying, which enables individuals to sue others for damages arising out of cyberbullying.
¶ 161 Leave a comment on paragraph 161 2 In the state of California (USA), the controversial SB 255 Electronic Communication Devices: Prohibited Distribution of Personal Information Act[55] came into effect in October 2013 and creates a new misdemeanour of disorderly conduct by way of distribution of intimate photographs with the intent to cause serious emotional distress. The Act is narrowly worded and focuses on instances in which the person who takes or makes the intimate image, distributes it with the intent to cause, and the effect of causing, serious emotional distress to a victim.
¶ 162 Leave a comment on paragraph 162 0 Challenges and lessons learnt:
¶ 163 Leave a comment on paragraph 163 1 There is a need for public sector initiatives to acknowledge and recognise that although online VAWG might not cause actual physical harm in all instances, it can also cause significant emotional and psychological harm, as well as impact on issues such as mobility, employment and public participation (see Section B above for a thorough exploration of consequences). These are equally important factors to address and prevent.
¶ 164 Leave a comment on paragraph 164 0 As a key priority, public sector initiatives need to address the underlying causes that contribute to and enable online VAWG (as discussed in Section C above) – specifically existing gender inequalities. Without addressing the root problems, public sector intiatives tend to adopt merely reactive stances to incidents of online VAWG. Citizen support is also easier to facilitate when the public sector invests in public education to address the underlying causes that contribute and enable online VAWG. The UK government’s anti-trolling website (see paragraph 158 above) can arguably be seen as one such way forward. There is also a related need to invest in research and statistics (and proper reporting guidelines) to be able to study the incidence of online VAWG.
¶ 165 Leave a comment on paragraph 165 0 In terms of responses, governments and the public sector tend to favour legislative instruments, which often take a lot of time to be developed and adopted. While these due processes generally allow for beneficial public consultation (when legislative instruments are not rushed), the pace at which Internet platforms develop, including the ways in which online abuse of women also evolve and change, often reduce the efficacy of such legislative responses by the time it is actually adopted.
¶ 166 Leave a comment on paragraph 166 1 Some countries also tend to utilise and amend existing legal frameworks rather than creating new laws specifically for new technologies (e.g. South Africa, paragraph 159 above). Not only does the adequacy of this approach for providing redress need to be investigated, but the public sector also needs to consider flexible and potentially informal measures for responding to online VAWG, although such measures need to be transparent at all times. An example is the UK government’s adoption of filters to address the distribution and viewing of online child abuse images. While probably well-intentioned, these filters were designed in a way to over-filter content (including sex education websites) and users struggle to find information regarding why certain sites are inaccessible.[56]
¶ 167 Leave a comment on paragraph 167 0 Governments should also ensure that they facilitate and simplify access to justice for survivors[57] whilst prioritising redress and relief over criminalisation. Where possible, governments should consider options beside traditional courts and tribunals. The creation of specialised, fast-track courts or specialised agencies to investigate complaints can, for instance, help to provide simple, quicker and more cost-effective (especially in comparison to ordinary courts) forms of recourse to victims. Where possible, such agencies should be able to accept third party complaints and should be able to act both reactively, in response to specific complaints, and proactively in response to potential trends and/or cases of online VAWG. In Estonia, for instance, specialised police officers[58] give advice regarding online crimes and also refer cases to police stations when necessary, while in Canada a cyberbullying investigative unit[59] provides an easy process for individuals to complain to.
¶ 168 Leave a comment on paragraph 168 0 While the effectiveness of protection orders in the context of online VAWG remains to be seen, these orders are already used in many countries to address domestic violence by providing a practical means of halting violence without requiring victims to become embroiled in lengthy and demanding criminal processes.
¶ 169 Leave a comment on paragraph 169 0 Before introducing legislative instruments or amendments, the public sector needs to consult citizens and victims to first determine their needs. When legislative instruments are required, proper consultation processes (including opportunities for public comment on legislative designs) need to be followed to ensure citizen support. The public sector should not only respond reactively to high profile cases of abuse by rushing through legislative instruments with limited consultation (as appears to have happened in some jurisdictions).
¶ 170 Leave a comment on paragraph 170 0 Lastly, the public sector needs to explore its legal relationship with Internet intermediaries and the level of obligations it can realistically impose on intermediaries. Some countries have already passed and amended legislation to compel electronic service providers to provide information to courts in certain instances of abuse (e.g. Nova Scotia, paragraph 160, and South Africa, paragraph 159 above). While some governments might consider using their licensing prerogative to require stricter content regulation obligations on intermediaries in order to protect women online (e.g. Broadband Commission recommendation, paragraph 177 below),[60] a ‘one-size-fits-all’ approach is arguably not workable. While intermediaries may have certain obligations to help prevent and rectify online VAWG, any duties imposed upon them arguably need to be both flexible to account for technological change and workable to account for the nature and speed of content distribution. While the responsibility of educating users and improving digital literacy levels arguably lies primarily with the public sector, governments should consider cooperating with intermediaries to ensure education also continues on the platforms. Where new users join a social media platform, for instance, they could be required to first complete an online training programme that interactively teaches users about user rights and duties, how to respect other users, what kind of behaviour will not be tolerated, and how to easily report abuse, for instance.
¶ 171 Leave a comment on paragraph 171 0 Tensions that arise on questions of competing rights and interests, as described above (see paragraphs 90-93 and 149 above), also need to be kept in mind. The fact that online VAWG also impedes women’s right to freedom of expression by creating environments in which they do not feel safe to express themselves (see Section B above for consequences), is unfortunately often neglected in debates about this conflict between freedom of expression and protecting women online. Taking steps to protect women online, although such steps might impact freedom of expression, also indirectly protect women’s ability to benefit from the freedom to express themselves online.
¶ 172 Leave a comment on paragraph 172 0 New legislation has sometimes differentiated between online and offline communication and expression that might have the good intention of protecting users online, but that introduce the potential for damaging freedom of expression (e.g. New Zealand, paragraph 154 above). While it might be useful for legislative instruments to recognise the unique nature of digital communication and the nature of harm attributed to online speech in light of the speed of proliferation, inability to permanently erase content, and anonymous nature of some online content, if legislation enables courts to have too much judicial discretion for interpretation, such legislation might be applied in ways that could limit free expression and could undermine the free flow of information.
¶ 173 Leave a comment on paragraph 173 0 In cases where private information or content like photographs and videos are distributed without consent, laws related to online defamation, voyerism and the wilful exposure of private and/or intimate material are sometimes used. However, in most jurisdictions, an accused can defend him/ herself by arguing that the content was true and in the public interest. This defence can expose the victim to more emotional trauma in the process of establishing ‘truth’.
¶ 174 Leave a comment on paragraph 174 0 b) Multistakeholder and intergovernmental initiatives
¶ 175 Leave a comment on paragraph 175 0 Due to increasing recognition of the importance to develop practices to counter online VAWG, there has been some initiatives taken by various intergovernmental agencies and multistakeholder approaches to address the issue.
¶ 176 Leave a comment on paragraph 176 0 The Council of Europe’s Convention on Preventing and Combating Violence against Women and Domestic Violence,[61] which entered into force in August 2014, places particular emphasis on the role of the ICT sector and the media in preventing violence targeted at women. Media organizations are encouraged to introduce self-regulatory mechanisms, internal codes of conduct/ethics and internal supervision measures to promote gender equality, to combat gender stereotypes, to avoid sexist advertising, language and content, and to refrain from the use of degrading images of women associating violence and sex. State parties are furthermore encouraged to cooperate with the private sector to equip children, parents and educators with the necessary skills for dealing with the ICT environments that provide access to degrading content of a sexual or violent nature.[62] Monitoring mechanisms are currently being put into place and the first evaluations are expected in 2016.[63]
¶ 177 Leave a comment on paragraph 177 1 The Broadband Commission for Sustainable Development, an initiative steered by UNESCO and the ITU, was established in May 2010 with the aim of boosting the importance of broadband on the international policy agenda. In September 2015 a report by its Working Group on Broadband and Gender (co-chaired by UNDP and UN Women) was released.[64] The report laments the volume of online VAWG and its social and economic implications for women online, and also highlights law enforcement agencies and courts’ failure to take ‘appropriate’ action to counter online VAWG. It argues that whilst legal and social approaches on a national level is challenging because of the global nature of the Internet, ‘rigorous oversight and enforcement of rules banning cyber VAWG’ is ‘an essential foundation stone’ for a safe Internet. It proposes a three-pronged approach of sensitization, safeguards and sanctions to address online VAWG, along with a somewhat controversial recommendation for political and governmental bodies to ‘use their licensing prerogative’ to force ‘Telecoms and search engines’ to ‘supervise content and its dissemination’.
¶ 178 Leave a comment on paragraph 178 0 Challenges and lessons learnt:
¶ 179 Leave a comment on paragraph 179 0 The novelty of these initiatives signals an increasing recognition of the importance of engaging and identifying the roles that different stakeholders can play in understanding and responding to this issue at regional and global levels. This BPF is one such measure; demonstrating commitment also by the IGF community as a multistakeholder platform to facilitate such policy discussions. Further dialogue, monitoring and assessment of these initiatives will provide important lessons learnt for other initiatives to emerge.
¶ 180 Leave a comment on paragraph 180 0 c) Private sector approaches
¶ 181 Leave a comment on paragraph 181 0 Some examples of private sector initiatives aimed at addressing online VAWG include:
¶ 182 Leave a comment on paragraph 182 2 Twitter’s abusive behavior policy is aimed evaluating and addressing potentially abusive behaviour on the platform if it is in violation of the Twitter Rules and Terms of Service. Twitter defines abusive behaviour as including (indirect or direct) violent threats; abuse and harassment; protecting users from self-harm; preventing anyone from publishing private information belonging to another; and impersonating others with the aim of misleading, confusing or deceiving others. It has a lighter touch approach to offensive content and mediation, which is tolerated as long as it does not violate the Twitter Rules and Terms of Service. It explains:
¶ 183 Leave a comment on paragraph 183 0 “Twitter provides a global communication platform which encompasses a variety of users with different voices, ideas and perspectives. Because of this diversity, you may encounter content you consider to be inflammatory or inappropriate that is not considered a violation of our rules.”
¶ 184 Leave a comment on paragraph 184 1 Twitter enables users to report violations and welcomes complaints or reports from both individuals that experience abuse and third party complaints. Twitter has also changed its privacy laws from making tweets available for only 30 days to making all tweets since Twitter was created available and searchable on their website; which is helpful in collecting evidence in past cases of online harassment.
¶ 185
Leave a comment on paragraph 185 0
Facebook’s Community Standards were developed with the aim of helping users feel safer online. In terms of these standards Facebook’s content reviewers can remove content, disable accounts and work with law enforcement agencies when it perceives a ‘genuine risk of physical harm or direct threats to public safety’. It also allows personal and third party reports/ complaints, and specifically mentions that when governments request the removal of content that violate local laws but not Facebook’s Community Standards, it may make such content unavailable in the relevant country or territory. Similar to Twitter, Facebook also stresses that ‘because of the diversity of our global community, please bear in mind that something that may be disagreeable or disturbing to you may not violate our Community Standards’.
¶ 186 Leave a comment on paragraph 186 1 Challenges and lessons learnt:
¶ 187 Leave a comment on paragraph 187 0 Company policies on anonymity and real name policies may, furthermore, contribute to the manifestation of online VAWG. For example, in one case a survivor of domestic violence managed to avoid her ex-husband for 20 years until Facebook’s “real-name policy” allowed her abuser to track her down. While Facebook’s real name policy might protect women from online abuse (as anonymity might mask perpetrators), it may also expose some women to new forms of abuse (e.g. transgender women, paragraphs 93 above). In addition, the use of proxy accounts can also circumvent these requirements by sending automated harassment messages without being able to determine who the ‘real’ account holder is.
¶ 188 Leave a comment on paragraph 188 0 In formulating content-regulation and privacy policies, intermediaries often fail to consider relevant social and other contexts; particularly in regards to VAWG. Many policies reflect limited engagement with the perspectives of women outside North America or Europe, and reporting mechanisms and policies tend to be in English, which makes it difficult for non-English speakers to access and benefit from. Urdu, for instance, is often largely written in a ‘Roman’ format online – i.e. Urdu speakers use the English alphabet to phonetically spell out Urdu words; making it difficult for translation software to decipher real meanings. If a user wants to complain about something written in ‘Roman’ Urdu on Facebook, for example, Facebook would need an actual person to translate in order to determine the real meaning of posts. In effect, this situation leads to harassments not being recognised as harassment unless it is written in the actual Urdu alphabet, which is easier for Facebook to translate.[65]
¶ 189 Leave a comment on paragraph 189 0 While many intermediaries do have terms of service and other user guides of conduct, some of these display a reluctance to provide public commitments to human rights standards – something that might be understandable considering the number of countries with diverse ‘beliefs’ they hope to operate in. For this reason, however, many terms of service tend to become mere reflections of minimum legal obligations (e.g. copyright infringement and child exploitation). Terms of service also often lack definitions of what conduct actually amounts to unlawful and abusive behaviour; particularly forms of technology-related VAWG, and tend to ill-define the remedies available for victims.
¶ 190 Leave a comment on paragraph 190 0 Some intermediaries do not have formal record-keeping systems and clear communication guidelines, and also lack the ability or will to remove individual content across the system at its source (e.g. when it comes to rape photos and videos being uploaded and spread virally, metadata should enable companies to track and locate content across its entire platform). Intermediaries also sometimes display a lack of transparency around reporting and redress processes. In many of the case studies submitted to the BPF, for example, women reported that when they submitted complaints or concerns to an intermediary, they received either an automated or a complete lack of response.
¶ 191 Leave a comment on paragraph 191 0 There are also challenges regarding the adequacy of available remedies offered by intermediaries. Multiple-strike policies are generally limited to copyright concerns and not to other crimes like online VAWG (e.g. reserve the right to terminate accounts specifically on the basis of repeated gender-based harassment, hate and abuse); and the effectiveness of actual take-down procedures at certain sites remains unknown despite the likelihood of such a platform being used to distribute videos and photos taken without consent of the people featured.
¶ 192 Leave a comment on paragraph 192 0 Lastly, intermediaries do not seem to invest enough in the promotion of user and staff literacy on the issue of online VAWG. Users are often unaware of their rights and responsibilities in online contexts, and intermediary staff similarly do not receive appropriate training to adequately address and deal with online VAWG.
¶ 193 Leave a comment on paragraph 193 0 d) Community-led initiatives
¶ 194 Leave a comment on paragraph 194 0 A range of commercial and non-commercial civil society organizations, users and communities have made significant contributions to the protection of women online, particularly as many of these have a global scope.
¶ 195 Leave a comment on paragraph 195 1 These initiatives have diverse objectives, including raising awareness about how to support victims; promoting digital safety education; enabling crowd-sourced blocking; identifying (‘naming and shaming’) perpetrators; encouraging public debates to promote norm-setting on online behaviour; and direct interventions in response to active (or real) cases of online VAWG. While it is impossible to highlight and describe each of these here, a few examples of these initiatives include:
¶ 196 Leave a comment on paragraph 196 0 Digital safety education and resources
- ¶ 197 Leave a comment on paragraph 197 0
- Security-in-a-box, which was created in 2009 by Tactical Tech and Front Line, aims to assist human rights defenders with their digital security and privacy needs by providing them with a collection of hands-on guides.
- The Boston Safety Hub Collective’s A DIY Guide to Feminist Cybersecurity also provides an introduction to available cybersecurity tools, and manages a hashtag on Twitter (#SafeHubTech) to which users can also tweet cybersecurity questions and concerns.
¶ 198 Leave a comment on paragraph 198 0 Campaigns/ raising awareness
- ¶ 199 Leave a comment on paragraph 199 0
- The APC’s Take Back the Tech! campaign was launched in 2006 with the aim of encouraging the use of any ICT platform to promote activism against gender-based violence. It plans various campaigns throughout the year, with the biggest being 16 Days of Activism Against Gender-Based Violence (November 25 to December 10 each year).
- Peng! is a collective that specialises in so-called ‘subversive direct action, culture jamming, civil disobedience and guerrilla communications’ launched its Zero Trollerance campaign in March 2015. The campaign used Twitter profiles controlled by computer programs (or bots) to target suspected trolls and to troll them back with the aim of educating these suspected trolls. 5000 suspected trolls were identified with ‘simple language analysis’ of Twitter data tweeting ‘the type of dangerous language often used to harass and incite violence against women and trans people’. While the campaign could arguably hinge on digital vigilantism, it has received significant media coverage.
- End Misogyny Online has created accounts on various social media platforms (including Twitter, Facebook, Pinterest and Tumblr) with the aim of highlighting and eradicating online misogyny and abuse by sharing real examples of misogynistic abuse from different users.
¶ 200 Leave a comment on paragraph 200 0 Helplines
¶ 201 Leave a comment on paragraph 201 0 Various helplines aim to assist women who face online abuse. In the UK, some of these helplines are supported by the UK Government Equalities Office in what appears to be a very useful public-private partnership.
- ¶ 202 Leave a comment on paragraph 202 0
- The Revenge Pornography Helpline, for example, was created as a pilot project in February 2015 with the objectives of supporting victims, assisting in the removal of harmful content, and collecting numeric data and evidence on online VAWG. The Helpline works both reactively (in response to complaints from victims) and proactively by reporting and requesting the removal of abusive content.[66]
- The Professionals Online Safety Helpline, in turn, was co-funded by the European Commission with the aim of supporting professionals working with children and young people in the UK with any online safety issues they may face themselves or with children in their care. The helpline provides support with all aspects of digital and online issues on social networking sites, including cyberbullying, sexting, online gaming and child protection online.[67]
- Both these helplines maintain strong relationships with companies like Google, Yahoo, Microsoft, Twitter, Facebook, Snapchat and Tumblr. These relationships tend to be reciprocal, with the helplines providing the platforms advisory support on beta testing of new products or services, while platforms keep the helplines abreast of safety and reporting updates. [68]
¶ 203 Leave a comment on paragraph 203 0 Technical solutions
- ¶ 204 Leave a comment on paragraph 204 0
- [Awaiting feedback from CSIRTs BPF]
- The development of certain applications aimed to help protect women in a variety of ways are also noteworthy, including certain tracking, monitoring and reporting mechanisms. HarassMap, for instance, is used in Egypt with the aim of preventing sexual harassment both through online and mobile reporting and mapping and through media campaigns. Crowdsourced maps are used to illustrate the scale of the problem and to raise awareness about the problem of sexual harassment.
¶ 205 Leave a comment on paragraph 205 0 F. CONCLUSION(S)
¶ 206 Leave a comment on paragraph 206 0 [To be populated in Draft JP]
¶ 207 Leave a comment on paragraph 207 2 FOOTNOTES – PART 2:
¶ 208 Leave a comment on paragraph 208 0 [1] Few respondents explicitly recognized that online VAW impacts women’s rights. While the survey results are by no means representative of a larger population, the lack of importance that the respondents attached to online VAW as a limitation of women’s rights was noteworthy.
¶ 209 Leave a comment on paragraph 209 0 [2] See: CEDAW’s General recommendation on women’s access to justice, 23 July 2015 (C/CG/33): http://tbinternet.ohchr.org/Treaties/CEDAW/Shared%20Documents/1_Global/CEDAW_C_GC_33_7767_E.pdf
¶ 210 Leave a comment on paragraph 210 0 [3] New Technology, Same Old Problems (EVAW, 2013). Available online: http://www.endviolenceagainstwomen.org.uk/data/files/Report_New_Technology_Same_Old_Problems.pdf.
¶ 211 Leave a comment on paragraph 211 0 [4] Available online: http://www.womensaid.org.uk/page.asp?section=00010001001400130007§ionTitle=Virtual+World+Real+Fear.
¶ 212 Leave a comment on paragraph 212 0 [5] APC (2015). From Impunity to Justice: Exploring Corporate and Legal Remedies for Technology-Related Violence Against Women. Available online: http://www.genderit.org/articles/impunity-justice-exploring-corporate-and-legal-remedies-technology-related-violence-against.
¶ 213 Leave a comment on paragraph 213 2 [6] Broadband Commission, Cyber Violence against Women and Girls: A world-wide wake-up call (2015). Available online: http://www.broadbandcommission.org/Documents/reports/bb-wg-gender-report2015.pdf.
¶ 214 Leave a comment on paragraph 214 0 [7] Available online: http://www.un.org/womenwatch/daw/vaw/SGstudyvaw.htm.
¶ 215 Leave a comment on paragraph 215 0 [8] Available online: http://www.un.org/womenwatch/daw/csw/csw57/CSW57_Agreed_Conclusions_(CSW_report_excerpt).pdf.
¶ 216 Leave a comment on paragraph 216 0 [9] A/HRC/23/50. Available online: http://www.ohchr.org/Documents/HRBodies/HRCouncil/RegularSession/Session23/A.HRC.23.50_EN.pdf.
¶ 217 Leave a comment on paragraph 217 0 [10] A/RES/68/181. Available online: http://www.gender.cawater-info.net/publications/pdf/n1345031.pdf.
¶ 218 Leave a comment on paragraph 218 0 [11] A/HRC/29/27/Add.2. Available online: http://www.ohchr.org/EN/HRBodies/HRC/RegularSessions/Session29/Documents/A_HRC_29_27_Add_2_en.doc.
¶ 219 Leave a comment on paragraph 219 1 [12] This is, again, by no means an exhaustive list, but was compiled from BPF participants.
¶ 220 Leave a comment on paragraph 220 0 [13] This list was identified by BPF participants as people who might be particularly vulnerable to online VAWG and is not an exhaustive list.
¶ 221 Leave a comment on paragraph 221 0 [14] Available online: http://www.intgovforum.org/cms/documents/best-practice-forums/best-practices-for-online-child-protection/413-bpf-2014-outcome-document-online-child-protection/file.
¶ 222 Leave a comment on paragraph 222 1 [15] Example submitted by Ellen Blackler, The Walt Disney Company, USA.
¶ 223 Leave a comment on paragraph 223 0 [16] Summarised from example and case study, Mariana Valenti, InternetLab, Brazil.
¶ 224 Leave a comment on paragraph 224 0 [17] Summarised from case study, APC. Available online: http://www.genderit.org/sites/default/upload/case_studies_pak3_0.pdf.
¶ 225 Leave a comment on paragraph 225 0 [18] Summarized from case study, Lisa Garcia, Foundation for Media Alternatives, Philippines. Article on story available online: http://www.manilalink.com/2015/07/star-andrea-brillantes-scandal-video.html.
¶ 226 Leave a comment on paragraph 226 0 [19] Summarised from example, Arzak Khan, Internet Policy Observatory, Pakistan. His summary of story available at: http://ipop.org.pk/death-for-dancing-and-mobile-phones-in-pakistan/).
¶ 227 Leave a comment on paragraph 227 0 [20] Summarised from case study, APC. Available online: http://www.genderit.org/sites/default/upload/case_studies_mex4_0.pdf
¶ 228 Leave a comment on paragraph 228 2 [21] See: http://www.un.org/en/ga/search/view_doc.asp?symbol=A/RES/68/181
¶ 229 Leave a comment on paragraph 229 1 [22] Summarised from case study, APC. Available online at: http://www.genderit.org/sites/default/upload/case_studies_pak1_1.pdf.
¶ 230 Leave a comment on paragraph 230 0 [23]Summarised from a paper by the Internet Democracy Project (2013), Don’t let it Stand! An Exploratory Study of Women and Verbal Online Abuse in India. Available online: http://internetdemocracy.in/wp-content/uploads/2013/12/Internet-Democracy-Project-Women-and-Online-Abuse.pdf.
¶ 231 Leave a comment on paragraph 231 3 [24] See @sonaliranade.
¶ 232 Leave a comment on paragraph 232 0 [25] See 2013 article on incident, available online: http://protectionline.org/2013/10/21/whrd-ic-condemns-the-aggressive-and-systematic-digital-harassment-of-the-latin-america-and-caribbean-womens-health-network-lacwhn-2/
¶ 233 Leave a comment on paragraph 233 0 [26] See article on incident, available online: http://www.themalaysianinsider.com/malaysia/article/baffled-by-investigation-father-of-bfm-host-denies-blasphemy-claim. Also see: http://www.independent.co.uk/news/world/asia/rape-threats-death-threats-and-a-police-investigation-after-video-poking-fun-at-an-islamic-party-in-10149554.html
¶ 234 Leave a comment on paragraph 234 0 [27] See article on incident, available online: http://www.rollingstone.com/politics/news/planned-parenthood-under-attack-by-anti-abortion-hackers-politicians-20150731
¶ 235 Leave a comment on paragraph 235 1 [28] A Dutch study, for instance, showed that lesbians were 6.4% more likely to experience online bullying than heterosexual women. See European Union Agency for Fundamental Rights (FRA) (September 2014). Violence against women: European Union survey results in the Dutch context. Available online: goo.gl/L66swK.
¶ 236 Leave a comment on paragraph 236 0 [29] See media report citing statistics: http://www.hotforsecurity.com/blog/uk-government-launches-stop-online-abuse-to-fight-cyberbullying-12170.html.
¶ 237 Leave a comment on paragraph 237 0 [30] APC (2013), Survey on Sexual Activism, Morality and the Internet. Available online: http://www.genderit.org/articles/survey-sexual-activism-morality-and-internet
¶ 238 Leave a comment on paragraph 238 0 [31] Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, David Kaye, A/HRC/29/32 (May 2015). Available online: http://daccess-ods.un.org/TMP/668197.572231293.html.
¶ 239 Leave a comment on paragraph 239 0 [32] Read more about the policy here: https://www.facebook.com/help/112146705538576.
¶ 240 Leave a comment on paragraph 240 0 [33] Lil Miss Hot Mess, The Guardian, 3 June 2015. Available online: http://www.theguardian.com/commentisfree/2015/jun/03/facebook-real-name-policy-hurts-people-creates-new-digital-divide.
¶ 241 Leave a comment on paragraph 241 0 [34] See media report, available online: http://www.france24.com/en/20150224-cameroon-lgbt-activists-face-threats-violence.
¶ 242 Leave a comment on paragraph 242 0 [35] Summarised from case study, APC. Available online: http://www.genderit.org/sites/default/upload/case_studies_pak1_1.pdf.
¶ 243 Leave a comment on paragraph 243 0 [36] Summarized from case study and report, New Technology, Same Old Problems (EVAW, 2013). Submitted as part of case study by Vera Gray, UK. Report available online: http://www.endviolenceagainstwomen.org.uk/data/files/Report_New_Technology_Same_Old_Problems.pdf.
¶ 244 Leave a comment on paragraph 244 0 [37] Summarised from a paper by the Internet Democracy Project (2013), Don’t let it Stand! An Exploratory Study of Women and Verbal Online Abuse in India. Available online: http://internetdemocracy.in/wp-content/uploads/2013/12/Internet-Democracy-Project-Women-and-Online-Abuse.pdf.
¶ 245 Leave a comment on paragraph 245 0 [38] Read media report on incident, available online: http://gawker.com/what-is-gamergate-and-why-an-explainer-for-non-geeks-1642909080.
¶ 246 Leave a comment on paragraph 246 0 [39] Summarized from case study and report, New Technology, Same Old Problems (EVAW, 2013). Submitted as part of case study by Vera Gray, UK. Report available online: http://www.endviolenceagainstwomen.org.uk/data/files/Report_New_Technology_Same_Old_Problems.pdf.
¶ 247 Leave a comment on paragraph 247 0 [40] Summarized from case study submitted by Said Zazai, Afghanistan. BBC report available online in Persian: http://www.bbc.com/persian/afghanistan/2015/08/150823_k04_afg_women_problem_in_facebook?ocid=socialflow_facebook#share-tools).
¶ 248 Leave a comment on paragraph 248 0 [41] Summarised from case study submitted by Laura Higgins, SWGfL.
¶ 249 Leave a comment on paragraph 249 0 [42] Policy Options for Connecting the Next Billion is one of the themes that define the IGF’s intersessional work in 2015. For this reason, the BPF decided to also consider issues related to access in relation to women and online VAWG.
¶ 250 Leave a comment on paragraph 250 0 [43] See: http://www.intgovforum.org/cms/policy-options-for-connection-the-next-billion#about.
¶ 251 Leave a comment on paragraph 251 0 [44] See, for example, the BPF outcome document on Online Child Protection (2014): http://www.intgovforum.org/cms/documents/best-practice-forums/best-practices-for-online-child-protection/413-bpf-2014-outcome-document-online-child-protection/file.
¶ 252 Leave a comment on paragraph 252 0 [45] Example submitted by Sadaf Baig, Media Matters, Pakistan. Available online: http://www.tanqeed.org/2015/07/facebook-domestication/.
¶ 253 Leave a comment on paragraph 253 0 [46] Some of these examples are derived from the APC report End Violence: Women’s Rights and Safety Online. Technology-related violence against women: recent legislative trends. Available online: http://www.genderit.org/sites/default/upload/flowresearch_cnyst_legtrend_ln.pdf.
¶ 254 Leave a comment on paragraph 254 0 [47] Available online: http://www.legislation.govt.nz/bill/government/2013/0168/latest/whole.html
¶ 255 Leave a comment on paragraph 255 0 [48] The position in the Philippines was summarized from a case study submitted by Lisa Garcia, Foundation for Media Alternatives, Philippines.
¶ 256 Leave a comment on paragraph 256 0 [49] The Estonian position was summarized from a case study submitted by Piret Urb, Ministry of Foreign Affairs, Estonia.
¶ 257 Leave a comment on paragraph 257 0 [50] Available online: goo.gl/M89tKv.
¶ 258 Leave a comment on paragraph 258 0 [51] Submitted as part of a case study submitted on the Nepali position by Shreedeep Rayamajhi, Nepal.
¶ 259 Leave a comment on paragraph 259 0 [52] Act No. 17 of 2011, available online: http://www.justice.gov.za/legislation/acts/2011-017.pdf.
¶ 260 Leave a comment on paragraph 260 0 [53] Bill No. 61. Available online: http://nslegislature.ca/legc/bills/61st_5th/1st_read/b061.htm.
¶ 261 Leave a comment on paragraph 261 1 [54] Note that the term ‘tort’ generally refers to a crime perpetrated by an individual.
¶ 262 Leave a comment on paragraph 262 1 [55] Available online: https://legiscan.com/CA/text/SB255/id/863412/California-2013-SB255-Amended.html.
¶ 263 Leave a comment on paragraph 263 0 [56] See Freedom House’s report on the state of online freedom in the UK (2014). Available online: https://freedomhouse.org/report/freedom-net/2014/united-kingdom
¶ 264 Leave a comment on paragraph 264 0 [57] See: CEDAW’s General recommendation on women’s access to justice, 23 July 2015 (C/CG/33): http://tbinternet.ohchr.org/Treaties/CEDAW/Shared%20Documents/1_Global/CEDAW_C_GC_33_7767_E.pdf
¶ 265 Leave a comment on paragraph 265 0 [58] More information available online: https://www.politsei.ee/en/nouanded/veebikonstaablid/.
¶ 266 Leave a comment on paragraph 266 0 [59] More information available online: http://cyberscan.novascotia.ca/#second_white.
¶ 267 Leave a comment on paragraph 267 1 [60] E.g. Broadband Commission, Cyber Violence against Women and Girls: A world-wide wake-up call. Available online: http://www.broadbandcommission.org/Documents/reports/bb-wg-gender-report2015.pdf.
¶ 268 Leave a comment on paragraph 268 0 [61] Hereafter ‘the Istanbul Convention’, which has been signed by 20 member states and ratified by 18 member states. Available online: http://conventions.coe.int/Treaty/EN/Treaties/Html/210.htm.
¶ 269 Leave a comment on paragraph 269 0 [62] See Article 17 (ibid).
¶ 270 Leave a comment on paragraph 270 0 [63] The Council of Europe’s position was summarized from a case study submitted by Bridget O’Loughlin, Council of Europe.
¶ 271 Leave a comment on paragraph 271 0 [64] Cyber Violence against Women and Girls: A world-wide wake-up call. (2015). Available online: http://www.broadbandcommission.org/Documents/reports/bb-wg-gender-report2015.pdf.
¶ 272 Leave a comment on paragraph 272 0 [65] Example submitted by Sadaf Baig, Media Matters, Pakistan.
¶ 273 Leave a comment on paragraph 273 0 [66] Summarized from case study submitted by Laura Higgins, SWGfL, UK.
¶ 274 Leave a comment on paragraph 274 0 [67] Ibid.
¶ 275 Leave a comment on paragraph 275 0 [68] Ibid.
Your definitions of abuse are far too broad. For example, it is nearly impossible to conduct a proper political campaign against (or investigative reporting into) a female politician without monitoring them, collecting information, soliciting unwanted contact or attention, or presenting information that may damage their reputation. The easy answer is that exceptions should be made for public figures, but of course the problem with that in a social network age is identifying who counts as a public figure and who does not. A person with no official standing as a public figure may nevertheless shape the opinions of thousands or millions of people via social or political commentary on various outlets.