Is it Actually Violence? Framing Technology-Facilitated Abuse as Violence

The Emerald International Handbook of Technology-Facilitated Violence and Abuse

ISBN: 978-1-83982-849-2, eISBN: 978-1-83982-848-5

Publication date: 4 June 2021

Abstract

When discussing the term “technology-facilitated violence” (TFV) it is often asked: “Is it actually violence?” While international human rights standards, such as the United Nations' Convention on the Elimination of All Forms of Discrimination against Women (United Nations General Assembly, 1979), have long recognized emotional and psychological abuse as forms of violence, including many forms of technology-facilitated abuse (United Nations, 2018), law makers and the general public continue to grapple with the question of whether certain harmful technology-facilitated behaviors are actually forms of violence. This chapter explores this question in two parts. First, it reviews three theoretical concepts of violence and examines how these concepts apply to technology-facilitated behaviors. In doing so, this chapter aims to demonstrate how some harmful technology-facilitated behaviors fit under the greater conceptual umbrella of violence. Second, it examines two recent cases, one from the British Columbia Court of Appeal (BCCA) in Canada and a Romanian case from the European Court of Human Rights (ECtHR), that received attention for their legal determinations on whether to define harmful technology-facilitated behaviors as forms of violence or not. This chapter concludes with observations on why we should conceptualize certain technology-facilitated behaviors as forms of violence.

Keywords

Citation

Dunn, S. (2021), "Is it Actually Violence? Framing Technology-Facilitated Abuse as Violence", Bailey, J., Flynn, A. and Henry, N. (Ed.) The Emerald International Handbook of Technology-Facilitated Violence and Abuse (Emerald Studies In Digital Crime, Technology and Social Harms), Emerald Publishing Limited, Leeds, pp. 25-45. https://doi.org/10.1108/978-1-83982-848-520211002

Publisher

:

Emerald Publishing Limited

Copyright © 2021 Suzie Dunn. Published by Emerald Publishing Limited. This chapter is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of these chapters (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode.

License

This chapter is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of these chapters (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode.


Introduction: Words Matter

The words used to describe a phenomenon shape the legal and social understanding of that experience. A change in terminology around a particular behavior can in turn change the norms of that society. This was demonstrated in the 1970s when feminist advocates in North America began calling for the recognition of and end to sexual harassment (Backhouse, 2012a; Backhouse & Cohen, 1978; MacKinnon & Siegel, 2004). Prior to that time, much of what was redefined by these advocates as “sexual harassment” had previously been considered harmless flirtation or “boys being boys.” In many situations, women were expected to take responsibility for provoking or encouraging men's sexual aggression toward them (Weiss, 2009). Unwanted sexual attention from men was something that might not have been viewed as entirely polite but should be tolerated by women, if not considered complimentary. At any rate, such behaviors were normalized and minimized.

Reframing unwanted sexual attention as sexual harassment altered the social acceptability of that behavior. This change in understanding did not come easily; in fact, many entrenched norms around gendered sexual expectations remain in place today, with some individuals continuing to brush off the harms of sexual harassment (Quinn, 2000). However, following the work of feminists who developed new terminology and reframed the issue, new laws that helped protect women from this harmful and discriminatory behavior were introduced. For example, in Canada, sexual harassment was recognized under human rights legislation as discrimination based on sex (Campbell, 2013). Due to the cultural changes that came along with the advocacy work undertaken by feminists and the shift in the legality of the behavior, people began to see sexual harassment as unacceptable and harmful. This altered the social and legal boundaries of what was considered sexually appropriate behavior (Balos, 2004). As this demonstrates, identifying and naming harms can have significant societal impacts.

At the present time, TFV is a relatively new phenomenon that is not well understood by general society. It faces the same challenges of tolerance and minimization that sexual harassment faced – and continues to face (Fairbairn, 2015). Scholars, legislators, advocates, and the general public are still grappling with what new behaviors such as online stalking, image-based sexual abuse, and harmful digital misrepresentations are and how they should be condemned or regulated, if at all (Henry et al., 2020; Powell & Henry, 2017). Even with some emerging laws that address certain forms of TFV, such as those that prohibit the nonconsensual distribution of intimate images, people continue to be blamed for the technology-facilitated abuse they experience (Henry et al., 2020; Waldman, 2019). For example, some abusers feel entitled to hack into women's online accounts in order to steal their nude photos to share on the internet (Massanari, 2017) or create simulated images of women engaging in sexual acts that they did not do using photoshop, deepfakes, and computer-generated images (Dunn, 2020). Women have been blamed for this type of abuse because they took sexualized images of themselves to begin with or for being celebrities with sexual allure (Marwick, 2017). Furthermore, feminists who have faced multiyear online campaigns against them, where groups of abusers have published their home addresses, created images of them being raped and beaten, threatened them with death and rape, and bombarded them with sexist and racist commentary, have been told they are overreacting when they complain and have had their experiences minimized by the public and the police (Jane, 2016; Kidd & Turner, 2016). They have been told it was just words and pictures on the internet and their fears were dismissed as an overreaction (Filipovic, 2007). In some cases, they have had to flee their homes, close their digital platforms, and cancel their events due to technology-facilitated attacks (Kidd & Turner, 2016). These instances, which were often rooted in sexism, caused them to feel fearful, and some targets of these attacks described their experiences as violence (Jane, 2016; Kidd & Turner, 2016). Yet, there remains uncertainty on where to draw the line between what aspects of these behaviors should be considered unacceptable or even violent.

These types of harmful behaviors are collectively known by a variety of names: TFV (Bailey & Mathen, 2017), cyberbullying (Bailey, 2016), trolling (Citron, 2014), online abuse (McGlynn & Rackley, 2017), cyberviolence (Peterson & Densley, 2017), digital harassment (Powell & Henry, 2017), technology-facilitated coercive control (Dragiewicz et al., 2018), symbolic violence (Barratt, 2018), and representational violence (Hall & Hearn, 2019), among others (Loise Backe, Lilleston, & McCleary-Sills, 2018). It can be difficult to determine what terms to use when a new phenomenon like this occurs or when advocates call for a new cultural understanding of an issue. This is particularly true if the new definition pushes back against entrenched stereotypes and systemic power structures. Even among those who support a new understanding, finding consensus on terminology is not an easy task for complex and nuanced concepts like TFV. More precise terminology is useful when examining specific behaviors such as image-based sexual abuse, doxing, and digital stalking (Loise Backe et al., 2018), but as an overarching term for behaviors that result in the harms described below, I argue that we should use “TFV.”

“TFV” is preferred over “cyberviolence” because although it is a lengthier and less evocative term, it is a broader term that captures a wider range of technologies. The term “cyber” finds its roots in control and networked systems and was popularized in science fiction in the 1980s and 1990s by authors who described cyberspace as virtual or digitally connected spaces (Azmi & Kautsarina, 2019), such as in William Gibson's (1984) Neuromancer. Many forms of TFV occur in digital and networked spaces like the internet, but other forms of technology that are not connected to a network, such as some video or audio recording devices, are also used, making “technology” a more encompassing term. The term “cyber” also suggests a separate world in cyberspace, one that is distinct from the physical world (Azmi & Kautsarina, 2019). In our current society, our digital experiences are as much a relevant and integrated part of our everyday lives as our physical ones. Furthermore, for reasons described below, I argue that many of these abusive technology-facilitated behaviors are a form of violence that should be situated on the continuum of violence and, as such, should be broadly defined as “violence.” To do otherwise risks minimizing the severity of these actions and fails to recognize their interconnectedness with other more familiar forms of violence. Not recognizing these behaviors as violence may limit a person's access to legal protections and can reinforce social norms that view this harmful behavior as acceptable or causing minimal impact.

Old Meets New: Concepts of Violence in a Technological Era

To begin to understand the concept of TFV, it is worthwhile to first look at existing concepts of violence to consider some common characteristics of violent behavior, the systemic factors that contribute to violence, and the resulting harms caused by violence. As many forms of TFV are highly gendered and racialized (Amnesty International, 2018; McGlynn & Rackley, 2017), this chapter will rely on three concepts of violence that originate from gender-based and critical race theorists, including the continuum of violence (Kelly, 1988), coercive control (Stark, 2007), and critical race theory, which examines systemic violence (Matsuda, Lawrence, Delgado, & Crenshaw, 1993). These theories will be used to illuminate why certain technology-facilitated behaviors can (and should) be understood as violence.

In conceptualizing what is understood as TFV, this chapter adopts a broad understanding of violence that is not limited to criminal or otherwise legal definitions of violence, which themselves vary widely. There may be some instances where the legal understanding of violence should be limited to harmful physical contact and others where a broader definition that includes psychological and TFV will be appropriate. As such, not all forms of violence will reach a legal threshold (Bonnet, 2015), but all forms of this broader understanding of violence should be considered socially unacceptable. It is important to note that while the law plays a valuable role in shaping societal norms around which types of behavior are acceptable or not, the legal system has not always been a substantially effective or accessible tool to address violence (Backhouse, 2012b; Burgin & Flynn, 2019; Dunn, Lalonde, & Bailey, 2017). This is particularly true for communities that have been historically over-policed or under-protected due to systemic factors such as classism, racism, sexism, homophobia, and transphobia (Ashley, 2018a, 2018b; Matsuda, Lawerence, Delgado, & Crenshaw, 1993; Walcott, Foster, Campbell, & Seally, 2016). For some communities, the legal system itself has been a site of violence and criminal laws that are framed as protecting vulnerable members of society have resulted in disproportionate incarceration of their members (Delgado, 2014; Human Rights Watch, 2013). Due to this, we should not rely on legal institutions as the sole authorities to dictate the definition of violence and the solutions to it. Instead, a social and contextual understanding of what constitutes violence should be adopted to examine and critique the law's definition of violence and its enforcement of related laws. This chapter seeks to achieve this in its examination of two case studies.

As new concepts of violence like TFV emerge, the categorization of what is considered violent within a community can be contested (Fairbairn, 2015). It often requires advocacy and education within the community to shift cultural concepts of violence. This can be seen in the case of family violence that was once understood as legitimate family discipline, or that of marital rape that was once understood as an acceptable form of sexual interaction in a marriage, which are now understood as forms of violence (Balos, 2004; Koshan, 2017). One of the greatest challenges in conceptualizing harmful technology-facilitated behaviors as violence is the lack of physical contact between the target and the perpetrator in many of the cases. This is due to limited societal views of violence as only encompassing physical attacks; however, as will be examined below, violence can include a breadth of harmful behaviors that cannot be disentangled from physical abuse and may on their own be equally, if not more, debilitating for the survivor (Follingstad & Rogers, 2013). However, not every unwanted touch or unwelcome contact through technology will be considered violent nor should they be. We must avoid developing an understanding of violence that is so broad that it waters down the significance of violent behaviors or results in overly punitive measures for less consequential behaviors or acts of self-defense. In some cases, a single incident will be severe enough to be considered violence, such as threats of murder or rape, where in other cases, actions may only become violent due to their cumulative or coercive nature, such as unwanted text messages or cruel comments on a person's social media page (Bonnet, 2015; Dragiewicz et al., 2018). Social and legal understandings of violence may require a contextual analysis that takes into consideration situational and systemic factors, as well as the impacts of the behaviors and their connection to other acts of violence. Taking this into consideration, the following section will discuss how certain technology-facilitated behaviors can be captured in the folds of existing theoretical understandings of violence.

The Continuum of Violence: Interconnected Behaviors

Writing decades prior to the advent of social media and smartphones, feminist scholar Liz Kelly (1987) adopted a broad view of sexual violence against women, recognizing the interconnectivity between harmful physical, sexual, emotional, and psychological acts. After interviewing women with experiences of sexual violence, she found that discrete acts of physical and psychological attacks on women's sexual autonomy could not be understood as independent of each other, as they each played an overlapping role in controlling women's behavior. In her understanding of violence, Kelly (1987) focused on the overall impacts experienced by the survivor, describing how women reacted differently to various experiences of sexual coercion, both physical and psychological. Their reactions were influenced by the nature of the incident, the relationship between the parties, the amount of time over which the abuse occurred, and the feelings of fear induced by the incident. After listening to women's reactions to various behaviors, Kelly (1987) concluded that a hierarchy that places physical attacks at the top, as most serious, and emotional abuse at the bottom, as least serious, did not reflect the reality of women's lived experiences of sexual violence. Doing so minimized nonphysical harms caused by these behaviors. In her book, Surviving Sexual Violence, Kelly (1988) outlined her theory of the continuum of sexual violence. She describes the ways in which actions like street harassment, coercive sex, and threats systemically limited women's sexual autonomy and contributed to their sexual domination. They caused women to feel a great deal of fear, and Kelly (1988) argued that these behaviors should not be overlooked because of the lack of physical damage to a woman's body. By focusing on the ways these intersecting harmful behaviors were a part of larger system that supported sexual violence against women, she included actions used to degrade, control, and cause fear in women within the conceptual parameters of sexual violence, both physical and psychological.

Kelly's (1987, 1988) work addressed the continuum of sexual violence, but the model of a continuum of violence has been adopted more generally by scholars such as Rashida Manjoo (2012), Emma Renold and Christine Barter (2005), and Kathy Sanders-Phillips (2009) who applied the continuum to all violence against women, violence experienced by youth in residential children's homes, and racialized youth, respectively. Other national and international bodies, including the United Nations, have adopted a similarly broad definition of violence that includes emotional and psychological violence and forms of TFV (Fairbairn, 2015).

The theory of a continuum of violence has been applied to technology-facilitated behaviors that share the underlying characteristics of coercing and controlling women, thus fitting within Kelly's (1987, 1988) concept of violence. Clare McGlynn, Erika Rackley, and Ruth Houghton (2017) drew on Kelly's continuum of violence when examining what they labeled the “continuum of image-based sexual abuse” (p. 1; see also, Henry et al., 2020). These authors situate image-based sexual abuse on a continuum with other forms of sexual violence. Image-based sexual abuse includes the creation or distribution of sexual images without consent, which captures things like the nonconsensual distribution of intimate images, upskirting, voyeurism, sexual deepfakes, digitally manipulated sexual images, images obtained via hacking, sextortion, images of sexual assault, and unsolicited dick pics (see e.g., Henry, Flynn, & Powell, 2018, 2019). These technology-facilitated behaviors are used to degrade women and control their sexual autonomy, resulting in significant harms. Women whose sexual images have been shared without consent have reported similar psychological impacts as survivors of sexual assault, including post-traumatic stress disorder, anxiety, depression, and suicidal thoughts (Bates, 2016), and some have drastically altered their behaviors in response to the misuse of their images (Powell, Henry & Flynn, 2018). In the case of shared images of sexual assault, Alexa Dodge (2016) found that the demeaning commentary that has accompanied these images normalizes and minimizes sexual violence against women and girls. Beyond sexual violence, many other forms of technology-facilitated behaviors have the effect of controlling or causing fear in women include stalkerware and death threats on digital platforms (Citron, 2014; Parson et al., 2019). These technology-facilitated behaviors are interconnected with other forms of physical, sexual, and psychological forms of violence as they share the underlying commonality of what Kelly (1987, 1988) conceptualized as violence. These behaviors are used to dominate, intimidate, and control the women who are targeted, negatively influencing their autonomy. Additionally, they are linked to a larger system of behaviors that influence societal norms that minimize and normalize violence.

Coercive Control: Beyond the Physical

Evan Stark's (2007) theory on coercive control demonstrates the ways that physical and psychological behaviors of one partner are used to break down the personality, self-worth, and agency of the other in an intimate partner violence relationship. This form of control relies on systemic inequalities such as gender inequality and employs ongoing behaviors that wear a person down over time. Its collective nature is the core of its severity, not any particular discrete physical act that causes injury. This can cause a level of invisibility to the harmful actions as it is difficult to spot the forest among the trees. It might seem unusual for a person to experience extreme fear for forgetting to text her partner at a specific time, but if failing to do so results in their partner relentlessly threatening and limiting her freedoms in their home for weeks afterward, it is a legitimate reason to be afraid. Women in abusive relationships can be controlled using physical force in combination with psychological abuse, but as noted by Anne Jones (2000), prolonged psychological abuse that leaves the woman in a state of fear and without control can be an equally effective form of abuse as physical attacks. As long as the perpetrator can destroy the will of his target, the behavior achieves its aims. The cumulative controlling effects of ongoing and repetitive attacks are part and parcel of the violence in those relationships, whether they are physical or not.

Heather Douglas, Bridget Harris, and Molly Dragiewicz (2019) have applied the theory of coercive control to TFV. In their study on the use of technology in domestic and family violence, they argue that some forms of technology-facilitated behaviors are “inextricably tied to” (p. 553) other forms of domestic and family violence and the systemic factors that contribute to their persistence. Douglas, Harris, and Dragiewicz (2019) identified isolation (through controlling digital communication), monitoring, stalking, image-based sexual abuse, social media–facilitated abuse, and online harassment as some of the more commonly reported forms of technology-facilitated behaviors they would categorize as coercive in violent relationships. Technology added to the feelings – and reality – of the perpetrator being ever-present and ever-threatening. In the case of stalkerware, Citizen Lab described this technology as having a “predator in your pocket” because it allows for an abusive partner to consistently monitor their spouse's movements and communications (Parsons et al., 2019). Ongoing tracking and excessive monitoring through technology can induce feelings of fear and a loss of autonomy in the same ways that physically controlling behaviors can. A simple buzz of a text may cause a person whose daily behaviors are being monitored by their abusive partner to feel legitimately fearful.

The use of this technology to dominate and control another person is another tool in the web of tactics employed by a violent perpetrator that can have a cumulative effect on the person targeted (Dragiewicz et al., 2018). The overarching effect of breaking down the agency in the person remains the same whatever tool the abuser uses. These harmful technology-facilitated behaviors can have a physical and psychological impact on the person targeted. Nicola Henry and Anastasia Powell (2015) have described reactions to technology-facilitated harms like these as embodied harms. They found that it is a myth that because the certain actions are not directed on the physical body or occur in digital spaces they have no effect on the physical body or social self. There are real psychological, physical, and social effects that impact a person's mental health, personal relationships, social position, and personal safety resulting from TFV. Creating bright lines between physical, psychological, and technology-facilitated behaviors when examining violent relationships would require a parsing of the cumulative behaviors that collectively lead to the loss of agency and feelings of domination by the person targeted. Creating these types of silos can dilute the severity of the overall experience and fails to recognize the physiological and psychological harms caused by all forms of violence.

Critical Race Theory: Systemic Violence

Critical race theory is foundational to understanding the ways words and images reinforce systemic factors, such as racism, sexism, and colonialism, that dehumanize particular groups and can be understood as a form of violence themselves (Jackman, 2002). Mari Matsuda, Charles Lawrence, Richard Delgado, and Kimberlé Crenshaw's (1993) book, Words that Wound, describe the inseparability of racism, hateful speech, and violence. They argue that racist speech and images can be a precursor to physical violence against people of color, such as inciting lynch mobs and contribute to normalizing physical and sexual violence against women of color. Racist speech and images can incite physical attacks, including violence by the state, as was raised as a concern in 2020 when a white woman in Central Park called the police falsely alleging that a Black man had threatened her physically (Bellafante, 2020), but they also dictate ideas of racial inferiority that make racial domination permissible and Black, Indigenous, and people of color (BIPOC) legitimate targets of physical attacks and social control (Walcott et al., 2016).

Understanding the ways in which words and images can be viewed as forms of systemic violence against BIPOC is valuable in conceptualizing many forms of harmful technology-facilitated behaviors as violence. For example, degrading images and speech are used in digital spaces to dehumanize, instill fear, and reinforce the subordination of BIPOC (Awan & Zempi, 2015; Nakamura, 2013), members of the LGBTQI community (Lenhart, Ybarra, Zickuhr, & Price-Feeney, 2016; Quodling, 2016; Stanko, 2005), women and girls (Duggan, 2014; Vera-Gray, 2016), and other equality-seeking groups (Bailey, 2007).

Amnesty International's (2018) #ToxicTwitter report found that being a woman of color in a leadership role online or speaking about race or gender online can trigger digital attacks that use threatening and discriminatory language and images. Some abusive Twitter users will seek out hashtags associated with equality issues and will encourage others to attack individuals promoting equality-focused campaigns. They use onslaughts of threatening, degrading, and violent comments, including death and rape threats, against individuals advocating for equality online. Other racist, sexist, and homophobic tropes are used during these online attacks to reinforce systemic power structures and can drive users who resist them off of digital platforms out of fear. Words and images like these cross over the line into violence when they create reasonable feelings of fear, reinforce the subordination of specific groups in ways that normalize violence against them, and dehumanize particular individuals or groups.

Manipulated images, decontextualized images, and images of violence play an important part in reinforcing the degradation of certain groups and normalizing violence against them. For example, after Black actor Leslie Jones starred in a new version of Ghostbusters, a campaign of individuals who were opposed to a Black woman starring in the film began trolling Jones with violent commentary and images that had racist and sexist undertones (Citron, 2019). The campaign was encouraged by conservative media commentator Milo Yiannopoulos, who was known for instigating online mob attacks against people he criticized (Silman, 2016). 1 Jones reported that she was sent doctored images that made it appear as though she had been ejaculated on, her website was hacked, her nude photos were stolen and posted online, photos of her personal identification documents were shared, and memes of a dead gorilla were posted on her website (Citron, 2019). These images violated her privacy and used racist and sexist tropes, along with other information, to make her feel fearful and to dehumanize her, thus fitting within the concept of violence described by the above critical race theorists.

Individuals like Leslie Jones may find themselves singled out for attacks, but many of these violent attacks are interconnected and led by discriminatory groups that organize in digital spaces. These abusers often find their homes in extremist alt-right, white supremist, homophobic, and misogynistic groups whose online messaging boards purposefully share discriminatory messages that dehumanize and encourage physical violence against equality-seeking groups (Baele, Brace, & Coan, 2019). Abusers' messages are often veiled under arguments for freedom of expression, conservative values, or described as jokes not to be taken seriously. However, much of the language and images on these boards actively reinforce the systemic dehumanization of certain groups of people (Conway, Scrivens, & Macnair, 2019).

In more extreme cases, multiple mass shootings have been linked to these online hate groups, one of which was live streamed on the internet. Several of the shooters have published racist and sexist manifestos on 8chan, which is an online forum that has been called a “cesspool of hate,” where users have praised the mass killer's “high score” for the number of people killed (Conway et al., 2019, pp. 8 & 13). These spaces are used to recruit and radicalize their users and use memes, GIFs, and specialized jargon to reinforce their messages (Conway et al., 2019). Similar groups, such as Incels, have advocated for and celebrated physical violence against women and employ a similar pattern of shared images and language to reinforce their messages of violence. Incels have been connected to at least two mass killings where the killers have been upheld as heroes of the movement for murdering people in revenge for their lack of sexual access to women (Baele et al., 2019). What makes the words and images used by these groups violent is their ability to reinforce systemic discrimination that dehumanizes marginalized groups, normalizing or inciting real world attacks against them.

Technology-Facilitated Violence and the Law

This section will review two recent cases that addressed whether certain technologically facilitated behaviors could be legally recognized as violence. As noted in the previous section, the law should not be viewed as the primary source of how we understand violence. However, it plays an important role in shaping societal understandings of violence, and some courts have recently begun interpreting whether certain forms of harmful technology-facilitated behaviors should be understood as violence. The following two cases were chosen because they both address laws where emotional and psychological abuse is included in the definition of violence, making them broad enough to capture many forms of TFV. The first case, AB v CD (2020 BCCA 11), 2 involved a father sharing private information about his transgender son with fringe-conservative media websites that promote transphobic messaging. It was selected due to the large amount of media attention the case was given around the question of what could be considered family violence. The second case, Buturugă v Romania (2020), involved an intimate partner violence relationship where an allegedly abusive ex-husband accessed his ex-wife's computer documents and social media accounts without her consent. It was selected because it was one of the first cases out of the ECtHR that recognized technology-facilitated privacy invasions as a form of violence.

AB v CD

In a 2020 decision, AB v CD, the BCCA set aside a finding of family violence from a lower court in a case where a father, CD, had used social media platforms and provided interviews to online newspapers in which he misgendered his transgender son, AB, and shared private information about him. His online interviews exposed his son to substantial violent commentary and unwanted transphobic attention from the greater community. One of the questions before the court was whether CD's actions met the definition of family violence. Family violence under the British Columbia Family Law Act (2011) includes “psychological or emotional abuse of a family member” such as “intimidation, harassment, coercion or threats” and “unreasonable restrictions on, or prevention of, a family member's financial or personal autonomy” (s. 1).

AB was a teenage boy who was assigned female at birth. When he sought gender-affirming medical care, his doctors and his mother supported his decision to begin taking masculinizing hormones so his body would better match his gender identity. AB had been struggling with challenges related to his gender, including the stresses he felt about his body's “femaleness” and the bullying and harassment that accompanied it. He had attempted suicide in March 2018, in part, due to the stresses related to his gender identity. His mother was fearful that if AB did not get access to gender-affirming medical treatments he might attempt suicide again. With gender-affirming treatment, AB's doctor believed some of AB's stresses would be relieved, and that he would be less likely to be harassed by others, reducing his risk of suicide. However, AB's father, CD, did not engage with his son's medical team and did not support the decision. Instead, CD filed a claim in family court disputing AB's ability to make this medical decision without CD's consent and later instigated a public campaign that included speaking to online conservative media sites about his disapproval of his son's medical decision specifically and gender-affirming medical treatments for transgender youth more generally. The story was widely covered by both mainstream and fringe-conservative media sources.

During interviews with online fringe conservative media sites and in his court filings, CD misgendered his son, publicly rejecting his child's gender identity and name change. The two media sites of concern, Culture Guard and The Federalist, are fringe online conservative media sites who write articles opposing gender transitions, particularly those involving minors. 3 In those interviews, CD shared detailed personal and medical information about AB and trivialized AB's suicide attempt. The authors of the articles also misgendered AB and published unredacted copies of AB's medical documents that identified him. Court documents showed that comments on these stories included derogatory statements about AB, claiming he was mentally ill, encouraging his suicide, suggesting he be kicked out of his family home, suggesting he be disowned, and advocating for a “post birth abortion” of AB ( AB v CD and EF, 2019 BCSC 604, para 33). The father, CD, expressed excitement about the media attention the story was getting and stated that he hoped that Fox News and Breitbart would pick up the story. Brietbart is an alt-right website known for spreading conspiracy theories and hiring provocative racist, sexist, and transphobic writers. Milo Yiannopoulos, the man credited for instigating the attacks on actor Leslie Jones, once worked there as a senior editor (Hunt, 2016). CD personally engaged in the comment sections on Culture Guard's website, offered to speak as a keynote speaker at a Culture Guard event (which he later declined due to a court ordered publication ban), and posted commentary on social media sites. Culture Guard also raised funds for CD's legal costs related to the case ( AB v CD and EF, 2019 BCSC 604).

In response to these acts, AB told the courts he was “terrified” that his father would go public with information that would identify AB, which could lead to “terrible bullying and violence” (2019 BCSC 604, para 32). AB was concerned about his physical and emotional safety around CD and felt scared to watch the interviews his father gave to Culture Guard. CD continued to engage online about his son's gender and medical decisions even after being cautioned by the courts that his behaviors were damaging to his son's well-being.

Lower lever courts (2019 BCSC 254 & 2019 BCSC 604) determined AB was a mature minor who was able to make decisions about his health without the consent of his father. A declaration was made that CD's behavior toward his son, including misgendering his son to third parties, 4 was a form of family violence and granted a protection order that restrained CD from sharing personal and medical information about AB with media outlets or on social media and required CD to refer to AB by his proper name and gender. The Court noted that CD's ongoing behavior put AB at risk of additional emotional and physical violence, including exposing AB to “degrading and violent public commentary” on social media ( AB v CD and EF, 2019 BCSC 604, para 68) and putting AB at “a high risk of public exposure and acts of emotional or physical violence, in the form of bullying, harassment, threats and physical harm, including self-harm” ( AB v CD and EF, 2019 BCSC 604, para 70). CD appealed the decision.

At the Court of Appeal, AB's ability to consent to his medical treatment was upheld, but the declaration that CD's actions were family violence and the protection order were set aside, largely due to procedural issues regarding which section of the Family Law Act (2011) was applied and what evidence was before the court ( AB v CD, 2020 BCCA 11). The protection order was replaced with a conduct order (an order to manage behaviors that might frustrate the resolution of a family law dispute) that prohibited CD from providing information about AB's gender identity or medical information to third parties and requiring CD refer to AB as male and by his chosen name. On the question of violence, the majority of the Court of Appeal found that the father's refusal to acknowledge AB's gender was “clearly hurtful” to AB, but stated that there was insufficient evidence that CD intended to hurt his son ( AB v CD, 2020 BCCA 11, para 171).

When considering the father’s behaviour and the concept of family violence, the Court’s ( AB v CD, 2020 BCCA 11) focus on the father’s intention was problematic, as the Family Law Act's (2011) definition of family violence does not specify that the psychological or emotional abuse needs to be intentional. However, there was evidence that CD's behaviors had emotionally and psychologically harmed AB, and his father had neglected to acknowledge that harm. The Court noted that CD did not engage with AB's medical team to better understand his son's medical decision and had not considered how AB would be negatively affected by his engagement with the public online forums that were advocating against young transgendered people's rights to make medical decisions related to their gender. The Court also noted that CD ignored the effect derogatory public comments would have on his son. Nonetheless, the Court of Appeal found that CD's actions were not emotional abuse amounting to family violence, but simply “misguided” and “irresponsible” behavior on the part of the father ( AB v CD, 2020 BCCA 11, para 179).

Focusing on the intention of the abuser rather than the impact of the abused in this way has the potential to permit a great deal of abusers to avoid responsibility for their behavior (Jackman, 2002). Racist, sexist, and transphobic online commentators often claim to be protecting traditional values or are only making comments as jokes, but the impact for those targeted is consequential (Conway et al., 2019). To claim ignorance or a lack of intent to harm his son, CD should not have been able to rely on his own reckless disregard of the impact his online campaign would have on his son, particularly because the harms were well documented by the courts. While CD cannot be held responsible for the individual actions of others on the internet, his ongoing engagement with digital groups that were encouraging physical and social harm against his vulnerable son should have played a larger role in the analysis of the overall impact of CD's behavior. The Court's focus should have been on the actual impact his ongoing actions.

The Family Law Act (2011) specifically includes psychological and emotional abuse as a form of family violence, and TFV has been recognized as a form of family violence in other cases. However, when commenting on the legal understanding of family violence, the Court of Appeal in this case warned that “some caution should be exercised in identifying ‘psychological or emotional abuse’ as constituting ‘family violence’” ( AB v CD, 2020 BCCA 11, para 175). This is a troubling direction for the court to take as it minimizes the significant harms AB experienced from his father's technology-facilitated actions. AB v CD was not a case where the person's technology-facilitated actions – misgendering his child in public digital spaces known for dehumanizing transgender people – had a minimal emotional or psychological impact on the person targeted. AB provided evidence to the court that CD's ongoing behavior was causing AB to fear for his physical and emotional safety and was impacting his autonomy. CD's child was at risk of self-harm and had medical professionals who recognized the harms experienced due to transphobia by his father and the general public. The courts recognized that CD had made his son an “unwilling poster child” in a divisive public debate about transgender youth rights that made dehumanizing comments about his son and transgender people ( AB v CD and EF, 2019 BCSC 604, para 69). Moreover, lower level courts in British Columbia have recognized forms of TFV as a form of family violence in the past. In previous cases, the courts found that excessive amounts of text messages and emails ( X v Y, 2015; M(MW) v K(JD), 2015) caused psychological or emotional harms amounting to family violence. The judgments in these cases did not focus on whether the abuser intended to harm the other person by their actions. Instead, in both cases, the abuser did not appreciate the full consequences of his behavior; however, the courts focused on the impact on the person being abused to determine if the technology-facilitated behavior was a form of family violence rather than the abuser's intention of his actions ( X v Y, 2015; M(MW) v K(JD), 2015).

Considering this, CD's behavior could be likened to critical race theories (Matsuda et al., 1993; Jackman, 2002) on violence where words and images are used to reinforce the systemic dehumanization of marginalized groups and risk inciting physical violence against them. Research has shown that transgender people are disproportionately at risk of discrimination, violence, and suicide (Ashley, 2018b). Due to the high levels of discrimination transgender people experience, Canada has recently recognized gender identity and expression under federal and provincial human rights legislation (e.g., Canadian Human Rights Act, 1985). As a transgender youth, AB was a member of group who experienced systemic discrimination. His father's actions were directly linked to AB's social location in this group, and CD was aware that AB had been harassed due to his gender identity and had attempted suicide in part due to challenges he faced having his proper gender recognized and affirmed. Yet, CD recklessly continued to make public statements rejecting his son's gender identity, engaging with transphobic media outlets, prioritizing his own agenda over the safety, and well-being of his child.

In sum, I would argue that the lower level courts' understanding of violence was more aligned with the concept of TFV outlined in this chapter than that of the Court of Appeal. In the lower court's decision, the “degrading and violent public commentary” was acknowledged as contributing to AB's fears, and it recognized that CD's behavior could incite emotional and physical violence against his son ( AB v CD and EF, 2019 BCSC 604, para 68). In the future, it is important that courts look to the impact of technology-facilitated behaviors in controlling the targeted individual as well as whether systemic factors, like transphobia, are present in the cases where the definition of violence includes psychological harms and limitations on autonomy.

Buturugă v Romania

In early 2020, the ECtHR held that “cyberviolence” is a form of intimate partner violence and that Romanian authorities had a positive obligation to investigate an abusive partner's unauthorized access to his partner's computer and social media accounts as it related to her case of domestic violence (Buturugă v Romania). In Buturugă v Romania (2020), Gina-Aurelia Buturugă brought a complaint against the Romanian government for failing to adequately investigate complaints she made against her husband, who she alleged had physically assaulted her, threatened to kill her, and accessed her electronic documents and social media without consent, copying her private correspondence, photos, and other material. According to Buturugă, her husband, who is only named in the decision by his initials (“MV”), had physically abused her during their relationship. She stated that when they began discussing divorce in late 2013, the violence escalated to the point that her husband threatened to kill her by throwing her off the balcony to make it look as though she died by suicide. Several days later, MV allegedly hit Buturugă on the head and threatened to kill her with an axe. Buturugă visited the hospital where she required significant care and obtained a medical document that detailed her injuries.

According to Buturugă, the authorities failed to adequately investigate her claims and tried to convince her to withdraw her complaints of physical violence. When she later reported that MV had accessed her computer and social media without her consent, she asked the police to search their family computer to collect evidence that her husband had violated her privacy as a part of his pattern of abuse. However, she was told by the police that the privacy invasion was unrelated to the violence she reported. In 2014, MV violated a protection order that was put in place to protect Buturugă from violence by sending threats through a family member, chasing Buturugă down the street, and trying to convince her to drop the criminal charges. The police closed her case in early 2015, stating the threats against her were not serious enough to warrant investigation, there was no evidence that MV had been the one to cause her injuries (although the police did not investigate who else could have caused them), and the alleged violation of her privacy was irrelevant to the subject matter of the case. Buturugă unsuccessfully contested the decision in the Romanian courts.

Dissatisfied with the treatment of her case by the Romanian police and the courts, Buturugă filed a complaint with the ECtHR, arguing that the Romanian government had not properly investigated her case and had violated her right to be free from inhumane or degrading treatment and her right to privacy under the European Convention on Human Rights. After hearing the case, the ECtHR found Romania had a positive obligation to take reasonable measures to investigate Buturugă’s claim as a form of domestic violence, including the unwanted access of her computer and social media accounts by her ex-husband. In its decision, the ECtHR found that the police had not adequately investigated who had caused Buturugă’s injuries or the alleged privacy violations. Importantly, the Court noted that unauthorized access to someone's computer, digital surveillance, and the taking, storing, and manipulating of data and images were recognized forms of violence under both domestic and international law. Under Romanian law, family violence includes emotional harms, including hindering a woman from exercising her fundamental rights and liberties, such as her right to private life, as well as physical and sexual violence. The ECtHR found that technology-facilitated privacy invasions could be considered a form of violence.

This decision by the ECtHR recognizes the role of TFV in violent intimate partner relationships that fit within with some of the theories of violence discussed above. As noted by Evan Stark (2007) and Anne Jones (2000), psychological harms that make a person feel as though they are being surveilled and controlled can be as effective in dominating a person in a violent intimate partner relationship as physical violence. The Romanian police and courts failed to understand how MV breaking into his ex-wife's computer to access her digital content was part of a larger pattern of violence in their relationship that significantly impacted Buturugă’s agency and autonomy. Entrenched norms of what “real” (i.e. physical) violence looks like likely influenced their decision. According to Buturugă, the police were encouraging her to drop the claims against her husband's physical abuse, and they neglected to include the privacy invasions in their investigation of his abuse, which suggests there may have been systemic factors at play that minimize gendered-based violence generally and TFV specifically. By separating the privacy invasion from the physical harms in their investigation, the police failed to see what McGlynn, Rackley, and Houghton (2017) described as the “commonalities between seemingly disparate phenomenon” that lie along the continuum of violence (p. 4).

In a violent intimate partner relationship, when an ex-partner who has been exhibiting controlling behaviors gains access to their ex's social media accounts and other personal digital files, it contributes to feelings of the partner being ever-present and controlling. The ECtHR made the decision in this case to ensure the harms caused by unauthorized access by an abusive partner to a person's digital content were not minimized. In the context of this relationship, which had a history of physical violence, threats, and harassment, it made sense that MV's nonconsensual access to Buturugă’s digital information was a part of his pattern of control in this abusive relationship. This places TFV on the continuum of violence and acknowledges its role in violent coercive relationships, something that the ECtHR affirmed.

Conclusion

There is power in recognizing and naming violence, including psychological and TFV. As society and the courts make determinations on what violence is, it is critical that they avoid minimizing the effects of TFV. Like the early days of antisexual harassment advocacy, there is a need for a cultural shift around TFV. Naming something gives it clarity and helps people situate it within their cultural understanding of what behaviors are socially acceptable and which are not. In our current society, TFV needs to be recognized and condemned. We cannot rely solely on legal decisions, such as the two discussed above, to guide this understanding, however, legal definitions do help shape societal views on violence and can provide useful legal remedies. Additional efforts in education, advocacy, and changes to societal attitudes are needed so targets are not blamed for their victimization are also critical components to changing social views on TFV.

Building on the theories developed by critical race, feminist, and antiviolence scholars and advocates, this chapter argued that TFV can be recognized by looking for behaviors that control, dominate, and instill fear in the person targeted, particularly those behaviors that rely on existing systemic power structures to further dehumanize and limit the autonomy of a particular person or group. Understanding harmful technology-facilitated behaviors can be complex and may require examining the situational or systemic factors at play to assess whether it meets the threshold of violence. In AB v CD (2019 BCSC 1057, 2019 BCSC 254, 2019 BCSC 604, 2019 BCCA 256, 2019 BCCA 297, 2020 BCCA 11) and Buturugă v Romania (2020) , systemic issues of transphobia and sexism, respectively, played a role in the TFV experienced. The publishing of private information online, misgendering a transgender youth on a digital platform, and accessing an ex-spouse's computer and social media without consent must be situated in larger patterns of systemic violence made to control, demean, and significantly limit the autonomy of the person targeted. This chapter argued that the Court of Appeal in AB v CD (2019 BCCA 256, 2019 BCCA 297, 2020 BCCA 11) was too limited in its assessment of violence, problematically relying on the intention of CD rather than the impact of his actions on AB. Whereas in Buturugă v Romania, the ECtHR was able to correctly situate MV's actions within the context of domestic violence. These are just two examples of the way courts can interpret TFV, but they illustrate two potential paths the courts may take in the future: one that minimizes TFV and one that illuminates it. In the future, we should look to the theories mentioned in this chapter to help us understand TFV and its place in our social and legal understanding of violence.

Notes

1

Milo Yiannopoulos is an alt-right public figure who was well known for his racist, homophobic, and sexist commentary online. His commentary was known to instigate harassment campaigns against the people he criticized. His account was permanently suspended from Twitter in 2016 due to his abusive and hateful commentary (Hunt, 2016).

3

Each of these contain “transgender” sections, for example: https://www.cultureguard.com/category/transgender/; https://thefederalist.com/tag/transgender/.

4

“Attempting to persuade AB to abandon treatment for gender dysphoria; addressing AB by his birth name; referring to AB as a girl or with female pronouns whether to him directly or to third parties; shall be considered to be family violence under s. 38 of the Family Law Act” (2019 BCSSC 254, para 70).

References

AB v CD, 2019a AB v CD . (2019). BCCA 256.

AB v CD, 2019b AB v CD . (2019). BCCA 297.

AB v CD, 2020 AB v CD . (2020). BCCA 11.

AB v CD and EF, 2019a AB v CD and EF . (2019). BCSC 1057.

AB v CD and EF, 2019b AB v CD and EF . (2019). BCSC 254.

AB v CD and EF, 2019c AB v CD and EF . (2019). BCSC 604.

Amnesty International, 2018 Amnesty International . (2018). #ToxicTwitter. London: Amnesty International.

Ashley, 2018a Ashley, F. (2018a). Genderfucking non-disclosure: Sexual fraud, transgender bodies, and messy identities. Dalhousie Law Journal, 41(2), 339377.

Ashley, 2018b Ashley, F. (2018b). Don't be so hateful: The insufficiency of anti-discrimination and hate crime laws in improving trans well-being. University of Toronto Law Journal, 68, 136.

Awan and Zempi, 2015 Awan, I. , & Zempi, I. (2015). ‘I will blow your face off’: Virtual and physical world anti-Muslim hate crime. British Journal of Criminology, 110.

Azmi and Kautsarina, 2019 Azmi, R. , & Kautsarina, K. (2019). Revisiting cyber definition. In 18th European Conference on Cyber Warfare and Security.

Backhouse, 2012a Backhouse, C. (2012a). Sexual harassment: A feminist phrase that transformed the workplace. Canadian Journal of Women and the Law, 24(2), 275300.

Backhouse, 2012b Backhouse, C. (2012b). A feminist remedy for sexual assault: A quest for answers. In E. Sheehy (Ed.), Sexual assault law in Canada (pp. 725739). Ottawa, ON: University of Ottawa Press.

Backhouse and Cohen, 1978 Backhouse, C. , & Cohen, L. (1978). The secret oppression : Sexual harassment of working women. Toronto, ON: Macmillan of Canada.

Baele et al., 2019 Baele, S. J. , Brace, L. , & Coan, T. G. (2019). From ‘Incel’ to ‘saint’: Analyzing the violent worldview behind the 2018 Toronto attack. Terrorism and Political Violence, 125.

Bailey, 2007 Bailey, J. (2007). Confronting collective harm: Technology's transformative impact on child pornography. University of New Brunswick Law Journal, 56, 65102.

Bailey, 2016 Bailey, J. (2016). Canadian legal approaches to ‘cyberbullying’ and cyberviolence: An overview . Ottawa Faculty of Law Working Paper No. 2016-37.

Bailey and Mathen, 2017 Bailey, J. , & Mathen, C. (2017). Technologically-facilitated violence against women and girls: If criminal law can respond, should It? Ottawa Faculty of Law Working Paper No. 2017-44.

Balos, 2004 Balos, B. (2004). A man's home is his castle: How the law shelters domestic violence and sexual harassment. Saint Louis University Public Law Review, 23(1), 77.

Barratt, 2018 Barratt, S. A. (2018). Reinforcing sexism and misogyny: Social media, symbolic violence and the construction of femininity-as- fail. Journal of International Women's Studies, 19(3), 1631.

Bates, 2016 Bates, S. (2016). Revenge porn and mental health: A qualitative analysis of the mental health effects of revenge porn on female survivors. Feminist Criminology, 12(1), 22.

Bellafante, 2020 Bellafante, G. (29 May 2020). Why Amy Cooper's use of ‘African-American’ stung. New York Times. Retrieved from https://www.nytimes.com/2020/05/29/nyregion/Amy-Cooper-Central-Park-racism.html

Bonnet, 2015 Bonnet, F. (2015). Intimate partner violence, gender, and criminalization: An overview of American debates. Presses de Sciences, 2(56), pp. 357383.

Burgin and Flynn, 2019 Burgin, R. , & Flynn, A. (2019). Women's behaviour as implied consent: Male “reasonableness” in Australian rape law. Criminology and Criminal Justice. doi:10.1177/1748895819880953

Buturugă, 2020 Buturugă v. Romania . (2020). European Court of Human Rights, application no. 56867/15.

Campbell, 2013 Campbell, D. A. (2013). The evolution of sexual harassment case law in Canada. Kingston, ON: Queen’s University Industrial Relations Centre.

Canadian Human Rights Act, 1985 Canadian Human Rights Act . (1985). RSC. c H-6.

Citron, 2014 Citron, D. (2014). Hate crimes in cyberspace. Cambridge: Harvard University Press.

Citron, 2019 Citron, D. (2019). Sexual privacy. The Yale Law Journal, 128(7), 18701960.

Conway et al., 2019 Conway, M. , Scrivens, R. , & Macnair, L. (2019). Right-wing extremists' persistent online presence: History and contemporary trends. The Hague: International Centre for Counter-Terrorism.

Delgado, 2014 Delgado, R. (2014). Law's violence: Derrick Bell's next article. University of Pittsburgh Law Review, 75, 435455.

Dodge, 2016 Dodge, A. (2016). Digitizing rape culture: Online sexual violence and the power of the digital photograph. Crime, Media, Culture: An International Journal, 12(1), 6582.

Douglas et al., 2019 Douglas, H. , Harris, B. A. , & Dragiewicz, M. (2019). Technology-facilitated domestic and family violence: Women's experiences. British Journal of Criminology, 59(3), 551570.

Dragiewicz et al., 2018 Dragiewicz, M. , Burgess, J. , Matamoros-Fernández, A. , Salter, M. , Suzor, N. , Woodlock, D. , & Harris, B. (2018). Technology facilitated coercive control: Domestic violence and the competing roles of digital media platforms. Feminist Media Studies, 117.

Duggan, 2014 Duggan, M. (2014). Online harassment. Washington, DC: Pew Research Center.

Dunn, 2020 Dunn, S. (2020). Identity manipulation: Responding to advances in artificial intelligence and robotics. WeRobot 2020 conference paper (pp. 117-140).

Dunn et al., 2017 Dunn, S. , Lalonde, J. , & Bailey, J. (2017). Terms of silence: Weaknesses in corporate and law enforcement responses to cyberviolence against girls. Girlhood Studies, 10(2), 8096.

Fairbairn, 2015 Fairbairn, J. (2015). Rape threats and revenge porn: Defining sexual violence in the digital age. In J. Bailey , & V. Steeves (Eds.), eGirls, eCitizens: Putting technology, theory and policy into dialogue with girls’ and young women’s voices (pp. 229–252). Ottawa, ON: University of Ottawa Press.

Family Law Act, 2011 Family Law Act . (2011). SBC. c 25, Part 1 s 1, Part 9 s 183.

Filipovic, 2007 Filipovic, J. (2007). Blogging while female: How internet misogyny parallels real-world harassment. Yale Journal of Law and Feminism, 19(1), 295304.

Follingstad and Rogers, 2013 Follingstad, D. , & Rogers, M. J. (2013). Validity concerns in the measurement of women's and men's report of intimate partner violence. Sex Roles, 69(3), 149167.

Gibson, 1984 Gibson, W. (1984). Neuromancer. New York, NY: Ace Books.

Hall and Hearn, 2019 Hall, M. , & Hearn, J. (2019). Revenge pornography and manhood acts: A discourse analysis of perpetrators' accounts. Journal of Gender Studies, 28(2), 158170.

Henry et al., 2018 Henry, N. , Flynn, A. , & Powell, A. (2018). ‘Policing image-based sexual abuse: Stakeholder perspectives’. Police Practice and Research: An International Journal, 19(6), 565581.

Henry et al., 2019 Henry, N. , Flynn, A. , & Powell, A. (2019). Responding to revenge pornography: The scope, nature and impact of Australian criminal laws: A report to the criminology research council. Canberra: Australian Institute of Criminology.

Henry et al., 2020 Henry, N. , McGlynn, C. , Flynn, A. , Johnson, K. , Powell, A. , & Scott, A. J. (2020). Image-based sexual abuse: A study on the causes and consequences of non-consensual nude or sexual imagery. London, New York: Routledge.

Henry and Powell, 2015 Henry, N. , & Powell, A. (2015). Embodied harms: Gender, shame, and technology-facilitated sexual violence. Violence Against Women, 21(6), 758779.

Human Rights Watch, 2013 Human Rights Watch . (2013). Those who take us away. New York, NY: Human Rights Watch.

Hunt, 2016 Hunt, E. (20 July 2016). Milo Yiannopoulos, rightwing writer, permanently banned from Twitter. The Guardian.

Jackman, 2002 Jackman, M. R. (2002). Violence in social life. Annual Review of Sociology, 28, 387415.

Jane, 2016 Jane, E. A. (2016). Online misogyny and feminist digilantism. Continuum, 30(3), 284297.

Jones, 2000 Jones, A. (2000). Next time, she’ll be dead : Battering & how to stop it. Boston, MA: Beacon Press.

Kelly, 1987 Kelly, L. (1987). The continuum of sexual violence. In J. Hanmer , & M. Maynardm (Eds.), Women, violence, and social control (pp. 46–60). London: Macmillan Press.

Kelly, 1988 Kelly, L. (1988). Surviving sexual violence. Cambridge: Polity Press.

Kidd and Turner, 2016 Kidd, D. , & Turner, A. J. (2016). The #GamerGate files: Misogyny in the media. In A. Novak , & I. El-Burki (Eds.), Defining identity and the changing scope of culture in the digital age (pp. 117–139). Hershey, PA: IGI Global.

Koshan, 2017 Koshan, J. (2017). The judicial treatment of marital rape in Canada: A post-criminalisation case study. In M. Randall , J. Koshan , & P. Nyaundi (Eds.), The right to say no marital rape and law reform in Canada, Ghana, Kenya and Malawi (pp. 257298). Portland, OR: Hart Publishing.

Lenhart et al., 2016 Lenhart, A. , Ybarra, M. , Zickuhr, K. , & Price-Feeney, M. (2016). Online harassment, digital abuse, and cyberstalking in America. New York, NY: Data & Society.

Loise Backe et al., 2018 Loise Backe, E. , Lilleston, P. , & McCleary-Sills, J. (2018). Networked individuals, gendered violence. A literature review of cyberviolence, 5(3), 135146.

MacKinnon and Siegel, 2004 MacKinnon, C. , & Siegel, R. (2004). Directions in sexual harassment law. New Haven, CT: Yale University Press.

Manjoo, 2012 Manjoo, R. (2012). The continuum of violence against women and the challenges of effective redress. International Human Rights Law Review, 1(1), 129.

Marwick, 2017 Marwick, A. (2017). Scandal or sex crime gendered privacy and the celebrity nude photo leaks. Ethics and Information Technology, 19(3), 177191.

Massanari, 2017 Massanari, A. (2017). #Gamergate and the fappening: How Reddit's algorithm, governance, and culture support toxic technocultures. New Media & Society, 19(3), 329346.

Matsuda et al., 1993 Matsuda, M. , Lawerence, C. , Delgado, R. , & Crenshaw, K. (1993). Words that wound: Critical race theory, assaultive speech, and the first amendment. London: Routledge.

McGlynn and Rackley, 2017 McGlynn, C. , & Rackley, E. (2017). Image-based sexual abuse. Oxford Journal of Legal Studies, 37(3), 534561.

McGlynn et al., 2017 McGlynn, C. , Rackley, E. , & Houghton, R. (2017). Beyond 'revenge porn': The continuum of image-based sexual abuse. Feminist Legal Studies, 15, 122.

M(MW) v K(JD), 2015 M(MW) v K(JD) . (2015). BCPC 315.

Nakamura, 2013 Nakamura, L. (2013). “It's a nigger in here! Kill the nigger!” User-generated media campaigns against racism, sexism, and homophobia in digital games. In K. Gates (Ed.), The international encyclopeida of media studies VI: Media studies future (pp. 1–15). Hoboken, NJ: Blackwell Publishing.

Parsons et al., 2019 Parsons, P. , Molnar, A. , Dalek, J. , Knockel, J. , Kenyon, M. , Haselton, B. , & Deibert, R. (2019). The predator in your pocket: A multidisciplinary assessment of the stalkerware application industry. Toronto, ON: Citizenlab.

Peterson and Densley, 2017 Peterson, J. , & Densley, J. (2017). Cyber violence: What do we know and where do we go from here?. Aggression and Violent Behavior, 34, 193200.

Powell and Henry, 2017 Powell, A. , & Henry, N. (2017). Sexual violence in a digital age. London: Palgrave Macmillan.

Powell et al., 2018 Powell, A. , Henry, N. , & Flynn, A. (2018). Image-based sexual abuse. In W. S. DeKeseredy, & M. Dragiewicz (Eds.), Routledge handbook of critical criminology (2nd ed., pp. 305–315). Abingdon and New York, NY: Routledge.

Quinn, 2000 Quinn, B. (2000). The paradox of complaining: Law, humor, and harassment in the everyday work world. Law & Social Inquiry, 25(4), 11511185.

Quodling, 2016 Quodling, A. (2016). Platforms are eating society: Conflict and governance in digital spaces. In A. McCosker , S. Vivienne , & A. Johns (Eds.), Negotiating digital citizenship: Control, contest and culture (pp. 131146). Lanham, MD: Rowman & Littlefield Publishers.

Renold and Barter, 2005 Renold, E. , & Barter, C. (2005). Challenging the normalisation of violence in children's homes from young people's perspectives. In E. Stanko (Ed.), The meanings of violence (pp. 90–111). London: Routledge.

Sanders-Phillips, 2009 Sanders-Phillips, K. (2009). Racial discrimination: A continuum of violence exposure for children of color. Clinical Child and Family Psychology Review, 12, 174195.

Silman, 2016 Silman, A. (2016, August 24). A timeline of Leslie Jones's horrific online abuse. The Cut. Retrieved from https://www.thecut.com/2016/08/a-timeline-of-leslie-joness-horrific-online-abuse.html

Stanko, 2005 Stanko, E. (Ed.). (2005). The meanings of violence. London: Routledge.

Stark, 2007 Stark, E. (2007). Coercive control: The entrapment of women in personal life. Oxford: Oxford University Press.

United Nations, 2018 United Nations . (2018). HRC 38th, 18/06/2018. A/HRC/38/47, Report of the Special Rapporteur on violence against women, its causes and consequences on online violence against women and girls from a human rights perspective.

United Nations General Assembly, 1979 United Nations General Assembly . (1979). Convention on the elimination of all forms of discrimination against women. 18 December 1979, A/RES/34/180.

Vera-Gray, 2016 Vera-Gray, F. (2016). Men's intrusion, women's embodiment: A critical analysis of street harassment. New York, NY: Routledge.

Walcott et al., 2016 Walcott, R. , Foster, C. , Campbell, M. , & Seally, D. (2016). Racial minority perspectives on violence: A report prepared for the review of the roots of youth violence. Ontario, ON: Ministry of Children, Community and Social Services.

Waldman, 2019 Waldman, A. E. (2019). Law, privacy, and online dating: ‘Revenge porn’ in gay online communities. Law & Social Inquiry, 44(4), 9871018.

Weiss, 2009 Weiss, K. (2009). ‘Boys will be boys’ and other gendered accounts: An exploration of victims' excuses and justifications for unwanted sexual contact and coercion. Violence Against Women, 15(7), 810834.

X v Y, 2015 X v Y . (2015). BCSC 336.

Prelims
Technology-Facilitated Violence and Abuse: International Perspectives and Experiences
Section 1 TFVA Across a Spectrum of Behaviors
Chapter 1 Introduction
Chapter 2 Is it Actually Violence? Framing Technology-Facilitated Abuse as Violence
Chapter 3 “Not the Real World”: Exploring Experiences of Online Abuse, Digital Dualism, and Ontological Labor
Chapter 4 Polyvictimization in the Lives of North American Female University/College Students: The Contribution of Technology-Facilitated Abuse
Chapter 5 The Nature of Technology-Facilitated Violence and Abuse among Young Adults in Sub-Saharan Africa
Chapter 6 The Face of Technology-Facilitated Aggression in New Zealand: Exploring Adult Aggressors' Behaviors
Chapter 7 The Missing and Murdered Indigenous Women Crisis: Technological Dimensions
Chapter 8 Attending to Difference in Indigenous People's Experiences of Cyberbullying: Toward a Research Agenda
Section 2 Text-Based Harms
Chapter 9 Introduction
Chapter 10 “Feminism is Eating Itself”: Women's Experiences and Perceptions of Lateral Violence Online
Chapter 11 Claiming Victimhood: Victims of the “Transgender Agenda”
Chapter 12 Doxxing: A Scoping Review and Typology
Chapter 13 Creating the Other in Online Interaction: Othering Online Discourse Theory
Chapter 14 Text-Based (Sexual) Abuse and Online Violence Against Women: Toward Law Reform?
Section 3 Image-Based Harms
Chapter 15 Introduction
Chapter 16 Violence Trending: How Socially Transmitted Content of Police Misconduct Impacts Reactions toward Police Among American Youth
Chapter 17 Just Fantasy? Online Pornography's Contribution to Experiences of Harm
Chapter 18 Intimate Image Dissemination and Consent in a Digital Age: Perspectives from the Front Line
Section 4 Dating Applications
Chapter 19 Introduction
Chapter 20 Understanding Experiences of Sexual Harms Facilitated through Dating and Hook Up Apps among Women and Girls
Chapter 21 “That's Straight-Up Rape Culture”: Manifestations of Rape Culture on Grindr
Chapter 22 Navigating Privacy on Gay-Oriented Mobile Dating Applications
Section 5 Intimate Partner Violence and Digital Coercive Control
Chapter 23 Introduction
Chapter 24 Digital Coercive Control and Spatiality: Rural, Regional, and Remote Women's Experience
Chapter 25 Technology-Facilitated Violence Against Women in Singapore: Key Considerations
Chapter 26 Technology as Both a Facilitator of and Response to Youth Intimate Partner Violence: Perspectives from Advocates in the Global-South
Chapter 27 Technology-Facilitated Domestic Abuse and Culturally and Linguistically Diverse Women in Victoria, Australia
Section 6 Legal Responses
Chapter 28 Introduction
Chapter 29 Human Rights, Privacy Rights, and Technology-Facilitated Violence
Chapter 30 Combating Cyber Violence Against Women and Girls: An Overview of the Legislative and Policy Reforms in the Arab Region
Chapter 31 Image-Based Sexual Abuse: A Comparative Analysis of Criminal Law Approaches in Scotland and Malawi
Chapter 32 Revenge Pornography and Rape Culture in Canada's Nonconsensual Distribution Case Law
Chapter 33 Reasonable Expectations of Privacy in an Era of Drones and Deepfakes: Expanding the Supreme Court of Canada's Decision in R v Jarvis
Chapter 34 Doxing and the Challenge to Legal Regulation: When Personal Data Become a Weapon
Chapter 35 The Potential of Centralized and Statutorily Empowered Bodies to Advance a Survivor-Centered Approach to Technology-Facilitated Violence Against Women
Section 7 Responses Beyond Law
Chapter 36 Introduction
Chapter 37 Technology-Facilitated Violence Against Women and Girls in Public and Private Spheres: Moving from Enemy to Ally
Chapter 38 As Technology Evolves, so Does Domestic Violence: Modern-Day Tech Abuse and Possible Solutions
Chapter 39 Threat Modeling Intimate Partner Violence: Tech Abuse as a Cybersecurity Challenge in the Internet of Things
Chapter 40 Justice on the Digitized Field: Analyzing Online Responses to Technology-Facilitated Informal Justice through Social Network Analysis
Chapter 41 Bystander Apathy and Intervention in the Era of Social Media
Chapter 42 “I Need You All to Understand How Pervasive This Issue Is”: User Efforts to Regulate Child Sexual Offending on Social Media
Chapter 43 Governing Image-Based Sexual Abuse: Digital Platform Policies, Tools, and Practices
Chapter 44 Calling All Stakeholders: An Intersectoral Dialogue about Collaborating to End Tech-Facilitated Violence and Abuse
Chapter 45 Pandemics and Systemic Discrimination: Technology-Facilitated Violence and Abuse in an Era of COVID-19 and Antiracist Protest