Technology-Facilitated Abuse gets Smarter: Evolving Power Dynamics and Consequences for Sexual Violence Policies
Patel T
Published on: 2021-08-24
Abstract
The Internet of Things (IoT) has been described as the driver behind the fourth industrial revolution, characterised by an increasing number of mundane objects being connected to the internet in order for users to control them remotely. This paper focuses on the uncertainties being created by software developers of these products, and the impact this has on survivors of technology-facilitated domestic abuse. Based on a series of focus groups with front line workers in women’s support services, many of whom are previous victims of abuse, I argue these uncertainties in IoT device capabilities allow for a transition away from forms of abuse aligned with punishment towards those closer aligned to discipline. I demonstrate that when care for vulnerable people is needed to be served on an individual scale, legislation is often viewed as compartmentalized and fails to reflect social forces exerted on the individual leading to a sense of injustice. Policies created to regulate technology companies allow them to avoid liability of users’ actions and coercive control legislation solely hold perpetrators to account without taking into account the wider abuse dynamic. This paper calls into question a need for the policy making system to be changed due to the perspective of survivors.
Keywords
Technology-Facilitated Abuse; Domestic Violence; Domestic Abuse LegislationIntroduction
Technology facilitated abuse is now widely prevalent within abusive relationships. Common methods include GPS tracking and stalking on social media and more recently smart home devices have become more prominent. Technology facilitated abuse is often present with other forms of psychological, physical or sexual abuse [1]. The increasingly widespread use of internet-connected devices provides a means for easier and repeated control, coercion, harassment and stalking from afar. Many survivors however do not perceive it as abuse or controlling behavior [2] Technology facilitated abuse often falls under coercion defined as “the use of force or threats to compel or dispel a particular response” and control defined as “structural forms of deprivation, exploitation, and command that compel obedience indirectly” and when they occur together, the result is a “condition of unfreedom [3]. In the UK, in 2015 controlling or coercive behavior became a criminal offence in England and Wales [4] followed by Scotland and the Irish Republic in 2018 [5], and Northern Ireland in 2019 [6] Prior to this there existed a gap between survivors’ experiences and the legal response, the latter of which previously focused on discrete, injurious assaults which was considered too narrow to capture a growing evidence base of controlling and coercive behavior. In 2012 the British Home Office conducted a public consultation to identify the best framework for coercive control which led to the new legislation with the support of evidence-based scientific research [7] Internet of Things (IoT) devices are often sold with a vision of a life in which small, daily tasks are made simpler and easier with the assistance of these devices. A utopia in which each individual’s busy life is alleviated through the help of a smart assistant which connects to, and controls, other objects within the home is envisioned and sold, intrinsically leading to benefits for the user. Alongside their increasing availability, imaginaries of these device capabilities are becoming more prevalent within society as marketing is based on visions of the future. Media accounts of what these devices are capable of however are less positive as stories of these devices being used to collect data about people and uncertainties as to what this data is being used for is leading to mistrust of the devices and imaginaries of uses of the information data collected [8] Internet of Things devices are built around the concept of ‘getting to know’ the user(s) in order to learn voices, preferences, routines and behaviors. Machine learning methods are being used to create devices which are sold under the assumption they will become more compatible with the user as they ‘get to know’ their habits and routines [9] The nature of devices which change, both as they ‘get to know the user(s)’ and through software updates can however lead to further uncertainties around device capabilities. With an increasing number of questions being asked about personal data collection and use, as well as observations that these devices are not without software faults has led to reports of devices acting in unexpected ways [10]. These narratives and media accounts act further to endorse the uncertainties around device functionalities. In the case of domestic abuse and coercion, such technological uncertainties can lead to the survivor self-regulating their behavior. A common method of abuse is Gas lighting, a term originating from a 1938 Patrick Hamilton play Gaslight in which a husband seeks to convince his wife and others that she is mentally ill by manipulating elements of the environment in their home and insisting she is mistaken, incorrectly remembering things or delusional [11]. Within this form of gas lighting however the survivor is correct about the change in environment, the perpetrator convinces the survivor and others of the survivor’s ill mental health. In the case of smart home devices, the imaginary created around their capabilities causes uncertainties but also leads to survivors making false claims. It is such false claims which may cause them to appear to be delusional. The uncertainty removes the perpetrator’s need to convince the survivor and others of the survivor’s ill mental health and instead the imaginary itself achieves this. This paper focusses on the findings of a project run to look at the prevalence of technology-facilitated abuse in coercive relationships. The project itself ran workshops with focus groups comprised of women working in support services for women and girls experiencing domestic and sexual abuse, as well as coercive control. In doing so, their perspectives were sought so that the university could aid in creating better policies designed with their input, aimed towards alleviating the situation for these vulnerable groups whilst ensuring their perspectives were taken into account. From the data presented here I argue the uncertainties in technological capabilities in these new devices is allowing for the transition away from mechanisms of punishment towards discipline and self-regulation, exploring the effect this has on abusive relationships being permitted through these devices. As technology companies are those who collect the data as well as perpetrators, the focus groups reveal the abuser is now becoming a combination of the technology companies, perpetrators themselves, as well as institutions often framed as supportive but in this case are seen as authoritative bodies which lack understanding on the level of the individual. The latter are authoritative institutions which base much of their work on policies informed by scientific research and the production of knowledge, itself based on the collection of data through evidence-based social science research [12]. The findings highlight a need for legislation on sexual violence to take a whole system approach.
Methods
A series of four focus group discussions with front-line women’s support workers who are the first contact point for women experiencing domestic abuse, working within organisations which are a part of the London Violence against Women and Girls Consortium were conducted to understand their views on these emerging Internet of Things technologies as tools of abuse. Each focus group consisted of five participants, coordinated by one facilitator. The consortium is made up of twenty nine organizations of which fourteen represent ethnic minority women and all represent women or women and girls. The representation of ethnic minority groups was dominant during the workshop with twenty of the thirty participants from charities supporting ethnic minority communities. Audio recordings we’re transcribed by a transcription company and anonymized and analyzed by the author. Conversation primarily focused on survivors’ experiences of technology facilitated abuse, potential benefits of the increasing use of internet connected devices within the home, policies created to protect victims of abuse, and finally the relationship with other support organizations such as the police.
Results & Discussion
Within this section the panoptic on is used as a metaphor to explain the situation many survivors of abuse find themselves in when dealing with the social dynamics at play in an abusive relationship. Jeremy Bentham devised the panoptic on as a system to control prisoners [13]. It allows for the observation of all prisoners by a single security guard without the option for any single inmate to be aware if they are being observed. Although the guard cannot see all the prisoners at once, the prisoners are unaware of who is being observed at any one time, so act to self-regulate their behavior. The scheme itself consisted of a circular building with an array of cells around the edge and a guard in a watchtower at the center. Each cell was partitioned with a divider wall so prisoners would be unable to observe others next to them. On the inner wall facing the watchtower as well as the outer wall leading to the outside world, clear windows are used so that the inmates can be observed. A light source is shone into the panoptic on using an outside source so that the prisoner may be seen by the watchman but not vice versa. Bentham envisioned this arrangement being used primarily in prisons but also in schools and hospitals. The intention was to create a prison cheaper than those of the time, requiring fewer staff. Furthermore, the prisoners would provide menial labor thus reducing the running costs of the prison itself. The design did not come into fruition but has been borrowed, most notably by Michel Foucault as a metaphor for the disciplinary mechanisms of societies and their inclination towards observation and normalization of individuals [14]. He claims all hierarchical structures such as the army, schools, hospitals, factories are designed to resemble the panopticon. His argument was that authority represented by laws and institutions is internalized by citizens and the power they have is derived from observation and collection of their data. Ambiguity of these devices adds to the uncertainties a victim of domestic abuse and violence is already experiencing. Accounts of doubt in a person’s own beliefs of what they are experiencing are prevalent from the focus groups. The initial uncertainty around device capabilities and functionalities adds to the doubt the survivor experiences and, as the device ‘learns’, its functionalities may change, adding a further layer of uncertainty which varies over time. As noted by a front-line worker from a women’s support organization. “If somebody's in a situation where they're kind of doubting themselves anyway this is just gonna exacerbate things, isn't it? Sometimes I feel a lot of paranoia plays into it and it's hard for me to separate the paranoia to what can really be happening.” Further, as this support worker shows, there is a need for the support services themselves to separate paranoia from what may really be happening. A development in gas lighting is being achieved for this exact reason; the false claims which may arise due to uncertainties in the device capabilities, remove the need for the perpetrator to convince others of the survivor’s delusion. People are forced to share their data and preferences with manufacturers until the device has ‘learnt’ the user and optimized itself for them. As a women’s support front-line worker noted, “Yeah, they own your information, yes, and they have the right to own the information because we use it.” The point is to make users feel that they must share their data in order to create a perfect device and thus a utopian life. Ambiguity is used as a tool to keep a vision of utopia gained through technological devices alive. Interminable ambiguity has an immediate value in ensuring users continue to share their data. Within this, questions of power arise. Within the narratives around domestic abuse, there is often a sense of ‘us vs. them’ usually referring to victims vs. perpetrators, however in this case the ‘other side’ becomes both the software development company and the perpetrator so the distinction ambiguous from the perspective of the survivor. The regulation of behavior is being enforced by both the perpetrator and software Development Company. Technology facilitated abuse is allowing for a transition in abuse mechanisms, away from physical punishment towards discipline and self-regulation. The technology allows for power to be exerted from perpetrator to victim passively. Power exists even when it is not being realized in action. The panoptic metaphor may be used to describe this transition. Foucault describes self-regulation which arose alongside the enlightenment - with perceived freedom came increased surveillance. From the focus groups, there is evidence of positive aspects to technology which increases survivors’ ability to collect evidence and in one case, allowed a survivor to become a perpetrator. The participants acknowledged power to be at the core of many of problems survivors experience, “with us, it's always about stripping it back to what the basis of all those forms of abuse are about, which is often about imbalance of power.” “I feel like this [IoT] is just gonna give them [perpetrators] even more of an opportunity to manipulate in a completely different way and like kind of mind-fuck in a way which can be worse than physical at times” The panoptic on relies on metaphorical walls to stop an uprising of voices to take collective action. It was found that institutions are exerting structural power onto survivors leading to them not coming forward to disclose their experiences. A general sense of injustice was prevalent within minority communities. Mistrust was highlighted towards child social services. Survivors feared their children would be taken away, and in particular a lack of understanding of how children are raised in different cultures was raised. There was a preference to ‘keep it [cases of abuse] within the community’ so as not to have their children raised in a different culture to their own, but also to protect the image of the community within wider society. “They say because they're going to report it to social services. Knowing the fear the family have when it comes to interference with social services or with the authority, that's, I think, what encourages them not to talk about it and, and sort of they support each other and they sort of put that blockage for that individual who wants to go forward, to say don't do it.” Mistrust towards charities was exhibited as a fear of their links to the home office had the potential to make migrants or refugees feel further vulnerability, and finally the university itself as they were seen as the initial creators of technologies. “So, there's this organization [X; organization], they worked with the Home Office...so that they [victims] could be deported which is really unethical.” “So who are these people that are doing IoT like who's behind this? [Laughs]. The University.” The police are also seen as exercising power over victims. Mistrust in the police was observed due to a perceived lack of understanding towards coercion as a form of abuse was prevalent. With the increasing possibilities of imaginaries of device capabilities, there would be an increase in ‘false’ reports by victims whereby reports which are proven to be false as devices are unable to perform actions victims claim they can. The survivor appears to be disillusioned with the device capability imaginary allowing already prevalent gas lighting methods to be further exploited. As there is falsification of the victim’s account, with police forces unwilling to act without physical evidence of coercion, the survivor begins to self-regulate. As a front-line worker notes, “turning the heating up really high, turning it down low. Making that person feel as though they're going crazy because who's to prove that it is that person remotely controlling those lights and that fridge, and the police aren't going to believe them, they're going to think they're crazy.” The imaginary possibility of data being collected and obtained by the perpetrator but this ability not being reciprocal, allows for self-regulation due to the fear of being observed. “a part of it is that it's almost like the perpetrator has the privacy but the victim doesn’t” In cases where technology is being used as a method to record evidence, the technology itself is framed as being empowering; it is beginning to be seen as a double edged sword. On the one hand it makes user groups more vulnerable and exposed, but on the other this exact feature allows survivors to use technology to gather evidence in order to prove their claims and seek justice. “But just last week, one of our clients used her smart phone to - while the perpetrator, the husband, put the - latex gloves on, er, on his hand. Got her to the, er, bathroom, tried to kill her. She used the gadget and make the audio record and also take the picture and they - before that, she went a couple of times to the police, no one believed her, and finally they believed her!” And finally, technology is being used and framed as empowering to the point that a survivor eventually becomes a perpetrator, “someone who was, er, openly detailing how she stalked her ex-boyfriend's new partner online and this woman still no idea that this woman is repeatedly checking up on her...She's monitoring her completely and she recognizes that's really unhealthy but she finds it really difficult to stop and you're sort of saying well how - where do we draw a line there because actually recognize that's really unhealthy behavior but a lot of young people would be like, well she's not harming her. She's not doing anything to her, so why is it bad?”
Conclusions
Due to the rise in internet-connected devices in the home, technology facilitated abuse is becoming more prevalent and health care policy and public health programs need to become more aware of this issue. Findings from this study provided insight into the heightened risk of coercive control, as well as new institutions taking the role of objects to be fought against - perpetrators, policy enforcing institutions and technology companies. The ambiguities surrounding these emerging technologies are shaping new power relations within society and the assumption of trust is being threatened. Technology companies are selling these items as objects to support convenience, but convenience for what? This study examined the new narratives being created around the identity of the abuser in coercive relationships and in particular how new power dynamics are coming into play with emerging technologies. The new power dynamics at play mean the side to be fought against now becomes the perpetrator, the authoritative institutions and technology companies, all seen as figures which are collecting data on the individual to coerce and control. The concept of convenience is sold to consumers, but questions are now raised around whether the convenience is to aid the consumer’s life, or for the collection of an individual’s data to be used by an unknown establishment. The transition to self-regulation is achieved through technological uncertainties. Technology is also framed as ‘empowerment’, but with several power holders and varying dynamics at play, empowerment is being achieved by all on the ‘other side’ whilst simultaneously the survivor is resorting to using the tools being used to oppress to their advantage, in effect creating a system with users increasingly reliant upon these emerging technologies. As this is taking place, support services and networks need to be more aware of these increasing threats. Policies concerning coercive control may need to turn its attention towards new power dynamics in order to ensure survivors’ experiences are taken seriously. In addition, producers of knowledge for scientific governance also need to be aware of power dynamics at play as survivors feel there is a greater need for care given on an individual basis in place of generalized policies.
Challenges of Technology Company Governance
The governance of technology companies in this new complex power dynamic concerning IoT devices is dependent upon what is constituted as private or public communication and data, and the possibilities and affordances offered by the software. This includes the ability to coerce users, spread and share data, and to report and/or block other users. Telecommunication providers are often the intermediaries through which abuse takes place, thus there has been increasing pressure on these companies to improve their responses concerning their facilitation towards abuse and harassment [15]. In terms of policy, digital intermediaries such as software developers present themselves as passive facilitators of communications, using this framework to limit their responsibility concerning users’ actions [16]. Those based in the United States are defended against liability concerning users’ behavior under the Communications Decency Act; outside the US however legal immunity is not as strong [17]. A liberal view of freedom of behavior allows for the freedom for platforms to endorse their own rules whilst simultaneously delegitimizing external regulation [18]. The lack of legal mechanisms through which technology companies are held to account poses a greater problem for those offering and seeking support. Social pressure is becoming greater, but with survivors often still not perceiving technology facilitated abuse as a dynamic within the abuse network and thus failing to disclose it, issues persist and regulatory frameworks still need to be regularly amended within this dynamic network.
Challenges for Producers of Knowledge for Scientific Governance
The accounts from front-line support services show survivors feel discontent towards those framed as offering support from wider society such as charities and government institutions including the British Home Office. This stems from a misinterpretation of survivors’ life experiences, as well as cultures. These misinterpretations lead to disjointed regulations and policies which feel disconnected to the individual. They result from research conducted on minority groups by universities, which in turn calls into question associated ethics and human values. The framework of Responsible Research and Innovation encompasses dimensions through which more ‘responsible research’ of science may be conducted. It frames the concept of Institutional Reflexivity as one in which a mirror should be held up to one’s own activities [19]. Ethically, this would lead to researchers being more mindful that their framing of an issue may not be universally held with all participants. This is especially important for already vulnerable populations. Furthermore, the general sense that the minority groups preferred to not be victimized or seen as culturally alien has led to great mistrust towards authoritative institutions and has its roots in the production of knowledge for policy making. In terms of health research and policy, race is considered a biological concept which categorizes people according to their phenotypical features. The basis is that biological attributes such as skin color or facial features are inextricably linked to behavioral characteristics between racial groups. The scientific basis of this work has been linked back to eugenics which has been long rejected by the scientific community [20]. As the concept of ‘race’ was discredited by the scientific community, ethnic groups became increasingly used in the literature [21]. In contrast to ‘race’, as a concept which indicates differences between groups based solely on the way they look, and is imposed by an outside observer, ethnicity indicates self-definition and group identification defined from within and usually centered on traditions and culture. In much of the health research on minority groups, ethnicity is often treated as fixed and un-dynamic, a characteristic which can be objectively measured [22]. In addition, the literature often classifies ethnic communities over those racialised as ‘white’ as the ‘problem’ due to their behavior being more heavily monitored over their ‘white’ counterparts. The ethnic is often seen as deviant in their behavior and a problem to be fixed, rather than a system which needs to adapt to the needs of different individuals [23]. This is due to the fact an easily observed ethnicity is seen as being the underlying factor for explaining inequalities. As this research highlighted, as a whole system approach is required to understand the power dynamics at play for survivors of abuse, which calls into question how the production of scientific policies may need to change.
References
- De Cremer D, Nguyen B and Simkin L. The integrity challenge of the Internet-of-Things (IOT): On understanding its dark side. Journal of Marketing Management 2016; 33: 1-2.
- Marganski A, Melander L. Intimate Partner Violence Victimization in the Cyber and Real World: Examining the Extent of Cyber Aggression Experiences and Its Association With In-Person Dating Violence. Journal of Interpersonal Violence. 2018; 33: 1071-1095.
- Woodlock M. The Abuse of Technology in Domestic Violence and Stalking. Violence against Women, 2016; 23: 584-602.
- Stark E. Coercive control: How men entrap women in personal life. 1st edition. Oxford: Oxford University Press. 2007.
- Stark E. From domestic violence to coercive control in the United Kingdom. Domestic Violence Report, 2016; 21: 23-26.
- Burman M and Brooks-Hay O. Aligning policy and law? The creation of a domestic abuse offence incorporating coercive control. Criminology & Criminal Justice 2018; 18: 67-84.
- Douglas H. Do we need a specific domestic violence offence? Melbourne University Law Review 2015; 39: 434-471.
- Toussaint W and Ding A. Machine Learning Systems in the IoT: Trustworthiness Trade-offs for Edge Intelligence. Proceedings of the Second International Conference on Cognitive Machine Intelligence. 2020.
- Gaslight 1944 film. 2020.
- Bentham J. The Panopticon Writings. First edition. New York: Verso. 2010.
- Foucault M. Discipline and Punish. 1st ed. Pantheon Books. 1977.
- United Nations Development Programme and UN Women. Combatting Online Violence against Women & Girls: A Worldwide Wake Up Call: Highlights. 2015.
- Gillespie T. The Politics of Platforms. New Media & Society, 2010; 12: 347-364.
- Suzor N. Digital Constitutionalism: Using the Rule of Law to Evaluate the Legitimacy of Governance by Platforms. Social Media + Society, 2017; 4: 1-11.
- Stilgoe J, Richard O, Phil M. Developing a framework for responsible innovation. Research Policy. 2013; 42: 1568-1580.
- Templeton A. Biological Races in Humans. Stud Hist Philos Biol Biomed Sci. 2013; 44:262-271.
- Jenkins R. Social anthropological models of inter-ethnic relations. In: Theories of Race and Ethnic Relations. 1st edition. Cambridge: Cambridge University Press. 1986; 170-186.
- Vanden Berghe P. Ethnicity and the sociobiology debate. In Theories of Race and Ethnic Relations. Cambridge: Cambridge University Press. 1986; 246-263.
- Wilkinson D and King G. Conceptual and Methodological Issues in the Use of Race as a Variable: Policy Implications. The Milbank Quarterly, 1987; 65: 56-71.
- Anderson B. Imagined Communities. Third edition. New York: Verso. 1983.
- Baker J. Race. First edition Oxford: Oxford University Press. 1974.
- Stark E and Hester M. Coercive Control: Update and Review. Violence against Women, 2019; 25: 81-104.
- Van der Molen F. How knowledge enables governance. The coproduction of environmental governance capacity. Environmental Science and Policy. 2018; 87: 18-25.