Law, Sex and Technologies

Please check out our Outputs page to find out the results and findings of this study.

About

Over the past two decades, there has been a rapid expansion of technologies that enhance or facilitate sexual experience or intimacy. This include digital technologies, such as smart phones and dating apps, mechanical technologies, such as digitally connected vibrators, and medical technologies, such as surgical processes and pharmaceutical products.

The sexualised use of technologies has highlighted potential legal complexities and safety concerns. Existing legal frameworks may be inadequate to respond to issues of consent, confidentiality, privacy, surveillance, harassment, ownership and control that have emerged due to new sex technologies.

In this project, we are analysing the law and regulation of sex technologies in Australia and the UK. We aim to identify and map existing policies, laws or other regulatory frameworks which may be of relevance to sex technologies, with a view to understand potential gaps or barriers in the law.

As part of this project, we will also identify where reform may be needed to allow legal systems to better accommodate and respond to sex technologies.

Case Study 1: Law and deepfake pornography

The sexualised use of deepfakes – AI technology that creates hyper-realistic images and videos of individuals saying and doing things that they do not say or do – has potential moral, social and legal consequences for adults residing in Australia and the UK.

In recent years, the use of deepfake technology has grown across a number of public and political domains. In particular, it has been used for the creation of pornography, often without the consent or knowledge of the individuals depicted in the images or videos.

As things stand, law’s ability to restrict or prohibit online distribution of such non-consensual images may be low, given that it occurs rapidly, often across jurisdictional boundaries, and via diverse social media.

A key research question to consider in this study is the extent to which law should intervene to restrict prohibit, or conversely protect, the dissemination of deepfake pornography.

However, there may be other circumstances where the creation of deepfake pornography is consensual, offering the potential to create new sexual experiences for certain groups, such as people with disabilities. In such circumstances, we may need to consider to what extent the dissemination of such images and/or such groups should be offered some degree of protection under the law.

This research strand investigates potential legal concerns and risks associated with deepfake pornography in Australia and the UK. As part of this research, we are analysing current legal and regulatory frameworks concerning the creation and dissemination of such images, as well as considering options for policy and law reform.  

Case study 2: Law and sex technologies in the time of Covid-19

As part of the Sex and Intimacy During Covid-19 study, we will be analysing how people perceive legal risks while using sex technologies to establish or maintain sexual connections, relationships and experiences during the coronavirus pandemic. This research will be informed by a national survey of adults residing in Australia.

Further details to follow.

Case Study 3: AI and the use of sexbots

The sex technology industry is a multi-billion-dollar industry. As use of AI becomes more common and more advanced – there are concerns that manipulating technology too much will inevitably present a number of legal and ethical risks. This case study will explore one of those potential risks – sexbots. Sexbots, otherwise termed ‘robotic sex dolls,’ have mostly been designed for the male market and include the creation of ‘Harmony’ and ‘Samantha,’ amongst others. These AI sex dolls can be customised to suit the customers ‘ideal’ partner including eye colour, hair colour, skin colour as well as possessing a specific personality.  

Similar to the sexualised use of deepfakes – sexbots have raised significant ethical, social and legal questions. Despite concerns, limitations of the law to regulate the adult sexbot industry have been noted e.g. what ‘harm’ or ‘wrong-doing’ is actually being caused/committed? Scholars have argued that the existence of adult sexbots promote sexual objectification of women and can encourage unhealthy and abusive sexual relations e.g. sexbots can be designed to allow an individual to act out rape fantasies.  

There may, however, be circumstances when the use of sexbots is viewed as ‘beneficial’ or less ‘harmful’ e.g., for medicinal purposes (such as treatment of sexual dysfunctions) or for elderly companionship. In such cases, is there room for the law to protect the manufacturing and selling of adult sexbots within certain contexts? This case study will test existing theories against an ethical and legal framework.  

In sum, case study 3 presents a similar regulatory research question as that for the use of sexualised deepfakes:  when should the law intervene to restrict, or conversely protect, the design, manufacturing and use of adult sexbots.

Case Study 4: The regulation of online safety in Australia and the UK

The law has traditionally been slow and reactive in recognising harmful behaviour online, such as sexual abuse, violence, stalking, threats, as well as misuse of sex technologies. However, the online safety of those considered to be at particular at risk – women, girls, women of colour, and sexual minorities – has recently become a political priority. For example, the Online Safety Act 2021 (Cth) in Australia passed both houses of the Federal Parliament in June 2021 and will come into effect in January 2022. The Act extends the powers of the Office of the eSafety Commissioner and establishes or expands regulatory powers for removing cyber-abuse, image-based abuse and harmful online content. In the UK, the draft Online Safety Bill 2021 is currently before the UK Parliament and will likely come into force in 2022.

Drawing on an examination of recent online safety laws in Australia and the UK, this case study examines how we should position the regulation of digital sex technologies in the context of such developments. In doing so, we critically interrogate the use of risk as the dominant legal paradigm and explore how notions of sexual autonomy and consensuality in sexual relations should inform a principled approach to the regulation of such technologies.

Further Information

Where can I read the findings?

We will publish results from this research in reputable Australian and international law journals. We will also publish a summary of our research on this website. Links to publications resulting from this research will be made available on the Research Outputs page. You are welcome to join the ARCSHS mailing list to be kept up-to-date with the latest ARCSHS research.

How can I find out more about the law strand of the Tech-Sex project?

If you have any further questions about the law strand of the Tech-Sex  project you can contact Ms Nicole Shackleton, La Trobe Law School (n.shackleton@latrobe.edu.au) [lead, Australian law research] and the Professor Anne-Maree Farrell, Chair of Medical Jurisprudence, Edinburgh Law School (A.Farrell@ed.ac.uk) [lead, UK law research; overall project lead for the law strand]. You can also visit the law strand website. If you have any questions about the project as a whole, please Contact us.