Gender as Emotive AI and the Case of Nadia: Regulatory and Ethical Implications

Artist: Janusz Jurek

Gender as Emotive AI and the Case of Nadia: Regulatory and Ethical Implications

Adams, R, Ni Loideain, N & Clifford, D 2021, ‘Gender as Emotive AI and the Case of Nadia: Regulatory and Ethical Implications’, Accepted for Privacy Law Scholars Conference 2021.

One of the most advanced forms of Artificial Intelligence (AI) are emotive or affective technologies. That is, AI-driven technologies that seek to read and process the emotive expressions – whether verbal or behavioral – of people, or that seek to display humanlike emotions. Virtual personal assistants (VPAs) are one such advanced AI interface where emotive technologies are increasingly prevalent as a major design feature. In Australia in 2017, a VPA with emotive capabilities was developed to be used by the country’s National Disability Insurance Agency (NDIA). The VPA was named “Nadia” and – following suit of all the major VPAs on the market (Siri, Alexa, Cortana) – was designed with a female voice and female face.
While the psychological effects of emotive AI are yet to be fully explored and understood,  there are – we argue here – clear human rights implications that go beyond a simplistic notion of the right to privacy in terms of personal data protection. The right to privacy is a central right here, but, as we show, compounded with violations to the fundamental rights to equality and human dignity, as set out in international human rights law. This is exacerbated in a context where the emotive AI is being used with vulnerable groups, such as – with Nadia – persons with disabilities, and where the AI is designed as female drawing on harmful gender stereotypes.

Accordingly, this article sets out to examine the ways in which gender is mobilized as an emotive technology within AI and data processing systems, taking the design and development of Nadia as a case study. In doing so, we offer the first examination of the human rights implications of gendering technology (and gender as a technology) in emotion-inducing AI (‘emotive AI’) and data processing systems and, particularly, how this works to reproduce social dynamics of power within the welfare state.