The Ghost in the Machine – Emotionally Intelligent Conversational Agents and the Failure to Regulate ‘Deception by Design’

Authors

  • Ronald Leenes** Pauline Kuss* *LL.M./Analyst, hy GmbH, Berlin, Germany, paulinekuss@gmx.net. Thispaper is based on Pauline Kuss, Deception by Design for the Goal of SocialGracefulness: Ethical and Legal Concerns of Humanlike Conversational Agents,Tilburg, 2019. ** Professor in Regulation by Technology, Tilburg Institute for Law,Technology, and Society, Tilburg, the Netherlands,r.e.leenes@tilburguniversity.edu

DOI:

https://doi.org/10.2966/scrip.170220.320

Abstract

Google’s Duplex illustrates the great strides made in AI to provide synthetic agents the capabilities to intuitive and seemingly natural human-machine interaction, fostering a growing acceptance of AI systems as social actors. Following BJ Fogg’s captology framework, we analyse the persuasive and potentially manipulative power of emotionally intelligent conversational agents (EICAs). By definition, human-sounding conversational agents are ‘designed to deceive’. They do so on the basis of vast amounts of information about the individual they are interacting with. We argue that although the current data protection and privacy framework in the EU offers some protection against manipulative conversational agents, the real upcoming issues are not acknowledged in regulation yet.

Downloads

Published

06-Aug-2020

Issue

Section

Research Article