The morality of abusing a robot

It is not uncommon for humans to exhibit abusive behaviour towards robots. This study compares how abusive behaviour towards a human is perceived differently in comparison with identical behaviour towards a robot. We showed participants 16 video clips of unparalleled quality that depicted different...

Full description

Bibliographic Details
Main Authors: Bartneck Christoph, Keijsers Merel
Format: Article
Language:English
Published: De Gruyter 2020-06-01
Series:Paladyn: Journal of Behavioral Robotics
Subjects:
Online Access:https://doi.org/10.1515/pjbr-2020-0017
id doaj-337f52916de943ddb98a95891ee20c58
record_format Article
spelling doaj-337f52916de943ddb98a95891ee20c582021-10-02T19:03:30ZengDe GruyterPaladyn: Journal of Behavioral Robotics2081-48362020-06-0111127128310.1515/pjbr-2020-0017pjbr-2020-0017The morality of abusing a robotBartneck Christoph0Keijsers Merel1HIT Lab NZ, University of Canterbury, Private Bag 4800, 8140 Christchurch, New ZealandHIT Lab NZ, University of Canterbury, Private Bag 4800, 8140 Christchurch, New ZealandIt is not uncommon for humans to exhibit abusive behaviour towards robots. This study compares how abusive behaviour towards a human is perceived differently in comparison with identical behaviour towards a robot. We showed participants 16 video clips of unparalleled quality that depicted different levels of violence and abuse. For each video, we asked participants to rate the moral acceptability of the action, the violence depicted, the intention to harm, and how abusive the action was. The results indicate no significant difference in the perceived morality of the actions shown in the videos across the two victim agents. When the agents started to fight back, their reactive aggressive behaviour was rated differently. Humans fighting back were seen as less immoral compared with robots fighting back. A mediation analysis showed that this was predominately due to participants perceiving the robot’s response as more abusive than the human’s response.https://doi.org/10.1515/pjbr-2020-0017abuserobotshumanmoralityperception
collection DOAJ
language English
format Article
sources DOAJ
author Bartneck Christoph
Keijsers Merel
spellingShingle Bartneck Christoph
Keijsers Merel
The morality of abusing a robot
Paladyn: Journal of Behavioral Robotics
abuse
robots
human
morality
perception
author_facet Bartneck Christoph
Keijsers Merel
author_sort Bartneck Christoph
title The morality of abusing a robot
title_short The morality of abusing a robot
title_full The morality of abusing a robot
title_fullStr The morality of abusing a robot
title_full_unstemmed The morality of abusing a robot
title_sort morality of abusing a robot
publisher De Gruyter
series Paladyn: Journal of Behavioral Robotics
issn 2081-4836
publishDate 2020-06-01
description It is not uncommon for humans to exhibit abusive behaviour towards robots. This study compares how abusive behaviour towards a human is perceived differently in comparison with identical behaviour towards a robot. We showed participants 16 video clips of unparalleled quality that depicted different levels of violence and abuse. For each video, we asked participants to rate the moral acceptability of the action, the violence depicted, the intention to harm, and how abusive the action was. The results indicate no significant difference in the perceived morality of the actions shown in the videos across the two victim agents. When the agents started to fight back, their reactive aggressive behaviour was rated differently. Humans fighting back were seen as less immoral compared with robots fighting back. A mediation analysis showed that this was predominately due to participants perceiving the robot’s response as more abusive than the human’s response.
topic abuse
robots
human
morality
perception
url https://doi.org/10.1515/pjbr-2020-0017
work_keys_str_mv AT bartneckchristoph themoralityofabusingarobot
AT keijsersmerel themoralityofabusingarobot
AT bartneckchristoph moralityofabusingarobot
AT keijsersmerel moralityofabusingarobot
_version_ 1716848269940752384