To be counted as operationally autonomous relative to the execution of some given task, a robotic system must be capable of performing that task without any human intervention after its activation. Recent progress in the fields of robotics and AI has paved the way to robots autonomously performing tasks that may signif-icantly affect individual and collective interests, which are deemed as worthy of pro-tection from both ethical and legal perspectives. The present contribution provides an overview of ensuing normative problems and identifies some ethically and legal-ly grounded solutions to them. To this end, three case studies will be more closely scrutinized, i.e. increasingly autonomous weapons systems, vehicles, and surgical robots. These case studies are used to illustrate, respectively, the preliminary prob-lem of whether we want to grant certain forms of autonomy to robotic systems, the problem of selecting appropriate ethical policies to control the behavior of autono-mous robotic systems, and the problem of how to retain responsibility for mis-doings of autonomous robotic systems. The analysis of these case studies brings out the key role played by human control in ethical and legal problem-solving strat-egies concerning the operational autonomy of robotic and AI systems.

I sistemi robotici ad autonomia crescente tra etica e diritto: quale ruolo per il controllo umano?

Amoroso, Daniele;
2019-01-01

Abstract

To be counted as operationally autonomous relative to the execution of some given task, a robotic system must be capable of performing that task without any human intervention after its activation. Recent progress in the fields of robotics and AI has paved the way to robots autonomously performing tasks that may signif-icantly affect individual and collective interests, which are deemed as worthy of pro-tection from both ethical and legal perspectives. The present contribution provides an overview of ensuing normative problems and identifies some ethically and legal-ly grounded solutions to them. To this end, three case studies will be more closely scrutinized, i.e. increasingly autonomous weapons systems, vehicles, and surgical robots. These case studies are used to illustrate, respectively, the preliminary prob-lem of whether we want to grant certain forms of autonomy to robotic systems, the problem of selecting appropriate ethical policies to control the behavior of autono-mous robotic systems, and the problem of how to retain responsibility for mis-doings of autonomous robotic systems. The analysis of these case studies brings out the key role played by human control in ethical and legal problem-solving strat-egies concerning the operational autonomy of robotic and AI systems.
2019
Autonomous Weapons Systems; Self-driving Cars; Surgical Robots; Deon-tological Ethics and Consequentialism; International Law
File in questo prodotto:
File Dimensione Formato  
350-702-1-SM.pdf

accesso aperto

Tipologia: versione editoriale
Dimensione 1.39 MB
Formato Adobe PDF
1.39 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11584/262741
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? ND
social impact