May-Chahal, Corinne and Deville, Joe and Moffat, Luke and Guo, Weisi and Tang, Yun and Tsourdos, Antonios (2024) Encoding Social & Ethical Values in Autonomous Navigation : Philosophies Behind an Interactive Online Demonstration. In: Second International Symposium on Trustworthy Autonomous Systems (TAS '24) :. Communications of the ACM . ACM, USA, pp. 1-9. (In Press)
_TAS_24_Encoding_Social_Ethical_Values_in_Autonomous_Navigation_Philosophies_Behind_and_an_Interactive_Online_Demonstration.pdf - Accepted Version
Available under License Creative Commons Attribution.
Download (833kB)
Abstract
Autonomous Systems (ASs) interacting with human societies raises complex social & ethical challenges. This paper argues that one way of scaffolding human trust in ASs is through the encoding of ethical, legal and social impact (ELSI) considerations in the ASs’ decisionmaking processes. Existing ELSI-encoding efforts often focus on the implementation of rule-based and risk-based approaches, leaving key questions unanswered - what are the relationships between ELSI-encoding software logic in ASs and human ethical practises; what ethical approaches cannot be easily translated into software rules and numeric risks; and what are the implications of this for ethical AS? To answer these questions, we review and discuss different ELSIencoding approaches in ASs from a new perspective, i.e., their relationships with classic human ethics philosophies. We also explore the feasibility of large language models (LLMs)-based ELSIencoding practices in overcoming the limitations of rule-based and risk-based approaches and the associated challenges. To foster understanding, facilitate knowledge exchange and inspire discussion among cross-disciplinary research communities, we build and publish the first online interactive playground demonstrating different ELSI-encoding approaches on the same AS decision-making process. We welcome feedback and contributions in making this platform truly beneficial to trustworthy autonomous system research communities.