Artificial Agents in Natural Moral Communities: A Brief Clarification

What exactly is it that makes one morally responsible? Is it a set of facts which can be objectively discerned, or is it something more subjective, a reaction to the agent or context-sensitive interaction? This debate gets raised anew when we encounter newfound examples of potentially marginal agenc...

Full description

Saved in:  
Bibliographic Details
Published in:Cambridge quarterly of healthcare ethics
Main Author: Tigard, Daniel W. (Author)
Format: Electronic Article
Language:English
Check availability: HBZ Gateway
Journals Online & Print:
Drawer...
Fernleihe:Fernleihe für die Fachinformationsdienste
Published: Cambridge Univ. Press 2021
In: Cambridge quarterly of healthcare ethics
Further subjects:B human–robot interaction
B Artificial Intelligence
B Moral Responsibility
B machine ethics
B Blame
B Moral Agency
Online Access: Volltext (lizenzpflichtig)
Volltext (lizenzpflichtig)
Description
Summary:What exactly is it that makes one morally responsible? Is it a set of facts which can be objectively discerned, or is it something more subjective, a reaction to the agent or context-sensitive interaction? This debate gets raised anew when we encounter newfound examples of potentially marginal agency. Accordingly, the emergence of artificial intelligence (AI) and the idea of “novel beings” represent exciting opportunities to revisit inquiries into the nature of moral responsibility. This paper expands upon my article “Artificial Moral Responsibility: How We Can and Cannot Hold Machines Responsible” and clarifies my reliance upon two competing views of responsibility. Although AI and novel beings are not close enough to us in kind to be considered candidates for the same sorts of responsibility we ascribe to our fellow human beings, contemporary theories show us the priority and adaptability of our moral attitudes and practices. This allows us to take seriously the social ontology of relationships that tie us together. In other words, moral responsibility is to be found primarily in the natural moral community, even if we admit that those communities now contain artificial agents.
ISSN:1469-2147
Contains:Enthalten in: Cambridge quarterly of healthcare ethics
Persistent identifiers:DOI: 10.1017/S0963180120001000