Moral Agency, Moral Responsibility, and Artifacts : What Existing Artifacts Fail to Achieve (and Why), and Why They, Nevertheless, Can (and Do!) Make Moral Claims Upon Us
(2014) In International Journal of Machine Consciousness 6(2).- Abstract
- This paper follows directly from an earlier paper where we discussed the requirements for an artifact to be a moral agent and concluded that the artifactual question is ultimately a red herring. As before, we take moral agency to be that condition in which an agent can appropriately be held responsible for her actions and their consequences. We set a number of stringent conditions on moral agency. A moral agent must be embedded in a cultural and specifically moral context and embodied in a suitable physical form. It must be, in some substantive sense, alive. It must exhibit self-conscious awareness. It must exhibit sophisticated conceptual abilities, going well beyond what the likely majority of conceptual agents possess: not least that it... (More)
- This paper follows directly from an earlier paper where we discussed the requirements for an artifact to be a moral agent and concluded that the artifactual question is ultimately a red herring. As before, we take moral agency to be that condition in which an agent can appropriately be held responsible for her actions and their consequences. We set a number of stringent conditions on moral agency. A moral agent must be embedded in a cultural and specifically moral context and embodied in a suitable physical form. It must be, in some substantive sense, alive. It must exhibit self-conscious awareness. It must exhibit sophisticated conceptual abilities, going well beyond what the likely majority of conceptual agents possess: not least that it must possess a well-developed moral space of reasons. Finally, it must be able to communicate its moral agency through some system of signs: a “private” moral world is not enough. After reviewing these conditions and pouring cold water on recent claims for having achieved “minimal” machine consciousness, we turn our attention to a number of existing and, in some cases, commonplace artifacts that lack moral agency yet nevertheless require one to take a moral stance toward them, as if they were moral agents. Finally, we address another class of agents raising a related set of issues: autonomous military robots. (Less)
Please use this url to cite or link to this publication:
https://lup.lub.lu.se/record/4496359
- author
- Parthemore, Joel LU and Whitby, Blay
- organization
- publishing date
- 2014
- type
- Contribution to journal
- publication status
- published
- subject
- keywords
- moral agency, moral stance, responsibility, concepts, consciousness, autopoiesis
- in
- International Journal of Machine Consciousness
- volume
- 6
- issue
- 2
- publisher
- World Scientific Publishing
- external identifiers
-
- scopus:84906890413
- ISSN
- 1793-8430
- DOI
- 10.1142/S1793843014400162
- language
- English
- LU publication?
- yes
- additional info
- http://www.worldscientific.com/loi/ijmc
- id
- c9637d84-6e94-4c3f-8ec9-8bfbce7690ec (old id 4496359)
- date added to LUP
- 2016-04-01 11:03:35
- date last changed
- 2023-11-25 00:19:56
@article{c9637d84-6e94-4c3f-8ec9-8bfbce7690ec, abstract = {{This paper follows directly from an earlier paper where we discussed the requirements for an artifact to be a moral agent and concluded that the artifactual question is ultimately a red herring. As before, we take moral agency to be that condition in which an agent can appropriately be held responsible for her actions and their consequences. We set a number of stringent conditions on moral agency. A moral agent must be embedded in a cultural and specifically moral context and embodied in a suitable physical form. It must be, in some substantive sense, alive. It must exhibit self-conscious awareness. It must exhibit sophisticated conceptual abilities, going well beyond what the likely majority of conceptual agents possess: not least that it must possess a well-developed moral space of reasons. Finally, it must be able to communicate its moral agency through some system of signs: a “private” moral world is not enough. After reviewing these conditions and pouring cold water on recent claims for having achieved “minimal” machine consciousness, we turn our attention to a number of existing and, in some cases, commonplace artifacts that lack moral agency yet nevertheless require one to take a moral stance toward them, as if they were moral agents. Finally, we address another class of agents raising a related set of issues: autonomous military robots.}}, author = {{Parthemore, Joel and Whitby, Blay}}, issn = {{1793-8430}}, keywords = {{moral agency; moral stance; responsibility; concepts; consciousness; autopoiesis}}, language = {{eng}}, number = {{2}}, publisher = {{World Scientific Publishing}}, series = {{International Journal of Machine Consciousness}}, title = {{Moral Agency, Moral Responsibility, and Artifacts : What Existing Artifacts Fail to Achieve (and Why), and Why They, Nevertheless, Can (and Do!) Make Moral Claims Upon Us}}, url = {{http://dx.doi.org/10.1142/S1793843014400162}}, doi = {{10.1142/S1793843014400162}}, volume = {{6}}, year = {{2014}}, }