Skip to main content

LUP Student Papers

LUND UNIVERSITY LIBRARIES

CRIMINAL LIABILITY OF ARTIFICIAL INTELLIGENT MACHINES:EYEING INTO AI´S MIND

Calixto Bonfim, Tany LU (2022) JAMM07 20221
Department of Law
Faculty of Law
Abstract
In modern legal systems, criminal responsibility is based on the concept of agency, which includes autonomy, intentionality, and individual accountability. In parallel, as technology advances and the use of artificial intelligence in our daily lives becomes more prevalent, important tasks are entrusted to AI-driven systems. As a result, AI will increasingly perform independently of humans. The more "smart" AI and sophisticated robots become, the more likely they are to react to impulses. As a result, we are increasingly dealing with agents rather than tools, and advanced artificial intelligence systems deployed irresponsibly will be the source of future concerns, as AI systems are already autonomously engaging in activities that would be... (More)
In modern legal systems, criminal responsibility is based on the concept of agency, which includes autonomy, intentionality, and individual accountability. In parallel, as technology advances and the use of artificial intelligence in our daily lives becomes more prevalent, important tasks are entrusted to AI-driven systems. As a result, AI will increasingly perform independently of humans. The more "smart" AI and sophisticated robots become, the more likely they are to react to impulses. As a result, we are increasingly dealing with agents rather than tools, and advanced artificial intelligence systems deployed irresponsibly will be the source of future concerns, as AI systems are already autonomously engaging in activities that would be considered illegal for a human. As a result, there may be no one to blame for the negative consequences of their actions. Since the accountability gap has been identified, this paper investigates the feasibility of attributing a criminal mind to artificial intelligence entities - such as robots. By assuming that AI entities have their own degree of consciousness, the likelihood of seeing them as holders of a guilty mind leave the inconceivable to the prospective. (Less)
Please use this url to cite or link to this publication:
author
Calixto Bonfim, Tany LU
supervisor
organization
course
JAMM07 20221
year
type
H2 - Master's Degree (Two Years)
subject
keywords
Mens rea – AI – Artificial Intelligence – Criminal Liability – Consciousness - E-personhood – Robots – Autonomous systems
language
English
id
9095199
date added to LUP
2022-08-23 09:46:12
date last changed
2022-08-23 09:46:12
@misc{9095199,
  abstract     = {{In modern legal systems, criminal responsibility is based on the concept of agency, which includes autonomy, intentionality, and individual accountability. In parallel, as technology advances and the use of artificial intelligence in our daily lives becomes more prevalent, important tasks are entrusted to AI-driven systems. As a result, AI will increasingly perform independently of humans. The more "smart" AI and sophisticated robots become, the more likely they are to react to impulses. As a result, we are increasingly dealing with agents rather than tools, and advanced artificial intelligence systems deployed irresponsibly will be the source of future concerns, as AI systems are already autonomously engaging in activities that would be considered illegal for a human. As a result, there may be no one to blame for the negative consequences of their actions. Since the accountability gap has been identified, this paper investigates the feasibility of attributing a criminal mind to artificial intelligence entities - such as robots. By assuming that AI entities have their own degree of consciousness, the likelihood of seeing them as holders of a guilty mind leave the inconceivable to the prospective.}},
  author       = {{Calixto Bonfim, Tany}},
  language     = {{eng}},
  note         = {{Student Paper}},
  title        = {{CRIMINAL LIABILITY OF ARTIFICIAL INTELLIGENT MACHINES:EYEING INTO AI´S MIND}},
  year         = {{2022}},
}