Skip to main content

Lund University Publications

LUND UNIVERSITY LIBRARIES

Interactions with Pseudo-Sapiens : User perception of anthropomorphism, mind, and trust in humanlike social agents

Haresamudram, Kashyap LU (2025)
Abstract
Advancements in AI and Robotics have made it possible, at least to some extent, for technology to interact with humans in humanlike ways, such as being able to use natural language. Some of these technologies have rapidly overtaken the consumer market in the form of services such as ChatGPT, which in 2022 became the fastest growing user-base in history by acquiring 100 million users within two months of launching. Beyond chatbots, several consumer products such as personal assistant “smart” speakers like Amazon Alexa or Apple Siri and personal robots such as Amazon Astro have been available to consumers for some time now. The common thread between these products is their use of “humanlikeness” of their appearance or behaviour (or both) to... (More)
Advancements in AI and Robotics have made it possible, at least to some extent, for technology to interact with humans in humanlike ways, such as being able to use natural language. Some of these technologies have rapidly overtaken the consumer market in the form of services such as ChatGPT, which in 2022 became the fastest growing user-base in history by acquiring 100 million users within two months of launching. Beyond chatbots, several consumer products such as personal assistant “smart” speakers like Amazon Alexa or Apple Siri and personal robots such as Amazon Astro have been available to consumers for some time now. The common thread between these products is their use of “humanlikeness” of their appearance or behaviour (or both) to facilitate interaction. Humanlikeness in technology design, in one sense, is not new, however, interaction with technologies that explicitly resemble or mimic humans is rapidly developing in ways that have previously been unachievable. Research in interaction with such technologies is essential to understand how these technologies impact humans in interaction and society at large. This thesis takes a user-centred focus on such technologies and examines interaction with humanlike social agents, with a focus on user perception of anthropomorphism, mind and trust in them. The thesis is comprised of a compilation of research articles that each examine interactions with different types of agents, such as robots, chatbots and voice assistants, particularly contrasting embodied
agents with disembodied agents, and text-based agents with voice-based agents, in order to study the effect of humanlikeness on the perception of the agent in interaction. Employing video-based methods, the thesis finds that users may be less likely to form trust perceptions regarding an agent based on its humanlike physical or
behavioural characteristics compared to its performance. Additionally, users broadly perceive agents to possess similar “mind” to one another irrespective of their physical or behavioural traits, with this “mind” being distinct from that of humans or other biological agents. The thesis advocates for further research on humanlikeness, collectively referring these agents as “Pseudo-Sapiens”. (Less)
Please use this url to cite or link to this publication:
author
supervisor
opponent
  • Ass. Prof. Lee, Minha, Eindhoven University of Technology , The Netherlands.
organization
publishing date
type
Thesis
publication status
published
subject
keywords
human-agent interaction, social robotics, artificial intelligence, anthropomorphism, trust, mind perception, transparency, explainability, pseudo-sapiens
pages
112 pages
publisher
Department of Technology and Society, Lund University
defense location
Lecture Hall E:A, building E, Klas Anshelms väg 10, Faculty of Engineering LTH, Lund University, Lund.
defense date
2025-02-14 09:00:00
ISSN
1652-4810
ISBN
978-91-8104-271-9
978-91-8104-270-2
language
English
LU publication?
yes
id
3bc55654-997f-4746-84e3-1e4c0e0c1cf2
date added to LUP
2025-01-20 23:21:04
date last changed
2025-04-04 13:53:25
@phdthesis{3bc55654-997f-4746-84e3-1e4c0e0c1cf2,
  abstract     = {{Advancements in AI and Robotics have made it possible, at least to some extent, for technology to interact with humans in humanlike ways, such as being able to use natural language. Some of these technologies have rapidly overtaken the consumer market in the form of services such as ChatGPT, which in 2022 became the fastest growing user-base in history by acquiring 100 million users within two months of launching. Beyond chatbots, several consumer products such as personal assistant “smart” speakers like Amazon Alexa or Apple Siri and personal robots such as Amazon Astro have been available to consumers for some time now. The common thread between these products is their use of “humanlikeness” of their appearance or behaviour (or both) to facilitate interaction. Humanlikeness in technology design, in one sense, is not new, however, interaction with technologies that explicitly resemble or mimic humans is rapidly developing in ways that have previously been unachievable. Research in interaction with such technologies is essential to understand how these technologies impact humans in interaction and society at large. This thesis takes a user-centred focus on such technologies and examines interaction with humanlike social agents, with a focus on user perception of anthropomorphism, mind and trust in them. The thesis is comprised of a compilation of research articles that each examine interactions with different types of agents, such as robots, chatbots and voice assistants, particularly contrasting embodied<br/>agents with disembodied agents, and text-based agents with voice-based agents, in order to study the effect of humanlikeness on the perception of the agent in interaction. Employing video-based methods, the thesis finds that users may be less likely to form trust perceptions regarding an agent based on its humanlike physical or<br/>behavioural characteristics compared to its performance. Additionally, users broadly perceive agents to possess similar “mind” to one another irrespective of their physical or behavioural traits, with this “mind” being distinct from that of humans or other biological agents. The thesis advocates for further research on humanlikeness, collectively referring these agents as “Pseudo-Sapiens”.}},
  author       = {{Haresamudram, Kashyap}},
  isbn         = {{978-91-8104-271-9}},
  issn         = {{1652-4810}},
  keywords     = {{human-agent interaction; social robotics; artificial intelligence; anthropomorphism; trust; mind perception; transparency; explainability; pseudo-sapiens}},
  language     = {{eng}},
  month        = {{01}},
  publisher    = {{Department of Technology and Society, Lund University}},
  school       = {{Lund University}},
  title        = {{Interactions with Pseudo-Sapiens : User perception of anthropomorphism, mind, and trust in humanlike social agents}},
  url          = {{https://lup.lub.lu.se/search/files/206442822/Interactions_with_Pseudo-Sapiens_-_WEBB.pdf}},
  year         = {{2025}},
}