AI som emotionellt stöd
(2024) ABMM54 20241Division of ALM and Digital Cultures
- Abstract
- Chat GPT was launched in November 2022 and has by some been described as just
as big and important as the launch of the internet or the personal computer. Trying
to describe Chat GPTs main use is not easy. It can help you with your programming,
write music or plan a dinner. But its main purpose is not to be a friend, a partner or
a therapist. Yet it is used as one. People using artificial intelligence as a type of
human being is not new. However, it can be done in different ways and since Chat
GPT is new, not much research has been done about it. The purpose of this study is
to explore how Chat GPT is being used through an emotional perspective. To
conduct this study, I have analyzed Reddit posts written during one month –
... (More) - Chat GPT was launched in November 2022 and has by some been described as just
as big and important as the launch of the internet or the personal computer. Trying
to describe Chat GPTs main use is not easy. It can help you with your programming,
write music or plan a dinner. But its main purpose is not to be a friend, a partner or
a therapist. Yet it is used as one. People using artificial intelligence as a type of
human being is not new. However, it can be done in different ways and since Chat
GPT is new, not much research has been done about it. The purpose of this study is
to explore how Chat GPT is being used through an emotional perspective. To
conduct this study, I have analyzed Reddit posts written during one month –
December 2023. I have searched for posts about Chat GPT and the search terms
loneliness, lonely, girlfriend, boyfriend, friend, friendship and therapist. By
analyzing the result, I found three themes, used in Chat GPT by Reddit users, to
deal with emotions. The first theme was entanglement, where Chat GPT is a big
part of the users everyday life. The second theme is fear where the users are both
fascinated and scared of the development in AI. The third theme is politeness, where
the users are as polite to Chat GPT as they are to humans, sometimes more. The
theories I have used to conduct this study are sociomaterial theory and computers
are social actors (CASA). Through them I have found signs of both entanglement
and of overlearned politeness, where Chat GPT is a part of peoples work life and
private life. Where we are mindlessly polite to artificial intelligence, applying
human emotions onto technology. (Less)
Please use this url to cite or link to this publication:
http://lup.lub.lu.se/student-papers/record/9157055
- author
- Kappelin, Sara LU
- supervisor
- organization
- alternative title
- AI as emotional support
- course
- ABMM54 20241
- year
- 2024
- type
- H2 - Master's Degree (Two Years)
- subject
- keywords
- Library, Information, Sociomaterial theory, CASA, Chat GPT, Anthropomorhism
- language
- Swedish
- id
- 9157055
- date added to LUP
- 2024-06-18 15:47:19
- date last changed
- 2024-06-18 15:47:19
@misc{9157055, abstract = {{Chat GPT was launched in November 2022 and has by some been described as just as big and important as the launch of the internet or the personal computer. Trying to describe Chat GPTs main use is not easy. It can help you with your programming, write music or plan a dinner. But its main purpose is not to be a friend, a partner or a therapist. Yet it is used as one. People using artificial intelligence as a type of human being is not new. However, it can be done in different ways and since Chat GPT is new, not much research has been done about it. The purpose of this study is to explore how Chat GPT is being used through an emotional perspective. To conduct this study, I have analyzed Reddit posts written during one month – December 2023. I have searched for posts about Chat GPT and the search terms loneliness, lonely, girlfriend, boyfriend, friend, friendship and therapist. By analyzing the result, I found three themes, used in Chat GPT by Reddit users, to deal with emotions. The first theme was entanglement, where Chat GPT is a big part of the users everyday life. The second theme is fear where the users are both fascinated and scared of the development in AI. The third theme is politeness, where the users are as polite to Chat GPT as they are to humans, sometimes more. The theories I have used to conduct this study are sociomaterial theory and computers are social actors (CASA). Through them I have found signs of both entanglement and of overlearned politeness, where Chat GPT is a part of peoples work life and private life. Where we are mindlessly polite to artificial intelligence, applying human emotions onto technology.}}, author = {{Kappelin, Sara}}, language = {{swe}}, note = {{Student Paper}}, title = {{AI som emotionellt stöd}}, year = {{2024}}, }