Deepfake Violence and the Future of Human Rights: A South Korean Case Study
(2025) MRSM15 20251Human Rights Studies
- Abstract
- The rise of deepfake pornography in South Korea reflects a serious breakdown in the protection of human rights in digital spaces. In a society where technological development significantly shapes national identity, the rights to bodily autonomy, education, and freedom from inhumane treatment are increasingly compromised by systems that normalize gender-based violence. Deepfake abuse in South Korea is part of a larger structural crisis in tech-facilitated sexual violence, following earlier waves of hidden-camera crimes and blackmail schemes. Using symbolic violence, sociotechnical imaginaries, and rhizomatic harm, this study examines how digital platforms and socio-cultural norms sustain new forms of abuse. A non-participatory digital... (More)
- The rise of deepfake pornography in South Korea reflects a serious breakdown in the protection of human rights in digital spaces. In a society where technological development significantly shapes national identity, the rights to bodily autonomy, education, and freedom from inhumane treatment are increasingly compromised by systems that normalize gender-based violence. Deepfake abuse in South Korea is part of a larger structural crisis in tech-facilitated sexual violence, following earlier waves of hidden-camera crimes and blackmail schemes. Using symbolic violence, sociotechnical imaginaries, and rhizomatic harm, this study examines how digital platforms and socio-cultural norms sustain new forms of abuse. A non-participatory digital ethnography of YouTube and Reddit is used to investigate how harm crosses conventional boundaries of agency and accountability. In these spaces, violence spreads in ways that are difficult to identify, and the idea of rhizomatic harm is developed to help explain how these violations persist without a clear beginning or end. These patterns of violence and the near impossibility of containment challenge basic assumptions in human rights and highlight the need for new ways of understanding harm, responsibility, and justice in a digitally connected world. (Less)
- Popular Abstract
- Deepfake pornography is a growing form of online abuse in South Korea, where manipulated videos are used to depict real women in sexually explicit scenarios without their consent. These violations leave lasting emotional and social harm and often go unpunished. I explore how this abuse is connected to deeper systemic issues, including gender norms, technology platforms, and legal gaps. By analyzing online conversations on Reddit and YouTube, I show how harm spreads across networks in ways that are hard to trace but deeply damaging. I argue that traditional human rights frameworks are not equipped to deal with these emerging forms of violence. To address this, we need new ways of thinking about justice, accountability, and safety in the... (More)
- Deepfake pornography is a growing form of online abuse in South Korea, where manipulated videos are used to depict real women in sexually explicit scenarios without their consent. These violations leave lasting emotional and social harm and often go unpunished. I explore how this abuse is connected to deeper systemic issues, including gender norms, technology platforms, and legal gaps. By analyzing online conversations on Reddit and YouTube, I show how harm spreads across networks in ways that are hard to trace but deeply damaging. I argue that traditional human rights frameworks are not equipped to deal with these emerging forms of violence. To address this, we need new ways of thinking about justice, accountability, and safety in the digital age. (Less)
Please use this url to cite or link to this publication:
http://lup.lub.lu.se/student-papers/record/9201985
- author
- Johnson, Valerie Maria LU
- supervisor
- organization
- course
- MRSM15 20251
- year
- 2025
- type
- H2 - Master's Degree (Two Years)
- subject
- keywords
- human rights, deepfake pornography, tech-facilitated sexual violence, gender-based violence, symbolic violence, sociotechnical imaginaries, rhizomatic harm, surveillance capitalism, digital ethnography
- language
- English
- id
- 9201985
- date added to LUP
- 2025-06-24 11:24:04
- date last changed
- 2025-06-24 11:24:04
@misc{9201985, abstract = {{The rise of deepfake pornography in South Korea reflects a serious breakdown in the protection of human rights in digital spaces. In a society where technological development significantly shapes national identity, the rights to bodily autonomy, education, and freedom from inhumane treatment are increasingly compromised by systems that normalize gender-based violence. Deepfake abuse in South Korea is part of a larger structural crisis in tech-facilitated sexual violence, following earlier waves of hidden-camera crimes and blackmail schemes. Using symbolic violence, sociotechnical imaginaries, and rhizomatic harm, this study examines how digital platforms and socio-cultural norms sustain new forms of abuse. A non-participatory digital ethnography of YouTube and Reddit is used to investigate how harm crosses conventional boundaries of agency and accountability. In these spaces, violence spreads in ways that are difficult to identify, and the idea of rhizomatic harm is developed to help explain how these violations persist without a clear beginning or end. These patterns of violence and the near impossibility of containment challenge basic assumptions in human rights and highlight the need for new ways of understanding harm, responsibility, and justice in a digitally connected world.}}, author = {{Johnson, Valerie Maria}}, language = {{eng}}, note = {{Student Paper}}, title = {{Deepfake Violence and the Future of Human Rights: A South Korean Case Study}}, year = {{2025}}, }