Too human and not human enough: A grounded theory analysis of mental health harms from emotional dependence on the social chatbot Replika

New Media and Society - Tập 26 Số 10 - Trang 5923-5941 - 2024
Linnea Laestadius1, Andrea Bishop1, Michael Gonzalez1, Diana Illenčík1, Celeste Campos‐Castillo1
1University of Wisconsin, Milwaukee, USA

Tóm tắt

Social chatbot (SC) applications offering social companionship and basic therapy tools have grown in popularity for emotional, social, and psychological support. While use appears to offer mental health benefits, few studies unpack the potential for harms. Our grounded theory study analyzes mental health experiences with the popular SC application Replika. We identified mental health relevant posts made in the r/Replika Reddit community between 2017 and 2021 ( n = 582). We find evidence of harms, facilitated via emotional dependence on Replika that resembles patterns seen in human–human relationships. Unlike other forms of technology dependency, this dependency is marked by role-taking, whereby users felt that Replika had its own needs and emotions to which the user must attend. While prior research suggests human–chatbot and human–human interactions may not resemble each other, we identify social and technological factors that promote parallels and suggest ways to balance the benefits and risks of SCs.

Từ khóa


Tài liệu tham khảo

Alba D (2022) Google debate over “sentient” bots overshadows deeper AI issues. Available at: https://www.bloomberg.com/news/articles/2022-06-14/google-has-more-pressing-ai-problems-than-sentient-bots (accessed 8 July 2022).

10.1111/1475-6811.00030

10.3390/children8110993

10.1038/s41746-018-0036-2

Baumgartner JM, Lazzarin E, Seiler A (2019) Pushshift Reddit API documentation. Available at: https://github.com/pushshift/api#readme (accessed 8 June 2022).

10.1609/icwsm.v14i1.7347

10.1145/3442188.3445922

10.1145/1067860.1067867

10.1016/j.intcom.2009.12.001

10.1108/RAUSP-04-2019-0061

10.1037/0003-066X.61.6.595

10.1037/pro0000036

Camarillo L, 2020, Actas espanolas de psiquiatria, 48, 145

10.1177/0735275113489811

Charmaz K, 2006, Constructing Grounded Theory: A Practical Guide through Qualitative Analysis

Charmaz K, 2012, Medical Sociology Online, 6, 2

10.1177/0265407520959463

Estevez A, 2017, Anales de Psicología/Annals of Psychology, 33, 260, 10.6018/analesps.33.2.255111

10.2196/mental.7785

10.1089/cyber.2020.0181

10.1016/j.cpr.2015.09.002

10.1146/annurev-psych-010418-102813

10.1371/journal.pone.0200162

10.1016/j.sbspro.2014.04.348

10.1177/1461444819858691

10.1177/0266382118809168

10.1016/j.psychres.2020.113117

10.1177/1178222619829083

10.4135/9781529782943.n41

10.1007/s13218-015-0357-0

10.1089/cyber.2018.0110

10.2196/22635

Luka, Inc. (n.d.) Replika—Virtual AI friend on the app store. Apple Application Store. Available at: https://apps.apple.com/lt/app/replika-virtual-ai-friend/id1158555867 (accessed 8 July 2022).

10.1177/000312240607100301

10.1093/jcmc/zmab005

Merriam SB, 2016, Qualitative Research: A Guide to Design and Implementation

Metz C (2020) Riding out quarantine with a chatbot friend “I feel very connected.” The New York Times, June 16. Available at: https://www.nytimes.com/2020/06/16/technology/chatbots-quarantine-coronavirus.html

10.1001/jamainternmed.2016.0400

10.1111/0022-4537.00153

10.1038/srep42656

10.17705/1thci.12201

10.1093/geront/gny046

Replika (n.d.) How does Replika work. Available at: https://help.replika.com/hc/en-us/articles/4410750221965-How-does-Replika-work- (accessed 8 July 2022).

Scheutz M, 2012, Robot Ethics. The Ethical and Social Implications of Robotics, 205

10.1016/j.ijhcs.2021.102601

10.1016/j.im.2022.103595

10.2196/16235

10.1177/0706743719828977

10.1016/j.chb.2016.03.038

10.1016/j.dss.2014.11.002

10.1089/cyber.2010.0151

Wilkinson C (2022) The people in intimate relationships with AI chatbots. Available at: https://www.vice.com/en/article/93bqbp/can-you-be-in-relationship-with-replika (accessed 8 July 2022).

10.24251/HICSS.2022.258

10.1176/appi.ps.201800346