AI-based chatbots in customer service and their effects on user compliance

Martin Adam1, Michael Wessel2, Alexander Benlian1
1Institute of Information Systems & Electronic Services, Technical University of Darmstadt, Hochschulstraße 1, 64289, Darmstadt, Germany
2Department of Digitalization, Copenhagen Business School, Howitzvej 60, 2000 Frederiksberg, Denmark

Tóm tắt

Abstract

Communicating with customers through live chat interfaces has become an increasingly popular means to provide real-time customer service in many e-commerce settings. Today, human chat service agents are frequently replaced by conversational software agents or chatbots, which are systems designed to communicate with human users by means of natural language often based on artificial intelligence (AI). Though cost- and time-saving opportunities triggered a widespread implementation of AI-based chatbots, they still frequently fail to meet customer expectations, potentially resulting in users being less inclined to comply with requests made by the chatbot. Drawing on social response and commitment-consistency theory, we empirically examine through a randomized online experiment how verbal anthropomorphic design cues and the foot-in-the-door technique affect user request compliance. Our results demonstrate that both anthropomorphism as well as the need to stay consistent significantly increase the likelihood that users comply with a chatbot’s request for service feedback. Moreover, the results show that social presence mediates the effect of anthropomorphic design cues on user compliance.

Từ khóa


Tài liệu tham khảo

Adam, M., Toutaoui, J., Pfeuffer, N., & Hinz, O. (2019). Investment decisions with robo-advisors: The role of anthropomorphism and personalized anchors in recommendations. In: Proceedings of the 27th European Conference on Information Systems (ECIS). Sweden: Stockholm & Uppsala.

Agarwal, R., & Prasad, J. (1998). A conceptual and operational definition of personal innovativeness in the domain of information technology. Information Systems Research, 9(2), 204–215.

Aggarwal, P., Vaidyanathan, R., & Rochford, L. (2007). The wretched refuse of a teeming shore? A critical examination of the quality of undergraduate marketing students. Journal of Marketing Education, 29(3), 223–233.

Araujo, T. (2018). Living up to the chatbot hype: The influence of anthropomorphic design cues and communicative agency framing on conversational agent and company perceptions. Computers in Human Behavior, 85, 183–189.

Astrid, M., Krämer, N. C., Gratch, J., & Kang, S.-H. (2010). “It doesn’t matter what you are!” explaining social effects of agents and avatars. Computers in Human Behavior, 26(6), 1641–1650.

Bem, D. J. (1972). Self-perception theory. Advances in Experimental Social Psychology, 6, 1–62.

Benlian, A., Klumpe, J., & Hinz, O. (2019). Mitigating the intrusive effects of smart home assistants by using anthropomorphic design features: A multimethod investigation. Information Systems Journal.

Bertacchini, F., Bilotta, E., & Pantano, P. (2017). Shopping with a robotic companion. Computers in Human Behavior, 77, 382–395.

Bickmore, T. W., & Picard, R. W. (2005). Establishing and maintaining long-term human-computer relationships. ACM Transactions on Computer-Human Interaction (TOCHI), 12(2), 293–327.

Bowman, D., Heilman, C. M., & Seetharaman, P. (2004). Determinants of product-use compliance behavior. Journal of Marketing Research, 41(3), 324–338.

Burger, J. M. (1986). Increasing compliance by improving the deal: The that's-not-all technique. Journal of Personality and Social Psychology, 51(2), 277.

Burger, J. M. (1999). The foot-in-the-door compliance procedure: A multiple-process analysis and review. Personality and Social Psychology Review, 3(4), 303–325.

Burger, J. M., Soroka, S., Gonzago, K., Murphy, E., & Somervell, E. (2001). The effect of fleeting attraction on compliance to requests. Personality and Social Psychology Bulletin, 27(12), 1578–1586.

Cassell, J., & Bickmore, T. (2003). Negotiated collusion: Modeling social language and its relationship effects in intelligent agents. User Modeling and User-Adapted Interaction, 13(1–2), 89–132.

Chaiken, S. (1980). Heuristic versus systematic information processing and the use of source versus message cues in persuasion. Journal of Personality and Social Psychology, 39(5), 752.

Charlton, G. (2013). Consumers prefer live chat for customer service: stats. Retrieved from https://econsultancy.com/consumers-prefer-live-chat-for-customer-service-stats/

Cialdini, R., & Garde, N. (1987). Influence (Vol. 3). A. Michel.

Cialdini, R. B. (2001). Harnessing the science of persuasion. Harvard Business Review, 79(9), 72–81.

Cialdini, R. B. (2009). Influence: Science and practice (Vol. 4). Boston: Pearson Education.

Cialdini, R. B., & Goldstein, N. J. (2004). Social influence: Compliance and conformity. Annual Review of Psychology, 55, 591–621.

Cialdini, R. B., & Trost, M. R. (1998). Social influence: Social norms, conformity and compliance. In S. F. D. T. Gilbert & G. Lindzey (Eds.), The Handbook of Social Psychology (pp. 151–192). Boston: McGraw-Hil.

Cialdini, R. B., Vincent, J. E., Lewis, S. K., Catalan, J., Wheeler, D., & Darby, B. L. (1975). Reciprocal concessions procedure for inducing compliance: The door-in-the-face technique. Journal of Personality and Social Psychology, 31(2), 206.

Cialdini, R. B., Wosinska, W., Barrett, D. W., Butner, J., & Gornik-Durose, M. (1999). Compliance with a request in two cultures: The differential influence of social proof and commitment/consistency on collectivists and individualists. Personality and Social Psychology Bulletin, 25(10), 1242–1253.

Davis, B. P., & Knowles, E. S. (1999). A disrupt-then-reframe technique of social influence. Journal of Personality and Social Psychology, 76(2), 192.

Derrick, D. C., Jenkins, J. L., & Nunamaker Jr., J. F. (2011). Design principles for special purpose, embodied, conversational intelligence with environmental sensors (SPECIES) agents. AIS Transactions on Human-Computer Interaction, 3(2), 62–81.

Deutsch, M., & Gerard, H. B. (1955). A study of normative and informational social influences upon individual judgment. The Journal of Abnormal and Social Psychology, 51(3), 629.

Edwards, A., Edwards, C., Spence, P. R., Harris, C., & Gambino, A. (2016). Robots in the classroom: Differences in students’ perceptions of credibility and learning between “teacher as robot” and “robot as teacher”. Computers in Human Behavior, 65, 627–634.

Elkins, A. C., Derrick, D. C., Burgoon, J. K., & Nunamaker Jr, J. F. (2012). Predicting users' perceived trust in Embodied Conversational Agents using vocal dynamics. In: Proceedings of the 45th Hawaii International Conference on System Science (HICSS). Maui: IEEE.

Epley, N., Waytz, A., & Cacioppo, J. T. (2007). On seeing human: A three-factor theory of anthropomorphism. Psychological Review, 114(4), 864–866.

Ericksen, M. K., & Sirgy, M. J. (1989). Achievement motivation and clothing behaviour: A self-image congruence analysis. Journal of Social Behavior and Personality, 4(4), 307–326.

Etemad-Sajadi, R. (2016). The impact of online real-time interactivity on patronage intention: The use of avatars. Computers in Human Behavior, 61, 227–232.

Eyssel, F., Hegel, F., Horstmann, G., & Wagner, C. (2010). Anthropomorphic inferences from emotional nonverbal cues: A case study. In In: Proceedings of the 19th international symposium in robot and human interactive communication. Viareggio: IT.

Facebook. (2019). F8 2019: Making It Easier for Businesses to Connect with Customers on Messenger. Retrieved from https://www.facebook.com/business/news/f8-2019-making-it-easier-for-businesses-to-connect-with-customers-on-messenger

Feine, J., Gnewuch, U., Morana, S., & Maedche, A. (2019). A taxonomy of social cues for conversational agents. International Journal of Human-Computer Studies, 132, 138–161.

Ferrucci, D., Brown, E., Chu-Carroll, J., Fan, J., Gondek, D., Kalyanpur, A. A., et al. (2010). Building Watson: An overview of the DeepQA project. AI Magazine, 31(3), 59–79.

Fogg, B. J., & Nass, C. (1997). Silicon sycophants: The effects of computers that flatter. International Journal of Human-Computer Studies, 46(5), 551–561.

Fornell, C., & Larcker, D. F. (1981). Structural equation models with unobservable variables and measurement error: Algebra and statistics. Journal of Marketing Research, 18(3), 382–388.

Freedman, J. L., & Fraser, S. C. (1966). Compliance without pressure: The foot-in-the-door technique. Journal of Personality and Social Psychology, 4(2), 195–202.

Gefen, D., & Straub, D. (2003). Managing user trust in B2C e-services. e-Service, 2(2), 7-24.

Gnewuch, U., Morana, S., & Maedche, A. (2017). Towards designing cooperative and social conversational agents for customer service. In: Proceedings of the 38th International Conference on Information Systems (ICIS). Seoul: AISel.

Go, E., & Sundar, S. S. (2019). Humanizing chatbots: The effects of visual, identity and conversational cues on humanness perceptions. Computers in Human Behavior, 97, 304–316.

Gustafsson, A., Johnson, M. D., & Roos, I. (2005). The effects of customer satisfaction, relationship commitment dimensions, and triggers on customer retention. Journal of Marketing, 69(4), 210–218.

Hayes, A. F. (2017). Introduction to mediation, moderation, and conditional process analysis: A regression-based approach (2nd ed.). New York: Guilford Publications.

Hess, T. J., Fuller, M., & Campbell, D. E. (2009). Designing interfaces with social presence: Using vividness and extraversion to create social recommendation agents. Journal of the Association for Information Systems, 10(12), 1.

Hill, J., Ford, W. R., & Farreras, I. G. (2015). Real conversations with artificial intelligence: A comparison between human–human online conversations and human–chatbot conversations. Computers in Human Behavior, 49, 245–250.

Holtgraves, T., Ross, S. J., Weywadt, C., & Han, T. (2007). Perceiving artificial social agents. Computers in Human Behavior, 23(5), 2163–2174.

Holzwarth, M., Janiszewski, C., & Neumann, M. M. (2006). The influence of avatars on online consumer shopping behavior. Journal of Marketing, 70(4), 19–36.

Hopkins, B., & Silverman, A. (2016). The Top Emerging Technologies To Watch: 2017 To 2021. Retrieved from https://www.forrester.com/report/The+Top+Emerging+Technologies+To+Watch+2017+To+2021/-/E-RES133144

Jin, S. A. A. (2009). The roles of modality richness and involvement in shopping behavior in 3D virtual stores. Journal of Interactive Marketing, 23(3), 234–246.

Jin, S.-A. A., & Sung, Y. (2010). The roles of spokes-avatars' personalities in brand communication in 3D virtual environments. Journal of Brand Management, 17(5), 317–327.

Jung, D., Dorner, V., Glaser, F., & Morana, S. (2018a). Robo-advisory. Business & Information Systems Engineering, 60(1), 81–86.

Jung, D., Dorner, V., Weinhardt, C., & Pusmaz, H. (2018b). Designing a robo-advisor for risk-averse, low-budget consumers. Electronic Markets, 28(3), 367–380.

Kim, Goh, J., & Jun, S. (2018). The use of voice input to induce human communication with banking chatbots. In: Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction.

Klumpe, J., Koch, O. F., & Benlian, A. (2019). How pull vs. push information delivery and social proof affect information disclosure in location based services. Electronic Markets. https://doi.org/10.1007/s12525-018-0318-1

Knowles, E. S., & Linn, J. A. (2004). Approach-avoidance model of persuasion: Alpha and omega strategies for change. In Resistance and persuasion (pp. 117–148). Mahwah: Lawrence Erlbaum Associates Publishers.

Kressmann, F., Sirgy, M. J., Herrmann, A., Huber, F., Huber, S., & Lee, D. J. (2006). Direct and indirect effects of self-image congruence on brand loyalty. Journal of Business Research, 59(9), 955–964.

Larivière, B., Bowen, D., Andreassen, T. W., Kunz, W., Sirianni, N. J., Voss, C., et al. (2017). “Service encounter 2.0”: An investigation into the roles of technology, employees and customers. Journal of Business Research, 79, 238–246.

Launchbury, J. (2018). A DARPA perspective on artificial intelligence. Retrieved from https://www.darpa.mil/attachments/AIFull.pdf

Lisetti, C., Amini, R., Yasavur, U., & Rishe, N. (2013). I can help you change! An empathic virtual agent delivers behavior change health interventions. ACM Transactions on Management Information Systems (TMIS), 4(4), 19.

Luger, E., & Sellen, A. (2016). Like having a really bad PA: The gulf between user expectation and experience of conversational agents. In: Proceedings of the CHI Conference on Human Factors in Computing Systems.

Maedche, A., Legner, C., Benlian, A., Berger, B., Gimpel, H., Hess, T., et al. (2019). AI-based digital assistants. Business & Information Systems Engineering, 61(4), 535-544.

Mero, J. (2018). The effects of two-way communication and chat service usage on consumer attitudes in the e-commerce retailing sector. Electronic Markets, 28(2), 205–217.

Meuter, M. L., Bitner, M. J., Ostrom, A. L., & Brown, S. W. (2005). Choosing among alternative service delivery modes: An investigation of customer trial of self-service technologies. Journal of Marketing, 69(2), 61–83.

Moon, Y., & Nass, C. (1996). How “real” are computer personalities? Psychological responses to personality types in human-computer interaction. Communication Research, 23(6), 651–674.

Mori, M. (1970). The uncanny valley. Energy, 7(4), 33–35.

Nass, C., & Moon, Y. (2000). Machines and mindlessness: Social responses to computers. Journal of Social Issues, 56(1), 81–103.

Nass, C., Moon, Y., & Carney, P. (1999). Are people polite to computers? Responses to computer-based interviewing systems. Journal of Applied Social Psychology, 29(5), 1093–1109.

Nass, C., Moon, Y., Fogg, B. J., Reeves, B., & Dryer, C. (1995). Can computer personalities be human personalities? In: Proceedings of the Conference on Human Factors in Computing Systems.

Nass, C., Moon, Y., & Green, N. (1997). Are machines gender neutral? Gender-stereotypic responses to computers with voices. Journal of Applied Social Psychology, 27(10), 864–876.

Nass, C., Steuer, J., & Tauber, E. R. (1994). Computers are social actors. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems.

Nunnally, J., & Bernstein, I. (1994). Psychometric theory (3rd ed.). New York: McGraw Hill Inc..

Orlowski, A. (2017). Facebook scales back AI flagship after chatbots hit 70% f-AI-lure rate. Retrieved from https://www.theregister.co.uk/2017/02/22/facebook_ai_fail/

Pavlikova, L., Schmid, B. F., Maass, W., & Müller, J. P. (2003). Editorial: Software agents. Electronic Markets, 13(1), 1–2.

Pfeuffer, N., Adam, M., Toutaoui, J., Hinz, O., & Benlian, A. (2019a). Mr. and Mrs. Conversational Agent - Gender stereotyping in judge-advisor systems and the role of egocentric bias. Munich: International Conference on Information Systems (ICIS).

Pfeuffer, N., Benlian, A., Gimpel, H., & Hinz, O. (2019b). Anthropomorphic information systems. Business & Information Systems Engineering, 61(4), 523–533.

Pickard, M. D., Burgoon, J. K., & Derrick, D. C. (2014). Toward an objective linguistic-based measure of perceived embodied conversational agent power and likeability. International Journal of Human-Computer Interaction, 30(6), 495–516.

Podsakoff, P. M., MacKenzie, S. B., Lee, J.-Y., & Podsakoff, N. P. (2003). Common method biases in behavioral research: A critical review of the literature and recommended remedies. Journal of Applied Psychology, 88(5), 879–903.

Qiu, L., & Benbasat, I. (2009). Evaluating anthropomorphic product recommendation agents: A social relationship perspective to designing information systems. Journal of Management Information Systems, 25(4), 145–182.

Qiu, L., & Benbasat, I. (2010). A study of demographic embodiments of product recommendation agents in electronic commerce. International Journal of Human-Computer Studies, 68(10), 669–688.

Rafaeli, S., & Noy, A. (2005). Social presence: Influence on bidders in internet auctions. Electronic Markets, 15(2), 158–175.

Raymond, J. (2001). No more shoppus interruptus. American Demographics, 23(5), 39–40.

Reddy, T. (2017a). Chatbots for customer service will help businesses save $8 billion per year. Retrieved from https://www.ibm.com/blogs/watson/2017/05/chatbots-customer-service-will-help-businesses-save-8-billion-per-year/

Reddy, T. (2017b). How chatbots can help reduce customer service costs by 30%. Retrieved from https://www.ibm.com/blogs/watson/2017/10/how-chatbots-reduce-customer-service-costs-by-30-percent/

Reeves, B., & Nass, C. (1996). The media equation : How people treat computers, television, and new media like real people and places. New York: Cambridge University Press.

Scherer, A., Wünderlich, N. V., & Von Wangenheim, F. (2015). The value of self-service: Long-term effects of technology-based self-service usage on customer retention. MIS Quarterly, 39(1), 177–200.

Schneider, D., Klumpe, J., Adam, M., & Benlian, A. (2020). Nudging users into digital service solutions. Electronic Markets, 1–19.

Seymour, M., Yuan, L., Dennis, A., & Riemer, K. (2019). Crossing the Uncanny Valley? Understanding Affinity, Trustworthiness, and Preference for More Realistic Virtual Humans in Immersive Environments. In: Proceedings of the Hawaii International Conference on System Sciences (HICSS).

Shevat, A. (2017). Designing bots: Creating conversational experiences. UK: O'Reilly Media.

Short, J., Williams, E., & Christie, B. (1976). The social psychology of telecommunications.

Simmons, R., Makatchev, M., Kirby, R., Lee, M. K., Fanaswala, I., Browning, B., et al. (2011). Believable robot characters. AI Magazine, 32(4), 39–52.

Simon, H. A. (1990). Invariants of human behavior. Annual Review of Psychology, 41(1), 1–20.

Snyder, M., & Cunningham, M. R. (1975). To comply or not comply: Testing the self-perception explanation of the" foot-in-the-door" phenomenon. Journal of Personality and Social Psychology, 31(1), 64–67.

Svennevig, J. (2000). Getting acquainted in conversation: A study of initial interactions. Philadelphia: John Benjamins Publishing.

Techlabs, M. (2017). Can chatbots help reduce customer service costs by 30%? Retrieved from https://chatbotsmagazine.com/how-with-the-help-of-chatbots-customer-service-costs-could-be-reduced-up-to-30-b9266a369945

Verhagen, T., Van Nes, J., Feldberg, F., & Van Dolen, W. (2014). Virtual customer service agents: Using social presence and personalization to shape online service encounters. Journal of Computer-Mediated Communication, 19(3), 529–545.

Watson, H. J. (2017). Preparing for the cognitive generation of decision support. MIS Quarterly Executive, 16(3), 153–169.

Weiner, B. (1985). "Spontaneous" causal thinking. Psychological Bulletin, 97(1), 74–84.

Weizenbaum, J. (1966). ELIZA—A computer program for the study of natural language communication between man and machine. Communications of the ACM, 9(1), 36–45.

Wessel, M., Adam, M., & Benlian, A. (2019). The impact of sold-out early birds on option selection in reward-based crowdfunding. Decision Support Systems, 117, 48–61.

Whatley, M. A., Webster, J. M., Smith, R. H., & Rhodes, A. (1999). The effect of a favor on public and private compliance: How internalized is the norm of reciprocity? Basic and Applied Social Psychology, 21(3), 251–259.

Wilkinson, N., & Klaes, M. (2012). An introduction to behavioral economics (2nd ed.). New York: Palgrave Macmillan.

Xu, K., & Lombard, M. (2017). Persuasive computing: Feeling peer pressure from multiple computer agents. Computers in Human Behavior, 74, 152–162.

Zaichkowsky, J. L. (1985). Measuring the involvement construct. Journal of Consumer Research, 12(3), 341–352.

Zhang, H., Lu, Y., Shi, X., Tang, Z., & Zhao, Z. (2012). Mood and social presence on consumer purchase behaviour in C2C E-commerce in Chinese culture. Electronic Markets, 22(3), 143–154.