Conversational Commerce: The rise of artificial sales agents

Conversational Commerce

The rise of artificial sales agents

Marisa Tschopp
by Marisa Tschopp
on August 03, 2023
time to read: 16 minutes

Keypoints

The unpredictable future of online shopping

  • Conversational commerce is an emerging commercial trading system
  • Consumers can search, purchase, track products via voice or written user interface, e.g., Alexa (Amazon)
  • While some call voice shopping the future of e-commerce, others are rather sceptical
  • Voice shopping is not yet widely adopted yet for various largely unexplored reasons
  • Our reserach investigates whether the human-AI relationship perception can better explain and predict behavior in the emerging field of voice shopping

Step into the world of voice commerce, an emerging trading system where consumers can effortlessly search, purchase, and track products using voice user interfaces like Amazon’s Alexa. Some hail it as the future of e-commerce, while others approach it with skepticism. Despite its potential, voice shopping has not yet achieved widespread adoption, presenting various challenges to its growth. Our research aims to delve into the consumer-AI relationship perception to shed light on behavior in this emerging new field of conversational shopping.

Fast forward three decades, and the shopping experience has reached a new level of convenience and interactivity. Now, with conversational AI assistants like Alexa, you do not even need to lay a finger on your computer. Simply engage in a conversation with your AI assistant, and you good to go. You can instruct it to make a purchase on your behalf. Voice shopping features have truly streamlined the process, eliminating the need for extensive searching or tedious typing. As Wally Brill, a legend in conversation design, puts it: You are in the kitchen, preparing a delicious dinner with raw chicken, and at the same time, you are effortlessly adding items to your cart using voice commands. It is a whole new era of multitasking and seamless shopping experiences, as depicted in the commercials.

As fascinating as these advancements are, it is essential to maintain a critical eye on the implications of such technological progress. While voice shopping seems to offer unparalleled convenience, it also raises concerns about data privacy and security. As we rely more on AI assistants to make purchases, they gather significant amounts of personal information about our preferences and habits. This data can be valuable for targeted advertising, but it also leaves consumers vulnerable to potential data breaches or misuse. Striking a balance between innovation and safeguarding user privacy remains a crucial challenge for the future of voice shopping and conversational AI technologies.

While voice shopping falls under the umbrella of e-commerce, it significantly diverges from traditional online shopping methods. As we delve into this area, we recognize the importance of selecting appropriate research methods to understand its unique dynamics. In our recent study we wanted to get a better picture of how people voice shop. Thus, we surveyed over 300 experienced voice shoppers in the UK (in 2022) to obtain some more descriptive data firsthand.

Which device do you use for voice shopping? Do you use a screen while voice shopping? How often do you engage in voice shopping? Since when you engage in voice shopping? Average spending per year in £ (GBP)?
  • 75% use only a smart speaker at home  
  • 11% use only a smartphone
  • 14% use both or other devices
  • 54% do not look at a screen
  • 37% look at a screen and see the products
  • 9% do both
  • 32% voice shop several times a month
  • 29% voice shop monthly
  • 19% every 2-3 months
  • 15% about 1-2 times per year
  • 4% daily and 1% less than yearly)
  • 34% have been voice shopping for 1-2 years
  • 28% for 2-3 years
  • 22 % for more than 3 years
  • 16% for less than 12 months
Average = 415.82 GBP

Voice assistants as quasi-sales agents?

Interestingly, voice shopping seems to share conceptual similarities with decision-making in brick-and-mortar stores, where customers engage in face-to-face interactions with salespeople. We propose that customers may perceive their conversational AI as quasi-sales agents, akin to human salespeople guiding them towards making informed purchase decisions.

However, the perception of conversational AIs as shopping assistants has not received sufficient attention in research so far, despite the plethora of studies on recommendation systems. Moreover, existing findings present a paradox, showing that conversational AIs can evoke feelings of both empowerment and friendship, albeit for specific products.

In a nutshell, voice shopping’s slow adoption rate and the contrasting results in existing research prompt us to explore the possible link between purchase preferences and consumers’ perceptions of their AI assistants. By understanding these dynamics, we may be able to unlock the factors that foster or hinder voice shopping’s widespread acceptance.

What are human-AI relationships

Have you ever wondered why we seem to treat machines like they are almost human? It is a fascinating trend found in many studies exploring human-AI relationships. We just can not help but assign emotions and intentions to those robotic counterparts. It is like our brains are wired to make machines feel more like one of us. It is remarkable how we tend to anthropomorphize machines, attributing them with emotions and intentions as if they were one of us. This peculiar aspect has led experts to draw on psychological theories and apply them to our interactions with AI, seeking to understand our behavior and predict how we engage with these technological companions.

But, what do we exactly talk about, when talking about human-AI relationshops? Pentina, Xie, Hancock & Bailey (2023) provide an overview of consumer-machine relationships reviewing 37 peer-reviewed empirical studies. The theories used in theses studies stemed from social psychology (e.g., Bowlby’s attachment theory), communication studies (e.g., uses & gratifications paradigm), human-computer intraction (e.g., CASA paradigm) or other, like the parasocial interaction theory. These are just a few examples of the plethora of theories applied in this field. Each theory brings its own set of strengths and weaknesses to the table, enriching our understanding of human-AI relationships in distinct ways. Yet, open questions remain unsolved:

Our approach

Recently we published our initial study on how humans perceive their relationship with conversational AI. Through the lens of Fiske’s relational models theory, our research revealed intriguing insights into how users establish connections with AI systems.

How humans perceive their relationship with conversational AI

We identified three distinct relationship models that users adopt. First, there is the traditional master-servant relationship, where users perceive themselves as the authoritative figure in control, while the AI system serves their commands. Second, some users develop a friendship-like relationship with the AI, imbuing the interaction with a sense of camaraderie and companionship. In this scenario, the AI becomes more than just a tool; it evolves into a trusted ally, capable of offering support and understanding. Lastly, we observed a rational relationship model, wherein users treat the AI system as a somewhat equal partner. This dynamic reflects a more balanced interaction, where both parties engage in a collaborative exchange of information and decision-making.

How do human-AI relationships impact voice shopping decisions?

Unraveling these diverse relationship models sheds light on the multifaceted nature of human-AI interactions, enhancing our comprehension of the evolving dynamics between users and AI systems. In a followup study we investigated the role of human-AI relationship perception in voice shopping decisions. Specifically we asked, whether the kind of relation people have with their AI, influences what kind of products they buy. In short, we found that the perception of the conversational AI as friend had the strongest predictive power for high- and low involvement products. The perception of the AI as a servant also predicted low-involvement shopping.

Results in short: Friends with AI? It’s complicated

Indeed, to establish causal relationships and make robust claims about the nature of human-AI interaction, an experimental approach is essential. While our explorations into the various relationship models provide valuable insights, experimental studies enable us to manipulate variables and assess their impact on the interaction. By designing controlled experiments, we can systematically test different conditions and observe how they influence user behavior and their perceptions of AI systems. This allows us to identify cause-and-effect relationships and gain a deeper understanding of the underlying mechanisms driving the observed patterns.

How does converstional design, human-AI relationship perception relate in voice shopping?

We are currently evaluating our experiments where we tested the effect of a more emotional design on voice shopping decisions. We cannot provide results yet, however, sharing the experimental procedure where we manipulated Alexa’s output. Overall, we created four videos. Two standard shopping scenarios where a person is purchasing something over Alexa. Then we created two videos, where Alexa was more emotional from a conversational design perspective.

Example standard video: In the standard video, a person is cooking in the kitchen. The person “wakes” Alexa up telling it that the coffee machine broke down and intiates the process of buying the coffee machine via Amazon while still cooking dinner. The conversation is very close to the original; however, to be able to manipulate the output we needed clear time cuts. Thus the whole video was scripted. The human as well as the Alexa outputs were recorded. Alexa outputs were manipulated using the text to voice skill and then recorded. The videos and recordings were cut and edited using MS Clipchamp.

Example emotional Alexa video: As we wanted to explore whether a more emotional design had an effect on people, depending on how they related to the system we manipulated Alexas output. The videos were identical but we changed the wordings to make Alexa friendlier and more welcoming. In a pretest, we found that participants rated the manipulated version significantly more friendly. We relied on prior research suggestion signalling identity by using first-person pronouns (e.g., Alexa referst to itself as I or to both as we) or by signaling empathy (e.g., Oh no, I am sorry to hear that!). Furthermore, we created screenshots of the Alexa output. In the manipualted version, we furthermore inserted emojis to humanize the design.

Outlook

The future of voice commerce looks promising as conversational AI users easily make purchases with simple vocal commands. Projections indicate that generative AI will further revolutionize e-commerce. However, voice shopping adoption is still limited, and the reasons for this are not fully understood. One potential factor is the perception that digital assistants lack the warmth of human sales representatives. Currently, our research is delving into the impact of emotional design on voice shopping intentions, and we will share our findings in the next 6-12 months. As we delve deeper into the realm of voice shopping and its association with emotional design, we will keep you updated with our latest findings. Stay tuned for more insights and discoveries in the coming months.

About the Author

Marisa Tschopp

Marisa Tschopp (Dr. rer. nat., University of Tübingen) is actively engaged in research on Artificial Intelligence from a human perspective, focusing on psychological and ethical aspects. She has shared her expertise on TEDx stages, among others, and represents Switzerland on gender issues within the Women in AI Initiative. (ORCID 0000-0001-5221-5327)

Links

You want to evaluate or develop an AI?

Our experts will get in contact with you!

×
Human and AI

Human and AI

Marisa Tschopp

Human and AI Art

Human and AI Art

Marisa Tschopp

ChatGPT & Co.

ChatGPT & Co.

Marisa Tschopp

TEDxBoston Countdown to AGI

TEDxBoston Countdown to AGI

Marisa Tschopp

You want more?

Further articles available here

You need support in such a project?

Our experts will get in contact with you!

You want more?

Further articles available here