Consumers want Artificial Intelligence (AI) based technology to make their lives easier. Amazon’s intelligent personal assistant Alexa creates shopping lists, sets the thermostat and locks the front door. But consumers don’t want Alexa to tell them what they should be eating, to decide what house temperature is optimal or to lock them out of their home à la “Hal 9000” in 2001: A Space Odyssey.
Professor Markus Giesler, chair of the Marketing Department at York University’s Schulich School of Business, considers who’s the master and who’s the servant, and why marketers of AI-based technologies need to establish some ground rules.
Giesler discusses these ideas and more in this Q&A with Brainstorm.
Q: Amazon’s Alexa and Apple’s Siri are marketplace success stories. At what point do you predict consumers will be turned off by devices that too closely replicate human interaction?
A: For me, the success of AI is attributable to clever storytelling, what I call magical tales and lab tales. Magical tales highlight the idea that a device magically acts and responds to us in human-like terms. These tales are beautifully illustrated in IBM’s Watson commercials that suggest Watson could actually teach someone like Bob Dylan something about song writing, for instance.
Lab tales, conversely, highlight human ingenuity and educational aspects. We can see this in the heroic, documentary-like engineering and programming stories about IBM’s Watson. This kind of storytelling is important because it reassures the audience that all of this magical stuff is still a straightforward human creation – that is, humans are still in charge.
Consumers typically turn off smart technology when they perceive that the harmony between these two kinds of storytelling has been violated. Companies carefully tailor their storytelling to reinforce this harmony.
Q: What are some examples where AI too closely mimic humans? What kind of marketing lessons are to be learned?
A: I asked Alexa if she were a feminist. She replied “yes” and articulated why it mattered. This was a political statement. I then asked her if she were a second- or a third-wave feminist. She didn’t offer a specific answer, which I think was smart on Amazon’s part because doing so would make her too ideological, too human.
One central question for AI users and AI consumer researchers is “Who’s in control?” One of my favorite science fiction examples is the computer “Hal 9000” in Stanley Kubrick’s film 2001: A Space Odyssey. “Hal” is depicted as a charismatic and dependable character who plays chess and takes an active interest in his colleagues’ lives onboard the spaceship. However, in a pivotal scene, he refuses to open the pod bay door, locking the station commander and his injured colleague out of the space station, thereby threatening their lives.
This scene sends chills down our spines because that decision, we feel, should be ours. It serves as a cautionary tale to those who try to model technology after themselves.
“Where’s the line in the sand at which decision-making capacity is taken away from humans and given to technologies such as Alexa? How can consumers still feel that they’re in charge even though they may not be?” – Markus Giesler
Q: What’s the tipping point between power enhancing and power stealing?
A: There is no one tipping point. Or rather, the tipping point is slightly different for each category of AI and in each cultural context. It is also in constant flux. What’s fascinating to observe, as a researcher, is that the negotiation of where this tipping point lies is happening in multiple domains right now – from self-driving cars to Amazon’s smart lock initiative.
In fact, marketers use storytelling to push the tipping point further and further. I would have never dreamed that AppleWatch could possess my health data or that Amazon could know at what time I typically go to bed.
Q: What are the key marketing-related questions when it comes to AI?
A: Some marketers believe that AI is all about the ebb and flow of the customer experience. These marketers typically ask ‘What makes AI emotionally captivating and spectacular to consumers?’ I agree that this question matters, but another vital question pertains to strategies and tactics that marketers use to make AI as natural and invisible as possible.
Where’s the line in the sand at which decision-making capacity is taken away from humans and given to technologies such as Alexa? How can consumers still feel that they’re in charge even though they may not be?
It’s important to recognize that all technologies owe their success and problem-solving abilities to these masking abilities. Marketers and engineers who believe that all they do is develop solutions are typically less successful than those who know that they should also redefine the problem.
“York offers contextualized and longitudinal insight that is extremely hard to find and is in high demand among companies, policy-makers and consumers. This sort of research yields comparatively enriched results.” – Markus Giesler
Q: How does York contribute to the AI discussion?
A: York researchers like myself study AI in their specific economic and socio-cultural context ̶ at work, in a social setting, in education, in the home. For example, in my research lab, the Big Design Lab, I work with technology companies to determine how they can avoid AI technology that perpetuates problematic physical or socio-economic ideals and inequalities. My research helps companies minimize the risk that AI could create even more pressures and uncertainties for consumers. This sort of research is challenging, but it yields more accurate and comparatively enriched results.
At York, we also study the influence of these technologies over time. This matters greatly because AI constantly changes how we work, eat, sleep, problem solve; and ultimately, AI changes who we are through how we choose to live our lives. Such contextualized and longitudinal insight is extremely hard to find today and is in high demand among companies, policy-makers and consumers.
By Megan Mueller, manager, research communications, Office of the Vice-President Research & Innovation, York University, email@example.com