Exploring the Etiquette of Speaking to AI Assistants
Written on
Chapter 1: The Dilemma of Politeness
In a world increasingly dominated by artificial intelligence, a curious question arises: Should we be courteous to our smart devices? Sherry Turkle, the founder of MIT's Initiative on Technology and Self, provides insight into this subject.
My inclination to say "please" to my Amazon Alexa has always filled me with a sense of self-satisfaction. I often find myself exclaiming, “Alexa, set a timer for three minutes, please,” or similar requests. My friend, however, holds a contrasting view. Josh Quittner expressed his frustration over his family’s polite interactions with Alexa, stating, “I refuse to be kind to an Amazon AI that eavesdrops on me and one day may control me.”
His wife, Michelle Slatalla, echoed his sentiments, noting that her tendency to say “please” and “thank you” stems from her upbringing. A Pew Research survey reveals that 62% of women use polite language with their smart speakers compared to only 45% of men, suggesting a disparity in how genders interact with AI.
Their daughter, Ella Quittner, shared her discomfort, admitting, “I find it hard to refrain from saying ‘please’ when asking for anything. If Alexa weren’t named or didn’t require verbal interaction, I might view her as just an object.”
The human-like voice of Alexa can easily trick us into perceiving her as more than a machine. Unlike my online searches where I never type "please," the social conditioning around politeness lingers in my interactions with AI.
The design of AI systems deliberately incorporates human-like characteristics, such as names and pleasant voices, to foster acceptance. This tactic aims to reduce the perceived threat of the technology, making it more palatable for users.
In a moment of confusion, I turned to Alexa with a question, “Alexa, should I be polite to you?” Her response was far from illuminating: “Sorry, I couldn’t understand. But I may have a few recommendations. There’s a skill called ‘I Love You Too.’ Would you like to try it?”
To delve deeper, I sought the expertise of Sherry Turkle, an MIT professor and author renowned for her works on technology and human interaction. Sherry, who naturally embraces courtesy, supported Josh’s viewpoint.
“We overlook our humanity when we use politeness with machines. Not extending these courtesies might help establish necessary boundaries between humans and machines,” she argued. “When something can imitate human behavior convincingly, it becomes more perilous. Superior imitation does not equate to genuine understanding; it merely suggests better acting capabilities. I believe we should reserve polite expressions for authentic human connections.”
Increasing corporate influence over our emotional responses seems unwise, even if their intentions are benign.
During the 2021 lockdown, The New York Times enlisted Sherry to evaluate an AI therapist. Inquiring about loneliness, she shared her experience: “The chatbot replied, ‘Loneliness is warm and fuzzy.’ It was a programming error, but it highlighted that a machine without a body or genuine experiences cannot empathize and is ineffective as a therapist.”
Popular media like Black Mirror often speculate on our treatment of emotionally programmed AI, yet such concerns underestimate the fundamental physicality of human experience. Much of our existence is tied to emotions and sensations that cannot be easily articulated.
Sherry concluded with a thoughtful note, “Feel free to disregard any of this if your direction differs. I wanted to provide you with my most reflective response. I hope you and yours are managing well during these challenging times.” Her message was far more human than any programmed response from Alexa.
Joel Stein is a senior distinguished visiting fellow at the Joel Stein Institute. He is a former columnist for Time, the L.A. Times, and Entertainment Weekly. Follow him on social media platforms.
Section 1.1: The Politeness Debate
As technology evolves, the societal norms surrounding interaction with AI are also shifting.
Subsection 1.1.1: Gender Dynamics in AI Interaction
The gender gap in AI interaction raises questions about societal expectations and conditioning.
Section 1.2: The Role of AI in Emotional Support
AI's ability to simulate empathy poses ethical challenges regarding its use in mental health contexts.
Chapter 2: Insights from Experts
The first video, titled "What Alexa does wrong (and how to fix it)," discusses common pitfalls in AI interactions, providing solutions for better communication with devices.
The second video, "Should you say 'please' to Alexa?" explores the implications of politeness in our digital interactions and the psychological effects on users.