Forky Asks A Question: What is Empathy?
Human and A.I. customer service are regressing to the mean
--
I’m particularly interested in how brands build deep engagement with customers to create long-term relationships. I believe customer service interactions are pivotal in this regard, and I often seek out these experiences as a way to learn how smart brands drive loyalty.
A few decades ago I had a customer service encounter with Southwest Airlines that I would describe as delightful and memorable, helping me codify what I believe is effective. I was experiencing a problem with a delayed flight, and the gate agent (1) exercised human judgment, (2) communicated in a way that made me feel respected as a customer, and (3) offered a small financial gesture to make things right. As a result, I became a pretty devoted Southwest customer until the company repeatedly proved it no longer operated this way.
This type of interaction seems to be the exception these days, because the typical service representative is disempowered from taking any of these three key steps. Companies like Bombas — training and empowering their service reps to perform all three actions — are notable and stand out from the crowd.
Unfortunately, many human customer service representatives have been reduced to offering the worst of what machines are able to do. We’ve all interacted with reps who are robotic and unable to deviate from their scripts. They provide emotionless, repeated apologies that lack any true empathy, and in many cases, the customer winds up feeling just as much pity for the service rep as frustration over whatever prompted the customer service request.
I experienced this first-hand at Disneyland last week, where my parents requested we gather for a family trip to celebrate their 75th birthdays and 50th wedding anniversary. While the trip was wonderful and generated memories that will last for many years, we experienced numerous customer service head-scratchers.
For example, it’s now impossible to make a dining reservation unless it’s through the Disneyland app. And it’s also the case that the restaurants at the resort book about three months in advance, so unless you booked your trip more than three months ahead of time, you probably can’t get a reservation. The app itself uses counter-intuitive terminology like “tip board” as a starting point to take actions.
What amazed me even more was that there was no concierge at our hotel, and no human agents were empowered to help address the problem. We experienced a lot of practiced apologies and were told to (what else?) wait in a standby line if we wanted to eat.
The entire experience generated ideas to extend the brilliant “Forky Asks a Question” series, with new entries including:
- What is a 90-minute line?
- What is a small fortune? and
- What is sugar?
Disney refers to its theme park employees as Cast Members, and it struck me how limited the resort and theme park employees were forced to act in their roles. Repeat the lines you’ve been given, with no deviation from the code. Computers become the interface for most interactions (use the app), and humans themselves are reduced to simple computers (they say “sorry, you have to use the app”).
While much has been written about the promise of artificial intelligence for customer service applications, the technology still needs work. So-called A.I. does reduce costs for brands by eliminating some human service rep expenses, but the frustration for consumers remains the same. A.I. service bots attempt to emulate human empathy, but don’t have the basic reading comprehension or listening skills to provide the majority of assistance needed.
Here’s an example where I tested Amazon and the U.S. Postal Service’s Twitter service accounts when an item wasn’t delivered:
The delivery mishap itself was irrelevant; I wasn’t expecting the item to show up on a Sunday. I was just curious how the tagged accounts would respond. When faced with non-standard prompts, they simply stopped responding.
In this example, is “Tenisha” a human or a customer service bot? A human would probably reply by writing something like, “Scott, I am a human customer service representative and I do want to help you with your problem.” Most likely this interaction is with a software script, but it doesn’t really matter, because either would tend to lead to the same result.
The interaction displays a hallmark of bad customer service: the rep didn’t even try to understand my issue, using irrelevant, canned responses to delay and deflect. Every one of these time-consuming interactions drives down a consumer’s loyalty to the brand in question.
What was the outcome? While the USPS eventually did reply on Twitter and ask for the tracking number, they took three days to reply after I provided the requested information. Their reply simply indicated that the item would be delivered soon, with no explanation of what had happened. It turns out Amazon has since updated the tracking information, and it’s pretty clear why the carrier “ran into an issue” as they phrased it:
Since Costa Mesa is over 40 miles from my house in Los Angeles, it makes sense that neither my front door nor my driveway was “accessible” from that distance. So… they just brought the package back to the sorting facility. It’s fairly impressive that with nearly a full week to look into the issue after being called out on social media, neither Amazon nor the U.S. Postal Service has enough capability in either of its customer service departments to explain that no delivery was actually attempted, and this was probably just a computer error that generated an inaccurate “delivery was attempted” message. As this was admittedly an unimportant (low dollar value) delivery, it makes sense that neither company would expend significant resources on the customer service interaction, but that’s also a missed opportunity to create loyalty.
Nobody wants to experience the innards of a Franz Kafka novel. Consumers want genuine empathy, not bureaucracy, and not incompetence, when interacting with customer service representatives, whether they be human or machine. Startups working on A.I. solutions for customer service would do well to remember this when developing next generation innovations, and brands should remember to allow their human representatives to offer the best of what people do well. It benefits neither approach to emulate the worst limitations of each, and there’s a lot of opportunity for the companies that figure out how to get this right.
Scott Lenet is President of Touchdown Ventures, a Registered Investment Adviser that provides “Venture Capital as a Service” to help corporations launch and manage their investment programs.
Unless otherwise indicated, commentary on this site reflects the personal opinions, viewpoints and analyses of the author and should not be regarded as a description of services provided by Touchdown or its affiliates. The opinions expressed here are for general informational purposes only and are not intended to provide specific advice or recommendations for any individual on any security or advisory service. It is only intended to provide education about the financial industry. The views reflected in the commentary are subject to change at any time without notice. While all information presented, including from independent sources, is believed to be accurate, we make no representation or warranty as to accuracy or completeness. We reserve the right to change any part of these materials without notice and assume no obligation to provide updates. Nothing on this site constitutes investment advice, performance data or a recommendation that any particular security, portfolio of securities, transaction or investment strategy is suitable for any specific person. Investing involves the risk of loss of some or all of an investment. Past performance is no guarantee of future results.