When Siri was integrated into the iPhone 4S for its release back in 2011, many of us thought it was a pretty cool and nifty new feature – a bit like a sci-fi reality. But the real-world reality was, even though it was an exciting technological step into the future to have our Apple smartphones enabled with artificial intelligence features, it wasn’t exactly the sci-fi utopia we’d hoped for.
That’s because, as with other artificial intelligence chatbots, Siri relies on machine-learning technology to effectively function and engage in meaningful dialogue with its human users.
So, here are 3 reasons why chatbots won’t replace humans:
1. Chatbots can’t (yet) understand the complexities and nuances of language
One of the major gripes many people had with Siri was her inability to understand complex user commands. While she was praised for her voice recognition capabilities, users had to provide Siri with very rigid commands for her to effectively respond, and there was no flexibility around what a user could ask.
For example, you could ask Siri to let you know today’s weather forecast, but she’d struggle to understand you if you asked her whether you should wear a coat and take an umbrella with you before you head out.
Language inflexibility and rigidity is also an issue with other artificial intelligence chatbots, especially those used by many organisations today as a substitute for humans when providing 24/7 live customer support.
These chatbots typically require users to make very rudimentary requests and commands, which often isn’t helpful when your question or issue requires more complex language to be effectively communicated.
2. Chatbots aren’t that great with accents
As well as the issues around language inflexibility, Siri was also incapable of recognising and understanding a range of accents, which led to a lot of frustrating ‘lost in translation’ scenarios.
Apple claimed the application had been designed to work with UK, US and Australian accents, saying:
“Siri can be enabled in any country, and you can choose to speak to it in English, French, or German. However, Siri is designed to recognise the specific accents and dialects of the supported countries listed above. Since every language has its own accents and dialects, the accuracy rate will be higher for native speakers.”
This meant, for example, if you weren’t a native English speaker but spoke to Siri in English, or if you spoke in one of the supported languages but happened to have a strong regional accent, you wouldn’t have much luck getting anywhere with Siri.
Check out Siri vs regional British accents in the video below:
3. Chatbots don’t (yet) have emotional intelligence
Imagine you’re a customer that’s just initiated an online conversation with a chatbot because you need some support – you’ve placed a winning bet but it’s still yet to be paid out, for example.
After some back and forth with the bot, you find out that the company is experiencing a technical issue with their payment system, which should be resolved “shortly”. But you’re a new customer and you’re somewhat sceptical, so you want to know how “shortly” that will be.
You ask the same question over and over, but all you get from the chatbot each time is the generic “shortly” response it’s been programmed to provide. By now, you’re frustrated, annoyed, and probably looking to throw your computer out the window just after giving the keyboard a good bashing.
But the chatbot can’t recognise or understand your expletives, or even how the situation could be a cause of frustration to a customer.
It’s this inability for chatbots to understand the emotions behind customer experiences that can lead to not only legions of unsatisfied customers, but also tarnished customer service reputations for organisations.
So, while chatbots can be a great help for both organisations and customers in serving customers’ needs, they lack the human element they need to successfully engage in meaningful, empathetic and effective conversations.