As well as chatting to you, can AI chatbots now tell how you’re feeling too?

The call centre of the future might simply be an AI chatbots that's become indistinguishable from a human.
The call centre of the future might simply be an AI chatbots that's become indistinguishable from a human. Copyright Canva
Copyright Canva
By Reuters and Aisling Ní Chúláin
Share this articleComments
Share this articleClose Button
Copy/paste the article video embed link below:Copy to clipboardCopied

The automated answering service, trained using a human voice, is capable of breaking down sentences into sounds and tones so it can talk back and read a caller’s emotional state.

ADVERTISEMENT

The next generation of chatbots powered by artificial intelligence could be so adept and sympathetic in answering your queries that you might never be able to tell them apart from their flesh and blood counterparts.

Five9, the company behind a new answering machine, claims their new technology - which uses AI to break down sentences into sounds and tones - will bring big savings to companies on personnel costs.

The key to making the caller experience satisfactory is using a human voice to train the AI.

"At the end of the day people are going to understand that they are talking to a machine, they are going to understand that they are talking to software," Callan Scheballa, a project manager with Five9, explained.

"Yet, the voice that it uses, there's no reason for it to not sound great. Now, the more. lifelike that it can sound, at least in our experience, the better the reception of the customers that are going to be talking to it," he added.

How does it work?

To find their latest voice, Five9 auditioned actors in London and decided on Joseph Vaughn to record a series of scripts for the company.

We are capturing all of the audio data and all of the combination of frequencies and vibrations that are inherent to a voice, that as a human we would recognise is a voice, but the machine is just guessing sounds.
Rhyan Johnson
Engineer, Wellsaid labs

That audio was then broken down into sounds and tones rather than words enabling the AI program to recreate not only sentences but also distinct moods.

The software is trained to detect word combinations and tones in the caller’s voice so that it can respond to their emotional state as well.

"We are capturing all of the audio data and all of the combination of frequencies and vibrations that are inherent to a voice, that as a human we would recognise is a voice, but the machine just is guessing sounds," Rhyan Johnson, an engineer from Wellsaid labs who is involved in the project, said.

"Eventually those sounds and those patterns turn into something that again we recognise as a human voice. We could aim for perfection but the human voice is not perfect so we aim for human naturalness instead," he added.

Five9 say their AI agents have already answered more than 82 million calls for healthcare providers like Covid Clinic, large retailers like Pizza Hut as well as insurance companies, banks, local businesses, and state and local governments.

Their new Virtual Voiceover tech will be available next year.

For more on this story, watch the video in the media player above.

Video editor • Mathilde Godon

Share this articleComments

You might also like