Amazon, a giant tech company, recently revealed a new feature that could make Alexa, its virtual assistant technology, mimic anyone’s voice, including those of the dead.
The company unveiled this feature at its tech summit this week in a video in which a child asks the program to read a bedtime story in the voice of his dead grandmother.
“As you saw in this experience, instead of Alexa’s voice reading the book, it’s the kid’s grandma’s voice,” said Rohit Prasad, Amazon’s head scientist for Alexa AI.
According to The Verge, Prasad said that adding human attributes to AI systems is very important, especially at times like this, with so many people who have lost their relatives due to the ongoing pandemic. However, some people might disagree with this sentiment because this same technology might open a door for many cyber attacks, as has been done before with deepfake technology.
According to The Washington Post, the feature, as much as it might come in handy to users, also raises a myriad of security and ethical concerns.
“I don’t feel our world is ready for user-friendly voice-cloning technology,” Rachel Tobac, CEO of SocialProof Security said.
Tobac added that such technology could be used by bad actors to manipulate the public with fake video clips and audio.
“If a cybercriminal can easily and credibly replicate another person’s voice with a small voice sample, they can use that voice sample to impersonate other individuals,” Tobac said.
“That bad actor can then trick others into believing they are the person they are impersonating, which can lead to fraud, data loss, account takeover, and more,” she said.
Tama Leaver, a professor of internet studies at Curtin University in Australia, added that this also raises questions about the consent – specifically of the dead person who has never imagined that their voice would be used in such technology when they die.
“There’s a real slippery slope thereof using deceased people’s data in a way that is both just creepy on one hand, but deeply unethical on another because they’ve never considered those traces being used in that way,” Leaver said.
By Zintle Nkohla
Follow Zintle Nkohla on Twitter
Follow IT News Africa on Twitter