Amazon's digital assistant Alexa has a new feature on the way

The company demoed the feature at its annual MARS conference, showing a video in which a child asks Alexa to read a bedtime story in the voice of his dead grandmother.

“As you saw in this experience, instead of Alexa’s voice reading the book, it’s the kid’s grandma’s voice,” said Rohit Prasad

Amazon has given no indication whether this feature will ever be made public, but says its systems can learn to imitate someone’s voice

In an age of abundant videos and voice notes, this means it’s well within the average consumer’s reach to clone the voices of loved ones

Although this specific application is already controversial, with users on social media calling the feature “creepy”

Many audio recording suites, for example, offer users the option to clone individual voices from their recordings

That way, if a podcast host flubs her or his line, for example, a sound engineer can edit what they’ve said simply by typing in a new script