据美联社6月23日报道,亚马逊宣布了其人工智能助手Alexa的一项全新功能——快速模仿人声。亚马逊高级副总裁罗希特·普拉萨德介绍,这项功能正在开发中,将允许Alexa语音助手根据不到一分钟的录音来模拟出“高质量的”人声。
Amazon's brightly colored models of its Echo Dot speaker are designed for children. [Photo/IC]
Amazon’s Alexa might soon replicate the voice of family members - even if they’re dead.
亚马逊Alexa很快将推出模仿家人声音的功能,包括已故亲人。
The capability, unveiled at Amazon’s Re:Mars conference in Las Vegas, is in development and would allow the virtual assistant to mimic the voice of a specific person based on a less than a minute of provided recording.
6月22日,该公司在拉斯维加斯举办的亚马逊Re:Mars大会上公布了该消息。这项功能正在开发中,将允许Alexa语音助手根据不到一分钟的录音模仿特定人的声音。
Rohit Prasad, senior vice president and head scientist for Alexa, said at the event Wednesday that the desire behind the feature was to build greater trust in the interactions users have with Alexa by putting more “human attributes of empathy and affect.”
亚马逊高级副总裁兼Alexa团队首席科学家罗希特·普拉萨德在大会中表示,希望通过这一功能为Alexa增添“人类属性”,即“同理心和情感”,帮助用户在与Alexa的互动中建立更强大的信任。
“These attributes have become even more important during the ongoing pandemic when so many of us have lost ones that we love,” Prasad said. “While AI can’t eliminate that pain of loss, it can definitely make their memories last.”
普拉萨德称:“在新冠疫情期间,我们许多人失去了所爱的人,因此这类功能就变得更加重要了。虽然人工智能无法消除失去亲人的痛苦,但它可以让他们的记忆存续下去。”
In a video played by Amazon at the event, a young child asks “Alexa, can Grandma finish reading me the Wizard of Oz?” Alexa then acknowledges the request, and switches to another voice mimicking the child’s grandmother. The voice assistant then continues to read the book in that same voice.
在亚马逊大会上播放的一段视频中,一个孩子问道:“Alexa,奶奶能把《绿野仙踪》读完吗?”Alexa随后确认了该请求,并立即切换到另一个模仿孩子祖母的声音,并继续以该声音阅读这本书。
To create the feature, Prasad said the company had to learn how to make a “high-quality voice” with a shorter recording, opposed to hours of recording in a studio. Amazon did not provide further details about the feature, which is bound to spark more privacy concerns and ethical questions about consent.
普拉萨德表示,为了创建这一功能,该公司必须学会如何用较短的录音制作“高质量的声音”,而不需要在录音室里录制几个小时。亚马逊没有提供有关该功能的更多细节,这势必引发更多隐私纠纷,以及与知情同意相关的道德问题。