Facebook and Instagram users will now be able to talk to voices that sound a lot like John Cena and Judi Dench. They won’t be the real actors, though, but an artificial intelligence chatbot.
Parent company Meta (META) announced Wednesday that it is adding voice conversations and celebrity voices to its artificial intelligence chatbot, Meta AI. Now, instead of simply messaging with the chatbot, users can have real-time conversations and can choose from a selection of computer-generated or celebrity voices.
The company partnered with Cena and Dench, as well as actors Kristen Bell, Awkwafina and Keegan-Michael Key, to train the chatbot to replicate their voices.
The update comes as Meta seeks to help its AI chatbot — which users can chat with on Facebook, Instagram, WhatsApp and Threads — keep pace with competitors’ products, including ChatGPT, which is rolling out its own advanced voice mode. Meta CEO Mark Zuckerberg said Meta AI is on track to be “the most used AI assistant in the world” by the end of this year, likely helped by the more than 3 billion people who use the company’s apps each day, although it’s not clear how Meta is measuring use of the chatbot and how frequently people engage with the tool.
Rival OpenAI came under fire earlier this year when it showed off its own real-time voice mode feature for ChatGPT because of a demo voice that sounded remarkably like actor Scarlett Johansson, who said she had been asked to participate in the project but declined. OpenAI denied that the voice, dubbed Sky, was based on Johansson, but paused its use anyway. Unlike that debacle, Meta appears to have formed formal partnerships with the actors whose voices were used to train its tool.
Zuckerberg announced the new voice mode during his keynote speech at the company’s annual Meta Connect conference, where he also shared other AI advancements, a new, cheaper version of Meta’s Quest headsets and updates to the company’s line of augmented reality RayBan glasses.
Among the other notable announcements: Meta will now let social media influencers make AI versions of themselves. Previously, influencers could train AI to have text conversations with their followers; now, followers will be able to have full, quasi-video calls with the AI versions of influencers who use the tool.
Meta’s AI technology will also auto-translate and dub foreign language Reels (Meta’s short-form videos) for viewers. So, if you speak English but a Reel comes across your feed that was originally created in, say, Spanish, it will appear in your feed as if it were made in English, complete with edits to the speaker’s mouth to make the dubbing look natural.
And you might start seeing more AI-generated content in your Facebook and Instagram feeds. Meta says it will start to generate and share AI-generated images to users’ feeds based on their “interests or current trends,” a feature it’s calling “imagined for you.” (It’s not clear whether users will be able to opt out of this if they’d prefer to see only content from their real, human friends.)
Meta’s AR glasses are getting live, AI-enabled translation, too. A user can have a conversation with someone speaking in a foreign language and, within seconds, hear the translation to their own language in their ear, Zuckerberg said.
Zuckerberg also previewed “Orion,” a prototype for a more advanced pair of techy glasses that would essentially put the power of an AR headset — like Meta Quest or Apple’s Vision Pro — into a pair of mostly normal-looking (if a bit bulky) glasses.
But there’s a big difference between the Orion and headsets like the Quest or Vision Pro. With existing AR headsets, users are looking at a screen that uses a camera to display emails or photos superimposed on the users’ surroundings, a technology known as “passthrough.” But the Orion lenses are actually see-through and use holograms to make it look as though your email inbox or text messages or even a live, 3D rendering of a friend are floating in space next to you.
Zuckerberg called them “the most advanced glasses the world has ever seen,” but they aren’t available for consumers to purchase just yet. The chief executive said Meta will continue to experiment on the glasses internally and make them available to select third-party developers to build software for them ahead of an eventual consumer version.