Meta is partnering with Snoop Dogg, Kendall Jenner, and other celebrities to create virtual assistants available to users of its messaging apps, part of a major push by the company to embed AI across its products and match the buzzy generative AI features offered by companies like Google and Microsoft.
Meta CEO Mark Zuckerberg showcased the various new AI products, including a new AI-enabled version of the company's Ray-Ban smart glasses, at the company's Connect developers' conference on Wednesday.
"This isn't just going to be about answering queries, this is about entertainment," Zuckerberg said of the celebrity avatars, which he said will become available to users of Instagram, WhatsApp, and Messenger, in the next couple of days, with more to come.
Zuckerberg described the AI features as one of the three pillars in the company's plan to build a digital world, or "metaverse," along with mixed reality and smart glasses. And he stressed that Meta, which has more than 3 billion daily users of its various products, was focused on bringing the technology to the masses.
"We focus a lot of our innovation not just on the groundbreaking innovation, but on making sure these things are going to be accessible to everyone," Zuckerberg said.
During a roughly one-hour solo presentation on stage, Zuckerberg showed off a new image generation tool called EMU that functions similarly to OpenAI's DALL-E, as well as Meta AI, the name of the company's new AI-based chatbot offerings. Zuckerberg also unveiled the new version of its mixed-reality headset, dubbed Quest 3, which will be available for $500 on October 10. While Zuckerberg never mentioned Apple by name, the new Quest's pricing and his comments about accessibility were a clear effort to distinguish Meta from the iPhone maker, whose Vision Pro mixed reality headset will cost $3,500 when it goes on sale next year.
The new version of Meta's Ray-Ban smart glasses, which will cost $299 in the U.S., will allow wearers to live-stream video for the first time. The glasses will also connect to Meta AI, so that a wearer can ask a question out loud—for example, how long does chicken need to be cooked for?—while engaged in an activity, and hear the response through the glasses' built-in speakers.
Meta AI is based on the company's Llama 2 large language model, its competitor to AI models such as OpenAI's GPT-4. Zuckerberg noted that Meta AI also works with Microsoft's Bing search engine.
During his onstage presentation Wednesday, Zuckerberg played a Dungeons and Dragons–like role-playing game with the virtual Snoop Dogg character, and showed off more than a dozen other lifelike avatars based on celebrities including Tom Brady, Paris Hilton, and YouTube personality MrBeast.
Users interact with the virtual assistants in Meta's messaging apps, summoning each avatar and typing in questions. The avatars respond via text messages accompanied by facial expressions, such as rolling their eyes or laughing, but Zuckerberg said audio responses would be coming within the next year.
Meta will also be introducing new tools to let businesses create their own animated AI assistants, Zuckerberg said.