
What you need to know
- Meta announced that it's opening the doors for other developers to begin creating mobile and web apps for the Ray-Ban Display.
- The company will roll out availability for its Developer Preview over the next few weeks; however, devs can get a head start before they enter.
- The Ray-Ban Display is expected to receive Meta's AI, Muse Spark, this summer, and Connect 2026 is on the way this September.
Meta announced late this week that its built-in lens display glasses are opening their doors to developers looking to make the best apps for it.
In a press release, Meta states that while developers have been experimenting with builds for its AI glasses, the Ray-Ban Display now provides two ways for them to create future apps. Meta is rolling out access for developers (via the Developer Preview) for them to begin creating mobile and web apps. The company adds, "You can create display experiences using familiar tools, whether you're extending an existing iOS or Android mobile app or building something entirely new."
It highlights that developers won't have to worry about creating a dev kit from scratch, as the medium—the platform—is already there. Devs can get an early start on their future on the Ray-Ban Display, as availability continues to roll out over the next few weeks.
The Meta Wearables Device Access Toolkit is where developers will find what they need for mobile apps on the Ray-Ban Display. This is a native SDK for Android and iOS, allowing devs to extend their apps onto the device's display. There will be tools to add UI features, such as "text, images, lists, buttons, and video playback." On the other hand, the new web apps path lets developers build using HTML, CSS, and JavaScript.
One big part of the Ray-Ban Display is its Neural Band, which utilizes your hand's movements to complete actions on its in-lens screen. Developers can now take advantage of it by adding informative overlays, real-time data (think sports scores), media streaming, and more.
More coming

This is the second bit of good news for the Ray-Ban Display this week, as the other portion was all about Muse Spark. It's probably better to say this AI announcement is only half good for the Display. Earlier this week, Meta revealed that its new LLM, Muse Spark, was headed for its AI glasses. This gives users access to its smarter, more accurate AI model, capable of handling tasks/requests with speed and precision, thanks to its use of multiple AI agents.
This is rolling out for Meta's AI glasses Gen 1 and Gen 2; however, the Display will have to wait until this summer to get it. That's not all, as Meta Connect 2026 was confirmed to take place from September 23-24. There weren't many details, but what was teased was AI, VR, wearables, and more for its main keynote.
Android Central's Take
This can only be a good thing for Meta. However, it does feel like the company is trying to ensure it can keep up with the changing tide. More companies are stepping forward with their own AI glasses, and with apps that are interested because of the developers behind them. If Meta wants to remain relevant and in contention, opening up the doors for developers outside of their own is a good idea.