Adverts
As Snap creators begin experimenting with the company's augmented reality glasses hardware, the company is diving deeper into enhancing the capabilities of its Lens Studio to build augmented reality filters that are more connected, more realistic, and more futuristic.
At the company's annual Lens Fest event, Snap debuted a number of changes to its lens creation suite. The changes range from efforts to integrate external media and data to more AR-centric features designed with the future of glasses in mind.
Adverts
On the media side, Snap will be launching a Sounds Library that will allow creators to add audio clips and millions of songs from Snapchat's licensed music library directly to their Lenses.
Snap is also striving to bring real-time data to Lenses through an API library that displays evolving trends, such as weather information from AccuWeather or cryptocurrency prices from FTX.
Adverts
One of the biggest feature updates will allow users to embed links inside the lens and send them to different web pages.
Snap's selfie filters continue to be a huge growth opportunity for the company, which has long had augmented reality in its sights.
Snap detailed that there are now more than 2.5 million lenses that have been built by more than a quarter of a million creators. These lenses have been viewed by users a total of 3.5 trillion times, the company says.
The company is building its own in-house “AR innovation lab” called Ghost, which will help the company fund Lens designers looking to push the boundaries of what’s possible, distributing grants of up to US$ 150,000 for individual projects.
As the company looks to make lenses smarter, it's also looking to make them more technically capable.
In addition to integrating new types of data, Snap is also looking at the underlying AR technology to help create nice lenses for users with lower-end phones. Its World Mesh feature has allowed users with high-end phones to leverage AR and view lenses that integrate more real-world geometry data for digital objects into one lens to interact with. Now, Snap is enabling this feature on lower-end phones as well.
Similarly, Snap is also releasing tools to make digital objects react more realistically in relation to each other, launching an in-lens physics engine that will enable more dynamic lenses that can not only interact more deeply with the real world, but can accommodate simultaneous user input as well.
Snap's efforts to build more sophisticated lens creation tools on mobile devices come as the company is also looking to build more advanced support for the tools that developers may need to design for hands-free glasses experiences on their new devices. AR glasses.
Creators have been experimenting with new hardware for months, and Snap is building new lens capabilities to address their concerns and generate new opportunities.
Ultimately, Snap's glasses are still in developer mode and the company hasn't offered any timeline for when they could ship a final product with built-in AR capabilities, so theoretically they have plenty of time to develop the product.
Some of the tools that Snap has been quietly creating include Connected Lenses, which enable shared experiences within Lenses so that multiple users can interact with the same content using AR glasses.
In its developer iteration, the AR glasses don't have long battery life, which means Snap has to get creative to make sure the Snaps are there when you need them, without running persistently.
The company's Endurance mode allows the lenses to continue running in the off-screen background while waiting for a specific trigger, such as reaching a certain GPS location.