Snapchat has been continually working on various resources, programs, and gadgets such as Snap Lens Network, GHOST, and Spectacles, that it offers its partners, developers, and creators, to enrich user experience. Capabilities like Connected Lens technology, VoiceML, and hand tracking are allowing developers to create new ways of interacting with AR.
Now, Snapchat has launched the latest version of its Lens Studio to enable developers to create realistic lenses. In addition, it is working with AstrologyAPI and Sportradar to broaden its API Library as well. It is also making enhancements to the Lens Analytics feature with Event Insights with the aim to enable creators to debug issues efficiently.
It is also working on a feature, Ray Tracing, that will "let reflections shine from AR objects in a lifelike way." For this feature, Snapchat has joined hands with Tiffany & Co, and Disney and Pixar, that will bring a signature piece and Buzz Lightyear"s spacesuit into AR. Furthermore, it is introducing Lens Cloud so that developers can use it along with Lens Studio to develop a new generation of AR experiences.
The three major services that Lens Cloud brings include:
Multi-User Services lets groups of friends interact together at the same time within the same Lens.
Location-Based Services anchors Lenses to places using our city templates, or any custom location around the world. Central London is the first City Landmarker available now, with more launching over the next year.
Storage Services make it possible to build complex and interactive Lenses by storing assets in our cloud, and call on them, on-demand. Snapchatters will also be able to pick up where they last left off through persistent data support. Storage Services will launch in the coming months.
With its Snap AR community, Snapchat is determined to prove the value and impact of AR by developing the latest Lens capabilities, tools, and infrastructure.