LiDAR for AR filters
The presentation of the new iPhone 12 Pro and 12 Pro Max, which took place yesterday evening, marked the debut of LiDAR technology also on the smartphones of the Cupertino house after it had already arrived on the iPad Pro presented at the beginning of the year.
A few hours after the announcement, it seems that developers are already moving to support the sensor in their applications: one of them is none other than Snapchat. This does not surprise us, as the famous social network was used during the event to show the potential of LiDAR, however, it was not taken for granted that the implementation of its support could take place so soon.
The company has begun releasing the 3.2 updates of its Lens Studio, the tool used to create new filters for Snapchat. This particular version allows you to create those able to take advantage of the LiDAR sensor, to use the data on the depth of the environment to apply images and effects realistically.
We remind you that LiDAR technology allows you to scan an environment and detect all the elements inside it, through a very precise 3D map. Apple has also implemented the use of this sensor in other areas of the camera of the new iPhones, such as night focus, the ability to make videos with depth data (to remove the background or other elements), and much more. yet.