Home > Article > Technology peripherals > To improve the accuracy of AR scenes in Apple Maps, Apple begins collecting relevant data
IT Home News on December 14th, Apple recently released a support document stating that since the release of iOS 17.2, will collect map-related data to improve the augmented reality positioning function.
Apple said it collects the data to improve the speed and accuracy of augmented reality features in the Apple Maps app
IT House quoted Apple’s announcement that when users use augmented reality features (including immersive walking routes or optimized location options) in the Apple Maps app, relevant information about “map points” (feature points) will be collected. Includes the shape and appearance of stationary objects such as buildings.
Apple stated that the information collected does not include photos or videos, but uses machine learning algorithms on the device to compare reference data sent to Apple Maps on the iPhone.
Apple said that the camera filters out moving objects such as people and vehicles, while Apple only collects "map points" of stationary objects.
The data collected by Apple is encrypted and not associated with a single user or Apple ID. Apple also uses on-device machine learning to add "noise" to the "map point" data to add irregular changes to prevent Use the possibility to reconstruct "map point" images from data.
The above is the detailed content of To improve the accuracy of AR scenes in Apple Maps, Apple begins collecting relevant data. For more information, please follow other related articles on the PHP Chinese website!