With the launch of iOS 17.2, Apple has outlined the Maps-related data that it is collecting in order to improve the augmented reality location function. In a new support document, Apple says that it is aiming to bolster the speed and accuracy of augmented reality features in the Maps app.
When using augmented reality features in Maps, including immersive walking directions or the refine location option, Apple collects information on "feature points" that represent the shape and appearance of stationary objects like buildings. The data does not include photos or images, and the feature points collected are not readable by a person.
According to Apple, Maps uses on-device machine learning to compare feature points to Apple Maps reference data that is sent to the iPhone. The camera filters out moving objects like people and vehicles, with Apple collecting just the feature points of stationary objects.
The comparison between the feature points and the Apple Maps reference data allows Maps to pinpoint a user location and provide detailed walking directions with AR context. Using either the AR Walking directions or Refine Location refreshes Apple's reference data to improve augmented reality accuracy.
Data that Apple collects is encrypted and not associated with an individual user or Apple ID. Apple also uses on-device machine learning to add "noise" to the feature points data to add irregular variations that prevent any "unlikely" attempt to use feature points to reconstruct an image from the data.
According to Apple, only an "extremely sophisticated attacker" that has access to the company's encoding system would be able to recreate an image from feature points, but because the data is encrypted and limited only to Apple, "an attack and recreation are extremely unlikely."
The use of AR data can be disabled to prevent Apple from collecting it. The "Improve AR Location Accuracy" toggle can be accessed in the Settings app by going to Privacy and Security and then tapping on Analytics and Improvements.