The Photos app in iOS 10 has been updated with what Apple calls "Siri intelligence," which essentially equates to new deep learning techniques and advanced facial and object recognition algorithms.
Using these tools, Photos is able to scan a user's entire photo library, intelligently detecting people, animals, places, and objects and grouping photos together in a logical way based on those parameters. As can be seen in the video below, this enables powerful searching capabilities, allowing users to search for "cats" to bring up their images of cats, or "mountains" to find all images taken of mountains.
New to Photos on iOS is a "People" album, housing all of a user's images featuring people, grouped based on facial recognition, and there's a world map that shows the physical location where each of a user's photos were taken.
Perhaps the best new feature in Photos is a "Memories" tab that uses all of the image recognition, date, and location information to aggregate photos based around certain days, vacation trips, family events, and more, so your photos can be revisited on a regular basis. With Memories, there are options to watch quick video montages of photos, which are set to music.
Also new in the iOS 10 Photos app are Live Filters that work with Live Photos and new Markup tools for annotating photos.
The new features in Photos are powered by a device's GPU with all learning done on a device-by-device basis to ensure full privacy. Apple has made it clear that it does not see images or image metadata. When using the new Photos features, each device with a photo library will need to scan images independently -- there is no iCloud link yet.
In case you missed them, make sure to check out our seven minute WWDC 2016 video, which features a quick rundown on all of the new iOS, macOS Sierra, tvOS, and watchOS features Apple introduced this week, and our video highlighting iOS 10's overhauled Lock screen. stay tuned to MacRumors for more in-depth software videos.