Apple Previews New Door Detection, Apple Watch Mirroring, and Live Captions Accessibility Features

Apple today previewed a range of new accessibility features, including Door Detection, Apple Watch Mirroring, Live Captions, and more.

Apple Accessibility OS features 2022
Door Detection will allow individuals who are blind or have low vision to use their iPhone or iPad to locate a door upon arriving at a new destination, understand how far they are from it, and describe the door's attributes, including how it can be opened and any nearby signs or symbols. The feature will be part of a new "Detection Mode" in Magnifier, alongside People Detection and Image Descriptions. Door Detection will only be available on iPhones and iPads with a LiDAR scanner.

Users with physical disabilities who may rely on Voice Control and Switch Control will be able to fully control their Apple Watch Series 6 and Apple Watch Series 7 from their ‌iPhone‌ with Apple Watch Mirroring via AirPlay, using assistive features like Voice Control and Switch Control, and inputs such as voice commands, sound actions, head tracking, and more.

New Quick Actions on the Apple Watch will allow users to use a double-pinch gesture to answer or end a phone call, dismiss a notification, take a photo, play or pause media in the Now Playing app, and start, pause, or resume a workout.

Deaf users and those who are hard of hearing will be able to follow Live Captions across the ‌iPhone‌, ‌iPad‌, and Mac, providing a way for users to follow any audio content more easily, such as during a phone call or when watching video content. Users can adjust the font size, see Live Captions for all participants in a group FaceTime call, and type responses that are spoken aloud. English Live Captions will be available in beta on the ‌iPhone‌ 11 and later, ‌iPad‌ models with the A12 Bionic and later, and Macs with Apple silicon later this year.

Apple will expand support for VoiceOver, its screen reader for blind and low vision users, with 20 new languages and locales, including Bengali, Bulgarian, Catalan, Ukrainian, and Vietnamese. In addition, users will be able to select from dozens of new optimized voices across languages and a new Text Checker tool to find formatting issues in text.

There will also be Sound Recognition for unique home doorbells and appliances, adjustable response times for Siri, new themes and customization options in Apple Books, and sound and haptic feedback for VoiceOver users in Apple Maps to find the starting point for walking directions.

The new accessibility features will be released later this year via software updates. For more information, see Apple's full press release.

To celebrate Global Accessibility Awareness Day, Apple also announced plans to launch SignTime in Canada on May 19 to support customers with American Sign Language (ASL) interpreters, launch live sessions in Apple Stores and social media posts to help users discover accessibility features, expand the Accessibility Assistant shortcut to the Mac and Apple Watch, highlight accessibility features in Apple Fitness+ such as Audio Hints, release a Park Access for All guide in ‌Apple Maps‌, and flag accessibility-focused content in the App Store, Apple Books, the TV app, and Apple Music.

Popular Stories

iPhone 17 Plus Feature

iPhone 17 'Slim': Everything We Know So Far

Friday July 5, 2024 5:13 am PDT by
In 2025, Apple is expected to discontinue the iPhone "Plus" device in its iPhone 17 lineup to make way for an iPhone "Slim" – although it may not actually be called this when it debuts in the fall of next year. Even though the iPhone 16 series launch is still over two months away, when you consider that we learned about larger displays on the iPhone 16 Pro models way back in May 2023, rumors...
HomePod G4 Feature

Leak Confirms Apple's Work On 'Home Accessory'

Thursday July 4, 2024 9:15 am PDT by
Code discovered on Apple's backend by MacRumors confirms Apple is indeed working on a long-rumored home accessory in addition to the HomePod and Apple TV. The code references a device with the identifier "HomeAccessory17,1," which is a new identifier category. The name is similar to the HomePod's "AudioAccessory" identifier. Interestingly, the 17,1 in the identifier name suggests that...
iPhone 16 Camera Lozenge 2

Apple Leak Confirms Four iPhone 16 Models With Same A18 Chip

Tuesday July 2, 2024 9:48 am PDT by
Code discovered in Apple's backend by Nicolás Alvarez and shared with MacRumors confirms Apple's plans to release four iPhone 16 models this year, and it indicates that the devices will all have the same A-series chip. There are mentions of new model numbers that are not associated with existing iPhones, and that have the numbering scheme Apple uses for its flagship devices. The code...

Top Rated Comments

MikhailT Avatar
28 months ago

Apple again leads in accessibility. Love the Live captions and door detection.
To be fair, Android has this Live Captions feature already as well as Google Chrome. I had to rely on it on all platforms.

Microsoft announced and is testing Live Captions on Windows 11 insider builds for a few months now.

Apple is late as usual but I’m sure they will be the best implemented one as that is just them.

Regardless, everyone wins here. We need more accessibility support across the industry.
Score: 8 Votes (Like | Disagree)
NoGood@Usernames Avatar
28 months ago

I think the difference is that Google does all processing on their servers, Apple's implementation is on-device only and works offline. (not to mention your conversation stays private)
Actually, Google’s live caption is all done on-device and does not require an internet connection to function. They have been moving more and more voice request processing to on-device the past few years.
Score: 6 Votes (Like | Disagree)
iStorm Avatar
28 months ago

Actually, Google’s live caption is all done on-device and does not require an internet connection to function. They have been moving more and more voice request processing to on-device the past few years.
This is correct. Taken from Android Accessibility Help ('https://support.google.com/accessibility/android/answer/9350862?hl=en') page: "All captions are processed locally, never stored, and never leave your device."

When it comes to accessibility, users need anything that can help them now. They can't sit around and wait for something else, so I would say Apple is late to the game here. I know a co-worker who switched to Android several years ago so he could use the live caption feature for meetings. Previously, he was using a captioning service over the phone, but was not a fan of having another live person listening in on the meetings.
Score: 5 Votes (Like | Disagree)
surfzen21 Avatar
28 months ago

Apple again leads in accessibility. Love the Live captions and door detection.
Agreed. A lot of their accessibility features seem to get over looked but they actually are life-changing for folks in need.
Score: 4 Votes (Like | Disagree)
Apple$ Avatar
28 months ago
Better late than never, Apple. As a CI Android user, I love the live captions feature so much! it's just so handy when you are watching a YouTube video that doesn't have captions. Instead of skipping it as I did in the past, I just turn on the live captions.
Score: 3 Votes (Like | Disagree)
eilavid Avatar
28 months ago

To be fair, Android has this Live Captions feature already as well as Google Chrome. I had to rely on it on all platforms.

Microsoft announced and is testing Live Captions on Windows 11 insider builds for a few months now.

Apple is late as usual but I’m sure they will be the best implemented one as that is just them.

Regardless, everyone wins here. We need more accessibility support across the industry.
I think the difference is that Google does all processing on their servers, Apple's implementation is on-device only and works offline. (not to mention your conversation stays private)
Score: 3 Votes (Like | Disagree)