iOS 15's Live Text Feature Lets You Digitize Written Notes, Call a Number on a Sign, Translate a Menu, and Much More

In iOS 15, Apple is introducing a new feature called Live Text that can recognize text when it appears in your camera's viewfinder or in a photo you've taken and let you perform several actions with it.

Apple iPadPro iPadOS15 photos LiveText 060721 big
For example, Live Text allows you to capture a phone number from a storefront with the option to place a call, or look up a location name in Maps to get directions. It also incorporates optical character recognition, so you can search for a picture of a handwritten note in your photos and save it as text.

Live Text's content awareness extends to everything from QR codes to emails that appear in pictures, and this on-device intelligence feeds into Siri suggestions, too.

ios15 live text
For instance, if you take a picture that shows an email address and then open the Mail app and start composing a message, ‌Siri‌'s keyboard suggestions will offer up the option to add "Email from Camera" to the To field of your message.

Other Live Text options include the ability to copy text from the camera viewfinder or photos for pasting elsewhere, share it, look it up in the dictionary, and translate it for you into English, Chinese (both simplified and traditional), French, Italian, German, Spanish, or Portuguese.

live text translate
It can even sort your photos by location, people, scene, objects, and more, by recognizing the text in pictures. For example, searching for a word or phrase in Spotlight search will bring up pictures from your Camera Roll in which that text occurs.

Live Text works in Photos, Screenshot, Quick Look, and Safari and in live previews with Camera. In the Camera app, it's available whenever you point your iPhone's camera at anything that displays text, and is indicated by a small icon that appears in the bottom right corner whenever textual content is recognized in the viewfinder. Tapping the icon lets you tap recognized text and perform an action with it. A similar icon appears in the ‌Photos‌ app when you're viewing a shot image.

visual look up ios 15
In another neural engine feature, Apple is introducing something called Visual Look Up that lets you take photos of objects and scenes to get more information from them. Point your ‌iPhone‌'s camera at a piece of art, flora, fauna, landmarks, or books, and the Camera will indicate with an icon that it recognizes the content and has relevant ‌Siri‌ Knowledge that can add context.

Since Live Text relies heavily on Apple's neural engine, the feature is only available on iPhones and iPads with at least an A12 Bionic or better chip, which means if you have an ‌iPhone‌ X or earlier model or anything less than an iPad mini (5th generation), iPad Air (2019, 3rd generation), or iPad (2020, 8th generation), then unfortunately you won't have access to it.

The iOS 15 beta is currently in the hands of developers, with a public beta set to be released next month. The official launch of iOS 15 is scheduled for the fall.

Related Forum: iOS 15

Popular Stories

iPhone 17 Slim Feature

'iPhone 17 Air' With 'Major' Design Changes and 19-Inch MacBook Detailed in New Report

Sunday December 15, 2024 9:47 am PST by
Apple is planning a series of "major design" and "format changes" for iPhones over the next few years, according to The Wall Street Journal's Aaron Tilley and Yang Jie. The paywalled report published today corroborated the widely-rumored "iPhone 17 Air" with an "ultrathin" design that is thinner than current iPhone models. The report did not mention a specific measurement, but previous...
iphone 17 pro concept render cameras

Major iPhone 17 Pro Redesign Backed by Supply Chain Info, Claims Leaker

Thursday December 12, 2024 4:36 am PST by
Next year's iPhone 17 Pro models will reportedly feature a major redesign, specifically centering around changes to the rear camera module, and now new supply chain information appears to confirm the striking change, according to a Chinese leaker. iPhone 17 Pro concept render Late last month, The Information's Wayne Ma claimed that the rear of the ‌iPhone 17‌ Pro and ‌iPhone 17‌ Pro...
Generic iOS 18

Apple Releases First Betas of iOS 18.3 and iPadOS 18.3

Monday December 16, 2024 10:06 am PST by
Apple today seeded the first betas of upcoming iOS 18.3 and iPadOS 18.3 updates to developers for testing purposes, with the software coming a week after Apple released iOS 18.2 and iPadOS 18.2. iOS 18.3 and iPadOS 18.3 can be downloaded from the Settings app on a compatible device by going to General > Software update. There's no word yet on what's included in iOS 18.3 and iPadOS 18.3, ...
Magic Mouse Next to Keyboard

Apple 'Working' on Redesigned Magic Mouse With a Long-Awaited 'Fix'

Sunday December 15, 2024 8:43 am PST by
Apple is working on a redesigned Magic Mouse that will address some "longstanding complaints," according to Bloomberg's Mark Gurman. In his Power On newsletter today, Gurman said Apple in recent months has been working on a "full overhaul" of the Magic Mouse with a design that "better fits the modern era." However, he does not expect the new Magic Mouse to be released in the "next 12 to 18...
AirTag 2 Mock Feature

AirTag 2 Expected to Launch Next Year With 'Considerable' Upgrade to Item Tracking

Sunday December 15, 2024 2:57 pm PST by
Apple plans to release a second-generation AirTag next year with "considerably" longer range for item tracking, according to Bloomberg's Mark Gurman. In his Power On newsletter today, Gurman said the new AirTag will use Apple's second-generation Ultra Wideband chip, or equivalent technology. The chip debuted last year in the iPhone 15 and the Apple Watch Ultra 2, and Apple said it offers up...

Top Rated Comments

ruka.snow Avatar
46 months ago
Let's see them digitise doctors notes and prescriptions.
Score: 19 Votes (Like | Disagree)
Unggoy Murderer Avatar
46 months ago

They don’t want to. Hence why they aren’t enabling it for the intel macs. Just a cash grab as they hope everybody will sell their newish intel Mac, which they paid handsomely for, at a huge loss and then go and pay handsomely again for a new Apple silicon Mac.
I'm sure Apple could get it running on the A11, however people apparently forget there will be trade-offs to enable that to happen. What if enabling it on the A11 halved the battery life, would you want that, would that be a trade-off you'd be willing to make? Or perhaps the performance of the phone drops by 25%, how about that?

Apple can't (and shouldn't be expected to) support older hardware with every single new feature that's released. Apple do a good job of supporting older hardware far better than the competitors do (the iPhone 6s, a 2015 phone, gets iOS 15), they'll get the benefit of at least some new features and improvements.

I'm sure some people will want to upgrade their hardware to support the latest software additions, but I (and I'm sure like a lot of other folks like me with newish machines), won't be bothered enough. Nice features, but I won't lose sleep over not having them.


Apple isn’t the company from heaven as most seem to think.
Of course they're not - Apple is a money making enterprise.
Score: 14 Votes (Like | Disagree)
EmotionalSnow Avatar
46 months ago

Forgive me if I'm wrong, but wasn't the A11 the first chip with a Neural Engine? So surely they could make this work on the X (and even the 8 and 8 Plus) if they REALLY wanted to.
The later phones have an improved Neural Engine. I'm guessing that these are required in order to ensure good performance.
Score: 8 Votes (Like | Disagree)
Unggoy Murderer Avatar
46 months ago

The old World Lens app (and subsequent iterations after Google acquired it) were able to both capture and translate text in real-time using substantially slower hardware.

Live text would just be this, without the translation (whether on device or not). I can’t see why the neural engine would be an absolute requirement, even if you try to make a ‘performance’ argument.
Try using Word Lens for any sustained period - it absolutely hammered your battery. And just to be clear, that isn't a criticism of Word Lens - it was an exceptional app for its time, it just couldn't take advantage of optimised silicon.

The performance argument is *the* argument. Apple more than likely could implement it on all hardware, but what it it caused your battery life to be half of usual, or Safari and camera apps to be considerably slower while it runs CPU-bound neural networks on every image downloaded, or every camera frame?

Also worth noting, text recognition isn't a trivial task, especially with handwriting.
Score: 8 Votes (Like | Disagree)
farewelwilliams Avatar
46 months ago
implemented better than android
Score: 5 Votes (Like | Disagree)
LukeHarrison Avatar
46 months ago

Since Live Text relies heavily on Apple's neural engine, so the feature is only available on iPhones and iPads with at least an A12 Bionic or better chip, which means if you have an iPhone X or earlier model or anything less than an iPad mini (5th generation), iPad Air (2019, 3rd generation), or iPad (2020, 8th generation), then unfortunately you won't have access to it.
Forgive me if I'm wrong, but wasn't the A11 the first chip with a Neural Engine? So surely they could make this work on the X (and even the 8 and 8 Plus) if they REALLY wanted to.
Score: 5 Votes (Like | Disagree)