Third-Party Devs Will Be Able to Access iPadOS Apple Pencil Latency Improvements for Art Apps

Apple in iPadOS introduced some performance improvements between the iPad Pro and the Apple Pencil, cutting latency from 20ms to 9ms with the new software.

Third-party developers who make apps that use the ‌Apple Pencil‌ will also be able to take advantage of some of these latency improvements, Apple software development chief Craig Federighi confirmed last week.

ipadproapplepencil
Federighi shared the information in a response to an email sent by Artstudio Pro developer Cladio Juliano, who tweeted what Federighi had to say last week. The info was highlighted today in a tweet by developer Steve Troughton-Smith.

In the email, Federighi explains that third-party developers have had access to predicted touches via UIKit since iOS 9, and with iOS 13, developers will receive the "latest and greatest" touch prediction advancements in minimizing PencilKit drawing latency.

Federighi explains just how Apple introduced the latency improvements, and he points out that there's a small gap of 4ms that developers won't have access to at the current time because Apple didn't have a way to safely expose the capability to developers. From Federighi's email:

Note that we achieve low latency through a combination of several techniques: Metal rendering optimizations, touch prediction, and mid-frame event processing. Third-party developers can achieve similar low-latency drawing experiences by taking advantage of Metal rendering and touch prediction best practices covered in the WWDC Sessions I've referenced below.

With these you can achieve nearly all of the improvements you've seen in PencilKit drawing with your own renderer. (There does remain a small gap: 4 ms of our improvement comes from a technique called mid-frame event processing; we are looking for ways to expose this capability to third party engines in the future, but for this year this one was only safely achievable through tight integration within our frameworks).

For developers, the WWDC sessions Federighi suggests include PencilKit, Adopting Predicted Touches, and Metal Performance Optimization.

In a nutshell, the information shared by Federighi confirms that third-party apps that take advantage of the ‌Apple Pencil‌ will be getting some of the same latency improvements that we'll be seeing when using the ‌Apple Pencil‌ within native functions like Markup.

The ‌Apple Pencil‌ latency improvements are built into iPadOS, the version of iOS 13 that is designed to run on the iPad. All of Apple's current iPads support the ‌Apple Pencil‌. ‌iPad Pro‌ models work with the ‌Apple Pencil‌ 2, while the 6th-generation ‌iPad‌, iPad mini, and iPad Air work with the original ‌Apple Pencil‌.

Related Forum: iOS 13

Popular Stories

iOS 18

iOS 18.4 Coming Next Week With These New Features for Your iPhone

Friday February 14, 2025 6:18 am PST by
The first iOS 18.4 beta for iPhones should be just around the corner, and the update is expected to include many new features and changes. Bloomberg's Mark Gurman expects the iOS 18.4 beta to be released by next week. Below, we outline what to expect from iOS 18.4 so far. Apple Intelligence for Siri Siri is expected to get several enhancements powered by Apple Intelligence on iOS...
iPhone 17 Roundup Feature 2

iPhone Design to Change 'Significantly' This Year

Monday February 17, 2025 7:09 am PST by
Apple is set to "significantly change" the iPhone's design language later this year, according to a Weibo leaker. In a new post, the user known "Digital Chat Station" said that the iPhone's design is "starting to change significantly" this year. The "iPhone 17 Air" reportedly features a "horizontal, bar-shaped" design on the rear, likely referring to an elongated camera bump. On the other...
apple launch feb 2025 alt

What to Expect From the 'Apple Launch' Next Week

Thursday February 13, 2025 11:48 am PST by
Apple has yet to announce any new devices this year, but that could change starting next week. Apple CEO Tim Cook today said to "get ready" for a "launch" on Wednesday, February 19. "Get ready to meet the newest member of the family," said Cook, in a social media post. The post includes an #AppleLaunch hashtag, along with a short video featuring an animated Apple logo inside of a circle....
Apple Maps 2024

Apple Maps Might Start Showing Ads

Sunday February 16, 2025 7:22 am PST by
Apple is "exploring" the idea of showing search ads in the Apple Maps app, according to Bloomberg's Mark Gurman. Back in 2022, Gurman said software engineering was "already underway" to display ads in the Apple Maps app, but Apple did not move forward with the idea at the time. Today, he said Apple is "giving this notion more thought" again. This time around, he said Apple has yet to...
Tim Cook Apple Park

10+ Announcements Apple Could Have Rolled Into a February Event

Saturday February 15, 2025 8:00 am PST by
Apple appears to have enough upcoming product announcements to justify a full event this month, yet all signs indicate these reveals will be handled through a series of press releases instead. There are a multitude of rumors from reliable sources about specific announcements in the coming weeks, so here's everything that Apple could have feasibly included in a hypothetical February event: ...
iPhone 17 Pro Render Front Page Tech

iPhone 17 Pro With All-New Camera Bar Design Allegedly Revealed

Thursday February 13, 2025 5:49 pm PST by
Apple's next-generation iPhone 17 Pro will feature three rear cameras arranged in a familiar triangular layout, but the cameras will be housed in an all-new rectangular camera bar with rounded corners, according to YouTube channel Front Page Tech. iPhone 17 Pro camera design render created by Asher for Front Page Tech In a video uploaded today, Front Page Tech host Jon Prosser said the camera ...
m2 pro mac mini

Apple is Now Selling a Refurbished Mac Mini for Just $319 (!)

Saturday February 15, 2025 9:58 am PST by
A few days ago, we reported that Apple's refurbished Mac mini pricing had a problem, and it appears that Apple has taken note. Apple was offering a refurbished Mac mini with the M2 chip, 16GB of RAM, and 256GB of storage for $559, which was $50 more than a refurbished Mac mini with the M4 chip, 16GB of RAM, and 256GB of storage. All other key specifications were equal. That's no longer...
iPhone SE 4 Thumb 1

Apple's Next iPhone SE Launching on Wednesday - Here's What We Know

Friday February 14, 2025 4:04 pm PST by
Apple CEO Tim Cook teased an Apple announcement that's coming on Wednesday, February 19, and it's looking like that mystery announcement will be the next-generation iPhone SE. We've been hearing about the iPhone SE 4 for quite some time now, and we essentially know everything to expect. If you want a sneak peek at what's coming, read on. Naming Apple first introduced the iPhone SE in...

Top Rated Comments

thisisnotmyname Avatar
74 months ago
I would have expected that was automatically exposed from the OS. I'm a bit surprised that they had to explicitly make some of those capabilities available to third party developers.
Score: 12 Votes (Like | Disagree)
cocky jeremy Avatar
74 months ago
I mean... duh?
Score: 6 Votes (Like | Disagree)
cmaier Avatar
74 months ago
I would have expected that was automatically exposed from the OS. I'm a bit surprised that they had to explicitly make some of those capabilities available to third party developers.
It is, if you use the appropriate control. But developer’s may want to integrate it into their own canvas or controls. In which case it is harder to expose it since things you do in your own code can interfere with the ability of the pen code to get the cycles it needs from the GPU and CPU.
Score: 4 Votes (Like | Disagree)
nexusrule Avatar
74 months ago
How nice of Apple. You would think they would limit functionality improvements to their own apps.
I think you don’t know how development works. When you start creating code you can’t always abstract it in a way that’s usable by third party devs through an API. What Federighi meant is right now the code that allow for that part of delay reduction is split between different of Apple software technologies. To be made safely accessible to others devs it needs to be abstracted, made indipendent, because private frameworks can’t be exposed for security reasons. You build these frameworks after you have the working feature, it’s simply impossible to abstract a solution that doesn’t exist. And this sort of work can require a massive rewrite of some parts of the relevant underlying technologies, and it requires time.
Score: 4 Votes (Like | Disagree)
NickName99 Avatar
74 months ago
I love that he gets into such detail. That’s interesting about the 4ms improvement they got using something they apparently can’t expose as a public method without some risk.

Now I’m curious about “mid-frame event processing”, but googling it hasn’t immediately got me anything.
Score: 4 Votes (Like | Disagree)
Cayden Avatar
74 months ago
I love that he gets into such detail. That’s interesting about the 4ms improvement they got using something they apparently can’t expose as a public method without some risk.

Now I’m curious about “mid-frame event processing”, but googling it hasn’t immediately got me anything.
Now I’m not sure so take this with a grain of salt, but as an engineer I’m inclined to believe “mid-frame event processing” means they are updating some pixel information (likely just the pixels associated with the pencil) in between frame updates in which all pixel information is updated and displayed. In other words, in between hardware detections of the pencil location, software would update where it presicts the pencil to be on the next update, and it can start looking for the pencil there instead of looking arbitrarily, mean the location can (usually) be found quicker. What I’m not sure about is if these pixels are actually being updated mid-frame or if the processing is simply keeping this information stored until the next frame is ready to update. I can’t see how the pixels could be updated mid-frame unless they had an individual refresh rate, so I’m inclined to believe the second case. If it’s the second case, it would make sense why Apple doesn’t want to give developers access to this, as this could quickly lead to timing errors between the software and hardware interrupts, such that it would only work within Apple’s framework and not an arbitrary code framework.
Score: 3 Votes (Like | Disagree)