Apple Intelligence Not Trained on YouTube Content, Says Apple

Apple on Thursday addressed concerns about its use of AI training data, following an investigation that revealed Apple, along with other major tech companies, had used YouTube subtitles to train their artificial intelligence models.

Apple Intelligence General Feature
The investigation by Wired earlier this week reported that over 170,000 videos from popular content creators were part of a dataset used to train AI models. Apple specifically used this dataset in the development of its open-source OpenELM models, which were made public in April.

However, Apple has now confirmed to 9to5Mac that OpenELM does not power any of its AI or machine learning features, including the company's Apple Intelligence system. Apple clarified that OpenELM was created solely for research purposes, with the aim of advancing open-source large language model development.

On releasing OpenELM on the Hugging Face Hub, a community for sharing AI code, Apple researchers described it as a "state-of-the-art open language model" that had been designed to "empower and enrich the open research community." The model is also available through Apple's Machine Learning Research website. Apple has stated that it has no plans to develop new versions of the OpenELM model.

The company emphasized that since OpenELM is not integrated into ‌Apple Intelligence‌, the "YouTube Subtitles" dataset is not being used to power any of its commercial AI features. Apple reiterated its previous statement that ‌Apple Intelligence‌ models are trained on "licensed data, including data selected to enhance specific features, as well as publicly available data collected by our web-crawler."

The Wired report detailed how companies including Apple, Anthropic, and NVIDIA had used the "YouTube Subtitles" dataset for AI model training. This dataset is part of a larger collection known as "The Pile," which is compiled by the non-profit organization EleutherAI.

Popular Stories

iOS 26 on Three iPhones

iOS 26's Liquid Glass Design Draws Criticism From Users

Wednesday September 17, 2025 2:56 pm PDT by
It's been two days since iOS 26 was released, and Apple's new Liquid Glass design is even more divisive than expected. Any major design change can create controversy as people get used to the new look, but the MacRumors forums, Reddit, Apple Support Communities, and social media sites seem to feature more criticism than praise as people discuss the update. Complaints There are a long...
iOS 26

iOS 26.1 to iOS 26.4: Here Are 5 New Features to Expect on Your iPhone

Tuesday September 16, 2025 11:17 am PDT by
iOS 26 was finally released on Monday, but the software train never stops, and the first developer beta of iOS 26.1 will likely be released soon. iOS 18.1 was an anomaly, as the first developer beta of that version was released in late July last year, to allow for early testing of Apple Intelligence features. The first betas of iOS 15.1, iOS 16.1, and iOS 17.1 were all released in the second ...
M6 MacBook Pro Feature 1

Apple's Rumored MacBook Pro Redesign: 6 New Features Anticipated

Wednesday September 17, 2025 4:26 am PDT by
Apple in October 2024 overhauled its 14-inch and 16-inch MacBook Pro models, adding M4, M4 Pro, and M4 Max chips, Thunderbolt 5 ports on higher-end models, display changes, and more. That's quite a lot of updates in one go, but if you think this means a further major refresh for the ‌MacBook Pro‌ is now several years away, think again. Bloomberg's Mark Gurman has said he expects only a small ...
iOS 26 Glass Feature

iOS 26: The Top 100 New Features and Changes

Tuesday September 16, 2025 12:26 pm PDT by
Apple released iOS 26 on September 15, and it's now available for all iPhone users with a compatible device. There are a lot of changes and features to learn about, so if you want a quick, easy-to-read list that outlines what's new, we've got you covered. Design Liquid Glass design that reflects light and refracts what's underneath. It's system wide, with dynamic tab bars and toolbars...
ios 26 liquid glass dark mode

iOS 26 Liquid Glass Design Makes App Icons Look Crooked, Report Users

Wednesday September 17, 2025 4:55 am PDT by
iOS 26's new Liquid Glass interface has been criticized for making some content illegible in certain circumstances, and now the UI design is reportedly causing another unusual visual problem for some users. Liquid Glass adds subtle glowing effects to the corners of app icons, creating a dynamic glass-like appearance with depth and parallax effects. However, as noted by Gizmodo, this design...
iPhone 17 Pro and Air Feature

Two iPhone 17 Pro and iPhone Air Colors Appear to Scratch More Easily

Friday September 19, 2025 10:02 am PDT by
As reported by Bloomberg today, some of the new iPhone 17 Pro and iPhone Air models on display at Apple Stores today are already scratched and scuffed. French blog Consomac also reported on this topic. The scratches appear to be most prominent on models with darker finishes, including the iPhone 17 Pro and Pro Max in Deep Blue, and the iPhone Air in Space Black. Images Credit: Consoma ...
AirPods Pro Firmware Feature

AirPods Pro 2 and AirPods 4 Get iOS 26 Features With New Firmware Update

Monday September 15, 2025 10:50 am PDT by
Apple today released updated firmware for the AirPods Pro 2 and the AirPods 4, introducing support for the new AirPods features that are included in iOS 26, iPadOS 26, and macOS Tahoe. The firmware has a build number of 8A356, and it replaces the current 7E93 firmware. With Apple's new software updates, the AirPods Pro 2 and the AirPods 4 support better audio quality for phone calls and...

Top Rated Comments

sniffies Avatar
15 months ago
Thank god for that. Training on YouTube videos from popular content creators would render Apple Intelligence pretty unintelligent.
Score: 25 Votes (Like | Disagree)
Havalo Avatar
15 months ago
Never believe anything until it’s been officially denied - Sir Humphrey (Yes, Minister)
Score: 13 Votes (Like | Disagree)
foobarbaz Avatar
15 months ago

Like a person, it could have been exposed to anything out in the wild and we don’t walk around with a list of references. But we treat this software differently to people… you wouldn’t let anyone off the street on your iPhone or laptop… similar goes for AI.
I think you're humanizing the AI too much. It's not a person searching knowledge "in the wild". It is a large file that has been created by a training algorithm which is given a lot of crawled data as the input. It doesn't learn anything outside of what its creators are passing along. And crucially, once training is complete, it's no longer acquiring knowledge. (Every interaction you have with it starts with a blank slate or explicit "context" given from your previous sessions/personal data.)

So the model's creators know absolutely what has been used to train it. They're generally just cagey about it, because they don't want to be sued once they admit whose copyrighted content they've used.
Score: 7 Votes (Like | Disagree)
peneaux Avatar
15 months ago

Thank god for that. Training on YouTube videos from popular content creators would render Apple Intelligence very unintelligent.
Unintelligent is a very polite way of saying garbage.
Score: 6 Votes (Like | Disagree)
Fuzzball84 Avatar
15 months ago
How do we truly know what they have been trained on?

Like a person, it could have been exposed to anything out in the wild and we don’t walk around with a list of references. But we treat this software differently to people… you wouldn’t let anyone off the street on your iPhone or laptop… similar goes for AI.
Score: 6 Votes (Like | Disagree)
antiprotest Avatar
15 months ago
I believe Apple on this, because from all that we have heard this thing is going to be so delayed that at this point it hasn't been trained on ANY content.
Score: 5 Votes (Like | Disagree)