Code for Apple's Communication Safety Feature for Kids Found in iOS 15.2 Beta [Updated]

Update: We've learned from Apple that the Communication Safety code found in the first iOS 15.2 beta is not a feature in that update and Apple does not plan to release the feature as it is described in the article.


Apple this summer announced new Child Safety Features that are designed to keep children safer online. One of those features, Communication Safety, appears to be included in the iOS 15.2 beta that was released today. This feature is distinct from the controversial CSAM initiative, which has been delayed.

iphone communication safety feature arned
Based on code found in the iOS 15.2 beta by MacRumors contributor Steve Moser, Communication Safety is being introduced in the update. The code is there, but we have not been able to confirm that the feature is active because it requires sensitive photos to be sent to or from a device set up for a child.

As Apple explained earlier this year, Communication Safety is built into the Messages app on iPhone, iPad, and Mac. It will warn children and their parents when sexually explicit photos are received or sent from a child's device, with Apple using on-device machine learning to analyze image attachments.

If a sexually explicit photo is flagged, it is automatically blurred and the child is warned against viewing it. For kids under 13, if the child taps the photo and views it anyway, the child's parents will be alerted.

Code in iOS 15.2 features some of the wording that children will see.

  • You are not alone and can always get help from a grownup you trust or with trained professionals. You can also block this person.
  • You are not alone and can always get help from a grownup you trust or with trained professionals. You can also leave this conversation or block contacts.
  • Talk to someone you trust if you feel uncomfortable or need help.
  • This photo will not be shared with Apple, and your feedback is helpful if it was incorrectly marked as sensitive.
  • Message a Grownup You Trust.
  • Hey, I would like to talk with you about a conversation that is bothering me.
  • Sensitive photos and videos show the private body parts that you cover with bathing suits.
  • It's not your fault, but sensitive photos can be used to hurt you.
  • The person in this may not have given consent to share it. How would they feel knowing other people saw it?
  • The person in this might not want it seen-it could have been shared without them knowing. It can also be against the law to share.
  • Sharing nudes to anyone under 18 years old can lead to legal consequences.
  • If you decide to view this, your parents will get a notification to make sure you're OK.
  • Don't share anything you don't want to. Talk to someone you trust if you feel pressured.
  • Do you feel OK? You're not alone and can always talk to someone who's trained to help here.

There are specific phrases for both children under 13 and children over 13, as the feature has different behaviors for each age group. As mentioned above, if a child over 13 views a nude photo, their parents will not be notified, but if a child under 13 does so, parents will be alerted. All of these Communication Safety features must be enabled by parents and are available for Family Sharing groups.

  • Nude photos and videos can be used to hurt people. Once something's shared, it can't be taken back.
  • It's not your fault, but sensitive photos and videos can be used to hurt you.
  • Even if you trust who you send this to now, they can share it forever without your consent.
  • Whoever gets this can share it with anyone-it may never go away. It can also be against the law to share.

Apple in August said that these Communication Safety features would be added in updates to iOS 15, iPadOS 15, and macOS Monterey later this year, and iMessage conversations remain end-to-end encrypted and are not readable by Apple.

Communication Safety was also announced alongside a new CSAM initiative that will see Apple scanning photos for Child Sexual Abuse Material. This has been highly controversial and heavily criticized, leading Apple to choose to "take additional time over the coming months" to make improvements before introducing the new functionality.

At the current time, there is no sign of CSAM wording in the iOS 15.2 beta, so Apple may first introduce Communication Safety before implementing the full suite of Child Safety Features.

Related Forum: iOS 15

Popular Stories

iPhone 17 Pro 3 4ths Perspective Aluminum Camera Module 1

iPhone 17 Pro Launching Later This Year With These 12 New Features

Sunday April 13, 2025 7:52 am PDT by
While the iPhone 17 Pro and iPhone 17 Pro Max are not expected to launch until September, there are already plenty of rumors about the devices. Below, we recap key changes rumored for the iPhone 17 Pro models as of April 2025: Aluminum frame: iPhone 17 Pro models are rumored to have an aluminum frame, whereas the iPhone 15 Pro and iPhone 16 Pro models have a titanium frame, and the iPhone ...
Apple 2025 Thumb 1

10 Products Still Coming From Apple in 2025

Friday April 11, 2025 4:14 pm PDT by
Apple may have updated several iPads and Macs late last year and early this year, but there are still multiple new devices that we're looking forward to seeing in 2025. Most will come in September or October, but there could be a few surprises before then. We've rounded up a list of everything that we're still waiting to see from Apple in 2025. iPhone 17, 17 Air, and 17 Pro - We get...
iOS 18 Siri Personal Context

Report Reveals Internal Chaos Behind Apple's Siri Failure

Thursday April 10, 2025 7:15 am PDT by
A new report from The Information today reveals much of the internal turmoil behind Apple Intelligence's revamped version of Siri. Apple apparently weighed up multiple options for the backend of Apple Intelligence. One initial idea was to build both small and large language models, dubbed "Mini Mouse" and "Mighty Mouse," to run locally on iPhones and in the cloud, respectively. Siri's...
M6 MacBook Pro Feature 1

Waiting for the Perfect MacBook Pro? 2026 Might Be the Year

Thursday April 10, 2025 4:19 am PDT by
Apple in October 2024 overhauled its 14-inch and 16-inch MacBook Pro models, adding M4, M4 Pro, and M4 Max chips, Thunderbolt 5 ports on higher-end models, display changes, and more. That's quite a lot of updates in one go, but if you think this means a further major refresh for the MacBook Pro is now several years away, think again. Bloomberg's Mark Gurman has said he expects only a small...
maxresdefault

The MacRumors Show: New iOS 19, iPhone 17, and Apple Watch Ultra 3 Leaks

Friday April 11, 2025 7:13 am PDT by
On this week's episode of The MacRumors Show, we catch up on the latest iOS 19 and watchOS 12 rumors, upcoming devices, and more. Subscribe to The MacRumors Show YouTube channel for more videos Detailed new renders from leaker Jon Prosser claim to provide the best look yet at the complete redesign rumored to arrive in iOS 19, showing more rounded elements, lighting effects, translucency, and...
iPad Pro iPadOS

iPadOS 19 Will Be 'More Like macOS' in Three Ways

Sunday April 13, 2025 6:43 am PDT by
A common complaint about the iPad Pro is that the iPadOS software platform fails to fully take advantage of the device's powerful hardware. That could soon change. Bloomberg's Mark Gurman today said that iPadOS 19 will be "more like macOS." Gurman said that iPadOS 19 will be "more like a Mac" in three ways:Improved productivity Improved multitasking Improved app window management...
Foldable iPhone 2023 Feature Homescreen

Foldable iPhone Resolutions Leak With Under-Screen Camera Tipped

Monday April 14, 2025 3:12 am PDT by
Apple's upcoming foldable iPhone (or "iPhone Fold") will feature two screens as part of its book-style design, and a Chinese leaker claims to know the resolutions for both of them. According to the Weibo-based account Digital Chat Station, the inner display, which is approximately 7.76 inches, will use a 2,713 x 1,920 resolution and feature "under-screen camera technology." Meanwhile, the...
apple intelligence black

NYT: Apple's AI Struggles Began with 2023 Chip Budget Dispute

Friday April 11, 2025 4:33 am PDT by
Apple's current struggles with Apple Intelligence and Siri began in early 2023 when AI head John Giannandrea sought approval from CEO Tim Cook to purchase more AI chips for development, according to a new report from The New York Times. Cook initially approved doubling the team's chip budget, but CFO Luca Maestri reportedly reduced the increase to less than half that amount, and instead...
Apple Vision Pro with battery Feature Blue Magenta

Vision Pro 2 Rumored to Have Two Key Advantages Over Current Model

Sunday April 13, 2025 7:15 am PDT by
Apple is working on a new version of the Vision Pro with two key advantages over the current model, according to Bloomberg's Mark Gurman. Specifically, in his Power On newsletter today, Gurman said Apple is developing a new headset that is both lighter and less expensive than the current Vision Pro, which starts at $3,499 in the U.S. and weighs up to 1.5 pounds. Gurman said Apple is also...

Top Rated Comments

tzm41 Avatar
45 months ago
Hopefully no CSAM ever… The system is going to be exploited by some states one way or the other.
Score: 42 Votes (Like | Disagree)
Marbles1 Avatar
45 months ago
“Sensitive photos and videos show the private body parts that you cover with bathing suits.”

Weird prudish Apple clearly have no idea about cultures outside the USA.
Score: 27 Votes (Like | Disagree)
HappyDude20 Avatar
45 months ago
Terrible move by Apple
Score: 25 Votes (Like | Disagree)
840quadra Avatar
45 months ago
I am hoping that there are far more details and explanations of what Apple is doing on device, and in the cloud for this feature before it is activated or officially offered to consumers. I get what they are trying to do, but for some there is a huge creep factor attached to this type of service / feature.
Score: 22 Votes (Like | Disagree)
Wildkraut Avatar
45 months ago
Did Ned Flanders get a Job at Apple?
Score: 22 Votes (Like | Disagree)
frumpy16 Avatar
45 months ago

People need to stop using "CSAM" to mean "CSAM detection". Let's expand the acronym in your sentence: "Apple must have worked on child sexual abuse material for a long time" - so, what you're basically saying is Apple is dealing in child porn illegally.
Pedantic. It's pretty obvious what people mean.
Score: 19 Votes (Like | Disagree)