MacRumors

For this week's giveaway, we've teamed up with Saddleback Leather Co. to offer MacRumors readers a chance to win a Front Pocket Leather Briefcase and a set of four different AirTag holders.

saddleback leather co giveaway
Bags from Saddleback Leather Co., such as the Front Pocket Leather Briefcase that we're giving away, are designed to last a lifetime and come with a 100 year warranty. They're made from a thick full-grain leather that's durable and able to stand up to wear and tear over the years.

The Front Pocket Leather Briefcase, which comes in tobacco, chestnut, black, and dark coffee brown, is sized to hold a MacBook and all of the accessories that you might need to carry with it. There's a dedicated laptop pocket that can accommodate up to a 17-inch laptop and a second pocket inside for holding other items.

saddleback leather briefcase
Two front pockets are able to hold an iPhone, cables, wallet, and other accessories, plus there are two additional pockets behind them and another two pockets at the sides. For papers and other small items, there's a quick access rear pocket. To hide your most sensitive items like spare cash or a passport, there's a secret false bottom.

There are no zippers, magnets, snaps, or buttons on the Front Pocket Leather Briefcase, so there are no breakable components to deal with. The bag is built with buckles, and all of the stress points are reinforced with rivets and hidden polyester strapping.

saddleback leather briefcase 3
The briefcase is made from a tough, water resistant full-grain leather and it is stitched with marine-grade polymer thread that's designed to hold up to sun and the elements. Clasps are made from Stainless Steel that Saddleback Leather Co. says can hold more than 700 pounds.

saddleback leather briefcase 2
There's a leather shoulder strap for carrying the briefcase, but it is also able to convert into a laptop backpack, as demoed in the walkthrough video below.


The Front Pocket Leather Briefcase is one of Saddleback Leather Co.'s most premium and spacious offerings, which is why it's priced at $719, but Saddleback Leather Co. has several smaller, lighter, and more affordable bags available too. The company also makes a series of AirTag holders, which are its newest product and came out earlier this year.

Saddleback Leather Co.'s AirTag holders are priced at $19 to $24 and they're made from the same protective full-grain leather that Saddleback bags are made from. There are a huge range of designs, like the eight-sided Rivet and the square-shaped Sleeve, both of which attach to keys, backpacks, bags, and more, plus other fun shapes ranging from bulls to koalas.

saddleback leather airtags 1
Other designs include the Strap and the Double Loop. The Strap can be used as a keyring or a loop thanks to a longer strip of leather, while the Double Loop has a two-loop design that's ideal for attaching to dog collars, backpack shoulder straps, and more.

saddleback leather airtags 2
All of the ‌AirTag‌ holders have a little pocket where the ‌AirTag‌ is housed, and they all have a discreet design that makes them look more like a simple keychain than an ‌AirTag‌ accessory.


We have a Front Pocket Leather Briefcase in the tobacco color and a set of four matching ‌AirTag‌ holders (Rivet, Strap, Double Loop, and Sleeve) to give away to one lucky MacRumors reader. To enter to win our giveaway, use the Gleam.io widget below and enter an email address. Email addresses will be used solely for contact purposes to reach the winners and send the prizes. You can earn additional entries by subscribing to our weekly newsletter, subscribing to our YouTube channel, following us on Twitter, following us on Instagram, or visiting the MacRumors Facebook page.

Due to the complexities of international laws regarding giveaways, only U.S. residents who are 18 years or older and Canadian residents (excluding Quebec) who have reached the age of majority in their province or territory are eligible to enter. To offer feedback or get more information on the giveaway restrictions, please refer to our Site Feedback section, as that is where discussion of the rules will be redirected.

The contest will run from today (August 6) at 11:00 a.m. Pacific Time through 11:00 a.m. Pacific Time on August 13. The winner will be chosen randomly on August 13 and will be contacted by email. The winner will have 48 hours to respond and provide a shipping address before a new winner is chosen.

Apple this week announced that, starting later this year with iOS 15 and iPadOS 15, the company will be able to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, enabling Apple to report these instances to the National Center for Missing and Exploited Children, a non-profit organization that works in collaboration with law enforcement agencies across the United States.

apple csam flow chart
The plans have sparked concerns among some security researchers and other parties that Apple could eventually be forced by governments to add non-CSAM images to the hash list for nefarious purposes, such as to suppress political activism.

"No matter how well-intentioned, Apple is rolling out mass surveillance to the entire world with this," said prominent whistleblower Edward Snowden, adding that "if they can scan for kiddie porn today, they can scan for anything tomorrow." The non-profit Electronic Frontier Foundation also criticized Apple's plans, stating that "even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor."

To address these concerns, Apple provided additional commentary about its plans today.

Apple's known CSAM detection system will be limited to the United States at launch, and to address the potential for some governments to try to abuse the system, Apple confirmed to MacRumors that the company will consider any potential global expansion of the system on a country-by-country basis after conducting a legal evaluation. Apple did not provide a timeframe for global expansion of the system, if such a move ever happens.

Apple also addressed the hypothetical possibility of a particular region in the world deciding to corrupt a safety organization in an attempt to abuse the system, noting that the system's first layer of protection is an undisclosed threshold before a user is flagged for having inappropriate imagery. Even if the threshold is exceeded, Apple said its manual review process would serve as an additional barrier and confirm the absence of known CSAM imagery. Apple said it would ultimately not report the flagged user to NCMEC or law enforcement agencies and that the system would still be working exactly as designed.

Apple also highlighted some proponents of the system, with some parties praising the company for its efforts to fight child abuse.

"We support the continued evolution of Apple's approach to child online safety," said Stephen Balkam, CEO of the Family Online Safety Institute. "Given the challenges parents face in protecting their kids online, it is imperative that tech companies continuously iterate and improve their safety tools to respond to new risks and actual harms."

Apple did admit that there is no silver bullet answer as it relates to the potential of the system being abused, but the company said it is committed to using the system solely for known CSAM imagery detection.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

See update at bottom of article



Spotify this week confirmed that its plans to add AirPlay 2 support to its iOS app have been placed on indefinite hiatus.

General Spotify Feature
In an online discussion forum post, a Spotify representative said the streaming music service had been working on supporting AirPlay 2, but the company has paused the efforts "for now" due to "audio driver compatibility issues." The representative described AirPlay 2 support as a "bigger project that we won't be able to complete in the foreseeable future."

AirPlay 2 launched as part of iOS 11.4 in May 2018 with several enhancements to the original AirPlay wireless audio protocol, including multi-room audio, Siri voice control, and improved buffering. An app can support AirPlay 2 by implementing AVFoundation framework APIs, with a four-step process outlined on Apple's developer website.

Spotify has also yet to add native support for the HomePod (or its HomePod mini successor), despite Apple opening up the speaker to third-party music services last year. This is despite Spotify filing a complaint against Apple with the European Commission in March 2019 that accused the iPhone maker of "locking Spotify and other competitors out of Apple services such as Siri, HomePod, and Apple Watch."

We've reached out to Spotify for comment and we'll provide an update if we hear back.

Update (Aug 7, 2021): Spotify has reached out and indicates that the original post was not accurate and AirPlay 2 support is still in the works:

A post on one of Spotify’s Community pages contained incomplete information regarding our plans for AirPlay2. Spotify will support AirPlay2 and we’re working to make that a reality.

Spotify has updated the original post with the revised information.

Amazon today has solid deals on Apple's AirPods, including a new all-time low price on the AirPods with Wireless Charging Case. You can get these for $129.98, down from $199.00. The AirPods are shipped and sold from Amazon, and no coupon is required.

AirPods Discount Feature MagentaNote: MacRumors is an affiliate partner with Amazon. When you click a link and make a purchase, we may receive a small payment, which helps us keep the site running.

This price beats the typical discount seen on these AirPods by about $20, and it's only available on Amazon as of writing. This is the model of the AirPods that includes the Wireless Charging Case, for charging the headphones on any Qi wireless mat.

For the lower-cost model, Amazon has the AirPods with Wired Charging Case for $119.00, down from $159.00. We've seen this model drop to $109 in the past, but this was a rare sale that hasn't reemerged often since its popularity earlier in 2021.

We track sales for every model of the AirPods in our Best AirPods Deals guide, so be sure to bookmark that page while you shop around for the wireless headphones.

Related Roundup: Apple Deals

Apple suppliers manufacturing the iPhone 13 models are struggling to hire enough workers ahead of the expected launch of new devices in September, according to the South China Morning Post.

iphone 13 blue with text
In an attempt to attract workers to meet demand for the ‌iPhone‌ 13 lineup, Apple suppliers in China are significantly raising their starter bonuses. Foxconn's factory in Zhengzhou, which is estimated to manufacture around 80 percent of the world's iPhones, has raised its new hires bonus to a record high of 10,200 yuan (U.S. $1,578).

Likewise, Lens Technology has doubled its bonus 5,000 yuan in February to 10,000 yuan in May, while Luxshare Precision's factory in Guangdong has doubled its internal referral bonus from 2,500 yuan in April to 5,000 yuan in May, with a top-up bonus of 3,800 yuan for returning workers who previously left the company.

Over the past three years, Apple has added more new suppliers from mainland China to its vendor list than any other country, but simultaneously the growth of China's labor force has peaked as factory jobs have started to lose their appeal and more workers move out of industrial labor, leading to companies having to lure in workers with more attractive pay packages.

The current aggressive hiring spree seeks to combat increasingly ramped-up production for the ‌iPhone‌ 13 models, which are believed to be on track for launch late next month.

Related Forum: iPhone

Last month we tracked a pair of discounts on a few Apple accessories offered by Verizon, including the Apple Pencil 2 and 2021 11-inch iPad Pro Magic Keyboard. Both of these sales are still happening on Verizon in August, in addition to a returning discount on the new Siri Remote.

Siri Remote 2 49Note: MacRumors is an affiliate partner with Verizon. When you click a link and make a purchase, we may receive a small payment, which helps us keep the site running.

Starting with the Siri Remote, Verizon has the new version of the Apple TV accessory for $49.97, down from $59.00. This sale first appeared back in May, but it hadn't reemerged until now. If you've been looking to purchase the new Siri Remote as an add-on to an older Apple TV 4K model, this is likely the best price we'll track on this accessory in 2021.

There are a lot of changes to the new Siri Remote, now featuring a one-piece aluminum body, tactile clickpad with five-way navigtation, and rearranged buttons. The clickpad also supports touch gestures, with the outer ring supporting a circular "jog" gesture that Apple says will help you find the exact spot you're looking for in a video.

Apple Pencil 25 off

Moving to the Apple Pencil 2, you can get this accessory for $103.99, down from $129.00. This sale doesn't require any coupon codes, and Verizon offers free two day shipping for most orders placed within the United States.

This is a second-best price for the Apple Pencil 2, and it's a match of the previous best price seen on the accessory earlier in 2021. Although the Apple Pencil 2 has dropped to around $99.00 in the past, this offer is extremely rare and Verizon's current sale is definitely a solid choice if you're on the hunt for the Apple Pencil 2.

2021 Magic Keyboard Blue

Thirdly, the 2021 11-inch iPad Pro Magic Keyboard is priced at $239.19, down from $299.00. This is the first cash discount on the White version of the 11-inch Magic Keyboard that we've tracked, but we have seen the Black model down to around $199 before.

The 11-inch Magic Keyboard first launched in Black in early 2020, and Apple followed up that release with a White finish earlier this spring. Both colors are on sale today on Verizon, and just like the Apple Pencil 2 deal, you won't need any coupon code to see these savings.

The Magic Keyboard features a built-in trackpad and floating cantilever design for multiple viewing angle options. Keep up with all of this week's best discounts on Apple products and related accessories in our dedicated Apple Deals roundup.

Related Roundup: Apple Deals

Apple's plans to scan users' iCloud Photos library against a database of child sexual abuse material (CSAM) to look for matches and childrens' messages for explicit content has come under fire from privacy whistleblower Edward Snowden and the Electronic Frontier Foundation (EFF).

appleprivacyad
In a series of tweets, the prominent privacy campaigner and whistleblower Edward Snowden highlighted concerns that Apple is rolling out a form of "mass surveillance to the entire world" and setting a precedent that could allow the company to scan for any other arbitrary content in the future.

Snowden also noted that Apple has historically been an industry-leader in terms of digital privacy, and even refused to unlock an iPhone owned by Syed Farook, one of the shooters in the December 2015 attacks in San Bernardino, California, despite being ordered to do so by the FBI and a federal judge. Apple opposed the order, noting that it would set a "dangerous precedent."

The EFF, an eminent international non-profit digital rights group, has issued an extensive condemnation of Apple's move to scan users' iCloud libraries and messages, saying that it is extremely "disappointed" that a "champion of end-to-end encryption" is undertaking a "shocking about-face for users who have relied on the company's leadership in privacy and security."

Child exploitation is a serious problem, and Apple isn't the first tech company to bend its privacy-protective stance in an attempt to combat it. But that choice will come at a high price for overall user privacy. Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor...

It's impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger's encryption itself and open the door to broader abuses.

All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children's, but anyone's accounts. That's not a slippery slope; that's a fully built system just waiting for external pressure to make the slightest change.

The EFF highlighted how various governments around the world have passed laws that demand surveillance and censorship of content on various platforms, including messaging apps, and that Apple's move to scan messages and ‌iCloud Photos‌ could be legally required to encompass additional materials or easily be widened. "Make no mistake: this is a decrease in privacy for all ‌iCloud Photos‌ users, not an improvement," the EFF cautioned. See the EFF's full article for more information.

The condemnations join the large number of concerns from security researchers and users on social media since Apple's announcement of the changes yesterday, triggering petitions to urge Apple to roll back its plans and affirm its commitment to privacy.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

The first physical Apple retail store in India, promised by Apple CEO Tim Cook last year to open in 2021, has been delayed due to the worldwide health crisis, the company has confirmed to The Indian Express.

apple india
Last year, during an earnings call, Apple CEO ‌Tim Cook‌ stated that his company plans to open its first official Apple Store in India by 2021. Clearly, due to the unforeseen and unpredictable challenges presented by the global health crisis, Apple has delayed its store opening in Mumbai.

In September, Apple tailored to its strong Indian customer base by opening an online store. This gave customers a direct way to purchase products from the company itself rather than from authorized retailers.

In the past, Cook and other Apple executives have remarked on India's importance, including in the latest quarterly earnings call, stating that Apple had grown double-digits in India, along with a few other countries.

Popular cryptocurrency exchange Coinbase has announced that it is now allowing traders to use bank cards linked to Apple Pay to purchase crypto assets on the platform.

coinbase app mobile

"Today, we're introducing new and seamless ways to enable crypto buys with linked debit cards to Apple Pay and Google Pay, and instant cashouts up to $100,000 per transaction available 24/7," said a Coinbase blog post on Thursday.

"If you already have a Visa or Mastercard debit card linked in your Apple Wallet, Apple Pay will automatically appear as a payment method when you're buying crypto with Coinbase on an Apple Pay-supported iOS device or Safari web browser."

In addition, Coinbase said it is also making it easier and faster for users to access their money by offering instant washouts via Real Time Payments (RTP), allowing customers in the U.S. with linked bank accounts to instantly and securely cash out up to $100,000 per transaction.

In June, Coinbase debit cards gained ‌Apple Pay‌ support, allowing it to be added to the Wallet app on iPhone. The Coinbase Card automatically converts the cryptocurrency that a user wishes to spend to U.S. dollars, and transfers the funds to their Coinbase Card for ‌Apple Pay‌ purchases and ATM withdrawals.

Apple TV+ show "Physical," starring Rose Byrne, has been renewed for a second season, reports Deadline. The dark comedy, whose season one finale airs Friday, sees Byrne take the role of Sheila, a distressed housewife in the 1980s who transforms into an aerobics video star while her husband runs for political office.

physical tv show

"We couldn't be more proud to showcase Annie Weisman's singular take on this darkly funny, heartbreaking and bold story," said Michelle Lee, Director of Domestic Programming at Apple TV+. "And then we got to watch Rose Byrne inhabit this incredible, multi-layered character, giving us an unforgettable tour de force performance. We have been thrilled to see audiences around the world fall in love and feel seen by this show and we can't wait for everyone to experience the next chapter in Sheila's journey towards personal empowerment."

"Physical" was created by Annie Weisman, who has worked on shows like "About a Boy," "Suburgatory," "I Feel Bad," and "Desperate Housewives." The show is directed by "I, Tonya" director Craig Gillepsie, as well as Liza Johnson and Stephanie Laing.

Apple Cash, Apple's peer-to-peer payments service that works with Apple Pay and iMessage, received a couple of small updates on Thursday.

Apple Cash
It's now possible to use Instant Transfer with both Mastercard and Visa debit cards. Previously only the latter card could be used, so the addition of Mastercard means Instant Transfer is more accessible to users who want to quickly transfer money from an Apple Cash balance to a bank account without having to wait for the transaction to be processed.

Apple says that beginning August 26, 2021, the cost of making an Instant Transfer will change to 1.5% (previously 1%) of the transfer amount, with a minimum fee of $0.25 and a maximum fee of $15.

If users don't want to use Instant Transfer, they can also transfer money to their bank account using ACH and receive it within one to three business days with no fee.

To make an Instant Transfer, open the Wallet app and select your Apple Cash card, then tap the three-dotted icon. Tap Transfer to Bank, enter an amount, and select Instant Transfer.

Currently only available in the United States, Apple Cash can be used to make and receive payments in Messages, or you can get Siri to send money to a friend or family member.

When someone sends you money, it goes on your virtual Apple Cash card, which is stored securely in the Wallet app on your iPhone. You can use the money on it to send to someone, make purchases using ‌‌Apple Pay‌‌ in stores, within apps, and on the web.

Apple in May released the new M1 11 and 12.9-inch iPad Pro models, and they're the first iPads that use Apple's M-series chips designed for Macs instead of A-series iOS chips, and the 12.9-inch ‌iPad Pro‌ has an all-new mini-LED display.


We did a hands-on video back when the ‌M1‌ ‌iPad Pro‌ first came out, but MacRumors videographer Dan has been using it daily since launch, and thought he'd revisit it to give an updated review on how it has fit into his workflow and with his other devices.

If you're still on the fence about one of the ‌M1‌ iPad Pros, Dan's video is worth watching to see how it performs in day to day usage over a period of time and whether it's worth picking up.

Apple's today sent out emails highlighting its latest Apple Pay promo, which has a back to school focus. Discounts are available for stores like Bed Bath & Beyond, Billabong, J. Crew, and more.

apple pay back to school promotion

  • Bed Bath & Beyond - 15 percent in My Funds rewards when you shop in the app to use toward a future purchase.
  • Billabong - 30 percent off a single item with promo code APPLEPAY.
  • J. Crew - An extra $25 off when spending $150 or more with promo code APPLEPAY.
  • Lands' End - 45 percent off full-priced styles with promo code APPLEPAY.
  • Quiksilver - 30 percent off any single item with promo code APPLEPAY.

The deals that are mentioned in the email will be available through August 11, 2021. ‌Apple Pay‌ is required when making a purchase to get the discounts.

Apple today announced a series of new child safety initiatives that are coming alongside the latest iOS 15, iPadOS 15, and macOS Monterey updates and that are aimed at keeping children safer online.

iphone communication safety feature arned
One of the new features, Communication Safety, has raised privacy concerns because it allows Apple to scan images sent and received by the Messages app for sexually explicit content, but Apple has confirmed that this is an opt-in feature limited to the accounts of children and that it must be enabled by parents through the Family Sharing feature.

If a parent turns on Communication Safety for the Apple ID account of a child, Apple will scan images that are sent and received in the Messages app for nudity. If nudity is detected, the photo will be automatically blurred and the child will be warned that the photo might contain private body parts.

"Sensitive photos and videos show the private body parts that you cover with bathing suits," reads Apple's warning. "It's not your fault, but sensitive photos and videos can be used to hurt you."

The child can choose to view the photo anyway, and for children that are under the age of 13, parents can opt to get a notification if their child clicks through to view a blurred photo. "If you decide to view this, your parents will get a notification to make sure you're OK," reads the warning screen.

These parental notifications are optional and are only available when the child viewing the photo is under the age of 13. Parents cannot be notified when a child between the ages of 13 and 17 views a blurred photo, though children that are between those ages will still see the warning about sensitive content if Communication Safety is turned on.

Communication Safety cannot be enabled on adult accounts and is only available for users that are under the age of 18, so adults do not need to worry about their content being scanned for nudity.

Parents need to expressly opt in to Communication Safety when setting up a child's device with Family Sharing, and it can be disabled if a family chooses not to use it. The feature uses on-device machine learning to analyze image attachments and because it's on-device, the content of an iMessage is not readable by Apple and remains protected with end-to-end encryption.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Apple today announced that iOS 15 and iPadOS 15 will see the introduction of a new method for detecting child sexual abuse material (CSAM) on iPhones and iPads in the United States.

iCloud General Feature
User devices will download an unreadable database of known CSAM image hashes and will do an on-device comparison to the user's own photos, flagging them for known CSAM material before they're uploaded to iCloud Photos. Apple says that this is a highly accurate method for detecting CSAM and protecting children.

CSAM image scanning is not an optional feature and it happens automatically, but Apple has confirmed to MacRumors that it cannot detect known CSAM images if the ‌iCloud Photos‌ feature is turned off.

Apple's method works by identifying a known CSAM photo on device and then flagging it when it's uploaded to ‌iCloud Photos‌ with an attached voucher. After a certain number of vouchers (aka flagged photos) have been uploaded to ‌iCloud Photos‌, Apple can interpret the vouchers and does a manual review. If CSAM content is found, the user account is disabled and the National Center for Missing and Exploited Children is notified.

Because Apple is scanning ‌iCloud Photos‌ for the CSAM flags, it makes sense that the feature does not work with ‌iCloud Photos‌ disabled. Apple has also confirmed that it cannot detect known CSAM images in iCloud Backups if ‌iCloud Photos‌ is disabled on a user's device.

It's worth noting that Apple is scanning specifically for hashes of known child sexual abuse materials and it is not broadly inspecting a user's photo library or scanning personal images that are not already circulating among those who abuse children. Still, users who have privacy concerns about Apple's efforts to scan user photo libraries can disable ‌iCloud Photos‌.

Security researchers have expressed concerns over Apple's CSAM initiative and worry that it could in the future be able to detect other kinds of content that could have political and safety implications, but for now, Apple's efforts are limited seeking child abusers.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Apple today announced that with the launch of iOS 15 and iPadOS 15, it will begin scanning iCloud Photos in the U.S. to look for known Child Sexual Abuse Material (CSAM), with plans to report the findings to the National Center for Missing and Exploited Children (NCMEC).

Child Safety Feature
Prior to when Apple detailed its plans, news of the CSAM initiative leaked, and security researchers have already begun expressing concerns about how Apple's new image scanning protocol could be used in the future, as noted by Financial Times.

Apple is using a "NeuralHash" system to compare known CSAM images to photos on a user's iPhone before they're uploaded to iCloud. If there is a match, that photograph is uploaded with a cryptographic safety voucher, and at a certain threshold, a review is triggered to check if the person has CSAM on their devices.

At the current time, Apple is using its image scanning and matching technology to look for child abuse, but researchers worry that in the future, it could be adapted to scan for other kinds of imagery that are more concerning, like anti-government signs at protests.

In a series of tweets, Johns Hopkins cryptography researcher Matthew Green said that CSAM scanning is a "really bad idea" because in the future, it could expand to scanning end-to-end encrypted photos rather than just content that's uploaded to ‌iCloud‌. For children, Apple is implementing a separate scanning feature that looks for sexually explicit content directly in iMessages, which are end-to-end encrypted.

Green also raised concerns over the hashes that Apple plans to use because there could potentially be "collisions," where someone sends a harmless file that shares a hash with CSAM and could result in a false flag.

Apple for its part says that its scanning technology has an "extremely high level of accuracy" to make sure accounts are not incorrectly flagged, and reports are manually reviewed before a person's ‌iCloud‌ account is disabled and a report is sent to NCMEC.

Green believes that Apple's implementation will push other tech companies to adopt similar techniques. "This will break the dam," he wrote. "Governments will demand it from everyone." He compared the technology to "tools that repressive regimes have deployed."


Security researcher Alec Muffett, who formerly worked at Facebook, said that Apple's decision to implement this kind of image scanning was a "huge and regressive step for individual privacy." "Apple are walking back privacy to enable 1984," he said.

Ross Anderson, professor of security engineering at the University of Cambridge said called it an "absolutely appalling idea" that could lead to "distributed bulk surveillance" of devices.

As many have pointed out on Twitter, multiple tech companies already do image scanning for CSAM. Google, Twitter, Microsoft, Facebook, and others use image hashing methods to look for and report known images of child abuse.


It's also worth noting that Apple was already scanning some content for child abuse images prior to the rollout of the new CSAM initiative. In 2020, Apple chief privacy officer Jane Horvath said that Apple used screening technology to look for illegal images and then disables accounts if evidence of CSAM is detected.

Apple in 2019 updated its privacy policies to note that it would scan uploaded content for "potentially illegal content, including child sexual exploitation material," so today's announcements are not entirely new.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Apple today previewed new child safety features that will be coming to its platforms with software updates later this year. The company said the features will be available in the U.S. only at launch and will be expanded to other regions over time.

iphone communication safety feature

Communication Safety

First, the Messages app on the iPhone, iPad, and Mac will be getting a new Communication Safety feature to warn children and their parents when receiving or sending sexually explicit photos. Apple said the Messages app will use on-device machine learning to analyze image attachments, and if a photo is determined to be sexually explicit, the photo will be automatically blurred and the child will be warned.

When a child attempts to view a photo flagged as sensitive in the Messages app, they will be alerted that the photo may contain private body parts, and that the photo may be hurtful. Depending on the age of the child, there will also be an option for parents to receive a notification if their child proceeds to view the sensitive photo or if they choose to send a sexually explicit photo to another contact after being warned.

Apple said the new Communication Safety feature will be coming in updates to iOS 15, iPadOS 15 and macOS Monterey later this year for accounts set up as families in iCloud. Apple ensured that iMessage conversations will remain protected with end-to-end encryption, making private communications unreadable by Apple.

Scanning Photos for Child Sexual Abuse Material (CSAM)

Second, starting this year with iOS 15 and iPadOS 15, Apple will be able to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, enabling Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies.

Apple said its method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, Apple said the system will perform on-device matching against a database of known CSAM image hashes provided by the NCMEC and other child safety organizations. Apple said it will further transform this database into an unreadable set of hashes that is securely stored on users' devices.

The hashing technology, called NeuralHash, analyzes an image and converts it to a unique number specific to that image, according to Apple.

"The main purpose of the hash is to ensure that identical and visually similar images result in the same hash, while images that are different from one another result in different hashes," said Apple in a new "Expanded Protections for Children" white paper. "For example, an image that has been slightly cropped, resized or converted from color to black and white is treated identical to its original, and has the same hash."

apple csam flow chart
Before an image is stored in iCloud Photos, Apple said an on-device matching process is performed for that image against the unreadable set of known CSAM hashes. If there is a match, the device creates a cryptographic safety voucher. This voucher is uploaded to iCloud Photos along with the image, and once an undisclosed threshold of matches is exceeded, Apple is able to interpret the contents of the vouchers for CSAM matches. Apple then manually reviews each report to confirm there is a match, disables the user's iCloud account, and sends a report to NCMEC. Apple is not sharing what its exact threshold is, but ensures an "extremely high level of accuracy" that accounts are not incorrectly flagged.

Apple said its method of detecting known CSAM provides "significant privacy benefits" over existing techniques:

• This system is an effective way to identify known CSAM stored in iCloud Photos accounts while protecting user privacy.
• As part of the process, users also can't learn anything about the set of known CSAM images that is used for matching. This protects the contents of the database from malicious use.
• The system is very accurate, with an extremely low error rate of less than one in one trillion account per year.
• The system is significantly more privacy-preserving than cloud-based scanning, as it only reports users who have a collection of known CSAM stored in iCloud Photos.

The underlying technology behind Apple's system is quite complex and it has published a technical summary with more details.

"Apple's expanded protection for children is a game changer. With so many people using Apple products, these new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material," said John Clark, the President and CEO of the National Center for Missing & Exploited Children. "At the National Center for Missing & Exploited Children we know this crime can only be combated if we are steadfast in our dedication to protecting children. We can only do this because technology partners, like Apple, step up and make their dedication known. The reality is that privacy and child protection can co-exist. We applaud Apple and look forward to working together to make this world a safer place for children."

Expanded CSAM Guidance in Siri and Search

iphone csam siri
Third, Apple said it will be expanding guidance in Siri and Spotlight Search across devices by providing additional resources to help children and parents stay safe online and get help with unsafe situations. For example, users who ask Siri how they can report CSAM or child exploitation will be pointed to resources for where and how to file a report.

The updates to Siri and Search are coming later this year in an update to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey, according to Apple.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Apple today announced a new Apple Music for Artists feature called Shareable Milestones, which is designed to allow ‌Apple Music‌ artists to share key milestones and successes with their fans on social media.

apple music milestones
The ‌Apple Music‌ for Artists feature generates automatic milestones for artists of all sizes, which artists can highlight on social media networks. Milestones include new highs and all-time bests across Plays and Shazams, and inclusion in ‌Apple Music‌'s curated playlists.

Artists will see images celebrating their milestones on their iOS overview page for ‌Apple Music‌ for Artists, and can tap the share icon to open up the share sheet. Users will also see relevant milestones on the song and country detail pages. Milestones can be shared to Facebook, Twitter, Instagram, and Facebook and Instagram stories.

Sharing milestones is a feature that's limited to artists at the current time and the images can only be accessed through the ‌Apple Music‌ for Artists iOS app.

Though limited to ‌Apple Music‌ for Artists at the current time, this is perhaps a feature that Apple could roll out to all ‌Apple Music‌ users in the future, allowing them to share metrics that are normally only available through the year-end Apple Music Replay and Recap features.

Apple Music for Artists is available to all artists who use ‌Apple Music‌. It provides artists and their teams with sales and streaming data for songs, albums, playlists, and more.