Apple Launches New Blog to Share Details on Machine Learning Research

Apple today debuted a new blog called the "Apple Machine Learning Journal," with a welcome message for readers and an in-depth look at the blog's first topic: "Improving the Realism of Synthetic Images." Apple describes the Machine Learning Journal as a place where users can read posts written by the company's engineers, related to all of the work and progress they've made for technologies in Apple's products.

In the welcome message, Apple encourages those interested in machine learning to contact the company at an email address for its new blog, machine-learning@apple.com.

apple machine learning journal

Welcome to the Apple Machine Learning Journal. Here, you can read posts written by Apple engineers about their work using machine learning technologies to help build innovative products for millions of people around the world. If you’re a machine learning researcher or student, an engineer or developer, we’d love to hear your questions and feedback. Write us at machine-learning@apple.com

In the first post -- described as Vol. 1, Issue 1 -- Apple's engineers delve into machine learning related to neural nets that can create a program to intelligently refine synthetic images in order to make them more realistic. Using synthetic images reduces cost, Apple's engineers pointed out, but "may not be realistic enough" and could result in "poor generalization" on real test images. Because of this, Apple set out to find a way to enhance synthetic images using machine learning.

Most successful examples of neural nets today are trained with supervision. However, to achieve high accuracy, the training sets need to be large, diverse, and accurately annotated, which is costly. An alternative to labelling huge amounts of data is to use synthetic images from a simulator. This is cheap as there is no labeling cost, but the synthetic images may not be realistic enough, resulting in poor generalization on real test images. To help close this performance gap, we’ve developed a method for refining synthetic images to make them look more realistic. We show that training models on these refined images leads to significant improvements in accuracy on various machine learning tasks.

In December 2016, Apple's artificial intelligence team released its first research paper, which had the same focus on advanced image recognition as the first volume of the Apple Machine Learning Journal does today.

The new blog represents Apple's latest step in its progress surrounding AI and machine learning. During an AI conference in Barcelona last year, the company's head of machine learning Russ Salakhutdinov provided a peek behind the scenes of some of Apple's initiatives in these fields, including health and vital signs, volumetric detection of LiDAR, prediction with structured outputs, image processing and colorization, intelligent assistant and language modeling, and activity recognition, all of which could be potential subjects for research papers and blog posts in the future.

Check out the full first post in the Apple Machine Learning Journal right here.

Popular Stories

New Things Your iPhone Can Do in iOS 18

20 New Things Your iPhone Can Do in iOS 18.2

Monday December 16, 2024 8:55 am PST by
Apple released iOS 18.2 in the second week of December, bringing the second round of Apple Intelligence features to iPhone 15 Pro and iPhone 16 models. This update brings several major advancements to Apple's AI integration, including completely new image generation tools and a range of Visual Intelligence-based enhancements. Apple has added a handful of new non-AI related feature controls as...
iphone 16 apple intelligence

Apple Drops Plans for iPhone Hardware Subscription Service

Wednesday December 18, 2024 11:39 am PST by
Apple is no longer planning to launch a hardware subscription service that would let customers "subscribe" to get a new iPhone each year, reports Bloomberg's Mark Gurman. Gurman first shared rumors about Apple's work on a hardware subscription service back in 2022, and at the time, he said that Apple wanted to develop a simple system that would allow customers to pay a monthly fee to gain...
iPhone 17 Pro Dual Tone Feature 1

iPhone 17 Pro Rumored to Stick With 'Triangular' Camera Design

Wednesday December 18, 2024 2:36 am PST by
Contrary to recent reports, the iPhone 17 Pro will not feature a horizontal camera layout, according to the leaker known as "Instant Digital." In a new post on Weibo, the leaker said that a source has confirmed that while the appearance of the back of the iPhone 17 Pro has indeed changed, the layout of the three cameras is "still triangular," rather than the "horizontal bar spread on the...
elevation lab airtag battery

Your AirTag's Battery Will Last for Up to 10 Years With Elevation Lab's New TimeCapsule Enclosure

Wednesday December 18, 2024 10:05 am PST by
Elevation Lab today announced the launch of TimeCapsule, an innovative and simple solution for increasing the battery life of Apple's AirTag. Priced at $20, TimeCapsule is an AirTag enclosure that houses two AA batteries that offer 14x more battery capacity than the CR2032 battery that the AirTag runs on. It works by attaching the AirTag's upper housing to the built-in custom contact in the...
apple tv 4k yellow bg feature

New Apple TV Rumored to Launch Next Year With These Features

Tuesday December 17, 2024 9:02 am PST by
The current Apple TV 4K was released more than two years ago, so the streaming device is becoming due for a hardware upgrade soon. Fortunately, it was recently rumored that a new Apple TV will launch at some point next year. Below, we recap rumors about the next-generation Apple TV. Bloomberg's Mark Gurman last week reported that Apple has been working on its own combined Wi-Fi and...
blackmagic vision pro

Blackmagic Debuts $30K 3D Camera for Capturing Video for Vision Pro

Monday December 16, 2024 4:17 pm PST by
Blackmagic today announced that its URSA Cine Immersive camera is now available for pre-order, with deliveries set to start late in the first quarter of 2025. Blackmagic says that this is the world's first commercial camera system designed to capture 3D content for the Vision Pro. The URSA Cine Immersive camera was first introduced in June, but it has not been available for purchase until...
mac pro creativity

Apple Launched the Controversial 'Trashcan' Mac Pro 11 Years Ago Today

Thursday December 19, 2024 7:00 pm PST by
Apple launched the controversial "trashcan" Mac Pro eleven years ago today, introducing one of its most criticized designs that persisted through a period of widespread discontentment with the Mac lineup. The redesign took the Mac Pro in an entirely new direction, spearheaded by a polished aluminum cylindrical design that became unofficially dubbed the "trashcan" in the Mac community. All of ...
iPhone 17 Slim Feature

'iPhone 17 Air' With 'Major' Design Changes and 19-Inch MacBook Detailed in New Report

Sunday December 15, 2024 9:47 am PST by
Apple is planning a series of "major design" and "format changes" for iPhones over the next few years, according to The Wall Street Journal's Aaron Tilley and Yang Jie. The paywalled report published today corroborated the widely-rumored "iPhone 17 Air" with an "ultrathin" design that is thinner than current iPhone models. The report did not mention a specific measurement, but previous...

Top Rated Comments

MikhailT Avatar
97 months ago
Except no, it isn’t. It isn’t in this area or other areas.

Apple is sharing their knowledge, and others? They aren’t! Except informercials.
I think he meant that Apple is having a hard time recruiting more AI researchers/scientists that needs to be able to publish their works (they're not the engineers type). In order for Apple to benefit from their minds, they have to start opening up to the public. This isn't your traditional CS work, this is purely scientific research that has a long history of journal-based reviews and public access.

There were many rumors that many AI researchers turned down jobs at Apple simply because they would not be able to publish their works. For these scientists, it is not about the money or the company, it is all about having their work published with their name on it.

In addition, this is one of the areas where knowing other research benefits everyone at the same time.

Google, Facebook, Microsoft and others are in fact publishing their works throughout various mediums (magazines, research papers, etc).

In fact, they all started a partnership to share research among each other, Partnership on AI here: https://www.partnershiponai.org (Apple is founding member along with Microsoft, IBM, Google, Facebook, Amazon, etc.
Score: 11 Votes (Like | Disagree)
AngerDanger Avatar
97 months ago
In the interest of being all scientific and sharing stuff, I read about half of the blogpost and realized some of the implications of its content. The blog specifically uses the example of human eye recognition in its explanation of machine learning and refined synthetic machine-based learning. Hmmmm, I wonder what thing Apple could be using all of this ocular information for? ;)

Assessing Gaze
Part of the blog places emphasis on knowing which direction the sampled eyes are looking. In fact, if the refinement process moves the iris too much, that output is (I think) weighted as less accurate. In the rumors leading up to the iP8 release, many commenters have voiced concern over the device's ability to understand whether or not you actually want it to unlock; it seems Apple might be attempting to address that concern.



Use of Monochrome Samples
Folks have also discussed the potential inability for iris/eye scanning technology to work in the dark, but perhaps they're not considering that your iPhone (or Android) can already see you in the dark. When held to your face during a call in a dark environment, it will shut the screen off. Next to the earpiece, there's a little IR LED that illuminates objects held close to it, and when the phone sees that particular of IR light, it shuts the screen off.



If that light were brighter, it could illuminate the user's entire face. However, because it's only IR light, it wouldn't see the full visible spectrum of light (RGB); it would only see monochrome faces in the dark. It just so happens that the sample images Apple is using are already monochrome.

Anyway, I gotta go buy more tinfoil for my hat!

Attachment Image
Score: 6 Votes (Like | Disagree)
Crzyrio Avatar
97 months ago
Wait?!

Apple launches a blog with employees talking about how they are doing their job???????

The hell froze over, Steve Jobs DEFINITIVELY wouldn’t allow THAT!
It is a must in the AI field
Score: 5 Votes (Like | Disagree)
alwaysbeincontact Avatar
97 months ago
Neat, interesting stuff, nice to Apple getting into blogging now and posting about this future tech.
Score: 4 Votes (Like | Disagree)
dabirdwell Avatar
97 months ago
Interesting! I didn't know about this partnership. I wonder how Elon Musk feels, and why Tesla hasn't joined.
He has OpenAI.

https://www.wired.com/2016/04/openai-elon-musk-sam-altman-plan-to-set-artificial-intelligence-free/
Score: 2 Votes (Like | Disagree)
Zirel Avatar
97 months ago
Wait?!

Apple launches a blog with employees talking about how they are doing their job???????

The hell froze over, Steve Jobs DEFINITIVELY wouldn’t allow THAT!
Score: 1 Votes (Like | Disagree)