The iOS 17 Health app introduces a new feature that allows you to keep track of your emotions and moods over time, so you can get an idea of your overall mental wellbeing.
With the iOS 17 Health app, you can quickly make a note of your state of mind each day. There's an option to choose a feeling ranging from Very Unpleasant to Very Pleasant, and then a list of words that you can use to hone in on your general mood.
From there, you can select a reason for the way that you're feeling, such as work, partner, hobbies, money, health, family, and more. Logging a mood takes just a few seconds, and the Health app will aggregate the data over time.
Mood logging can also be done on an Apple Watch running watchOS 10 through the Mindfulness app. On the watch, if you open up the Mindfulness app, there's a new "Log State of Mind" option. Tapping it lets you log how you're feeling right now or how you've felt throughout the day.
Apple says that it also plans to link your mood reports to your activities like Workouts and sleep so you can see what might have an impact, plus the company plans to provide standardized mental health assessments so you can see your risk for anxiety or depression.
With the iOS 17 Photos app, Apple is making it easier to crop your photos to your specifications. When you zoom into an image, there's a new "Crop" button that appears in the upper right.
Tapping the Crop button brings up the crop interface with the zoom level that you've selected, so you can crop into the part of the image that you prefer with just a couple of taps. If you like your crop, tap on Done, and if you want to change anything, the full editing interface is available.
In iOS 16, cropping involves tapping into the Edit interface, choosing the crop tool, and adjusting the crop from there with either pinch zoom gestures or by dragging the corners of the cropping tool. You can still edit images in this way, but it's quicker to use the new zoom crop feature.
Apple today announced the Apple Vision Pro, a wearable augmented and virtual reality headset. Because of the way the headset fits against the face, it does not accommodate glasses, but Apple has a solution for those who need prescription lenses.
Apple is partnering with Zeiss to offer Zeiss Optical Inserts that can be customized to each person's vision prescription. The inserts will attach to the Vision Pro lenses magnetically, allowing for precise viewing and eye tracking.
There is no word yet on what the Zeiss prescription inserts will cost, but Apple says that vision correction accessories will be sold separately.
Glasses wearers will need a valid prescription to get the inserts, and not all prescriptions will be supported, so there will be some limitations that might prevent some people from using the headset.
The Vision Pro headset will be available in early 2024 and it will be priced starting at $3,499.
Apple's Vision Pro headset is the company's first new product category since the Apple Watch, and it is unlike any other Apple device. It runs an operating system called visionOS, and developers will need to create augmented and virtual reality apps specifically for the headset.
To ensure that there are a wide selection of experiences available at launch, Apple plans to provide Apple Vision Pro developer kits to developers at some point in the future.
Apple says that developer kits will be offered to help developers bring their creations to life on Vision Pro, and that they will offer the ability to quickly build, iterate, and test on the headset. Developers will be able to apply to get a kit, but Apple hasn't offered details on when the kits will be made available.
The last developer kit that Apple offered was a Mac mini with an Apple silicon chip inside in 2020, and it was provided to developers to help them transition from Intel to Apple silicon. Apple sold the Mac mini machines for $500, but developers had to return them at the end of the testing program.
Apple will likely require developers to purchase an Apple Vision Pro headset to create apps for it, but the purchase price will include access to beta software, developer labs, discussion forums, technical support, and other resources if the Apple Vision Pro development kit is similar to the Apple silicon development kit.
Along with Apple Vision Pro developer kits, Apple says that it will offer Apple Vision Pro compatibility evaluations for existing apps and opportunities for developers to visit a Vision Pro developer lab that provides demonstrations of visionOS, iPadOS, and iOS apps running on the headset. Labs will be available in Cupertino, London, Munich, Shanghai, Singapore, and Tokyo.
More information on these tools for developing content for the Vision Pro headset will be available in July. The visionOS SDK will be available later this month.
With the first iOS 17 beta, Apple has introduced a new accessibility feature called Personal Voice. First highlighted earlier this year, Personal Voice is designed to allow you to use artificial intelligence to create a replica of your voice.
The feature is aimed at those who are at risk of losing their ability to speak, with Personal Voice offering these individuals the chance to "create a voice that sounds like them" for communication purposes.
Personal Voice is available in the initial iOS 17 beta, so developers can begin testing it right away. It can be found under Accessibility > Personal Voice. Creating a Personal Voice is process that takes around an hour. Recording requires a quiet place with little to no background noise, with Apple instructing users to speak naturally at a consistent volume while holding the iPhone approximately six inches from the face.
If there is too much background noise in your location, Apple will warn you that you need to find a quieter place to record.
Personal Voice requires you to read a series of sentences aloud, after which your iPhone will generate and store your Personal Voice. The Personal Voice can then be used with the Live Speech feature, which allows users to type-to-speak in FaceTime, the Phone app, and other communication apps.
Personal Voice will be available to the general public when Apple releases the first public beta of iOS 17. Apple has said that iOS 17 will be available to public beta testers next month.
If your Messages app often becomes cluttered with one-time verification codes that you need to manually delete, it's going to get a lot easier to clean them up in iOS 17.
The iOS 17 update includes a new option to delete verification codes in Messages (and Mail) after they've been inserted into an app or website through the Autofill feature.
"Clean Up Automatically" can be toggled on for verification codes in the Passwords app under Password Options.
Note that autofill verification codes are new to the Mail app in iOS 17, and work the same way as the verification code autofill feature for the Messages app. When you get a one-time code emailed to you, the Mail app can detect it and use it in Safari automatically without you having to swap over to the Mail app.
Apple says that one-time codes sent to both Mail and Messages can be deleted automatically if the delete feature is turned on.
Apple is enhancing the security of Safari in iOS 17, and private browsing now requires Face ID authentication or a passcode to access. If you open up a private browsing window in Safari, you will need to authenticate with Face ID.
That means someone who has access to your unlocked phone and opens your Safari browser won't be able to get to your private browsing history without secondary authentication. Face ID or a passcode can be used to access the Safari private tabs.
Private browsing also completely blocks known trackers from loading on pages and removes tracking added to URLs as you browse, improving privacy. Websites are prevented from tracking or identifying your device with these new additions, plus Apple also offers improved extension control.
In private mode, extensions with website access are turned off, and you will need to manually re-enable them. iCloud Private Relay also uses IP address locations based on country and time zone rather than a general location.
Safari in iOS 17 is also gaining a Profiles feature so you can keep your personal browsing and work browsing separate, with different histories, Tab Groups, cookies, and favorites.
Apple today previewed iOS 17 for the iPhone, and one of the key new features coming with the update is improved autocorrect functionality.
Apple says iOS 17 includes a state-of-the-art language model for word prediction that will greatly improve autocorrection on the iPhone. Any time you type, on-device machine learning will intelligently correct mistakes with greater accuracy than ever before. In addition, you will now receive predictive text recommendations inline as you type, allowing for words or complete sentences to be added by tapping the space bar.
Autocorrection has an updated design on iOS 17 that briefly underlines an autocorrected word. Tapping on an underlined word reveals the original word that you typed, making it easy to quickly revert the change. The system will also learn your typing habits over time and avoid some corrections, which Apple's software engineering chief Craig Federighi said is designed for "those moments where you just want to type a ducking word."
iOS 17 is available in beta starting today for members of Apple's Developer Program, and will be publicly released later this year. Autocorrection can be enabled or disabled in the Settings app under General → Keyboard → Auto-Correction.
Apple in iOS 17 is introducing StandBy mode, which is a new display experience designed for a charging iPhone that's placed in a horizontal orientation. An iPhone in this position is able to display an array of full-screen widgets, turning it into a useful home hub.
StandBy mode activates automatically on an iPhone running iOS 17 that's placed horizontally on a charger. You can see information such as the time, weather, a calendar, music controls, your photos, and more.
You can swipe left or right through the available StandBy options, and long press or swipe up/down to customize. With the time, for example, you can choose from an analog view, a digital view, a bubbly font, and a solar view where the background color shifts based on the time.
There are options to add Home Screen widgets to the main StandBy view, which is the first option you see when activating StandBy. In this view, you can select two widgets to display side-by-side, so you can control your HomeKit products, see your calendar events, keep an eye on the stock market, check on device battery life, see the weather, and more.
If a Live Activity is active, it will be displayed full screen, as will results from Siri requests.
At night, StandBy mode will get darker if the room you're in is dark, so that it is not distracting at night while you're sleeping. StandBy mode is akin to Nightstand Mode on the Apple Watch, and it functions in much of the same way.
Note that having the iPhone continually show information requires an iPhone 14 Pro or Pro Max with always-on display technology. On other iPhones, a tap is required to see what's on the screen.
iOS 17 introduces a much-requested AirTag feature, the option to share an AirTag with another person. Since launch, AirTags have only been able to be owned and used by a single person, but that's changing in the iOS 17 update.
In the Find My app, you can select an AirTag and choose the "Share This AirTag" option to invite a contact of yours. The invited person will be able to see the location of the AirTag just as you can, which is useful if you're lending an item with an AirTag attached to it to a friend or family member because it eliminates those annoying tracking alerts.
A shared car with an AirTag, for example, will no longer send up warnings about an unknown AirTag when the person who doesn't own the AirTag is using the car. You can invite anyone to see an AirTag, and remove the person at any time as well, so temporary sharing is possible.
Sharing also works with Find My-enabled items, so it is not limited to AirTags. A person who has access to an item or an AirTag can track it and play a sound.
Apple today introduced an Apple silicon version of the Mac Pro that uses the new M2 Ultra chip, and with that update, Apple's transition to Apple silicon is now complete. The first Apple silicon Mac came out in 2020, and three years later, every Mac is using Apple-designed chips.
The Mac Pro was the last Mac that was still using older Intel chip technology, and with the launch of the new M2 Ultra model, the Intel versions have been discontinued.
Apple may still be selling refurbished Intel Macs through its online store for refurbished devices, but none of its current product lineup is using Intel's chip technology.
The M2 Ultra chip is available in both the Mac Pro and the Mac Studio, both of which can be preordered today and will launch next week. The M2 Ultra Mac Pro is priced starting at $7,000, while the M2 Ultra Mac Studio is priced starting at $4,000.
Prior to the launch of the latest Apple TV 4K, there were rumors that Apple planned to build Find My functionality into the Siri Remote to make it easier to find. That never happened, but Apple is introducing a similar location function for the Siri Remote in tvOS 17.
If the Siri Remote becomes lost, tvOS 17 users can open up the Apple TV remote on the Control Center on a linked iPhone to locate their Siri Remote. There is a Find My-like interface that will guide them toward the remote, with the size of the onscreen circle growing to guide movement.
The feature for locating a Siri Remote works with the second-generation or later version of the device, and tvOS 17 is required as well.
The Remote location feature is likely possible due to deeper integration between the iPhone and iPad and the Apple TV. Apple has also introduced a FaceTime app for the Apple TV that uses a connected iPhone or iPad as the camera source.
Other new tvOS 17 features include updated screen savers that use your photos, an option to immediately switch to your profile when you use the Apple TV with your iPhone remote, and a revamped Control Center that makes it easier for you to access key settings and information.
Following the keynote event, Apple began allowing members of the press to get a quick look at the Apple Vision Pro headset in person. Apple has several demonstration areas set up, but as of right now, media attendees can only see the device and aren't able to try it out.
The headset has a futuristic, sleek look, with Apple mounting the devices on stands to give the media a closer look. The external battery pack can be clearly seen connected to the headset through a cable at the side of the device.
Design wise, the headset is not unlike a pair of ski goggles, featuring a wrap-around display that's held against the face by a soft mesh and a seal that keeps out the light. The headband is made from a soft, braided material that's meant to be comfortable to wear for longer periods of time.
There is no word yet on whether media attendees will be given a chance to test out the headset today or later this week, but there is a good chance that we'll soon be seeing some first impressions.
Apple today seeded the first beta of the newly announced macOS 14 Sonoma update to developers for testing purposes. While the beta is limited to developers at this time, Apple plans to provide a public beta later this summer.
Registered developers can download the beta through the Apple Developer Center and after the appropriate profile is installed, with the betas available through the Software Update mechanism in System Settings.
macOS Sonoma introduces new Apple TV-like screen savers that also serve as wallpapers after you log in, plus it moves widgets to the desktop. You can use the new widget gallery to choose from a range of widgets, and then drag them to your Mac's desktop.
Widgets can be arranged in any way that's useful, and when you're using an app, they are designed to fade into the background so they're less distracting. Widgets are more interactive than before, so you can use them to do things like play music, turn off the lights in your home, and more. Through Continuity, your iPhone's widgets can also show up on your Mac's desktop.
Video conferencing has improved with a new Presenter Overlay view that shows your desktop or project in new ways, plus Safari now supports web apps for the Dock and the option to create Profiles so you can separate personal browsing from work browsing.
Other new features include improved search that's faster and more responsive, password and passkey sharing, a revamped stickers interface for the Messages app, PDF integration in notes that makes it easier than ever to manage PDFs, and more.
macOS Sonoma will be in beta testing for several months, with a public release set to come in September or October.
Following today's keynote event, Apple has released the first betas of iOS 17 and iPadOS 17 to developers for testing purposes. The betas are only available for those with a developer account, and Apple is restricting downloads with the elimination of the previous profile system.
Registered developers are able to opt into the betas by opening up the Settings app, going to the Software Update section, tapping on the "Beta Updates" option, and toggling on the iOS 17 Developer Beta. Note that an Apple ID associated with a developer account is required to download and install the beta.
iOS 17 is a major update that introduces a customized look for each person that calls, with the person who places the call able to customize their look. Live voicemails let you see a transcript of a message someone is leaving in real time so you can choose to pick up the phone if you want, and voice messages people send in iMessage are now transcribed into text. You can also record a video or audio message when someone misses your FaceTime call, and FaceTime works on the Apple TV through Continuity functionality.
In the Messages app, apps have been moved to a new tucked-away interface for a cleaner look, and there is a new Check In feature that is designed to let your friends and family keep an eye on you when you're traveling. Check In automatically notifies friends or family members when you arrive at a destination, such as home. Locations can also now be shared directly from the Messages app.
In a group chat, there's a catch-up arrow so you can see the first message you haven't seen in a conversation, and with search filters, you can more easily find what you're looking for. Stickers have been overhauled, and all emoji are now stickers, living alongside sticker packs and Memoji. Using the remove from background feature in iOS 17, you can turn the subject from any image into a sticker.
With StandBy, an iPhone placed horizontally turns into a little home hub that displays information like the calendar, time, home controls, and more, and Live Activities can be displayed in full screen too.
Widgets on the Home Screen are interactive, so you can do things like check off an item on a to-do list or turn off the lights without having to open an app. AirDrop has been improved and there's a NameDrop function for sharing contacts quickly, plus you can hold two iPhones together to start a SharePlay session. SharePlay also now works with CarPlay so passengers can play their music in the car too.
Other new features include a journaling app coming later this year, AirPlay in select hotel rooms, improvements to AirPods Pro 2 thanks to a new Adaptive Audio feature, offline Maps, Siri that does not require the "Hey" activation, and improvements to search and spotlight.
While today's beta is limited to developers, Apple will be providing a public beta later this summer.
Following this morning's keynote event, Apple has seeded the first beta of an upcoming watchOS 10 update to developers for testing purposes.
To install the watchOS 10 update, developers will need to open the Apple Watch app, go to the Software Update section under "General" in Settings, and toggle on the watchOS 10 Developer Beta. Note that an Apple ID linked to a developer account is required.
Once beta updates have been activated, watchOS 10 can be downloaded under the same Software Update section. To install software, an Apple Watch needs to have 50 percent battery life and it must be placed on an Apple Watch charger.
watchOS 10 adds a whole new widget-focused interface. You can access a widget stack from any watch face using the Digital Crown, swiping through them to get to relevant information. Control Center can be activated from any app by pressing the side button, and these new quick access controls are meant to let you use watch faces that support less information while still putting everything you need at your fingertips.
There are new Palette and Snoopy watch faces, updates to Cycling and Hiking workouts, and mental health integrations. Users can log their state of mind and mood using the Apple Watch, with the device providing insights into mental health over time.
watchOS 10 is limited to developers at the current time, but Apple will offer a public beta later this summer, with an official release to follow this fall.
Apple today seeded the first beta of an upcoming tvOS 17 beta to developers for testing purposes.
Registered developers are able to download the tvOS 17 update by opting in to the beta through the Settings app on the Apple TV. A registered developer account is required.
tvOS updates don't receive as much attention as updates to iOS and macOS, and are never as feature rich, but tvOS 17 adds brings FaceTime to the TV for the first time. The Apple TV 4K can connect to an iPhone or iPad that serves as the camera, with the FaceTime interface showing up on the TV's screen.
All of the FaceTime features are available, including Center Stage to keep you front and center, plus there are new gesture-based reactions that let you use your hands to generate on-screen effects. For example, if you make a heart with your hands, the screen will display hearts.
Split View for Apple TV lets users watch television with friends and family using SharePlay, and there are controls for transferring calls between the TV and an iPhone or iPad as needed. Third-party apps like Zoom will also be able to take advantage of this functionality, so those apps will also work on the TV screen.
Control Center on Apple TV has been revamped and it is easier for users to access key settings and information, plus there is a new feature that allows the iPhone to locate a Siri Remote that's been misplaced.
Apple today introduced the long-awaited AR/VR headset that it has had in development for the last several years, and as rumors suggested, it is expensive. The Vision Pro is priced starting at $3,499.
Given the high price point, there appears to be a single model, with no color options or accessories for the device, despite rumors. There could, however, be different storage options as Apple did mention a starting price.
The headset will not be launching in 2023, with Apple instead planning to debut it in early 2024. When it launches, it will be available solely in the United States, though it will expand to other countries later on in 2024.
Demonstrations will be available at Apple retail locations closer to the device's launch date.