Apple has been awarded a patent by the U.S. Patent and Trademark Office (via AppleInsider) for a digital camera including a refocusable imaging mode adapter, with the document also discussing the potential use of a similar camera system in a device like the iPhone.
The patent details a camera that is able to be configured to operate in a lower-resolution mode that includes refocusing capability in addition to a high-resolution non-refocusable mode, with the camera's body containing an image mode adaptor to switch between the two.
Also cited in the patent is the plenoptic imaging system used in the Lytro light-field camera, which Apple draws inspiration from but points out that its own microlens array can produce higher-quality images because of a higher spatial resolution. Apple also cites the Lytro's camera system as prior art in the patent.
Microlens (440) inserted into light path for lower-resolution refocusable images
A digital camera system configurable to operate in a low-resolution refocusable mode and a high-resolution non-refocusable mode comprising: a camera body; an image sensor mounted in the camera body having a plurality of sensor pixels for capturing a digital image;
An imaging lens for forming an image of a scene onto an image plane, the imaging lens having an aperture; and an adaptor that can be inserted between the imaging lens and the image sensor to provide the low-resolution refocusable mode and can be removed to provide the high-resolution non-refocusable mode,
The adaptor including a microlens array with a plurality of microlenses; wherein when the adaptor is inserted to provide the low-resolution refocusable mode, the microlens array is positioned between the imaging lens and the image sensor.
Microlens (440) removed from light path for higher-resolution standard images
Apple's patent outlines how such a lens system could be integrated with a more complete camera solution incorporating image correction and other features, either in a standalone product or within a mobile device.
The Lytro-like technology naturally leads to speculation that it could be used in Apple's rumored standalone point-and-shoot digital camera, which was first rumored in 2012 after Steve Jobs was quoted his biography done by Walter Isaacson stating his desires for the future involved the reinvention of three industries, with one of them being photography. Isaacson's biography also noted that Jobs had met with the CEO of Lytro, although it has been unclear how much direct interest Apple had in Lytro's technology.
Thursday October 31, 2024 9:42 am PDT by Tim Hardwick
Apple is set to release iOS 18.2 in December, bringing the second round of Apple Intelligence features to iPhone 15 Pro and iPhone 16 models. This update brings several major advancements to Apple's AI integration, including completely new image generation tools and a range of Visual Intelligence-based enhancements. There are a handful of new non-AI related feature controls incoming as well.
...
We're officially in the month of Black Friday, which will take place on Friday, November 29 in 2024. As always, this will be the best time of the year to shop for great deals, including popular Apple products like AirPods, iPad, Apple Watch, and more.
Note: MacRumors is an affiliate partner with some of these vendors. When you click a link and make a purchase, we may receive a small payment,...
Thursday October 31, 2024 7:06 pm PDT by Joe Rossignol
The first Geekbench 6 benchmark results for the M4 Pro chip surfaced today. Impressively, the results that are available so far show that the highest-end M4 Pro chip is faster than the highest-end M2 Ultra chip in terms of peak multi-core CPU performance.
Here is a comparison of the results:
Mac mini with M4 Pro (14-core CPU): 22,094 multi-core score (average of 11 results)
Mac Studio...
Friday November 1, 2024 4:04 am PDT by Tim Hardwick
The iPhone SE 4 that's set to come out early next year is expected to debut Apple's first in-house 5G modem, according to Jeff Pu, an analyst who covers companies within Apple's supply chain.
In a research note this week with Hong Kong-based investment firm Haitong International Securities, Pu said Apple is expected to roll out its custom-made 5G modem starting with the next-generation...
Monday November 4, 2024 10:54 am PST by Juli Clover
With the second beta of iOS 18.2 that's available for developers today, Apple has further fleshed out the ChatGPT integration that's available with Siri. In the Settings app, there's now a section that shows the ChatGPT daily limit, and offers an option to upgrade to the paid ChatGPT Plus plan.
The beta includes an Advanced Capabilities section with a "Daily Limit" reading that shows up as...
Friday November 1, 2024 8:04 am PDT by Joe Rossignol
Apple's new M4 Pro and M4 Max chips are impressively fast in terms of CPU performance, topping the M2 Ultra, but what about graphics performance?
The first Geekbench 6 results for GPU performance are now available for the M4 Pro and M4 Max, and the Metal scores reveal some impressive year-over-year gains. Based on the Metal scores that are available so far, the M4 Pro and M4 Max are up to...
Friday November 1, 2024 9:40 am PDT by Joe Rossignol
After a busy October in which Apple announced new Macs and Apple Intelligence launched, the calendar has now turned to November. Below, we outline what to expect from Apple this month as the slower-but-still-busy holiday season approaches.
After seeding the first betas of iOS 18.2, iPadOS 18.2, and macOS Sequoia 15.2 with additional Apple Intelligence features last month, Apple will likely...
This is stupid. Nobody has ever had a need to refocus after the shot, because you can focus when you TAKE the shot in the first place. Also, smartphones small sensors have a huge depth-of-field anyways. You only have shallow/unfocused images in large sensors....
This is what I was getting at. A human user would not likely want to re-focus a shot but a computer might. The computer would do the re-focus in order to gain depth information. With such info it could create a wire frame and a texture map.
Combine this wire frame 3D image with the 3D sensor they reported yesterday can you can drop a real person into a video game.
Today if you tried that with a still image you'd have a "cardboard cut out" dropped into the game. It would look bad. But a real 3D character. People would by that.
You could turn it around backwards too. Take a re-focusable image of a room. Now you can drop a virtual camera into the scene and move the camera around. In a game you could place the chargers in your environment but for way a real-estate sales you can make better presentations because you have the 3D data to allow perspective changes with viewpoint changes.
So YES,I agree, who would want to refocus an image? Answer software would.
This is stupid. Nobody has ever had a need to refocus after the shot, because you can focus when you TAKE the shot in the first place. Also, smartphones small sensors have a huge depth-of-field anyways. You only have shallow/unfocused images in large sensors.
It's a dead end technology.
The most important and useful photography technology that Apple could implement would be to add optical image stabilization. The next would be larger sensors.
Other options would be to allow for interchangeable lenses, and to provide Aperture capability on a mobile device.
A professional photographer has a need to edit and publish photos as quickly as possible. The genius thing about smart phones is that they allow the editing/publishing part to happen in mobile devices in field. The next step would be to implement a higher-quality imaging system (35mm full-frame sensors, various lenses, flash/strobe mounts, other SLR features, etc..)
No need to get silly with light field tech. Just look at what's needed (high-end imaging and rapid publishing) and implement a solution for that.
Although this is very cool, I would much appreciate a megapixel update on the next iphone if possible apple... even just a little to keep up with the nokia lumia!!
It's not the number of pixels, but the size that counts.
Although this is very cool, I would much appreciate a megapixel update on the next iphone if possible apple... even just a little to keep up with the nokia lumia!!
good that you're not in charge of Apple. Megapixels are not everything.
Light field technology is the only way smart phone cameras can continue to shrink in size and increase in quality. Good news.
Quality? The lightfeild image has much lower resolution. That is why Apple's patent allows you use to switch from normal to light field. One way you get good images with one plane focused and the other mode allows refocusabl image but with much lower resolution.
If the sensor has only so many pixels you can use those pixels in two ways. A light field camera might use 100 sensor pixels per image pixel.
How could Apple use this? The technology makes for a good 3D camera too. I doubt many people will want to re-focus their images but they might want 3D and stereo images with one click. Light field can do that.