During Intel's annual investor relations event earlier this week, Intel outlined a fundamental shift in its future processor designs that will likely impact Apple's future notebooks.
Until now, the bulk of Intel's notebook chips are design to draw around 35 watts of power--many of its notebook parts are lower, and some are higher, but 35 watts is the center point for Intel's portable lines. Going forward, however, the new center point will be in the 10 to 15 watt range.
Intel's future roadmap for notebook processors will now target a much lower power draw then present chips. That means ultra-low voltage processors like those found in the MacBook Air will become the norm instead of a specialty product.
Intel seems to be clearly feeling the pressure of the growing smartphone and tablet market, According to the Financial Times, Otellini describes a future of PCs evolving into "higher performance mainstream-priced, touch-enabled device that would not compromise on features such as thinness, instant-on capabilities, permanent internet connectivity and all-day battery life." Apple's notebook line will certainly benefit from these advances.
Intel and Apple have had a close relationship since Apple switched over to Intel's processors several years ago. Apple has frequently been the first computer manufacturer to ship the latest Intel technologies. In a Reuters report yesterday, Intel said they work very closely with Apple and that Apple even influences their roadmap:
"We work very closely with them and we're constantly looking down the road at what we can be doing relative to future products. I'd go as far as to say Apple helps shape our roadmap," Kilroy said.
Wednesday November 13, 2024 2:09 am PST by Tim Hardwick
Apple is set to release iOS 18.2 next month, bringing the second round of Apple Intelligence features to iPhone 15 Pro and iPhone 16 models. This update brings several major advancements to Apple's AI integration, including completely new image generation tools and a range of Visual Intelligence-based enhancements. There are a handful of new non-AI related feature controls incoming as well....
Thursday November 14, 2024 4:19 pm PST by Juli Clover
The M4 MacBook Pro models feature quantum dot display technology, according to display analyst Ross Young. Apple used a quantum dot film instead of a red KSF phosphor film, a change that provides more vibrant, accurate color results.
Young says that Apple has opted for KSF for prior MacBook Pro models because it doesn't use toxic element cadmium (typical for quantum dot) and is more...
Wednesday November 13, 2024 11:01 am PST by Juli Clover
A trio of Apple customers this month filed a class action lawsuit against Apple, accusing the Cupertino company of violating California consumer protection laws and false advertising for continuing to sell AirPods Pro models that had ongoing issues with crackling or static sounds.
A few months after the AirPods Pro came out in October 2019, buyers began to complain about crackling, rattling, ...
Thursday November 14, 2024 2:54 am PST by Tim Hardwick
Google has launched its dedicated Gemini artificial intelligence app for iPhone users, expanding beyond the previous limited integration within the main Google app. The standalone app offers enhanced functionality, including support for Gemini Live and iOS-specific features like Dynamic Island integration.
The new app allows iPhone users to interact with Google's AI through text or voice...
Wednesday November 13, 2024 11:59 am PST by Juli Clover
Apple last week replaced the M3 Max MacBook Pro with the new M4 Max MacBook Pro, and we picked up one of the new high-end MacBook Pro machines to see how it compares to the prior model with both benchmarks and real-world tests.
We tested an M4 Max with a 16-core CPU, 40-core GPU, and 48GB RAM against an M3 Max MacBook Pro with similar specs. The two machines look similar, but the display on...
Wattage ratings for CPUs are not power ratings but TDP ratings for OEMs to build appropriate cooling solutions.
They are all we got and pretty much all we need. Idle power usages have gone down every year but the TDP often affects the idle usage too. The TDP determines the suitability of a certain chip. While MBA could run a 130W when it's idling, the CPU would shut itself down when actually doing something since the cooling isn't appropriate.
As the mainstream CPUs are now 35W, that means you can't build a small, thin laptop and put one of those in it without heat issues. Clearly, Intel wants reduce the footprint of laptops and the only way they can do that is to produce more efficient CPUs with lower TDP.
Intel has their Xscale ARM before sold the whole division to Marvell few years ago, Intel doesn't make ARM cpu anymore.
My point was, Intel target power is 10-15 watts while ARM is less than 1 watts.
I'm sure ARM will not take over Intel in Desktop space anytime soon, but the opposite is still true. I still wonder who will won the next cpu war: slim down a fat architecture or beef up a slime design.
I remember reading an article about ARM vs Intel what stated that the possible issue with ARM is that power consumption and performance don't scale up evenly. ARM seems to work great in ~1W areas but its performance might be horrible when you start increasing the frequency and core count and thus the TDP (i.e. it does not scale up. E.g. you double the clock speed but your TDP becomes 10 times as big). Especially if the architecture is designed for 1W areas.
There will come a time (sooner than you think) when all that intensive work will *not* demand a so-called "higher-end" processor
Sorry, but this seems like a huge contradiction.
If there exists "intensive work", that is considered more processing intensive than other applications, wouldnt it then require a higher echelon of processors as opposed to less powerful solutions?
Are you saying that bottom end processors of the future will totally overkill the ever evolving complex and intensive applications of the time...seems like rubbish to me.
The only way your comment would be close to accurate, is if software development stands still. :rolleyes:
There will come a time (sooner than you think) when all that intensive work will *not* demand a so-called "higher-end" processor, or (and more likely), that those high-end processors will require a fraction of the power they require today. Looking it what the iPad 2 is capable of today, it's pretty astounding.
The power consumption has actually gone up. With Pentium 4s for instance, the maximum TDP was 115W and the CPUs we have now have maximum TDP of 130W. iPad is nothing else but a brick when it comes intensive tasks such as true video editing (i.e. more than cut&paste that you can do with iMovie) and 3D rendering.
You won't see high performance CPUs that require only a fraction of power anytime soon. There is, and will always be, a market for the fastest CPUs, even if it means more heat and higher power draw.
Could this "shift in design" partially reflect the "3D" re-architecture of transistors? Not to be naive in assuming that they will have their cake and eat it too...but I dont think this lower consumption will always result in performance below current LV/ULV chips.
Im hoping this will be a general evolution in efficiency (current performance at lower TDPs)
The Tri-Gate will transistors definitely help. I didn't mean that lowering the TDP would cause the CPUs to be slower than their predecessors ;) What it can cause, however, is that the performance upgrade will be smaller than what it would have been if the TDPs stayed the same.
Most likely, Intel's approach will take some time so this doesn't mean that Ivy Bridge mobile CPUs will all be 10-15W. Like I said, Intel will probably offer more lineup for different usages. High-performance laptops with higher TDP and then mainstream laptops with less CPU power but longer battery life etc. Quite similar to what we have now but might be that the TDPs of all CPUs will come down (e.g. 15W for mainstream, 25W performance etc).
remember the current 65 watts Quad Core CPU (similar to the ones in iMac) used to cost premium price than the 95watts last year compared to now
LV CPUs still cost a nice premium over the SV chips.
We will still see 35W and 45W mobile CPUs though. Reducing the power consumption means slower performance and not everyone is ready to sacrifice performance for better battery life and stuff. For an average user, even a 10W Atom is sufficient so widening the lineup of low-voltage CPU sounds reasonable.
To be honest, I wouldn't mind a low-power MBA with +10 hours of battery.
I've read about the ARM since it's first use in the Newton. and in my understanding, the ARM is a pure RISC design, a very small core built with efficiency in mind. They don't have branch prediction and deep execution pipe like x86 processor, limiting their effective power in desktop environment. It's like comparing a regular 3L V6 engine with a 1.6 turbo V4 running at 11,000 RPM, both could achieve about the same HP. But the V6 can be push more ahead burning fuel and the V4 will have better fuel efficiency at low speed. While ARM is already push to it's limit, core multiplication and expending the base design of ARM can obliterate those limit in near future.
The interesting part come from Intel, saying right now ARM mobile CPU is growing twice as fast as the Moore Law.
I could see why ARM would be going twice as fast as Moore for little while. My guess is because it only more recently been really developed and pushed so it is more or less playing catch up and using tricks and technology learned from the other CPU lines over the years. I am willing to bet it will slow down and drop to moore law speed after a while.
Really don't understand what you mean. Are you saying work will become less intensive, or processors will become faster+more-energy efficient? or are you saying software will become multi-threaded allowing it to leverage multiple energy-efficient cores to get performance, making it both fast and less energy?
He is just repeating Apple catch phases and his church of Apple worship.
I will tell you multithreading/multicore coding is hell to do in programming and a huge pain in the ass to get it all working correctly because so many more things can go wrong plus you have to make sure they are not trying to write or change the same set of data at the same time. Single threading is so much easier to code and design for than multi threading.