Mac of the Future: the CPU

Future of the CPUFor the past few years, predicting the CPUs that Apple would put inside its Macs has been relatively easy. Ever since Apple made the move toIntel’s x86 processors, the Mac road map mirrored Intel’s road map: Intel would release a new CPU, and a few months later Apple would release a new Mac. It was like clockwork, and it removed some of the surprise from Apple’s otherwise difficult-to-predict product-release cadence.


But over the past year, Apple effectively smashed that clock. It all started with the MacBook Pros released in April 2010.
In that round of updates, only two members of the MacBook Pro family—the 15-inch () and 17-inch () models—got Intel’s then-new “Arrandale” microprocessors (more popularly known as the Core i5 and Core i7 chips). Those CPUs took advantage of some of Intel’s most up-to-date technologies—including a 32-nanometer manufacturing process, Hyper-Threading, and Turbo Boost. The 13-inch MacBook Pro (), however, stuck with the older Core 2 Duo CPU.
The simple decision to stick with the Core 2 Duo indicated two things: first, that the Apple–Intel relationship might not be as cozy as it once was; and, second, that Apple really likes graphics processing units (GPUs). Those two points will drive much of Apple’s hardware decision-making over the next two years.

Intel Inside?

Apple is the best kind of manufacturer a CPU vendor could partner with: Its products virtually market themselves. And being associated with the Apple brand is still a very good thing. The company is known as an early adopter of new technologies (at least those it believes in). The retail prices for its products are high enough to allow the company to use the best available hardware. Apple owns the hardware and software stack, so it can implement new features on a whim without waiting for slow software partners to catch up to market trends.
Though working with Apple can certainly be a pain, those benefits are apparently lucrative enough that Intel relaxed almost all of its usual marketing standards. Apple chooses where to put Intel’s logos on its products. You won’t always see a mention of specific Intel brands in Apple marketing. (Apple does include Intel model numbers in its tech specs.) For example, nVidia gets a mention on the box the Mac mini ships in, but Intel doesn’t.
As far as I can tell, Apple’s customers didn’t mind when it used the Core 2 Duo in the 13-inch MacBook Pro (and, more recently, in the 11- and 13-inch MacBook Airs ). If those products sell just fine, Apple will probably no longer see the need to use Intel’s latest and greatest products.
At the same time, Apple has seen the need to use powerful GPUs in its computers. You can’t buy a Mac today that doesn’t have a robust GPU of some sort. Even the 15- and 17-inch MacBook Pros pair their integrated graphics with an nVidia GPU, just in case you need it. Thanks to the OpenCL spec, such GPUs can be used for more than just real-time graphics rendering, taking on general computing tasks as well.
So you have a company that seems no longer to care as much as it once did about Intel’s CPUs, but that increasingly cares a lot about GPUs. While I can’t imagine Apple dropping Intel altogether, these two factors make me wonder whether Apple will at least consider using CPUs from AMD in the next two years.

The AMD Option

AMD’s CPU–GPU strategy is a little different from Intel’s. AMD has started introducing its first Fusion class of processors, which it calls APUs (Accelerated Processing Units). These APUs combine an AMD x86 CPU with an AMD GPU on a single die. The GPUs that AMD is implementing are not only very powerful compared with Intel’s GPUs, but they are capable of running general-purpose apps via OpenCL should a developer choose to write to them.
The first AMD processor that should be of interest to Apple-watchers is known as the E-350. Its CPU falls between that of an Intel Atom and a Core 2 Duo, but offers much better graphics performance. (For the microprocessor architects among you, it’s effectively an out-of-order Atom paired with an 80 SP GPU.) I don’t really see a spot for the E-350 in Apple’s current lineup, unless Apple wants to push out a MacBook Air that’s even smaller (or less expensive) than the 11-inch model.
Next up is the Llano. This APU will pair a CPU that’s faster than the E-350’s with a GPU that’s much faster. Llano could be an interesting option for Apple’s smaller notebooks, but I don’t see Apple giving up CPU performance in the larger MacBook Pros for one of these integrated AMD solutions.
Sometime in 2012, however, AMD will likely release a new, more powerful CPU core and pair that with one of its GPUs. If Apple is going to consider moving any of its products to AMD, that would be the time. Apple and AMD have been discussing Fusion over the past couple of years. Whether or not it’s simply to keep Apple’s options open is up for debate at this point. I guess we’ll find out in 2012.

Building a Sandy Bridge

Intel won’t be standing still all of that time. The second-generation of its Core i-series CPUs (code-named “Sandy Bridge”) will come out in January 2011. Apple would get much better overall performance and hardware-accelerated video transcoding from these chips; their adoption in Macs is pretty much guaranteed.
Sandy Bridge will have on-die graphics, but that hardware won’t support OpenCL. While I believe Sandy Bridge’s graphics will be fast enough for the majority of OS X users, I don’t think Apple will want to stop shipping OpenCL-capable GPUs in its systems. For that reason, we may continue to see discrete GPUs shipped with most Macs sold even in 2011.
I’d expect to see Sandy Bridge chips appear in MacBooks, MacBook Pros, and iMacs sometime in the first or second quarter of 2011; I’d expect the Mac Pro to get Sandy Bridge sometime in 2011’s third or fourth quarter. Given the recent release of the new models, I wouldn’t expect to see the MacBook Air get a serious update until late 2011 at the earliest.
In late 2011 or early 2012, Intel is expected to release Sandy Bridge’s successor: Ivy Bridge. That should provide a more capable GPU core than Sandy Bridge; whether or not it will meet Apple’s requirements for a compute-ready GPU is still unknown.