AMD vs Intel: Gaming with No Graphics Card - Just say NO!

Just because something is cheap doesn't mean it represents value for money. Choosing the right components for your next gaming PC can mean the difference between a string of headshots or a shameful tea-bagging, but when the budget is tight you have to make some compromises.

Buying an inexpensive pre-built PC off the shelf at your local big box retailer isn't such a bad idea anymore; gone are the days where you had to spend specialist cash for a kick ass gaming rig just to keep in front of the curve. In the past, cheap desktops stayed cheap because manufacturers left out a certain component that is critical for gamers: a graphics card. For those totally in the dark about PC hardware, the graphics card is the slab of technology inside the PC that allows games to render 3D graphics to your monitor at high speed. Basically put, the better (and more expensive) the graphics card, the flashier your game's graphics would be. To deliver 3D graphics without a dedicated video card, cheap desktops used "integrated graphics", which was industry code for "a complete waste of time". This was basically a very cut-down graphics card built into the motherboard (the big board that everything else plugs into) of your PC, and it was pathetically bad at running games.



Where integrated graphics were once the dirty words of PC building, chip makers AMD and Intel are both out to kill the graphics card with a variation of this old solution. They're doing this by bringing integrated graphics into the future. Rather than just beef up the solutions they already had, the companies have gone for the throat and built these graphics processors right into the main system processor, the CPU (Central Processing Unit), which is the brains of the PC. Thus the APU (Accelerated Processing Unit) was born, combining the CPU and graphics card into a single (and hopefully capable) product. A few short months ago it was a given that you needed a graphics card in your system to get any kind of gaming action going. Today it's a supposedly a different story and, depending on what kind of games you play, you can save serious cash simply by not buying a graphics card.

Will these new chips let you have the same kind of gaming experience that we're used to, while still saving that money? We did the testing to find out exactly what you get when you don't spend a single penny. The answer probably won't surprise you, despite what the advertisements say.

Which chips include graphics?
Monster chip giant Intel now build graphics capability into the vast majority of its processors. The key words to look for when choosing an Intel processor for its graphics capabilities are "Intel HD Graphics". Intel HD Graphics comes in two flavours, with the lower end represented by the HD Graphics 2000, while the HD Graphics 3000 is moderately faster. Intel describe the HD Graphics 3000 as providing casual mainstream gaming capabilities along the lines of an "entry level" graphics card. What that actually means is anyone's guess.

Plucky underdog AMD are betting the farm on its CPU/graphics combo, launching a huge series of new chips under a new brand called Fusion. AMD are the ones who have come up with the APU phrase that we've used elsewhere in this article. AMD's been playing a long game with this technology, buying old graphics stalwart ATI Technologies in 2006 to begin the process of melding chalk and cheese. Any AMD processor sold under the Fusion brand name includes graphics on the chip, but the speed of the graphics is tied to the model of the processor. Pay more, get a faster CPU, and faster graphics as well.

Both companies spend a lot of time and effort telling people just how awesome its products graphics capabilities are – and for normal people it's a step up from what they're used to. The real story is different for us gamers, as today's blockbuster PC games are very demanding in terms of the graphics hardware needed to provide the best visual experience. While any PC can run your mother's favourite game of Solitaire, it requires some serious grunt to run the latest Battlefield title.

In the past, if you wanted to be a hardcore PC gamer, you'd build a gaming PC by buying a middle of the road CPU and pairing it with the best graphics card you could afford. Whether it was a super-jumbo of the graphics card world like the $750 NVIDIA GTX590, or a more modest $180 Radeon HD 6870 card, the graphics card was a core component of any gaming rig. So AMD and Intel face a very tough job – to replicate the same kind of performance delivered by a big graphics card, then shrink it down so that it fits into one small chip. Did they manage it?

The answer is no, not even close.

The Truth Is Out There
The performance level of these new APUs only competes with the cheapest of graphics cards available today. We grabbed a couple of these new products, AMD Fusion's $140 A8-3850 APU and Intel's $220 Core i5 2500K, and ran them through some of our favourite games to see how they stacked up. Without going into endless detail, or even worse, death by poorly made graphs, there is a pretty clear difference between what these chips are capable of and what a moderately priced graphics card can do.

First off the rank is StarCraft II. It's not the newest game around, but like the original StarCraft it's going to be around for a long, long, long time. (Well, at least until Diablo III comes out, 'cause all bets are off after that. You won't see me, that's for sure.) Realistically, both the AMD and Intel chips -- without the help of any dedicated graphics card -- are capable of playing StarCraft II at a HD resolution with all the eye candy set to medium. Yes, that's rather impressive for a graphics card-free system. We'd class the AMD chip's performance as good, playing nicely with a minimum of jerkiness. The Intel chip wasn't quite as compelling, being noticeably jerky compared to the AMD setup beside it.

We repeated the same tests with the graphically demanding Metro 2033, just for a bit of run and gun action, and got a pretty poor performance from both. It was only when we dropped the resolution to 1024 x 768 that it was finally playable, with the AMD offering wiping the floor with Intel.

There is one game though that both chips smashed through; Blizzard's World of Warcraft: Cataclysm. The AMD Fusion A8-3850 was noticeably smoother, but the Intel isn't a slouch either. It's no surprise really, as WoW is an older game which doesn't present much of a challenge in the graphics department. If all you want to play is WoW - and that's actually a big percentage of PC gamers - either chip will be fine to do the job, with the AMD being the preferred solution.

So that's two from three that are quite playable, and as we tried more games we noticed a few trends. First and foremost, the AMD chip is far better than the Intel offering - to be blunt, we wouldn't recommend the Intel chip for any gaming purposes at all. The second conclusion, and it's a biggie, is that the AMD chip is fine for gaming provided you run at a low resolution (1024 x 768), and you run at minimum graphical detail settings. Yes, it's a big sacrfice to make - most PC games look worse than the console versions when run at this detail level.

If you want to play the newest games the way the developers intended, with the eye candy cranked up at HD resolution, you will still need to buy a dedicated graphics card. The integrated graphics and APUs turning up in systems at the bottom end of the budget range just don't make the cut. The pure grunt required to churn through all the fancy effects and fast frame rates of today's blockbuster games just isn't there yet.

The good news is that these chips are much better than the integrated graphics of ye olden days, and the next generation is only going to get better. Whether or not they'll ever catch up to the supercomputers-on-a-card that are today's graphics cards is debatable, but they'll at least offer solid gaming performance for those on the tightest of budgets.