The Wii U — R700 GPU
The Wii U — What is an R700 GPU and why does it matter?
So we’ve all heard the rumors about Nintendo’s upcoming Wii U console, and how it’s going to be some strange tablet-y thing with a touchscreen and analog sticks and a microwave and a cat. Or something. It’s all very interesting and strange and I don’t really care because it’s going to be obsolete the minute you finish the next Zelda title. Here’s why.
The rumor mill started saying early on that the Wii U would use an AMD/ATI Radeon R700 GPU, and it’s been all but confirmed by Nintendo. But what does that mean? What is an R700 GPU?
Complicated and relevant nonsense
Let’s start with the basics. A GPU is a Graphics Processing Unit, that is the chip in a computer or console that processes geometry, textures, lighting, and postprocessing effects and composites them into the image that you see on your screen when you play a video game. Modern GPUs use dozens or hundreds (or even thousands) of tiny, specialized microprocessors called “shaders” to execute the complex mathematical operations that turn a collection of binary digits and code into an actual image. A system’s GPU is the primary determinant of the quality of graphics that it displays.
There are a number of factors that affect graphics performance; the number of shader cores, the clockspeed they run at, the amount and speed of the memory used, the bandwidth of the bus linking the chip to the CPU, etc. It’s all very complicated, but unless you’re a PC gamer you don’t have to worry about it too much, and if you are a PC gamer just ask an older PC gamer and they can do the hard thinking stuff for you.
The GPUs in current generation consoles are old. Very old. The PS3′s GPU is based on Nvidia’s GeForce 7800/7900 series GPUs, which are so old they predate the modern unified shader graphics paradigm, instead using fixed function hardware for specific rendering tasks. The Xbox 360′s Xenos GPU is a predecessor to the ATI Radeon HD2600 graphics card from 2006, which while it does at least have unified shaders, it literally was one of the first GPUs to use them, and the implementation is primitive.
The Wii is even worse off. While its exact graphics architecture is unknown due to Nintendo’s silence and harsh non-disclosure agreements, it’s no secret that it is massively inferior in graphics processing power to the Xbox 360 and PS3.
The bad news
So with the new generation of consoles fast approaching, surely we can expect some better graphics, right? Nintendo may not grab the top performance crown, but they can at least try to surpass the current generation, right? Right?
And so we thought. When the rumor mill informed us that the Wii U would use an R700 GPU there was much rejoicing. R700 is the graphics architecture used in AMD/ATI’s Radeon HD4000 series graphics cards. This was a highly successful series for AMD, as their HD2000 series had been stomped flat by Nvidia GeForce 8, and their Radeon HD3000 series had been largely ignored. The flagship 4870 and 4850 graphics cards from AMD came in and dominated the graphics market vs Nvidia’s GeForce 200 series with superior bang for the buck, although they didn’t quite nab the top performance crown.
So what’s the problem? Well for starters, apparently Nintendo thinks this is still 2008. Radeon 4000? Come on Mr. Iwata, we’re on Radeon 7000 and GeForce 600, with Radeon 8000 and GeForce 700 fast approaching. The 4870 is a lower-midrange card by today’s standards and will soon be totally obsolete.
More bad news
But wait, why are we assuming Nintendo will use a 4870 GPU? If that was the case why not say specifically, “Radeon RV770XT GPU” (the HD4870′s chip’s designation)? There were tons of graphics cards in the Radeon HD4000 series, from the behemoth dual-GPU Radeon 4870×2 monster to the low-end 4350 and the 4200 integrated solution. Not to mention the 4850×2, 4890, 4860, 4850, 4830, 4770, 4750, 4730, 4670, 4650, and 4550, and the mobile line as well.
So which is Nintendo using? Well they aren’t just using the exact chips that were used four years (FOUR YEARS) ago. At the very least it seems likely they’ll do a die shrink to 32nm (basically moving to a more modern manufacturing process that will reduce power consumption). As for how powerful that chip will be… Well, more powerful chips cost more money. And sources say the Wii U is being built on the cheap. And developers have said that the Wii U has less shader cores than current gen consoles.
Wait, less? The Xbox 360 has only 48 shaders. Considering that the R700′s shaders come in blocks of five (a configuration called “VLIW5″) that means the Wii U has at most 40 shaders. Umm. That’s half as many shaders as the 4350, a card never even intended for gaming, an HTPC card that could barely decode Blu-Ray. That sounds like the Radeon HD4200 IGP, a chip built into motherboards and dirt cheap laptops with bare minimum specs to power ~720p displays. The most high-fidelity game you could hope to play on it is Doom 3, on lowest settings.
Well, maybe we’re interpreting it wrong. Less shaders… Well R700′s shaders are grouped into clusters of five. Maybe they meant shader clusters, not total shaders. That would 40 * 5 = 200 shaders. That’s more than the 4350, but still less than the 4650. For comparison, the 4650 is about as powerful as Nvidia’s current lowest-end graphics card, the GT520.
Very nice and useful details. I am thinking of getting new graphic so which is the best option available in the market?
Originally Posted by mystieus
Last edited by AdrianBoyd; 04-14-2013 at 12:34 PM.