Graphics Cards Guide
- ‹ Prev
- Next ›
Mainstream DirectX 10
Mainstream DirectX 10
When NVIDIA introduced the first DirectX 10 graphics card for the PC last year, the GeForce 8800 GTX, most of us could only gawk at its powerful performance and equally impressive price tag from the sidelines. It was undoubtedly the fastest card then and it still is, but it also remains out of the reach of most users. Those with limited budgets could only look forward to the inevitable lower and mid-range variants of NVIDIA's new architecture.
Fast forward six months and the highly anticipated mid-range GeForce 8 cards are finally ready to be unveiled, with two separate series (the GeForce 8600 and 8500) catering to a range of budgets. Leading the charge is the GeForce 8600 GTS, equipped with a new 80nm core (G84) designed for NVIDIA's GeForce 8 architecture and promising even more PureVideo HD enhancements. This series of graphics cards looks set to be the intended mid-range successor to NVIDIA's highly successful GeForce 7600 series, especially given its similar naming convention. Presently, NVIDIA has three different GeForce 8600/8500 cards shipping before May, namely the GeForce 8600 GTS, the 8600 GT and the 8500 GT (in decreasing order of performance and price). For our article today, we shall be focusing mostly on the fastest of the three, the GeForce 8600 GTS. First, let's take a look at how this newcomer stacks up on paper against some of its likely competitors in the market now.
|Model||NVIDIA GeForce 8600 GTS 256MB||NVIDIA GeForce 8800 GTS 320MB||NVIDIA GeForce 7950 GT 512MB||ATI Radeon X1950 PRO 256MB||ATI Radeon X1950 GT 256MB|
|Transistor Count||289 million||681 million||278 million||330 million||330 million|
|Manufacturing Process (microns)||0.08||0.09||0.09||0.08||0.08|
|Vertex Shaders||32 Stream Processors (operating at 1450MHz)||96 Stream Processors (operating at 1200MHz)||8||8||8|
|Rendering (Pixel) Pipelines||24||12||12|
|Pixel Shader Processors||24||36||36|
|Texture Mapping Units (TMU) or Texture Filtering (TF) units||16||48||24||12||12|
|Raster Operator units (ROP)||8||20||16||12||12|
|Memory Clock||2000MHz DDR3||1600MHz DDR3||1400MHz DDR3||1380MHz DDR3||1200MHz DDR|
|DDR Memory Bus||128-bit||320-bit||256-bit||256-bit||256-bit|
|Ring Bus Memory Controller||NIL||NIL||NIL||512-bit (for memory reads only)||512-bit (for memory reads only)|
|PCI Express Interface||x16||x16||x16||x16||x16|
|Molex Power Connectors||Yes||Yes (dual)||Yes||Yes||Yes|
|Multi GPU Technology||Yes (SLI)||Yes (SLI)||Yes (SLI)||Yes (Native CrossFire ready)||Yes (Native CrossFire ready)|
|DVI Output Support||2 x Dual-Link||2 x Dual-Link||2 x Dual-Link||2 x Dual-link||2 x Dual-link|
|HDCP Output Cable?||Yes||Yes||Yes||Yes||Yes|
|Street Price||US$199 - 229 (SRP)||~ US$299 - 309||~ US$219 - 249||~ US$159 - 199||~ US$149|
The G84 Core
The NVIDIA GeForce 8600 GTS uses a brand-new variant of the GeForce 8 core, the G84, manufactured using an 80nm process compared to the 90nm on the original G80. As might be expected from a mid-range card, the G84 core is a watered down version of the G80, with only 32 unified shaders (or as NVIDIA calls them, stream processors). This is less than half of the 96 found on the GeForce 8800 GTS 320MB. NVIDIA did compensate somewhat by clocking these stream processors at 1450MHz instead of the 1200MHz on the GeForce 8800 GTS. The core clock is the highest we have seen for a GeForce 8 card so far at 675MHz, a full 100MHz faster than the GeForce 8800 GTX.
All these high clocks however are constrained by the narrow 128-bit memory bus on the new cards. Despite hopes that we will see an increase in memory bandwidth for this new generation of graphics cards, NVIDIA has kept with a 128-bit bus for the GeForce 8600 and 8500. This is the same as the GeForce 7600 series and gives the GeForce 8600 GTS a total bandwidth of 32.0GB/s even with its vastly superior 2000MHz DDR memory clock, and compares unfavorably against existing 256-bit memory bus equipped cards like the GeForce 7900 GS. Appropriately, the memory size is set at 256MB as given these constraints we doubt having a larger memory buffer in this case will significantly boost performance. The crux of the matter is to keep costs low and the specs of this new series clearly looks to be doing just that.
Power and Other Notes
As you may have noticed too, the transistor count for the G84 core has been signifacantly reduced, from the record breaking 681 million on the GeForce 8800 series (G80 core) to the more reasonable 289 million on the new core for the midrange parts. Power consumption will definitely be lower than its high end counterparts (max TDP for the GeForce 8600 GTS is rated at 71W by NVIDIA) and so only the GeForce 8600 GTS requires the mandatory 6-pin Molex power connector. The slower GeForce 8600 GT and 8500 GT will not need the power connector. Minimum PSU requirements as recommended by NVIDIA is a 350W unit for a single GeForce 8600 GTS and 450W if you intend to setup a pair of these cards in SLI.
On a related note, NVIDIA has also said that in a break from the past, these new cards will have identical clocks for both 2D and 3D mode. In other words, the clock speeds are no longer lower for 2D and then increased when running 3D applications. NVIDIA feels that with the modest TDP of the GeForce 8600/8500 series, there is little power savings to be had by doing so. While we would beg to differ on this matter, the fact is that Windows Vista does continuously utilize the GPU to keep its fancy Aero user interface running slick and pretty. Drawing notes from history, Vista (fortunately or unfortunately) will very likely be the defacto operating in time and as such NVIDIA's decision sounds valid. However, Win XP users could have still benefited from further power savings if NVIDIA opted to still operate the GPU with dual clock speeds. Looking at the future usage model and in the perspective of easing GPU design, we can understand this decision.
So far, looking at the modest specifications, the NVIDIA GeForce 8600 GTS doesn't look like it will be setting any speed records in the gaming department but NVIDIA has something else up its sleeves and that is none other than a major boost to its PureVideo HD technology. We'll discuss the enhancements next.
- ‹ Prev
- Next ›