This is an old archive page of HWZ prior to revamp. Please visit HWZ for the latest reviews and news.


» News

GeCube Radeon X1300 256MB DDR2 (AGP)
Graphic Cards | First Looks
Thu 23 Feb 2006

Delaying the Inevitable


This is the Rialto bridge chip responsible for the AGP version of the Radeon X1300.

You'll need to connect the GeCube directly to your power supply. Too bad GeCube didn't see fit to include the Molex cable.

The GeCube Radeon X1300 has the standard output ports

It is hardly news that the AGP interface is slowly being phased out by the faster PCI Express. All new motherboards are now configured for the newcomer. While the graphics chipmakers have acquiesced to market demands with bridge chips that convert native PCI Express chipsets to be compatible with the AGP interface, the introduction of the latest generation of graphics cards from both ATI and NVIDIA seemed to spell the end. NVIDIA initially had no native AGP cards although ATI left that to the discretion of its partners. Recently, this situation has improved somewhat. For those still clinging onto their ageing but still decently fast systems, NVIDIA has a new GPU, the GeForce 7800 GS while for ATI fans, GeCube tries to revive the AGP platform singlehandedly with its Radeon X1300 256MB DDR2.

Rialto Powered

The Rialto bridge chip introduced by ATI enabled GeCube to effortlessly adapt the Radeon X1300 from its native PCI Express form to the AGP interface. This chip is found at the back of the card and could reach rather high temperatures during operation. Also, the AGP interface is unable to supply adequate power to the GeCube Radeon X1300 so you will need to connect directly to your power supply through the external connector. More importantly, the GeCube Radeon X1300 brings the latest ATI graphical architecture to the AGP platform, including features like support for DirectX 9.0 Shader Model 3.0, advanced High Dynamic Range rendering and ATI's Avivo technology for video encoding and playback.

For the Low-End Segment

The GeCube Radeon X1300 is clocked slightly higher than the default design. The core clock is the same as the reference card at 450MHz but the memory has been given a tiny speed bump of 30MHz DDR to 530MHz DDR. However, the margin is rather small so don't expect to find any major performance increase. While we don't have any quantitative results of this mini overclock, from our experience, the effect is practically negligible in the real world.

There are few equivalents to the GeCube Radeon X1300 256MB DDR2 since the AGP market is devoid of any comparable cards from the new generation of graphics processors. Hence, the competition faced by the GeCube should come from older cards, like the NVIDIA GeForce 6600. While it may seem unfair to pit the formerly mid-range GeForce 6600 against the entry-level GeCube Radeon X1300, the reality is that both cards are rivals now as their prices and performance are in the same ballpark.

Based on its technical specifications and our preliminary results, the GeCube Radeon X1300 had a slight edge over the GeForce 6600 for newer games and benchmarks while older games like Unreal Tournament 2004 and of course OpenGL based games such as Quake 4 prefer the NVIDIA card by a healthy margin.

Minimal Bundle

Never known to be generous, GeCube has provided a pretty barren bundle. Consisting of only the graphics drivers, a thin user manual and a S-Video to Composite cable, GeCube probably saved quite a fair bit here that we really hope would be reflected in its retail price. Standard accessories that were missing include the DVI-to-VGA adaptor. Crucially, the required Molex power connector was not even included, which could be a problem for some end-users. So if you intend to make full use of the card and its features, be prepared to spend a little more (and its price isn't all that appealing either to begin with).

Our Thoughts

Frankly, there are few compelling reasons for upgrading to a low-end AGP card like the GeCube. If you have a decent AGP system, upgrading to a mid-range or better, graphics card would prolong its lifespan. But if your system is lagging in almost every department, we would suggest a complete makeover instead and migrate to PCI Express. Despite the recent revival in the fortunes of AGP for both ATI and NVIDIA graphics cards, the end is nigh for the platform, particularly so for the low-end.

However, there is one niche segment of users who might just be looking out for the capabilities of the Radeon X1300 on the AGP interface and they are those eyeing for very large LCD displays such as the 30-inch behemoths from Dell and Apple. The low-end Radeon X1300 supports dual-link DVI output that is mandatory to drive these ultra high-resolution displays which previous generation graphics cards lack (including the mainstream retail GeForce 6 series). Hence if you are currently settled on an AGP platform and require the most cost-effective solution to drive these huge displays, the GeCube Radeon X1300 AGP graphics cards comes in very handily.

For all other folks, even if you do require a low-end AGP card, there seems little value in getting a Radeon X1300 card, as the price for the GeCube is estimated at US$135, making it more expensive than the comparable GeForce 6600, which hovers around US$100. This unfortunate situation applies not just for the GeCube, but for the Radeon X1300 AGP in general. But the case for the GeCube Radeon X1300 256MB DDR2 looks even less palatable when coupled with its almost non-existent bundle.

Product Specifications
  • ATI Radeon X1300 GPU
  • 128-bit, 4-channel GDDR2 memory interface
  • Core clock speed: 450MHz
  • Memory clock speed: 530MHz DDR2
  • 4 pixel shaders and 2 vertex shaders
  • Ultra-Threaded shader engine supports DirectX 9.0 Shader Model 3.0
  • Advanced image quality features supports 64-bit floating point HDR rendering
  • AGP 8x interface through external bridge chip
  • 300W PSU recommended