Fury x bandwidth: Today’s Review: Radeon R9 Fury X

Behold the beast: Full AMD Radeon R9 Fury X tech specs and design details revealed

AMD’s formal unveiling of the beastly new Radeon R9 Fury X at E3 earlier this week revealed a lot about the graphics card, but several technical details were left glaringly undetailed. Today, AMD’s taking the wraps off the rest of the information, giving us a full profile of its impressive new $650 flagship—a flagship where just as much care was spent on aesthetics as on raw technological firepower.

The rest of AMD’s new Radeon R300 series cards are hitting the streets today, which we covered in a separate post.

We’ll go through it all in detail, but let’s kick things off with the premier feature: The Fury X’s revolutionary high-bandwidth memory.

AMD’s HBM design.

Traditional dies for GDDR5 DRAM need to be arrayed on the board around the graphics processor, which sucks up a ton of space on the card. HBM is a new technology that stacks DRAM vertically instead, connecting the dies and the GPU via interposers. You can read all about HBM here, but in a nutshell, it requires far less room on the graphics card and also delivers a ton of memory bandwidth, by pairing low clock speeds with a ridiculously wide memory interface.

Specifically, the HBM in the Radeon R9 Fury X is clocked at a mere 1Gbps. That may seem paltry when compared to the 7Gbps speeds standard to the traditional GDDR5 memory in Nvidia’s flagship graphics cards. But Nvidia’s GDDR5 memory travels over a 384-bit-wide interface, which the Fury X’s 4GB of HBM utilizes a 4,096-bit bus. Yes, you read that correctly. That combination gives the Fury X 512GBps of total memory bandwidth, compared to the ferocious GTX 980 Ti’s 336.5GBps.

Craziness.

The AMD Fiji GPU at the heart of the Fury X.

AMD Radeon R9 Fury X tech specs. (Click to enlarge.)

HBM’s drastically reduced footprint also lets AMD pack a ton of tech into its new Fiji GPU—literally. Fiji rocks 4,096 stream processors and 8. 9 billion transistors, compared to the older R9 290X’s 2,816 stream processors and 6.3 billion transistors. (Nvidia’s Titan X packs 8 billion.) Clocked at up to 1,050MHz, it’s able to pump out up to 8.6 teraflops of compute performance. You can see the full tech specs for the Fury X’s HBM and Fiji processor in the chart at right.

All that power needs a pair of 8-pin connectors and 275 watts from the wall under heavy gaming scenarios, which is similar to the 980 Ti’s needs. (Nvidia’s card asks for 250W).

So now for the elephant in the room: How does the Fury X compare against Nvidia’s similarly priced GeForce GTX 980 Ti? It’s impossible to tell until we’ve put the Radeon through its review paces, given that AMD’s stream processors and Nvidia’s CUDA core technology aren’t directly comparable, and HBM adds an unknown factor. But these AMD-supplied benchmarks—which were obviously chosen to place the Radeon in the best possible light—show the two cards performing fairly neck-and-neck in most games. You’ll find the graphics settings AMD used in each game here.

AMD-supplied benchmark results pitting the Radeon Fury X against Nvidia’s GeForce GTX 980 Ti in several games at 4K resolution.

But the liquid-cooled Fury X was made to be overclocked. “You’ll be able to overclock this thing like no tomorrow,” AMD CTO Joe Macri said at the card’s unveiling. “This is an overclocker’s dream.” So here are more AMD-supplied benchmarks showing performance gains in various games after a 100MHz overclock is applied to the Fury X.

Remember: That’s all with the Radeon R9 Fury X being water-cooled—Nvidia’s 980 Ti relies on air. You have to wonder how the benchmarks will shake out when the air-cooled Radeon R9 Fury launches July 14. Hey! That’s a nice segue to…

AMD’s new flagship draws a lot of design cues from the Radeon R9 295×2, AMD’s immensely powerful dual-GPU graphics card from the R200 series generation.

As mentioned, the Radeon R9 Fury X sports a fully integrated water cooling solution. It cools all elements of the graphics card, eliminating the need for a fan on the card’s board, which allowed AMD to eliminate the grill on the rear port bracket and extend the shroud to the sides of the graphics card—an area left open in many graphics card designs. Locking down the card so tightly prevents heat from your other PC components from interfering with the Fury X’s cooling, AMD representatives said.

AMD’s Radeon Fury X graphics card alongside its the fan and radiator for its integrated liquid cooling.

The closed-loop liquid cooling solution itself is a custom design dreamed up by AMD and Cooler Master, paired with a 120mm Nidec Gentle Typhoon on the radiator. That fan can spin up to 3000 rpm, though representatives say it mostly spins at a much quieter 1500 rpm. AMD claims the liquid cooling keeps temperatures at a chilly 50 degrees Celsius—similar performance to the Radeon R9 295×2’s integrated liquid cooling—with noise levels around 35 decibels. Hey overclockers: AMD says this cooler supports up to 500 watts of thermal capacity.

In case it isn’t obvious yet, the Fury X uses a very unique design. So unique, in fact, that AMD’s add-in board partners (like Asus, MSI, and Sapphire) won’t be able to customize the card with their own cooling solutions. The Fury X will be reference design-only, though AIBs will be able to tinker with the air-cooled Radeon R9 Fury released in July.

That means all Fury X cards will be physically similar no matter which manufacturer you buy from.

The Fury X measures a mere 7.5 inches long, or 30 percent shorter than the older R9 290X. It’s constructed of multiple pieces of die-cast black nickel aluminum, finished with a mirror gloss on the exoskeleton and black soft-touch on the side plates. Removing four hex screws will let you take off the shroud; the Fury X also features a full backplate. (Yes!)

Port-wise, you’ll find three full-size DisplayPorts as well as an HDMI 1. 4a connection. AMD learned the folly of the Radeon R9 295×2’s heavy reliance on Mini-DisplayPort connections, it seems, though the lack of HDMI 2.0 means you’ll be limited to 30Hz when pushing 4K video through that port. The Fury X is capable of driving up to six displays simultaneously, though doing so would obviously require a DisplayPort hub.

The AMD Radeon R9 Fury X’s backplate, side shroud, and power pins (with “GPU tach” lights).

You’ll find an LED-illuminated Radeon logo on the face and outer edge of the card, as well as a new feature: 8 small lights located above the 8-pin power connectors. Dubbed “GPU tach” (as in “tachometer”) by AMD, more of these lights will flare to life the harder you push your graphics card—a nifty gimmick, though I’m not sure that cranking it to 8 has quite the same allure as cranking it to 11. A ninth green LED will illuminate when the GPU is put to sleep by AMD’s ZeroCore technology.

Speaking of cranking it to 11—er, 8—AMD’s PR keeps stressing that the Fury X will be a kick-ass overclocker. The card’s design speaks to that, featuring a dual BIOS switch, 6-Phase power design with up to 400 amps of power delivery, and AMD’s standard SVI2 interface to the voltage regulator, which sports full telemetry readback and lets you tinker with power settings via AMD’s PowerTune. (If you didn’t understand any of that, don’t sweat it—they’re hardcore overclocking features.) And while the Fury X typically draws just 275W of power while gaming, the dual 8-pin connectors support up to 375W. Read: OVERCLOCK ME.

Finally, the Fury X supports all the software features you’d expect: the next-gen DirectX 12 and Vulkan APIs, FreeSync, Virtual Super Resolution, the aforementioned PowerTune, and AMD’s new frame rate targeting control, which allows you to set a maximum frame rate output to reduce power draw and, by association, noise output. Here are more AMD-supplied benchmarks showing FRTC in action:

How does it

feel?

There you have it: Every single tech spec you need to know about AMD’s new flagship, the water-cooled Radeon R9 Fury X. Of course, all the tech specs in the world don’t mean a thing next to the numbers that really matter: Benchmark results.

The Fury X walks the spec walk, but can it talk the performance talk about Nvidia’s similarly priced GeForce GTX 980 Ti? That remains to be seen. But we’ll no doubt have the answer sooner rather than later, considering that the Radeon R9 Fury X hits the streets next week, on June 24.

Author: Brad Chacos, Executive editor

Brad Chacos spends his days digging through desktop PCs and tweeting too much. He specializes in graphics cards and gaming, but covers everything from security to Windows tips and all manner of PC hardware.

AMD Radeon R9 Fury X vs. Nvidia GeForce GTX 980 Ti

by Nikita Fedorov
September 15, 2015 One comment2015-09-15EDT13:43:21. 000000-14400
2022-06-08EDT04:30:14.000000-14400

When AMD announced its potential “killer” of nVidia’s GeForce GTX 980 Ti – the Radeon R9 Fury X, the whole gaming community was quite intrigued whether it is going to be for real or not. Since then, we were all waiting impatiently for release to lay our hands on the “Red Fury” and find out. So, let us see what AMD has accomplished this time and whether they created a real competitor to nVidia GTX 980 Ti. Which is the best card to put in your custom gaming PC? Which one will let you play games at high frame rates and 4K?

Looking into the future

This is not the first time when AMD releases their new product with innovative features. As many of us may remember the famous Radeon HD-4870 – the world’s first graphics card that incorporated high-speed GDDR5 memory solution. This time, apart from a newest 28nm GPU built on Fuji technology, AMD comes first again with the High Bandwidth Memory video memory (HBM).

HBM is so-called stacked or “3D” memory technology, which was developed by AMD in cooperation with SK Hynix – unlike GDDR5, memory chips are stacked on the graphics card and no longer applied side by side. Actual communication with GPU is ensured by so-called interposer. Stacking the memory allows increasing a memory bandwidth and requires less physical space on the circuit board. AMD says that HBM may accommodate up to 94% less space in comparison with traditional GDDRR5 modules. HDM runs at relatively low voltage – 1.3V vs. 1.5V for GDDR5 and lower clock speeds – 500MHz vs. 1750MHz for GDDR5. The transfer rate is also slower than GDDR5 – 1 Gbps vs. 7 Gbps. However, exceptionally wide bandwidth makes it up for these attributes. Each DRAM die in the stack can transfer data by way of two 128-bit-wide channels. Each stack, then, has an aggregate interface width of 1024 bits (versus 32 bits for a GDDR5 chip). At 1 Gbps, that works out to 128 GB/s of bandwidth for each memory stack. Thus, the limitations that are valid for GDDR5 are easily will be avoided in the future. It can be boldly said that HBM is the evolutionary step for AMD’s products and for Fury X in particular. Although, current implementation is limited to 4GB capacity, it presents unbelievably high throughput on an unprecedented 4096-bit wide memory interface.

Thermals, Noise and Power and Performance

No doubt, R9 Fury X is cooler by every way than the reference air-cooled GTX 980 Ti due to pre-installed water-cooling system. But when it comes to non-reference designs of nVidia cards, like EVGA 980 Ti Hybrid, for example, results are surprisingly the opposite. R9 Fury X warms-up a bit longer, which just mean that it’s cooling less aggressively on the start, but the temperature continues to increase throughout the tests. It’s not necessarily a bad thing – it just means the cooler doesn’t attack thermals as powerfully as the nVidia solution.

As sad as it may be, a loud hum of the pump of R9 Fury X is always audible. Even in comparison with reference design on GTX 980 Ti equipped with fans, still the noise is quite notable. Not to mention GTX 980 Ti water-cooling solutions – yes, they hum as well, but their hum is more deeper and not that annoying as AMD’s higher-frequency noise.

As for power consumption, here we have almost similar picture between R9 Fury X (a bit less) and nVidia (a bit more) non-reference solutions with water-cooling. And in the opposite, reference design GTX 980 Ti would consume as much as twice more energy, which would require the customer to consider more powerful power supply unit.

Gaming performance is generally a step behind the reference GTX 980 Ti when played at 4K. As resolution decreases, the Fury X’s weak point – polygon and triangle limitations – is revealed as a bottleneck. The disparity widens measurably, nearly 2x in some instances, as AMD competes at 1440p and 1080p resolutions.

Generalized conclusion

One of the advantages of AMD’s graphics cards was often their lower price compared to the Nvidia ones. Unfortunately, this time the Fury X can’t outscore GTX 980 Ti – 700 euros vs. 670, at the time, when AMD’s solution was considered as flagship video card. Given the higher performance and lower noise and power consumption, the price can be considered as post-primary. However, we give a credit to AMD for pioneering work that provides the manufacturer with the High Bandwidth Memory. This has been at the Fury X no significant added value compared to GDDR5 video memory, though bandwidth were increased significantly. Within the next few years HBM would prevail over older systems as a new standard and there would be no limitations of GDDR5. Until the Radeon R9 Fury X does not fall in price, you could seriously consider GTX 980 Ti as better performer.

GIGABYTE releases AMD RadeonTM R9 Fury X and new G1 gaming graphics cards powered by AMD 300 series GPUs

Taipei, Taiwan, July 1, 2015 — GIGABYTE, the world’s leading gaming hardware brand, is expanding its AMD product portfolio with a total of 7 new models which include high-quality graphics card Radeon ™ R9 Rage X and a complete line of specially designed R9 / R7 300 series of graphics cards, targeting various segments with revolutionary technology and unique gaming features.

Powered by Fiji’s most advanced GPU, the Radeon R9 Fury X is a compactly designed graphics card equipped with 4GB of innovative High Bandwidth Memory (HBM), delivering incredible performance for future 4K gaming and virtual reality applications. The Fury X is water-cooled with a liquid cooling system closed loop, providing extremely cool, quiet, and reliable performance for gaming enthusiasts.
In addition to Fury X, GIGABYTE released three G1 gaming models based on AMD 390X R9, R9 and R9 390 380, namely GV-R939XG1 Gaming 8GD, GV-R939G1 Gaming 8GD, and GV-R938G1 Gaming 4GD. G1 GAMING series graphics cards provide excellent power switching and thermal efficiency. Built on the famous WINDFORCE 2X cooling system, G1 GAMING cards provide both cool and quiet gaming performance right off the bat. Fan blades with a special 3D strip curve design effectively enhance airflow at 23% reduction in air turbulence for more efficient heat dissipation power. The cards also feature quiet semi-passive cooling, and the fans stay off when the GPU is at low temperatures, offering fantastic acoustic performance in light gaming. A pair of LED fan indicators on the side of the card provide instant fan status display.
GIGABYTE introduces three even stronger performers to complete the product line, which are GV-R938WF2OC-2GD and GV-R737WF2OC-2GD equipped with dual PWM fans and copper heatpipes, creating efficient cooling for solid gaming performance. Ideal for small form- factor builds, the compact GV-R736OC-2GD with a single 90mm unique fan blade satisfies such demand. The card is an excellent solution for playing some of the most popular online games.
OC Guru II free overclocking limitation

GIGABYTE OC Guru II allows gamers to adjust the maximum core voltage and synchronize GPU temperature and power consumption. These key features provide extremely performance to make gamers enjoy an amazing gaming experience.

AMD EyefinityTechnology

AMD Eyefinity Technology is here to deliver the unfair advantage you deserve. With new modes and features like gaming across five displays, AMD HD3D technology and universal bezel compensation, the world’s best multi-display gaming technology just got even better.

HBM ─ High Bandwidth Memory

Delivering more than three times the bandwidth per watt of GDDR5 memory with smaller, faster, and more power-efficient High-bandwidth on-chip memory, boosting gaming performance to achieve high-frame rates and smooth gaming experience.

For more information about GIGABYTE R9 Fury X and R9/R7 300 series, please visit the official website.
Facebook: https://www.facebook.com/GIGABYTE.VGA

Original text:


GIGABYTE Released AMD RadeonTM R9 Fury X and New G1 GAMING graphics cards powered by AMD 3000 Series4 GPU Taipei, Taiwan, 1st Jul 2015 – GIGABYTE, the world’s leading gaming hardware brand, expands its AMD product portfolio by a total of 7 new models, which include the Radeon™ R9Fury X high-end graphics card and a complete line of custom-designed R9/R7 300 series graphics cards, targeting various segments with revolutionary technology and unique gaming features.
Powered by the most advanced Fiji GPU, the Radeon R9 Fury X is a compactly designed graphics card equipped with 4GB of innovative High Bandwidth Memory (HBM), delivering incredible performance for the future of 4K gaming and Virtual Reality applications. The Fury X is water cooled with a closed-loop liquid cooling system, providing an extremely cool, quiet, and reliable operation for gaming enthusiasts.

In addition to the Fury X, GIGABYTE launched three G1 GAMING models based on the AMD R9 390X, R9 390 and R9 380, namely the GV-R939XG1 GAMING-8GD, the GV-R939G1 GAMING-8GD, and the GV-R938G1 GAMING-4GD. Forged with only the top-notch GPU core through the very own GPU Gauntlet Sorting technology, the R9 390X, R9 390 and R9 380 G1 GAMING series graphics cards provide excellent power switching and thermal efficiency.

Built upon the renowned WINDFORCE 2X cooling system, the G1 GAMING cards ensure both cool and quiet gaming performance right off the bat. The fan blades with the special 3D stripe curve design effectively enhance air flow by 23%, reducing air turbulence for more efficient heat dissipation capacity. The cards also feature silent semi-passive cooling, as the fans remain off when the GPU is under low temperature, offering fantastic acoustic performance during light gaming. A pair of LED fan indicators on the side of the card provides an instant display of the fan status.

GIGABYTE further introduces three more strong performers to complete the product lineup, which are the GV-R938WF2OC-2GD and the GV-R737WF2OC-2GD equipped with dual PWM fans and copper heat pipes, creating effective cooling for a solid gaming performance. Ideal for small form factor builds, the compact GV-R736OC-2GD with a single 90mm unique blade fan satisfies such demand. The card is a great solution for playing some of the most popular online games.
OC GURU II liberates the overclocking limitation

GIGABYTE OC GURU II allows gamers to adjust maximum core voltage and synchronize the GPU temperature and power consumption. These main features deliver an extremely performance to make gamers enjoy the amazing gaming experience.

AMD Eyefinity Technology

AMD Eyefinity technology is here to deliver the unfair advantage you deserve. With new modes and features, like gaming across five displays, AMD HD3D technology and universal bezel compensation, the world’s best multi-display gaming technology just got better.

HBM ─ High Bandwidth Memory

Achieve more than three times the bandwidth per watt over GDDR5 memory with smaller, faster, and more power-efficient High Bandwidth Memory on-chip, boosting gaming performance achieving high-framerates and smooth gameplay.

For more details about GIGABYTE R9 Fury X and R9/R7 300 series, please visit the official website.
Facebook: https://www.facebook.com/GIGABYTE.VGA

ATI Rage — frwiki.wiki

ATI Rage is a series of graphics chipsets offering 2D graphics acceleration, video acceleration and 3D acceleration. It is the successor to the Mach series.

Summary

  • 1 3D RAGE (I)
  • 2 3D RAGE II (II+, II+ DVD, IIc)
  • 3 3D Rage Pro
  • 4 Rage LT and Rage LT Pro
  • 5 RAGE XL
  • 6 RAGE 128

    • 6.1 Rage 128 Pro
    • 6.2 Alternative frame rendering
  • 7 Rage 6
  • 8 Mobility

3D RAGE(I)

The first generation of the Rage architecture arrived in

The original 3D RAGE chip (aka Mach64 GT) was based on the 2D Mach64 core with new 3D features and MPEG-1 acceleration.

3D RAGE was used in the ATI 3D Xpression video card.

3D RAGE II (II+, II+ DVD, IIc)

The second generation Rage (also known as the Mach64 GT-B) offers twice the performance in 3D.

Its GPU once again relied on a redesigned Mach64 graphics engine that delivered maximum 2D performance with high-speed EDO or SGRAM memory.

The 3D Rage II chip was an improved version compatible with the 3D Rage accelerator. The chip is PCI-compliant second generation, which improved 2D performance by 20% and added support for MPEG-2 (DVD) playback.

The chip was also compatible with Microsoft Direct3D and Reality Lab, QuickDraw 3D Rave, Criterion RenderWare and Argonaut BRender.

OpenGL drivers are available for the 3D and CAD professional community and Heidi drivers are available for AutoCAD users.

Drivers are also included with operating systems including Windows 95, Windows NT, Mac OS, OS/2 and Linux.

ATI also supplied a TV decoder chip for RAGE II, ImpacTV chip .

The

RAGE II was integrated into several Macintosh, including the first version of the Macintosh G3 (Beige), the Power Mac 6500. In IBM-compatible PCs, several motherboards and graphics cards used the chipset, including: 3D Xpression + , 3D Pro Turbo and original All-in-Wonder .

3D Rage IIc was the last version of the Rage II core and offered AGP support. The Rage IIc was integrated into the Macintosh computer, the original iMac G3/233.

  • Rage II+ DVD Specifications:
    • Frequency 60 MHz
    • up to 83 MHz SGRAM
    • Memory bandwidth 480 MB / s
    • DirectX 5.0

3D Rage Pro

The third generation of the Rage architecture appeared in the summer of 1997.

ATI has made a number of changes to 3D RAGE II: a new triangle configuration engine, improved perspective correction, transparency and fog support, specular lighting support, and improved DVD support. The 3D Rage Pro chip was designed for the Intel Accelerated Graphics Port (AGP) and takes advantage of runtime texturing, command pipelining, sideband addressing, and dual protocols. Initial versions were based on standard graphics memory configurations: up to 8 MB SGRAM or 16 MB WRAM, depending on model.

The

RAGE Pro offers performance in 3dfx’s Voodoo and Nvidia’s RIVA 128 line of accelerators, but generally doesn’t match or outperform its competitors.

This, in addition to the (early) lack of OpenGL support, had a negative impact on sales. In , ATI introduced a 2x AGP version of the Rage Pro to the OEM market and attempted to reinvent the Rage Pro for the retail market while renaming the chip Rage Pro Turbo and releasing a new driver Rage Pro Turbo. (4.10.2312), which supposedly improved performance by 40%. In fact, early versions of the new driver only provided improved performance on test benches such as Ziff-Davis’ 3D Winbench 98 and Final Reality.

Game performance has been severely affected. Despite poor implementation, the Rage Pro Turbo name stuck, and eventually ATI was able to release updated versions of the driver that showed noticeable performance improvements in games, but weren’t enough to keep gaming enthusiasts interested.

The 3D Rage Pro was primarily sold in the retail market as Xpert @ Work or Xpert @ Play, with the only difference being that the Xpert @ Play version had a TV out port. It was also the graphics chipset built into Sun’s Ultra 5/10 workstations, their first computer model to offer basic PC hardware. It was also the integrated graphics chipset of the second version of the Macintosh G3 (Beige).

3D Rage Pro General Features:

  • 75 MHz core
  • 100 MHz 4, 8 and 16 MB SGRAM / WRAM
  • Memory bandwidth 800 MB/s
  • DirectX 6. 0

Rage LT and Rage LT Pro

Rage LT (also known as Mach64 LT) was often used on motherboards and in mobile applications such as laptops.

This late 1996 chip was very similar to the Rage II and supported the same application encoding. It includes an LVDS (Low-Voltage Differential Signaling) transmitter for laptop LCD screens and advanced power management (block power control). RAGE LT PRO, based on 3D RAGE PRO, was the very first mobile GPU to use AGP. He suggested filtered ratiometric extension that automatically adjusted images to full screen size. ATI ImpacTV2+ is integrated with the RAGE LT PRO chip to support multi-screen viewing, i.e. simultaneous TV, CRT and LCD output. In addition, the RAGE LT PRO can drive two displays with different images and/or refresh rates through the use of independent and dual CRT controllers. The Rage LT Pro was often used in desktop graphics cards that featured a VESA Digital Flat Panel port for digital control of some desktop LCD monitors.

RAGE XL

Rage XL was an inexpensive solution to RAGE Pro . As a low power solution with 2D acceleration capability, this chip has been used in many low end graphics cards. It was also seen on pre-2004 Intel motherboards and was still used in 2006 for server motherboards. The Rage XL has been replaced by the ATI ES1000 for server use.
The

chip was a Rage Pro retractable chip, optimized for very low-cost operation in solutions that required only basic graphics output.

RAGE 128

In the ongoing struggle to create the fastest and most advanced 3D accelerator, ATI has introduced the RAGE 128 . The chip was announced in two versions: RAGE 128 GL and RAGE 128 VR.

Aside from the lower price of the VR chip, the main difference was that the former was fully 128-bit, while VR, still a 128-bit internal processor, used a 64-bit external memory interface.

  • Magnum is an OEM workstation card with 32MB SDRAM.
  • Rage Fury 32MB SDRAM and same performance as Magnum, this add-on card was designed for PC gamers.
  • Xpert 128 16 MB SDRAM and, like others, used a RAGE 128 GL chip.
  • Rage Orion — A RAGE 128 GL design specially made for Mac OS with 16 MB SDRAM , OpenGL and QuickDraw 3D/RAVE support, essentially a market Xpert 128. This card supported more video resolutions than later versions . Designed by RAGE 128. This card was designed for Macintosh gamers.
  • Nexus 128 — Also a Mac-specific design of the RAGE 128 GL, but with 32MB RAM, just like the Rage Fury. This card was intended for graphic design professionals.
  • Xclaim VR 128 — Also a Mac-specific RAGE 128 GL design with 16MB SDRAM, but including video capture, video output, TV tuner support, and QuickTime video acceleration.
  • Xpert 2000 — RAGE 128 VR design using 64-bit memory interface.

Rage 128 was compatible with Direct3D 6 and OpenGL 1.2. It supported many of the features of previous RAGE chips such as triangular configuration, DVD acceleration, and a VGA/GUI accelerator core. RAGE 128 added Inverse Discrete Cosine Transform (IDCT) acceleration to the DVD catalog. It was ATI’s first dual-texturing renderer capable of outputting two pixels per clock (two pixel pipelines). The processor was known for its high performance 32-bit color mode as well as its 16-bit weak frame mode. ; oddly enough, RAGE 128 was not much faster in 16-bit color, despite the low bandwidth requirement. In 32-bit mode, RAGE 128 more than matched RIVA TNT, and Voodoo 3 didn’t support 32-bit at all. The chip was supposed to compete with NVIDIA RIVA TNT, Matrox G200 and G400 and 3dfx Voodoo 3.

ATI implemented a caching technique called Twin Cache Architecture with Rage 128. Rage 128 used an 8 KB buffer to store the texels used by the 3D engine. To further improve performance, ATI engineers have also included an 8 KB pixel cache, which is used to rewrite pixels to the framebuffer.

  • 8 million transistors, 0.25 micron manufacturing
  • 3D feature set
    • Hardware support for vertex, fog, and fog tables
    • Alpha blending, Z-based vertex and fog, video textures, texture lighting
    • Bilinear and trilinear texture filtering and texture composition on a single clock
    • Perspective-correct texture mipmapping with color key support
    • Z-based vertices and reflections, shadows, spotlights, polarization 1.00
    • Removing hidden areas using 16-, 24-, or 32-bit Z-buffer
    • Gouraud and mirror filled polygons
    • Line and edge smoothing, bump mapping, 8-bit stencil
  • RAMDAC 250 MHz , AGP 2 ×

Rage 128 Pro

ATI later developed a successor to the original Rage 128 called Rage 128 Pro . This chip brought several improvements, including an improved triangular configuration engine that doubled the geometric bandwidth to eight million triangles/s, improved texture filtering, DirectX 6.0 texture compression, 4×AGP, DVI support, and a 9 chip0166 Rage Theater for composite and S-Video TV input. This chip was used on the Rage Fury Pro, gamer oriented boards, and the Xpert 2000 PRO, business oriented boards. The Rage 128 Pro was broadly equal to the Voodoo 3 2000, RIVA TNT2, and Matrox G400, but was often hampered by its lower clock speeds (often at 125 MHz) that competed with the high-end Voodoo3 3500, TNT2 ultra, and G400 MAX.

Alternate Frame Rendering

The Rage Fury MAXX featured two Rage 128 Pro chips in an Alternate Frame Rendering (AFR) configuration that nearly doubled performance. As the name suggests, AFR renders each image on an independent GPU. This card was designed to compete with the NVIDIA GeForce 256 and later the 3dfx Voodoo 5. Although it managed to match approximately 32 SDR GeForce 256 MB cards, the GeForce 256 with DDR memory still easily won the exits. At the time, there were few games that supported hardware transform, clipping, and lighting (T&L), but the lack of T&L in MAXX would put it at a disadvantage when those games became more mainstream.

ATI later discovered that Windows NT 5.x (Windows 2000, XP) did not support both AGP GPUs as implemented by ATI. NT puts both of them on the AGP bus and switches between them, so the card can only run as one Rage 128 Pro with the performance of a Rage Fury card. The optimal operating system for Rage Fury MAXX is Windows 98/ME. Windows 95 and Mac OS were not supported.

Rabies 6

The Rage 128 Pro graphics accelerator was the last iteration of the Rage architecture and the last use of the Rage brand. While the next iteration was originally called Rage 6 , ATI decided to rename it to Radeon .