Nvidia 9800gx2: NVIDIA GeForce 9800 GX2 Specs

Nvidia’s GeForce 9800 GX2 graphics card

I said just last week that GPUs are breeding like rabbits, and here we have another example of multi-GPU multiplication. The brand-spanking-new GeForce 9800 X2 combines a pair of G92 graphics processors onto one card for twice the goodness, like an incredibly geeky version of an old Double-Mint gum commercial.

Cramming a pair of GPUs into a single graphics card has a long and familiar pedigree, but the most recent example of such madness is AMD’s Radeon HD 3870 X2, which stole away Nvidia’s single-card performance crown by harnessing a duo of mid-range Radeon GPUs. The folks on the green team tend to take the heavyweight performance crown rather seriously, and the GeForce 9800 GX2 looks primed to recapture the title. We already know the G92 GPU is faster than any single graphics processor AMD has to offer. What happens when you double up on them via SLI-on-a-card? Let’s have a look.

Please welcome the deuce

Dressed all in black, the GeForce 9800 GX2 looks like it means business. That’s probably because it does. This beast packs two full-on G92 graphics processors, each running at 600MHz with a 1500MHz shader clock governing its 128 stream processors. Each GPU has its own 512MB pool of GDDR3 memory running at 1GHz (with a 2GHz effective data rate) on a 256-bit bus. For those of you in Rio Linda, that adds up to 1GB of total graphics memory and a whole lotta bandwidth. However, as in any SLI setup, memory isn’t shared between the two GPUs, so the effective memory size of the graphics subsystem is 512MB.

The G92 GPU may be familiar to you as the engine behind the incredibly popular GeForce 8800 GT, and you may therefore be tempted to size up the GX2 as the equivalent of two 8800 GT cards in SLI. But that would be selling GX2 short, since one of the G92 chip’s stream processor (SP) clusters is disabled on the 8800 GT, reducing its shader and texture filtering power. Instead, the GX2 is closer to a pair of GeForce 8800 GTS 512 cards with their GPUs clocked slightly slower.

Translation: this thing should be bitchin’ fast.

XFX’s rendition of the GeForce 9800 GX2

The Johnny-Cash-meets-Darth-Vader color scheme certainly works well for it. (And before you even ask, let’s settle this right away: head to head, Cash would defeat Vader, ten times out of ten. Thanks for playing.) Such color schemes tend to go well with quirky personalities, and the GX2 isn’t without its own quirks.

Among them, as you can see, is the fact that its two DVI ports have opposite orientations, which may lead to some confusion as you fumble around behind a PC trying to connect your monitor(s). Not only that, but only port number 1 is capable of displaying pre-boot output like BIOS menus, DOS utilities, or the like. Nvidia calls this port “bootable.” The second port will drive a display only once you have video drivers installed and are booted into a proper OS.

To the left of the DVI and HDMI ports in the picture above are a pair of LED indicators to further confuse and astound you. The lower blinkenlight turns green to indicate that all of the necessary power leads are connected to the GX2, while the upper one lights up blue to indicate which of the two GX2 cards in (ahem) a quad SLI setup owns the primary display port.

As you can see in the picture above, a black plastic shroud envelops the entire GX2, as if it were a Steve Jobs-style black turtleneck. The GX2’s full-coverage shroud furthers its image as a self-contained graphics powerhouse—and conceals its true, dual nature, as we’ll soon find out.

A few holes in the shroud do expose key connectors, though. This card requires both a six-pin aux PCIe power connector and an eight-pin one. Take note: plugging a six-pin connector into that eight-pin port isn’t sufficient, as it is for some Radeon cards. The GX2 requires a true eight-pin power lead. Unfortunately, space around this eight-pin plug is tight. Getting our PSU’s connector into the port took a little extra effort, and extracting it again took lots of extra effort. Nvidia claims the problem is that some PSUs don’t comply with the PCIe spec, but that’s little comfort. Cutting a slightly larger hole in the shroud would have prevented quite a few headaches.

Extra exposure below the shroud, though, doesn’t seem to be part of the program. For instance, just to the left of the six-pin power plug is an audio S/PDIF input, needed to feed audio to the GX2’s HDMI output port. This port was concealed by a rubber stopper on this XFX card out of the box.

The GX2’s SLI connector lurks under a plastic cover, as well, semi-ominously suggesting the potential for quad SLI. Those who remember the disappointing performance scaling of Nvidia’s previous quad SLI solution, based on the GeForce 7950 GX2, will be relieved to hear that the 9800 GX2 should be free from the three-buffer limit that the combination of Windows XP and DirectX 9 imposed back then. Nvidia says it intends to deliver four-way alternate frame rendering (AFR), a la AMD’s CrossFire X. That should allow for superior performance scaling with four GPUs, provided nothing else gets in the way.

Price, certainly, should be no object for the typical quad SLI buyer. XFX has priced its 9800 GX2 at $599 on Newegg, complete with a copy of Company of Heroes. That puts the GX2 nearly a hundred bucks above the price of a couple of GeForce 8800 GTS 512MB cards. In this case, you’re paying a premium for the GX2’s obvious virtues, including its single PCIe x16 connection, dual-slot profile, quad SLI potential, and the happy possibility of getting SLI-class performance on a non-Nvidia chipset.

Looking beneath Vader’s helmet

After several minutes of unscrewing, prying, and coaxing, I was able to remove the 9800 GX2’s metal shroud. Beneath it lies this contraption:

Like the 7950 GX2 before it, the 9800 GX2 is based on a dual-PCB design in which each board plays host to a GPU and its associated memory. Unlike the 7950 GX2, this new model has a single, beefy cooler sandwiched between the PCBs. This cooler directs hot air away from the card in two directions: out of the lower part of the expansion slot backplate and upwards, out of the top of the shroud.

Beneath the card, a ribbon cable provides a communications interconnect between the two PCBs.

And at the other end of the card, you can see the partially exposed blades of the blower.

Sadly, that’s where my disassembly of the GX2 stopped—at least for now. Too darned many screws, and that ribbon cable looks fragile, so I chickened out. Presumably, on the PCB that houses the “primary” GPU, you’ll also find a PCI Express switch chip similar to the nForce 200 chip that Nvidia used as glue in the nForce 780i chipset.

Before we move on to see how this puppy performs, I should mention one other trick it has up its sleeve: when paired with the right Nvidia chipset with integrated graphics, the GX2 is capable of participating in a HybridPower setup. The basic idea here is to save power when you’re not gaming by handing things over to the motherboard’s integrated GPU and powering down the GX2 entirely. Then, when it’s game time, you light the fire under the big dawg again for max performance. Unfortunately, we’ve not yet been able to test HybridPower, but we’ll keep an eye on it and try to give it a spin shortly.

Another thing we’ve not yet tested is quad SLI with dual GX2s. We can show you how a single GX2 performs today; we’ll have to follow up with the scary-fast quad stuff later.

Our testing methods

As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and the results were averaged.

Our test systems were configured like so:

Processor Core
2 Extreme X6800 2.93GHz
Core
2 Extreme X6800 2.93GHz
System
bus
1066MHz
(266MHz quad-pumped)
1066MHz
(266MHz quad-pumped)
Motherboard Gigabyte
GA-X38-DQ6
XFX
nForce 680i SLI
BIOS
revision
F7 P31
North
bridge
X38
MCH
nForce
680i SLI SPP
South
bridge
ICH9R nForce
680i SLI MCP
Chipset
drivers
INF
update 8. 3.1.1009

Matrix Storage Manager 7.8

ForceWare
15.08
Memory
size
4GB
(4 DIMMs)
4GB
(4 DIMMs)
Memory
type
2
x Corsair
TWIN2X20488500C5D
DDR2 SDRAM at 800MHz
2
x Corsair
TWIN2X20488500C5D
DDR2 SDRAM at 800MHz
CAS
latency (CL)
4 4
RAS
to CAS delay (tRCD)
4 4
RAS
precharge (tRP)
4 4
Cycle
time (tRAS)
18 18
Command
rate
2T 2T
Audio Integrated
ICH9R/ALC889A

with RealTek 6. 0.1.5497 drivers

Integrated
nForce 680i SLI/ALC850

with RealTek 6.0.1.5497 drivers

Graphics Diamond Radeon HD
3850 512MB PCIe

with Catalyst 8.2 drivers

Dual
GeForce
8800 GT 512MB PCIe

with ForceWare 169.28 drivers

Dual Radeon HD
3850 512MB PCIe

with Catalyst 8.2 drivers

Dual
Palit GeForce
9600 GT 512MB PCIe

with ForceWare 174.12 drivers


Radeon HD 3870 512MB PCIe

with Catalyst 8.2 drivers

Dual

Radeon HD 3870 512MB PCIe

with Catalyst 8. 2 drivers

Radeon HD 3870 X2 1GB PCIe

with Catalyst 8.2 drivers

Dual Radeon HD 3870 X2 1GB PCIe

with Catalyst 8.3 drivers

Radeon HD 3870 X2 1GB PCIe +

Radeon HD 3870 512MB PCIe

with Catalyst 8.3 drivers

Palit
GeForce
9600 GT 512MB PCIe

with ForceWare 174.12 drivers

GeForce
8800 GT 512MB PCIe

with ForceWare 169.28 drivers

EVGA
GeForce 8800 GTS 512MB PCIe

with ForceWare 169. 28 drivers

GeForce
8800 Ultra 768MB PCIe

with ForceWare 169.28 drivers

GeForce
9800 GX2 1GB PCIe

with ForceWare 174.53 drivers

Hard
drive
WD
Caviar SE16 320GB SATA
OS Windows
Vista Ultimate x86 Edition
OS
updates
KB936710, KB938194, KB938979,
KB940105, KB945149,
DirectX November 2007 Update

Please note that we tested the single and dual-GPU Radeon configs with the Catalyst 8. 2 drivers, simply because we didn’t have enough time to re-test everything with Cat 8.3. The one exception is Crysis, where we tested single- and dual-GPU Radeons with AMD’s 8.451-2-080123a drivers, which include many of the same application-specific tweaks that the final Catalyst 8.3 drivers do.

Thanks to Corsair for providing us with memory for our testing. Their quality, service, and support are easily superior to no-name DIMMs.

Our test systems were powered by PC Power & Cooling Silencer 750W power supply units. The Silencer 750W was a runaway Editor’s Choice winner in our epic 11-way power supply roundup, so it seemed like a fitting choice for our test rigs. Thanks to OCZ for providing these units for our use in testing.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

We used the following versions of our test applications:

  • Call of Duty 4: Modern Warfare 1. 4
  • Crysis 1.1
  • Half-Life 2 Episode Two
  • Unreal Tournament 3 1.1
  • 3DMark06 1.1.0
  • FRAPS 2.9.4

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

G x 2 = Yow!

To give you a better sense of the kind of wallop this one graphics card packs, have a look at the theoretical numbers before. On paper, at least, the GX2 is staggering.

Peak
pixel
fill rate
(Gpixels/s)

Peak bilinear

texel
filtering
rate
(Gtexels/s)


Peak bilinear

FP16 texel
filtering
rate
(Gtexels/s)


Peak
memory
bandwidth
(GB/s)

Peak
shader
arithmetic
(GFLOPS)

GeForce 9600 GT 10. 4 20.8 10.4 57.6 312
GeForce 8800 GT 9.6 33.6 16.8 57.6 504
GeForce 8800 GTS 512 10.4 41.6 20.8 62.1 624

GeForce 8800 GTX

13.8 18.4 18. 4 86.4 518
GeForce 8800 Ultra 14.7 19.6 19.6 103.7 576
GeForce 9800 GX2 19.2 76.8 38.4 128.0 1152
Radeon HD 2900 XT 11.9 11.9 11.9 105. 6 475
Radeon HD 3850 10.7 10.7 10.7 53.1 429
Radeon HD 3870 12.4 12.4 12.4 72.0 496
Radeon HD 3870 X2 26.4 26.4 26.4 115.2 1056

The GX2 outclasses Nvidia’s previous top card, the GeForce 8800 Ultra, in every category. More importantly, perhaps, it matches up well against the Radeon HD 3870 X2—ostensibly its closest competitor, although the 3870 X2 is now selling for as low as $419. The two cards are fairly evenly matched in terms of pixel fill rate, memory bandwidth, and peak shader power, but look closely at the 9800 GX2’s advantage over the 3780 X2 in terms of texture filtering capacity—it leads 76.8 to 26.4 Gtexels/s. The gap closes with FP16 texture formats, where the GX2’s filtering capacity is chopped in half, but it’s still considerable.

We can, of course, measure some of these things with some simple synthetic benchmarks. Here’s how the cards compare.

The 9800 GX2 trails the 3870 X2 in terms of pixel fill rate, but it makes up for it with a vengance by more than doubling the X2’s multitextured fill rate, more or less as expected. In fact, in this test, the GX2 shows more texture filtering capacity than three GeForce 8800 Ultras or four Radeon HD 3870s.

Somewhat surprisingly, the Radeon HD 3870 X2 takes three of the four 3DMark shader tests from the GX2. But will that matter in real games?

Call of Duty 4: Modern Warfare

We tested Call of Duty 4 by recording a custom demo of a multiplayer gaming session and playing it back using the game’s timedemo capability. Since these are high-end graphics configs we’re testing, we enabled 4X antialiasing and 16X anisotropic filtering and turned up the game’s texture and image quality settings to their limits.

We’ve chosen to test at 1680×1050, 1920×1200, and 2560×1600—resolutions of roughly two, three, and four megapixels—to see how performance scales. I’ve also tested at 1280×1024 with the lower-end graphics cards, since some of them struggled to deliver completely fluid rate rates at 1680×1050.

Here’s our first look at the GX2’s true performance, and it’s a revelation. This “single” graphics card utterly outclasses the GeForce 8800 Ultra and Radeon HD 3870 X2, producing performance faster than three Radeon HD 3870 GPUs in a CrossFire X team and nearly matching a pair of 3780 X2 cards with four GPUs.

The only fly in the ointment is a consistent problem we’ve seen in this game with SLI configs using 512MB cards; their performance drops quite a bit at 2560×1600. The GX2 looks to be affected, although not as badly as the GeForce 8800 GT SLI and 9600 GT SLI setups we tested. And, heck, it’s still pumping out 60 frames per second at that resolution.

Enemy Territory: Quake Wars

We tested this game with 4X antialiasing and 16X anisotropic filtering enabled, along with “high” settings for all of the game’s quality options except “Shader level” which was set to “Ultra.” We left the diffuse, bump, and specular texture quality settings at their default levels, though. Shadows, soft particles, and smooth foliage were enabled. Again, we used a custom timedemo recorded for use in this review.

I’ve excluded the three- and four-way CrossFire X configs here since they don’t support OpenGL-based games like this one.

The GX2 slices through Quake Wars with ease, again easily outperforming the Radeon HD 3870 X2 and the GeForce 8800 Ultra. The only place where that might really matter in this game is at 2560×1600 resolution. At lower resolutions, the X2 and Ultra are plenty fast, as well.

Half-Life 2: Episode Two

We used a custom-recorded timedemo for this game, as well. We tested Episode Two with the in-game image quality options cranked, with 4X AA and 16 anisotropic filtering. HDR lighting and motion blur were both enabled.

Here, the GX2 just trails not one but two Radeon HD 3870 X2s paired (or really quadded) up via CrossFire X. No other “single” card comes close. Since the GX2 averages 67 FPS at 2560×1600, even the two and three-way SLI configs with a GeForce 8800 Ultra don’t look to be much faster in any meaningful sense.

Crysis

I was a little dubious about the GPU benchmark Crytek supplies with Crysis after our experiences with it when testing three-way SLI. The scripted benchmark does a flyover that covers a lot of ground quickly and appears to stream in lots of data in a short period, possibly making it I/O bound—so I decided to see what I could learn by testing with FRAPS instead. I chose to test in the “Recovery” level, early in the game, using our standard FRAPS testing procedure (five sessions of 60 seconds each). The area where I tested included some forest, a village, a roadside, and some water—a good mix of the game’s usual environments.

Due to the fact that FRAPS testing is a time-intensive endeavor, I’ve tested the lower-end graphics cards at 1680×1050 and the higher end cards at 1920×1200, with the GX2 and our CrossFire X configs included in both groups.

The GX2’s performance at 1920×1200 is matched only by two or three GeForce 8800 Ultras. That is, in technical terms, totally sweet. Personally, despite the seemingly low numbers, I’d consider Crysis playable at 1920×1200 and high quality on the GX2. The card is producing a 35 FPS average, but it rarely dips below that average, so it feels smooth enough. And I’m testing in a very intensive section of the game with lots of dense jungle. When you move elsewhere on the same map, frame rates can climb into the mid 40s or better.

In order to better tease out the differences between the high-end solutions, I cranked up Crysis to its “very high” quality settings and turned on 4X antialiasing.

The GX2 stumbles badly, just as the Radeon HD 3870 X2 did. Although the GX2 is a very fast single card, it can’t match two GeForce 8800 Ultras in a number of key respects, including pixel fill rate and memory bandwidth. Perhaps more notably, the GX2 has, effectively, 512MB of memory, while even a single GeForce 8800 Ultra has 768MB. That may (and probably does) explain the Ultra’s higher performance here.

Unreal Tournament 3

We tested UT3 by playing a deathmatch against some bots and recording frame rates during 60-second gameplay sessions using FRAPS. This method has the advantage of duplicating real gameplay, but it comes at the expense of precise repeatability. We believe five sample sessions are sufficient to get reasonably consistent and trustworthy results. In addition to average frame rates, we’ve included the low frames rates, because those tend to reflect the user experience in performance-critical situations. In order to diminish the effect of outliers, we’ve reported the median of the five low frame rates we encountered.

Because UT3 doesn’t natively support multisampled antialiasing, we tested without AA. Instead, we just cranked up the resolution to 2560×1600 and turned up the game’s quality sliders to the max. I also disabled the game’s frame rate cap before testing.

Once more, the GX2 distances itself from the GeForce 8800 Ultra and Radeon HD 3870 X2. One almost forgets this is a “single” graphics card and starts comparing it to two Ultra or two X2s. I’d say that wouldn’t be fair, but heck, the GX2 stands up pretty well even on that basis.

Power consumption

We measured total system power consumption at the wall socket using an Extech power analyzer model 380803. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. The cards were plugged into a motherboard on an open test bench.

The idle measurements were taken at the Windows Vista desktop with the Aero theme enabled. The cards were tested under load running UT3 at 2560×1600 resolution, using the same settings we did for performance testing.

Note that the SLI configs were, by necessity, tested on a different motherboard, as noted in our testing methods section.

Given its insane performance, the GX2’s power consumption is realy quite reasonable. It can’t come close to matching the admirably low idle power consumption of the Radeon HD 3870 X2; even the three-way CrossFire X system draws fewer watts at idle. When running a game, however, the GX2 draws less power than the X2. That adds up to a very nice performance-per-watt profile.

Noise levels

We measured noise levels on our test systems, sitting on an open test bench, using an Extech model 407727 digital sound level meter. The meter was mounted on a tripod approximately 12″ from the test system at a height even with the top of the video card. We used the OSHA-standard weighting and speed for these measurements.

You can think of these noise level measurements much like our system power consumption tests, because the entire systems’ noise levels were measured, including the stock Intel cooler we used to cool the CPU. Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a card’s highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.

Unfortunately—or, rather, quite fortunately—I wasn’t able to reliably measure noise levels for most of these systems at idle. Our test systems keep getting quieter with the addition of new power supply units and new motherboards with passive cooling and the like, as do the video cards themselves. I decided this time around that our test rigs at idle are too close to the sensitivity floor for our sound level meter, so I only measured noise levels under load.

The GX2’s acoustic profile is a little disappointing in the wake of the miraculously quiet dual-slot coolers on recent Nvidia reference designs. It’s not horribly loud or distracting, but it does make its presence known with the blower’s constant hiss. The card is much quieter at idle, but still audible and probably louder than most.

GPU temperatures

Per your requests, I’ve added GPU temperature readings to our results. I captured these using AMD’s Catalyst Control Center and Nvidia’s nTune Monitor, so we’re basically relying on the cards to report their temperatures properly. In the case of multi-GPU configs, I only got one number out of CCC. I used the highest of the numbers from the Nvidia monitoring app. These temperatures were recorded while running UT3 in a window.

The GX2 runs hot, but not much hotter than most high-end cards.

Conclusions

The GeForce 9800 GX2 is an absolute powerhouse, the fastest graphics card you can buy today. Even Crysis is playable on the GX2 at 1920×1200 with high-quality settings. Although it is part of a clear trend toward high-end multi-GPU solutions, it is also a testament to what having a capable high-end GPU will get you. The GX2 is oftentimes faster than three Radeon HD 3870 GPUs and sometimes faster than four. Nvidia’s superior building block in the G92 GPU makes such feats possible. At the same time, the SLI-on-a-stick dynamic works in the 9800 GX2’s favor. The GX2 absolutely whups Nvidia’s incumbent product in this price range, the GeForce 8800 Ultra. Ye olde G80 is still a formidable GPU, but it can’t keep up with two G92s. The G92 is based on a smaller fab process, so even with dual GPUs onboard, the GX2’s power consumption is similar to the Ultra’s. On top of that, the G92 GPU offers new features like built-in HD video decode acceleration that the Ultra lacks.

The only disappointment here is the GX2’s relatively noisy cooler, which isn’t terrible, but can’t match the muted acoustics of Nvidia’s other high-end graphics cards. Let’s hope this card is an exception to the rule because of its multi-GPU nature, not part of a trend toward louder coolers.

All of that said, the GX2 is still an SLI-based product, so it has its weaknesses. Multi-GPU schemes require driver updates in order to work optimally with newer games. This is an almost inescapable situation given today’s hardware and software realities. Worse yet, the GX2’s multi-monitor support is as poor as any SLI config’s. The card can accelerate 3D games on only one display, and one must switch manually between multi-GPU mode and multi-display mode via the control panel. This arrangement is light-years behind the seamless multi-monitor support in AMD’s drivers. Nvidia really needs to address this shortcoming if it expects products like the GX2 to become a staple of its lineup.

At $599, the GX2 is pricey, but Nvidia and its partners can ask such a premium for a product with no direct rivals. Obviously, this is a card for folks with very high resolution displays and, most likely, motherboards based on non-Nvidia chipsets. If you are willing to live with an nForce chipset, you could save some cash and get fairly similar performance by grabbing a couple of GeForce 8800 GT cards and running them in SLI. Two of those will run most games at 2560×1600 resolution reasonably well. I’d even suggest looking into a couple of GeForce 9600 GTs as an alternative, based on our test results, if I weren’t concerned about how their relatively weak shader power might play out in future games.

The big remaining question is how well the GX2 realizes its potential in quad-SLI configurations. We’ll find out soon and let you know.

NVIDIA’s GeForce 9800 GX2 — Early Test!

Video Cards & GPUs

NVIDIA GeForce GPU

The 9800 GX2 has shown up on our doorstep and Shane thought it was time to take it for a ride through XP and Vista.

Published Mar 14, 2008 11:00 PM CDT   |   Updated Tue, Nov 3 2020 7:04 PM CST

Manufacturer: none

14 minute read time

Introduction

The latest in stalker letters is here, and it comes attached with another NVIDIA graphics card like our last. If you recall last time this happened, we received the 8800GTS G92 based card anonymously with a letter attached to it. This time we’ve got another new card and accompanying letter.

Again, it’s an NVIDIA offering, but the message was a little different this time, a little scarier. Someone wants us to have the new NVIDIA cards, but at the same time that person or company doesn’t want us to know who they are!

Really, we’re not complaining and as long as nothing like a horse’s head shows up in my bed then I’m not too concerned.

We’ve all heard about the GX2, so there isn’t much to say here; there’s no package so we’ll get stuck straight into the card and talk about its specifications before we get into the benchmarking.

The Card

Without the shroud to cover the card we can see that the card looks like the back of any normal graphics card. Both sides look almost identical with only a few things making them slightly different from each other.

The main difference between the two sides is that one carries with it an SLI connector and PCI Express connector while the other has neither.

Towards the back of the card you can see the hole on each side where the fans pull air from.

If we hover above the card and look down we can see the design of the heatsink and the fins where the air comes out. The whole design actually looks great, but we worry that the shroud over the top is just going to enhance any heat issues.

Across the top we can also see the SLI connector located on one of the cards along with two power connectors. One card has a single 6-pin PCI Express connector while the second card carries with it an 8-pin PCI Express Connector.

The I/O department carries with it two Dual Link DVI connectors, with one being on each PCB along with a single HDMI connector. This one doesn’t carry an optical port like we’ve seen in some of the reference pictures. We figure this is an optional extra that partners can choose to add.

Specifications

As you would expect, the details on GPU-Z are a bit scarce; it was unable to get details on most specifics of the card. What it does tell us though is that we have a 600MHz core clock, 1500MHz shader clock and 2000MHz DDR clock on the memory.

Other specs include 1GB of GDDR3 memory split across the two PCBs, 65nm cores, 256-bit memory interface per core, 128 stream processors per core, PCI-E 2.0 support, Direct X 10, SM 4.0 and support for SLI (Quad Core) coming at the end of March.

Specifications wise, it’s nothing more than something between the G92 based 8800GT and 8800GTS.

Test System Setup and 3DMark06

Test System Setup

Processor(s): Intel Core 2 Quad Q6600 @ 3GHz (333MHz x 9)
Cooling: Corsair Nautilus500 (Supplied by Corsair) with Arctic Cooling MX-2 Thermal Compound (Supplied by Arctic Cooling)
Motherboard(s): GIGABYTE X48-DQ6 (Supplied by GIGABYTE)
Memory: 2 X 1GB Kingston PC6400 DDR-2 3-3-3-10 (KHX6400D2ULK2/2G) (Supplied by Kingston)
Hard Disk(s): Seagate 250GB 7200RPM SATA-2 7200.10 (Supplied by Seagate)
Operating System: Windows XP Professional SP2, Windows Vista
Drivers: Catalyst 8.3, Forceware 169.25, Forceware 171.16, GX2 Forceware Beta 173.67

The main thing to compare here today is the performance of the 9800GX2 against the HD 3870 X2 from AMD. We’ve also thrown in the 8800GT, as the specifications on a single core of a 9800GX2 is only slightly above that of an 8800GT, this should mean that if a game doesn’t take advantage of SLI then the performance should only be just above the single 8800GT.

We’ve tested our standard resolutions purely for consistency, but once we’ve completed it we will sit back and evaluate the scores to see if it’s worth going to the next level. If we get barely playable scores with decent settings at 1920 x 1200, then what’s the point of testing it at 2560 x 1600 when we know it’s not going to be playable at all? — The fact that the GX2 can get an average of 20FPS Vs. a HD 3870 X2 getting an average of 18FPS doesn’t really prove anything.

3DMark06

Version and / or Patch Used: Build 110
Developer Homepage: http://www.futuremark.com
Product Homepage: http://www.futuremark.com/products/3dmark06/
Buy It Here

3DMark06 is the very latest version of the «Gamers Benchmark» from FutureMark. The newest version of 3DMark expands on the tests in 3DMark05 by adding graphical effects using Shader Model 3.0 and HDR (High Dynamic Range lighting) which will push even the best DX9 graphics cards to the extremes.

3DMark06 also focuses on not just the GPU but the CPU using the AGEIA PhysX software physics library to effectively test single and Dual Core processors.

In our first test we can see performance from the GX2 is very close to that of the X2. At the higher resolutions you can see the large gains over the G92 based 8800GT.

Benchmarks — PT Boats: Knights of the Sea

PT Boats: Knights of the Sea

Version and / or Patch Used: Benchmark Demo
Developer Homepage: http://en.akella.com/
Product Homepage: http://www.pt-boats.net/

PT Boats: Knights of the Sea is a naval action simulator that places gamers in charge of a mosquito fleet of the Allied Forces, Russia or Germany during the height of World War II.

Using the latest Direct X 10 technology PT Boards — Knights of the Sea manages to apply a lot of stress to the components of today which in turn gives us quite an intensive benchmark.

Under PT Boats, while we see the GX2 performing similar to the HD 3870 X2 in the average department, when you look at the minimum FPS score you can see the GX2 has a good lead over the other cards at the higher resolutions.

Benchmarks — CINEBENCH R10

CINEBENCH R10

Version and / or Patch Used: Release 10
Developer Homepage: http://www.maxon.net/
Product Homepage: http://www.maxon.net

CINEBENCH is a real-world test suite that assesses your computer’s performace capabilities. MAXON CINEBENCH is based on MAXON’s award-winning animation software, CINEMA 4D, which is used extensively by studios and production houses worldwide for 3D content creation. MAXON software has been used in blockbuster movies such as Spider-Man, Star Wars, The Chronicles of Narnia and many more.

MAXON CINEBENCH runs several tests on your computer to measure the performance of the main processor and the graphics card under real world circumstances. The benchmark application makes use of up to 16 CPUs or CPU cores and is available for Windows (32-bit and 64-Bit) and Macintosh (PPC and Intel-based).

We can see CINEBENCH shows us nothing spectacular with performance actually being the worst out of all the cards tested.

Benchmarks — Half Life 2 (Episode Two HDR)

Half Life 2 (Episode Two HDR)

Version and / or Patch Used: Latest from Steam
Timedemo or Level Used: Custom Timedemo
Developer Homepage: http://www.valvesoftware.com
Product Homepage: http://www.half-life2.com
Buy It Here

By taking the suspense, challenge and visceral charge of the original, and adding startling new realism, responsiveness and new HDR technology, Half-Life 2 Episode Two opens the door to a world where the player’s presence affects everything around him, from the physical environment to the behaviors even the emotions of both friends and enemies.

We benchmark Half Life 2 Episode Two with our own custom timedemos as to avoid possible driver optimizations using the «record demo_name» command and loading the timedemo with the «timedemo demo_name» command — For a full list of the commands, click here.

While we can see at the higher resolution the GX2 shows big gains over the 8800GT, it still lags behind the HD 3870 X2 across all resolutions.

Benchmarks — World in Conflict

World in Conflict

Version and / or Patch Used: 1.0.0.5
Timedemo or Level Used: Built-in Test
Developer Homepage: http://www.massive.se
Product Homepage: http://www.worldinconflict.com

World in Conflict is a real-time strategy video game by Massive Entertainment and to be published by Sierra Entertainment for Windows (DX9 and DX10) and the Xbox 360.

The game is set in 1989 where economic troubles cripple the Soviet Union and threaten to dissolve it. However, the title pursues a «what if» scenario where, in this case, the Soviet Union does not collapse and instead pursues a course of war to remain in power. It is an intensive new game is sure to put plenty of stress on even the latest graphics cards and we use the built-in benchmarking for our testing.

Here we can see the GX2 has some decent gains over the HD 3870 X2. The best gains are seen at the lower resolution, but as we climb up the two cards sit quite close together. We can see that there are some good gains over the 8800GT.

Benchmarks — Enemy Territory: Quake Wars

Enemy Territory: Quake Wars

Version and / or Patch Used: Latest Steam Version
Timedemo or Level Used: Custom time demo
Developer Homepage: http://www.splashdamage.com/
Product Homepage: http://www.enemyterritory.com/

Enemy Territory: Quake Wars is the latest Quake incarnation to make it out of the iD labs and carries with it a fast paced experience that manages to place a good amount of strain on your graphics card.

We use a custom made time demo which shows a bit of everything and manages to give us a good solid benchmark for the graphics cards that we test.

While our OpenGL CINEBENCH benchmark showed nothing in favor of the GX2, Enemy Territory (also based on OpenGL) sees some good gains at the higher resolution.

Benchmarks — Crysis

Crysis

Version and / or Patch Used: 1.1
Timedemo or Level Used: Custom time demo
Developer Homepage: http://www.crytek.com/
Product Homepage: http://www.ea.com/crysis/
Buy It Here

From the makers of Far Cry, Crysis offers FPS fans the best-looking, most highly-evolving gameplay, requiring the player to use adaptive tactics and total customization of weapons and armor to survive in dynamic, hostile environments including Zero-G.

Real time editing, bump mapping, dynamic lights, network system, integrated physics system, shaders, shadows and a dynamic music system are just some of the state of-the-art features the CryENGINE 2 offers. The CryENGINE 2 comes complete with all of its internal tools and also includes the CryENGINE 2 Sandbox world editing system.

NB: Due to the way that Crysis offers which resolutions you can select, some cards may not be tested at 1920 x 1200 and beyond.

Here we can see at the lower resolution we get some decent gains over the HD 3870 X2, but very small gains over the 8800GT are seen. As we climb up to the higher resolutions the GX2 shows considerable gains over the 8800GT at 1920 x 1200.

Benchmarks — Unreal Tournament 3

Unreal Tournament 3

Version and / or Patch Used: 1.1
Timedemo or Level Used:
Developer Homepage: http://www.epicgames.com/
Product Homepage: http://www.unrealtournament3.com/
Buy It Here

Following the formulae that made Unreal Tournament so great the third installment to the series has hit us recently with better than ever graphics. The games uses the latest Unreal Engine which like most modern day games when maxed out puts the pressure on our lineup of graphics cards.

At the lowest resolution we can see that the GX2 is coming out ahead of the X2, but the numbers are so high that it doesn’t really matter. We can see as we start going up that it lags behind the X2, but still shows good gains over the single 8800GT.

High Quality AA and AF

Our high quality tests let us separate the men from the boys and the ladies from the girls. If the cards weren’t struggling before they will start to now.

3DMark06

When you turn on AA there’s no surprise that the GX2 sneaks ahead of the X2. We see good gains compared to the 8800GT as well, which isn’t of any real surprise as 3DMark06 makes good use of multiple GPUs.

Half Life 2 (Episode Two HDR)

We can see under Episode Two that the AMD offering comes out ahead.

World In Conflict

WIC with AA and AF shows very little difference unlike our non AA and AF tests.

Benchmarks — 3DMark06 — XP

3DMark06

Version and / or Patch Used: Build 110
Developer Homepage: http://www.futuremark.com
Product Homepage: http://www.futuremark.com/products/3dmark06/
Buy It Here

3DMark06 is the very latest version of the «Gamers Benchmark» from FutureMark. The newest version of 3DMark expands on the tests in 3DMark05 by adding graphical effects using Shader Model 3.0 and HDR (High Dynamic Range lighting) which will push even the best DX9 graphics cards to the extremes.

3DMark06 also focuses on not just the GPU but the CPU using the AGEIA PhysX software physics library to effectively test single and Dual Core processors.

When we move to Windows XP we can actually see the GX2 coming out ahead of the X2 at all resolutions.

Benchmarks — CINEBENCH R10 — XP

CINEBENCH R10

Version and / or Patch Used: Release 10
Developer Homepage: http://www. maxon.net/
Product Homepage: http://www.maxon.net

CINEBENCH is a real-world test suite that assesses your computer’s performace capabilities. MAXON CINEBENCH is based on MAXON’s award-winning animation software, CINEMA 4D, which is used extensively by studios and production houses worldwide for 3D content creation. MAXON software has been used in blockbuster movies such as Spider-Man, Star Wars, The Chronicles of Narnia and many more.

MAXON CINEBENCH runs several tests on your computer to measure the performance of the main processor and the graphics card under real world circumstances. The benchmark application makes use of up to 16 CPUs or CPU cores and is available for Windows (32-bit and 64-Bit) and Macintosh (PPC and Intel-based).

We continue to see no gains under CINEBENCH with the GX2 actually performing at the back of the pack.

Benchmarks — World in Conflict — XP

World in Conflict

Version and / or Patch Used: 1. 0.0.5
Timedemo or Level Used: Built-in Test
Developer Homepage: http://www.massive.se
Product Homepage: http://www.worldinconflict.com

World in Conflict is a real-time strategy video game by Massive Entertainment and to be published by Sierra Entertainment for Windows (DX9 and DX10) and the Xbox 360.

The game is set in 1989 where economic troubles cripple the Soviet Union and threaten to dissolve it. However, the title pursues a «what if» scenario where, in this case, the Soviet Union does not collapse and instead pursues a course of war to remain in power. It is an intensive new game is sure to put plenty of stress on even the latest graphics cards and we use the built-in benchmarking for our testing.

Here we can see performance between the X2 and the GX2 is almost identical. The GX2 gains over the 8800GT can only be seen at the highest resolution.

Benchmarks — Unreal Tournament 3 — XP

Unreal Tournament 3

Version and / or Patch Used: 1. 1
Timedemo or Level Used:
Developer Homepage: http://www.epicgames.com/
Product Homepage: http://www.unrealtournament3.com/
Buy It Here

Following the formulae that made Unreal Tournament so great the third installment to the series has hit us recently with better than ever graphics. The games uses the latest Unreal Engine which like most modern day games when maxed out puts the pressure on our lineup of graphics cards.

Unlike Windows Vista, we see considerable gains over the X2 here. HUGE gains over the 8800GT are also seen at the highest resolution. This is the first sign of the GX2 that gives us any real faith in the product.

Benchmarks — Half Life 2 (Episode Two HDR) — XP

Half Life 2 (Episode Two HDR)

Version and / or Patch Used: Latest from Steam
Timedemo or Level Used: Custom Timedemo
Developer Homepage: http://www. valvesoftware.com
Product Homepage: http://www.half-life2.com
Buy It Here

By taking the suspense, challenge and visceral charge of the original, and adding startling new realism, responsiveness and new HDR technology, Half-Life 2 Episode Two opens the door to a world where the player’s presence affects everything around him, from the physical environment to the behaviors even the emotions of both friends and enemies.

We benchmark Half Life 2 Episode Two with our own custom timedemos as to avoid possible driver optimizations using the «record demo_name» command and loading the timedemo with the «timedemo demo_name» command — For a full list of the commands, click here.

Half Life 2 under XP also sees much better results. Big gains over the 8800GT at the higher resolutions can be witnessed here, and there’s a decent gap between the X2 and GX2.

Temperature and Sound Tests

Temperature Tests

With the TES 1326 Infrared Thermometer literally in hand we found ourselves getting real-world temperatures from the products we test at load (3D clock speeds).

There are two places we pull temperature from — the back of the card directly behind the core and if the card is dual slot and has an exhaust point we also pull a temperate from there, as seen in the picture.

The temperature on the card isn’t bad without the shroud. With it on however, we think will be a different story.

Sound Tests

Pulling out the TES 1350A Sound Level Meter we find ourselves quickly yelling into the top of it to see how loud we can be.

After five minutes of that we get a bit more serious and place the device two CM away from the fan on the card to find the maximum noise level of the card when idle (2D mode) and in load (3D mode).

Noise levels are also pretty standard for such a high-end card. The shroud should actually make it a little quieter.

Power Consumption Tests

Using our new PROVA Power Analyzer WM-01 or «Power Thingy» as it has become quickly known as to our readers, we are now able to find out what kind of power is being used by our test system and the associated graphics cards installed. Keep in mind; it tests the complete system (minus LCD monitor, which is plugged directly into AC wall socket).

There are a few important notes to remember though; while our maximum power is taken in 3DMark06 at the same exact point, we have seen in particular tests the power being drawn as much as 10% more. We test at the exact same stage every time; therefore tests should be very consistent and accurate.

The other thing to remember is that our test system is bare minimum — only a 7,200RPM SATA-II single hard drive is used without CD ROM or many cooling fans.

So while the system might draw 400 watts in our test system, placing it into your own PC with a number of other items, the draw is going to be higher.

Power usage at load comes in under the X2, but as you would expect, it’s considerably higher than the single 8800GT. It manages to carry with it the highest idle wattage as well.

Final Thoughts

I have to say that until we got to our Windows XP tests, the 9800GX2 has been a complete let down. Under UT3 it actually looks like we have a CPU bottleneck with performance pretty much the same across all resolutions.

It out-performs the HD 3870 X2 in 3DMark06 under XP, along with it also out-performing the AMD counterpart in Half Life 2: Episode Two; a game that has always favored the AMD offerings.

Come launch day, I’m not too sure if this is going to be a card worth picking up. While we see gains under XP, not everything sees the same success like UT3 and HL2:E2. There has to be a new driver coming out soon. Websites have posted a newer driver than the one we received on the driver CD, but it doesn’t actually work. It refuses to install even though the driver is stated to be for the GX2.

The card has potential, and we can see it’s got the power in some applications. It just needs to be harnessed better. If you had been reading the reviews online and thought «Bugger this, I’m off to get a HD 3870 X2», we would hold off, at least for a week or two.

As soon as word of a new driver comes out, we will be placing the card back on the test bench to hopefully get the gains that we expected.

At the moment the card doesn’t deserve to carry a new generation naming scheme, but with a little bit of hope seen from under Windows XP we could be alright. Now we just have to wait and see if we get a new driver in the next day or two.

Last minute add: I just performed a run of Unreal Tournament 3 at 2560 x 1600 and received an average of 90.66FPS which is just below what the HD 3870 X2 scores at 1920 x 1200.

Shopping Information

PRICING: You can find products similar to this one for sale below.

United States: Find other tech and computer products like this over at Amazon.com

United Kingdom: Find other tech and computer products like this over at Amazon.co.uk

Australia: Find other tech and computer products like this over at Amazon.com.au

Canada: Find other tech and computer products like this over at Amazon. ca

Deutschland: Finde andere Technik- und Computerprodukte wie dieses auf Amazon.de

Shawn Baker

Shawn takes care of all of our video card reviews. From 2009, Shawn is also taking care of our memory reviews, and from May 2011, Shawn also takes care of our CPU, chipset and motherboard reviews. As of December 2011, Shawn is based out of Taipei, Taiwan.

GeForce 9800 GX2 [in 1 benchmark]


NVIDIA
GeForce 9800 GX2

Buy

  • Interface PCIe 2.0 x16
  • Core clock speed 600MHz
  • Max video memory 512 MB
  • Memory type GDDR3
  • Memory clock speed 1000MHz
  • Maximum resolution

Summary

NVIDIA started GeForce 9800 GX2 sales 18 March 2008 at a recommended price of $599. This is Tesla architecture desktop card based on 65 nm manufacturing process and primarily aimed at office use. 512 MB of GDDR3 memory clocked at 1 GHz are supplied, and together with 512 Bit memory interface this creates a bandwidth of 128 (64 per GPU).

Compatibility-wise, this is dual-slot card attached via PCIe 2.0 x16 interface. Its manufacturer default version has a length of 10.5″ (26.7 cm). 6-pin & 8-pin power connector is required, and power consumption is at 197 Watt.

It provides poor gaming and benchmark performance at


2.71%

of a leader’s which is NVIDIA GeForce RTX 3090 Ti.


GeForce
9800 GX2

vs


GeForce RTX
3090 Ti

General info


Of GeForce 9800 GX2’s architecture, market segment and release date.

Place in performance rating 753
Value for money 3. 38
Architecture Tesla (2006−2010)
GPU code name G92
Market segment Desktop
Release date 18 March 2008 (14 years ago)
Launch price (MSRP) $599
Current price $5.95 (0x MSRP) of 49999 (A100 SXM4)

Value for money

To get the index we compare the characteristics of video cards and their relative prices.

  • 0
  • 50
  • 100

Technical specs


GeForce 9800 GX2’s general performance parameters such as number of shaders, GPU base clock, manufacturing process, texturing and calculation speed. These parameters indirectly speak of GeForce 9800 GX2’s performance, but for precise assessment you have to consider its benchmark and gaming test results.

Pipelines / CUDA cores 128 of 18432 (AD102)
CUDA cores 256 (128 per GPU)
Core clock speed 600 MHz of 2610 (Radeon RX 6500 XT)
Number of transistors 754 million of 14400 (GeForce GTX 1080 SLI Mobile)
Manufacturing process technology 65 nm of 4 (GeForce RTX 4080 Ti)
Thermal design power (TDP) 197 Watt of 900 (Tesla S2050)
Maximum GPU temperature 105 °C
Texture fill rate 76.8 billion/sec of 939.8 (h200 SXM5)
Floating-point performance 2x 384. 0 gflops of 16384 (Radeon Pro Duo)

Compatibility, dimensions and requirements


Information on GeForce 9800 GX2’s compatibility with other computer components. Useful when choosing a future computer configuration or upgrading an existing one. For desktop video cards it’s interface and bus (motherboard compatibility), additional power connectors (power supply compatibility).

Interface PCIe 2.0 x16
Length 10.5″ (26.7 cm)
Height 2-slot
Width 2-slot
Supplementary power connectors 6-pin & 8-pin
SLI options +

Memory


Parameters of memory installed on GeForce 9800 GX2: its type, size, bus, clock and resulting bandwidth. Note that GPUs integrated into processors don’t have dedicated memory and use a shared part of system RAM.

Memory type GDDR3
Maximum RAM amount 512 MB of 128 (Radeon Instinct MI250X)
Memory bus width 512 Bit of 8192 (Radeon Instinct MI250X)
Memory clock speed 1000 MHz of 21000 (GeForce RTX 3090 Ti)
Memory bandwidth 128 (64 per GPU) of 14400 (Radeon R7 M260)

Video outputs and ports


Types and number of video connectors present on GeForce 9800 GX2. As a rule, this section is relevant only for desktop reference video cards, since for notebook ones the availability of certain video outputs depends on the laptop model.

Display Connectors HDMIDual Link DVI
Multi monitor support +
HDMI +
Maximum VGA resolution 2048×1536
Audio input for HDMI S/PDIF

API support


APIs supported by GeForce 9800 GX2, sometimes including their particular versions.

DirectX 11.1 (10_0)
Shader Model 4.0
OpenGL 2.1 of 4.6 (GeForce GTX 1080 Mobile)
OpenCL 1.1
Vulkan N/A
CUDA +

Benchmark performance


Non-gaming benchmark performance of GeForce 9800 GX2. Note that overall benchmark performance is measured in points in 0-100 range.


Overall score

This is our combined benchmark performance rating. We are regularly improving our combining algorithms, but if you find some perceived inconsistencies, feel free to speak up in comments section, we usually fix problems quickly.


9800 GX2
2.71

  • Passmark
Passmark

This is probably the most ubiquitous benchmark, part of Passmark PerformanceTest suite. It gives the graphics card a thorough evaluation under various load, providing four separate benchmarks for Direct3D versions 9, 10, 11 and 12 (the last being done in 4K resolution if possible), and few more tests engaging DirectCompute capabilities.

Benchmark coverage: 26%


9800 GX2
797


Game benchmarks


Let’s see how good GeForce 9800 GX2 is for gaming. Particular gaming benchmark results are measured in frames per second. Comparisons with game system requirements are included, but remember that sometimes official requirements may reflect reality inaccurately.

Average FPS
Popular games

Relative perfomance


Overall GeForce 9800 GX2 performance compared to nearest competitors among desktop video cards.



ATI Radeon HD 3850 X2
102.95


NVIDIA GeForce GT 730
102. 95


AMD Radeon R7 M365X
102.58


NVIDIA GeForce 9800 GX2
100


ATI Radeon HD 5670
98.15


AMD Radeon R5 M435
97.79


NVIDIA GeForce GT 440
97.05

AMD equivalent


We believe that the nearest equivalent to GeForce 9800 GX2 from AMD is Radeon HD 5670, which is slower by 2% and lower by 4 positions in our rating.


Radeon HD
5670


Compare


Here are some closest AMD rivals to GeForce 9800 GX2:


AMD Radeon R7 M260DX
102.95


ATI Radeon HD 3850 X2
102. 95


AMD Radeon R7 M365X
102.58


NVIDIA GeForce 9800 GX2
100


ATI Radeon HD 5670
98.15


AMD Radeon R5 M435
97.79


ATI Radeon HD 4810
96.31

Similar GPUs

Here is our recommendation of several graphics cards that are more or less close in performance to the one reviewed.


GeForce GT
440


Compare


Radeon HD
3850 X2


Compare


GeForce
9800 GTX


Compare


Radeon HD
4810


Compare


Radeon HD
4830


Compare


Radeon HD
3870 X2


Compare

Recommended processors

These processors are most commonly used with GeForce 9800 GX2 according to our statistics.


Core 2
Quad Q6600

8.3%


Core 2
Duo E8400

5.6%


Core i3
10100F

5.6%


Ryzen 9
3950X

2.8%


Pentium
E5700

2.8%


Core i7
920

2.8%


Core i5
7300HQ

2.8%


Ryzen 7
3700U

2. 8%


Core i3
3220

2.8%


Ryzen 7
PRO 2700

2.8%

User rating


Here you can see the user rating of the graphics card, as well as rate it yourself.


Questions and comments


Here you can ask a question about GeForce 9800 GX2, agree or disagree with our judgements, or report an error or mismatch.


Please enable JavaScript to view the comments powered by Disqus.

PX98GX21024D3-NHM | Sparkle Geforce 9800 GX2 1024MB GDDR3

Sparkle GeForce 9800 GX2 1024MB 2x 256-bit GDDR3 PCI Express 2.0 x16 SLI Ready, (Dual Link) Dual DVI, HDMI Video Card. With two on-board GPUs, a GeForce 9800 GX2-based graphics solution is one of the fastest graphics card available, and when paired with a 7 Series NVIDIA nForce® motherboard, creates the latest in a line of powerful NVIDIA gaming platforms. Be blown away by scorching frame rates, true-to-life extreme HD gaming, and picture-perfect Blu-ray and HD DVD movies.

Specifications:

Product Type: GeForce 9800GX2 1GB Recertified Pull — Video Card
Model number:
SF-PX98GX21024D3-NHM
Graphics Processing:
NVIDIA GeForce 9800GX2
Processor Cores:
256(128×2)
Core Clock:
600MHz
Memory Clock:
2000MHz
Memory Type:
1024MB(2x512MB) GDDR3
Memory Interface:
2x 256-bit
Processor Clock:
1500 MHz
Bus Type:
PCI-Express 2.0
RAMDAC:
400 MHz
Product Packaging:
OEM (Video Card Only)
Product Condition: Recertified Pull (These Cards Were Pulled Out Of Computer Systems)   

Key Features:

NVIDIA® unified architecture
Fully unified shader core dynamically allocates processing power to geometry, vertex, physics, or pixel shading operations, delivering up to 2x the gaming performance of prior generation GPUs

Full Microsoft® DirectX® 10 support

DirectX 10 GPU with full Shader Model 4. 0 support delivers unparalleled levels of graphics realism and film-quality effects

Quad NVIDIA® SLI™ technology
Industry leading Quad NVIDIA SLI technology offers amazing performance scaling by implementing 4-way AFR(Alternate Frame Rendering),for the world’s fastest gaming solution under Windows Vista with solid ,state-of-the-art drivers.

NVIDIA HybridPower™ Technology

HybridPower technology automatically switches from the GeForce 9800 GX2 graphics card to the motherboard GeForce GPU when running non graphically-instensive applications for a silent,low power PC experience.

PCI Express 2.0 support

Designed for the new PCI Express 2.0 bus architecture offering the highest data transfer speeds for the most bandwidth-hungry games and 3D applications,while maintaining backwards compatibility with existing PCI Express motherboards for the broadest support.

GigaThread™ Technology
Massively multi-threaded architecture supports thousands of independent, simultaneous threads, providing extreme processing efficiency in advanced, next generation shader programs

NVIDIA® Lumenex™ Engine

Delivers stunning image quality and floating point accuracy at ultra-fast frame rates:
16x Anti-aliasing Technology: Lightning fast, high-quality anti-aliasing at up to 16x sample rates obliterates jagged edges
128-bit floating point High Dynamic-Range(HDR) Lighting:Twice the precision of prior generations for incredibly realistic lighting effects-now with support for anti-aliasing

NVIDIA® Quantum Effects™ Technology
Advanced shader processors architected for physics computation enable a new level of physics effects to be simulated and rendered on the GPU –all white freeing the CPU to run game engine and AI

NVIDIA® ForceWare® Unified Driver Architecture (UDA)

Delivers a proven record of compatibility reliability and stability with the widest range of games and applications
ForceWare provides the best out-of-box experience for every user and delivers continuous performance and feature updates over the life of NVIDIA GeForce® GPUs

OpenGL® 2. 1 optimizations and support

Ensures top-notch compatibility and performance for OpenGL applications
Dual 400MHz RAMDACs
Blazing-fast RAMDACs support dual QXGA displays with ultra-high, ergonomic refresh rates –up to 2048×1536 @ 85 HZ

Dual Dual-Link DVI Support
Able to drive industry’s largest and highest resolution flat-panel displays up to 2560×1600 and with support for High-bandwidth Digital Content Protection(HDCP).

HDMI Output
Integrated HDMI connector enables sending both high-definition video and audio signals to an HDTV via a single cable.

NVIDIA PureVideo HD technology
The combination of high-definition video decode acceleration and post-processing that delivers unprecedented picture clarity, smooth video, accurate color, and precise image scaling for movies and video.

Discrete, Programmable Video Processor

NVIDIA PureVideo is a discrete programmable processing core in NVIDIA GPUs that provides superb picture quality and ultra-smooth movies with 100% offload of H. 264 video decoding from the CPU and significantly reduced power consumption.

Hardware Decode Acceleration

Provides ultra-smooth playback of H.264, VC-1, WMV and PEG-2 HD and SD movies.

Dual Stream Decode Acceleration
Hardware acceleration for HD picture-in-picture enables a complete HD movie playback experience.

Dynamic contract Enhancement

Provides post-processing and optimization of High Definition movies on a scene basis for spectacular picture clarity

HDCP Capable

Designed to meet the output protection management (HDCP) and security specifications of the Blu-ray Disc and HD DVD formats, allowing the playback of encrypted movie content on PCs when connected to HDCP-compliant displays.

Advanced Spatial-Temporal De-Interlacing
Sharpens HD and standard definition interlaced content on progressive displays, delivering a crisp, clear picture that rivals high-end home-theater systems.

High-Quality Scaling
Enlarges lower resolution movies and videos to HDTV resolutions, up to 1080i, while maintaining a clear, clean image. Also provides downscaling of videos, including high-definition, while preserving image detail.

Inverse Telecine (3:2 & 2:2 Pulldown Correction)
Recovers original film images from films-converted-to-video (DVDs, 1080i HD content), providing more accurate movie playback and superior picture quality.

Bad Edit Correction

When videos are edited after they have been converted from 24 to 25 or 30 frames, the edits can disrupt the normal 3:2 or 2:2 pulldown cadence. PureVideo uses advanced processing techniques to detect poor edits, recover the original content, and display
perfect picture detail frame after frame for smooth, natural looking video.

Noise Reduction
Improves movie image quality by removing unwanted artifacts.

Edge Enhancement
Sharpens movie images by providing higher contrast around lines and object

ASUS GeForce 9800 GX2 review

By now we can safely say that 400 million dollars later, the G80 architecture was good to Nvidia. First released in November 2006 in the form of the still quite capable GeForce 8800 GTX, this then new graphics architecture set an industry benchmark that was not met by ATI until very recently.

The biggest problem Nvidia had with the GeForce 8800 initially was its grossly expensive manufacture cost, which ultimately was passed on to the consumer, or at least those that could afford them. Eventually however, less speedy but cheaper cards went into the market and practically dominated the scene throughout 2007. It wasn’t until very late in the year when the Radeon HD 3800 series appeared, and finally challenge the GeForce albeit only those cards in the mainstream $200-300 target.


As we entered 2008, Nvidia remained tight lipped about a true next-generation product that could push the performance envelope further. At the same time we were expecting AMD to hold its promise about a dual-GPU solution based on the Radeon HD 3800 that unlike conventional multi-GPU technology, would not require a Crossfire-compatible motherboard to work, and would become a real contender in the high-end graphics market.

We admitted being somewhat skeptical about the Radeon HD 3870 X2 when we were presented with the idea on paper. The single-card implementation of Crossfire could have easily transformed into big product delays and an overall less appealing product down the line. But AMD proved itself this time, successfully launching the Radeon HD 3870 X2 on schedule, and perhaps even more important than that, having actual products on retail shelves immediately.

The performance of the Radeon HD 3870 X2 was more solid than we initially expected, as it delivered very similar performance to a Crossfire Radeon HD 3870 setup. The advantage being that the 3870 X2 costs slightly less at ~$450, and it will work on virtually any motherboard with a PCI Express x16 slot. Having snatched the performance crown away from Nvidia, we quickly declared the Radeon HD 3870 X2 a success!

As you can imagine, it wasn’t on Nvidia’s plans to sit back and watch ATI reign supreme with their new dual-GPU graphics card, not when they had so many impressive 65nm GPUs that could share the same PCB.

If the Radeon HD 3870 X2 is essentially two Radeon HD 3870 GPUs put together on the same PCB, then it is safe to say that the new GeForce 9800 GX2 is no different, with two GeForce 8800 GTS 512 GPUs slapped together on a single PCB.

What ATI should find scary about this is that the GeForce 8800 GTS 512 is significantly faster than the Radeon HD 3870, and so that gives us a good starting point to start analyzing the GeForce 9800 GX2…

The 9800 GX2 Card

ASUS went out on a limb with their Radeon HD 3870 X2, as they were the only manufacturer to develop their own custom cooling design while also including all four DVI outputs.

This made this product a little more special and allowed it to stand out in the sea of Radeon HD 3870 X2 graphics cards. However, with the new GeForce 9800 GX2 they have not been as bold, releasing a product that closely follows the Nvidia reference design this time.

In terms of physical dimensions the GeForce 9800 GX2 is a monster, measuring 27cm long, which is wider than a full ATX motherboard. The card dimensions are however very similar to that of a GeForce 8800 GTX/Ultra and the recently released Radeon HD 3870 X2, so we are not entering unchartered territory.

While the GeForce 9800 GX2 is a dual-GPU solution like the Radeon HD 3870 X2, their design is actually very different.

Whereas the Radeon HD 3870 X2 features both GPUs and the 1GB of memory on the same PCB, things are done a little differently on the GeForce. Similar to the older GeForce 7950 GX2, the GeForce 9800 GX2 features two separate PCBs, each featuring their own G92 GPU and 512MB of GDDR3 memory. However, unlike the GeForce 7950 GX2 that featured two separate single slot coolers, the GeForce 9800 GX2 uses a sandwich plate design.

Basically what this means is that they have taken a typical graphics card heatsink and sandwiched it in-between the two cards, allowing the single chunk of aluminum to cool both GPUs and all 1024MB of memory.

Nvidia claims that this method will keep the GeForce 9800 GX2 cooler at higher clock speeds. This rather complex design will be more difficult to support by third parties, so it is unlikely that we will see many, if any, after market coolers for the GeForce 9800 GX2.

As we had mentioned before, the Nvidia GeForce 9800 GX2 comes with 1GB of memory (512MB assigned to each GPU). The default operating specification for the memory stands at 2000MHz, while each GPU has been designed to work at 600MHz.

This means that the cores are each clocked 50MHz lower than the GeForce 8800 GTS 512 graphics card, while the memory runs 60MHz faster, for a theoretical memory bandwidth of 128GB/s, the highest for any Nvidia card ever produced.

Benchmarks: System Specs & Crysis

Test System Specs: Hardware
— Intel Core 2 Duo E8400 (3.00GHz) LGA775

— x2 OCZ DDR3 PC3-12800 FlexXLC Edition Module(s)
— x2 OCZ DDR2 PC2-6400 CL4 FlexXLC Edition Module(s)

— ASUS P5N-T Deluxe (NVIDIA nForce 780i SLI)
— ASUS P5E3 Deluxe (Intel X38)

— OCZ GameXStream (700 watt)

— Seagate 500GB 7200RPM (Serial ATA II)

— ASUS GeForce 9800 GX2 (1GB) — Forceware 174.53
— ASUS Radeon HD 3870 X2 (1GB) — Catalyst 8.4
— ASUS GeForce 8800 GTX (768MB) — Forceware 169.28
— ASUS GeForce 8800 GT (512MB) SLI — Forceware 169.28
— ASUS GeForce 8800 GT (512MB) — Forceware 169.28
— ASUS Radeon HD 3870 (512MB) ATI Catalyst 8.1
— ASUS Radeon HD 3870 (512MB) ATI Catalyst 8.1

Software
— Microsoft Windows Vista Ultimate (64-bit)
— nForce 9. 46
— Intel System Driver 8.4.0.1016
— Nvidia Forceware 174.53
— Nvidia Forceware 169.28
— ATI Catalyst 8.4
— ATI Catalyst 8.1


It didn’t take long for the ASUS GeForce 9800 GX2 to impress. We found it to be 57% faster than the Radeon HD 3870 X2 at 1920×1200 in Crysis. Furthermore, the GeForce 9800 GX2 was 26% faster than the GeForce 8800 GT SLI configuration, which was also very impressive

Rendering 47.7fps at 1920×1200 while using high quality settings in Crysis is something we had not seen before using a single graphics card.

Benchmarks: Company of Heroes


The ASUS GeForce 9800 GX2 continued to impress, producing a massive 54% performance lead over the Radeon HD 3870 X2 at 1920×1200 in Company of Heroes.

Rendering 161.9fps is incredible and will certainly allow gamers to max out the quality settings. It is amazing to think that the GeForce 9800 GX2 was roughly three times faster than a single Radeon HD 3870 graphics card in this game.

Benchmarks: F.E.A.R


While not as impressive in F.E.A.R, the ASUS GeForce 9800 GX2 was still the fastest graphics card, managing 106fps at 1920×1200, whereas the Radeon HD 3870 X2 was limited to a marginally lower 98fps.

The GeForce 8800 GT SLI configuration rendered 97fps, while a single GeForce 8800 GT graphics card still managed 59fps on average at 1920×1200.

Benchmarks: Prey


The ASUS GeForce 9800 GX2 was roughly as fast as the GeForce 8800 GT SLI configuration in Prey, which made it significantly faster than the Radeon HD 3870 X2 and the GeForce 8800 GTX.

Benchmarks: Supreme Commander


Supreme Commander is a highly demanding real-time strategy game where in the past we have found Radeon-based graphics cards to be notoriously better performers.

That said, the performance margins between the GeForce 9800 GX2 and the Radeon HD 3870 X2 were very close, though the slim margins always showed positively in favor of the GeForce 9800 GX2.

Benchmarks: Unreal Tournament 3


The Radeon HD 3870 X2 was not as impressive in Unreal Tournament 3 as we thought it could have been. Though the average frame rates were incredibly high, it hardly put the GeForce 8800 GTX away with its performance.

The ASUS GeForce 9800 GX2 was 21% faster than the Radeon HD 3870 X2 at 1920×1200, while also beating the 8800 GTX by a small margin at 1440×900 and by a wider one at higher resolutions.

Benchmarks: World in Conflict


The ASUS GeForce 9800 GX2 finished quite strong with World of Conflict, where it was found to be 26% faster than the Radeon HD 3870 X2 at 1920×1200. To give you the full picture, the Radeon HD 3870 X2 was the second fastest performer in this game, defeating the GeForce 8800 GT SLI configuration.

Power Consumption & Temperatures


The ASUS GeForce 9800 GX2 consumes 24% more power at idle when compared to the Radeon HD 3870 X2, but it was also slightly more conservative than the GeForce 8800 GT SLI configuration. When testing with Crysis (that means LOAD) the GeForce 9800 GX2 used 3% more power than the Radeon HD 3870 X2, which rates as OK in our book.


The operating temperature of the GeForce 9800 GX2 was quite high and although the stress temp of 74 degrees does seem reasonable when compared to the GeForce 8800 GT and Radeon HD 3870 graphics cards, keep in mind that it runs almost constantly at this temperature. The GeForce 9800 GX2 is certainly going to bake a few cases!

Final Thoughts

Just last month we made this statement when reviewing the Radeon HD 3870 X2…

“Better late than never, AMD/ATI have done it. There is no question that the Radeon HD 3870 X2 is the fastest single card solution on the market today.”

Well, already we are going to have to retract that statement, as Nvidia has undoubtedly reclaimed the performance crown with the GeForce 9800 GX2. 
While AMD has nothing in its arsenal that can take on the GX2, the war is far from over. Yes, the GeForce 9800 GX2 is the single fastest graphics card on the planet, but all this speed comes at a cost, a really high cost!

The Radeon HD 3870 X2 was released and made available at $450, and today prices range between $420 and $450. On the other hand, the GeForce 9800 GX2 has been released at $600, and the cheapest we have found so far will set you back $570. So, while the GeForce 9800 GX2 is unquestionably faster than the Radeon HD 3870 X2, is it worth the 35% price premium?

Those looking for a product offering that goes beyond the mainstream range topping at $300 will think it twice before going all the way up to $600, even when the card is this fast. Meanwhile, those gamers out there that can spend more for the ultimate performance will love what the new GeForce 9800 GX2 has to offer.

In terms of features, the GeForce 9800 GX2 does have one glaring weakness and that is its lack of DirectX 10.1 support, something that is present on most of the recently released Radeon products, although that remains relatively unused for now. On a more positive note, the GeForce 9800 GX2 has been a dream test subject, with no problems whatsoever during testing, showing maturity in terms of drivers, even when we ran all these tests using Vista 64-bit which only a few months back was simply overkill for Nvidia hardware.

Just like the Radeon HD 3870 X2, the new GeForce 9800 GX2 was easy to install and use, with no manual configuration required, you can simply forget it’s a dual GPU card. The GeForce 9800 GX2 is a well designed product and a solid performer that has as its only real weakness the extremely high price tag.

XFX Nvidia GeForce 9800 GX2 600M 1GB

Heat

We don’t normally comment much on heat with graphics cards, but the GeForce 9800 GX2 is an exception because the heat this card produces caused us some headaches along the way. We encountered a number of heat-related crashes, hard locks and instabilities with our Asus Striker II Formula motherboard.

The thing was, these problems were limited to only when the GeForce 9800 GX2 was installed in the board. With two GeForce 8800 GTS 512MB cards in SLI, there were no problems and the story was the same with the GeForce 8800 Ultra as well – we’re 100 percent certain that the problems we encountered were entirely related to the installation of the GeForce 9800 GX2.

We witnessed temperatures in excess of 90 degrees Celsius on both the nForce 200 chip and the south bridge on the Asus motherboard – this triggered the company’s overheat protection feature that automatically shuts down the motherboard to prevent long term damage.

Throughout all of our testing, we closely monitored GPU temperatures after we first encountered stability problems and they never exceeded 85 degrees Celsius, which is pretty good for a graphics card—especially one with two GPUs so closely packed together. In fact, Nvidia says the point where the GPU starts to throttle itself down is when the core reaches a temperature of 105°C for reference purposes, so there’s still quite a bit of headroom.

Click to enlarge

A fan placed directly above the south bridge prevented these heat related problems from cropping up again. And after a lot of time spent stress testing GeForce 9800 GX2 cards in other motherboards (such as XFX’s nForce 780i SLI and nForce 790i SLI boards, as well as the Gigabyte GA-X38-DS5), we have come to the conclusion that system instability may occur on nForce 700-series motherboards with the nForce 200 chip located in between the top two PCI-Express x16 slots.

In that location, the chip is right underneath the GeForce 9800 GX2 and, with the card being covered in what is essentially a big heatspreader, a heat pocket is created underneath the card.

The problem as I see it is that when Nvidia set the design guidelines for partners to adhere to when designing nForce 780i SLI motherboards, Nvidia gave engineers the freedom to place the nForce 200 chip wherever they pleased – there were no limitations imposed. During my conversations with Nvidia’s nForce 780i SLI engineers last October, I asked on several occasions whether the nForce 200 chip had a significant TDP because I had spoken to several motherboard engineers that had complained about the increased TDP requirements for the chipset.

Every time I asked the question, I was told that the TDP increase was insignificant. That may be the case with the Nvidia-designed board used by some partners, but it’s definitely not the case here with a GeForce 9800 GX2 installed in the Asus Striker II Formula. The problem for motherboard engineers was that they probably didn’t know a great deal (if anything) about the GeForce 9800 GX2 and the amount of heat it produces. And because Nvidia imposed no strict limitations on the placement of nForce 200, the problem has come out into the open.

It’s sad that we have to put a downer on this card’s performance, but we have to make you aware of what you may encounter if you’re using one of the more popular nForce 780i SLI motherboards on the market. This isn’t XFX’s fault, or even really Asus’s fault either – it’s just a bit of a mess that wouldn’t have come about if Nvidia had imposed design limitations on the placement of the nF200 chip.

Conclusions

The XFX GeForce 9800 GX2 is a good implementation of Nvidia’s GeForce 9800 GX2 design and delivers some unprecedented levels of performance. In many respects, the GeForce 9800 GX2 has left AMD in a strange position, because it was only a matter of weeks ago when it challenged for the performance crown once more. The Radeon HD 3870 X2, while cheaper, has been beaten into the ground pretty comprehensively across the board here – there are a few glimmers of hope, but on the whole it doesn’t look very good.

I think it’s time for AMD to head back to the drawing board because it really has fallen a long way behind Nvidia in recent months. The R600 generation of products just haven’t been brilliant for AMD and it’s been clear for a while that Nvidia has been sandbagging releases, doing just enough to keep AMD at bay while it prepares for an impending battle for supremacy with Intel.

At around £435 including VAT (and delivery to regular contributors to the bit-tech community), the XFX GeForce 9800 GX2 represents reasonable value for money for a high end card, but it’s one of the more expensive GeForce 9800 GX2s on the market. If you’re just after a white box, this probably isn’t the card for you.

There’s also the fact that it’s over £170 more expensive than the cheapest Radeon HD 3870 X2s, although they’re all on pre-order. The cheapest you can pick up a 3870 X2 for today is around £265 (inc. VAT). I guess you have to pay for the privilege of the best performance on the market… and that’s exactly what’s happening here – Nvidia can charge what it wants for this card, just like it did with the GeForce 8800 Ultra.

Where it gets interesting is when you look at the price of a pair of GeForce 8800 GTS 512MB cards – BFG Tech’s GeForce 8800 GTS OC 512MB card is available for just over £170 (inc. VAT) for example and that’s one of the cheapest on the market at the moment. A pair of those will cost you less than a GeForce 9800 GX2 and will, by and large, deliver a very similar gaming experience in most scenarios, but they won’t offer HybridPower when motherboards and chipsets supporting the technology become available.

Final Thoughts…

Overall then, if you’re after the fastest single card solution on the market, the GeForce 9800 GX2 is it and if you don’t have an nForce motherboard, it’s currently the only way to enable SLI technology on other motherboards and could quite possibly be the fastest graphics solution in that situation as well, given some of the performance differences between the GeForce 9800 GX2 and the Radeon HD 3870 X2.

If you do have an nForce motherboard and aren’t likely to suffer from the documented heat problems, there are options available for less money that deliver similar performance in most scenarios, but the GeForce 9800 GX2 isn’t really about delivering value for money. What it is about is delivering the fastest graphics card on the market and, in that respect it delivers. And what’s more, Nvidia will soon release a driver to enable Quad SLI on a pair of GeForce 9800 GX2s to potentially deliver more performance than three GeForce 8800 GTX/Ultra cards – we just hope Quad SLI is better than it was last time.

  • Features
  • x
  • x
  • x
  • x
  • x
  • x
  • x
  • x
  • 8/10
  • Performance
  • x
  • x
  • x
  • x
  • x
  • x
  • x
  • x
  • x
  • 9/10
  • Value
  • x
  • x
  • x
  • x
  • x
  • 5/10
  • Overall
  • x
  • x
  • x
  • x
  • x
  • x
  • x
  • 7/10

What do these scores mean?

1 — Nvidia GeForce 9800 GX22 — Nvidia GeForce 9800 GX2 architecture3 — Nvidia GeForce 9800 GX2 architecture4 — What’s the first thing you do when. ..5 — XFX GeForce 9800 GX2 600M 1GB6 — Let there be light…7 — Let’s talk about drawbacks8 — XFX GeForce 9800 GX2 Box & Bundle, Test Setup9 — Crysis10 — Call of Duty 4: Modern Warfare11 — World in Conflict12 — BioShock13 — Supreme Commander14 — The Elder Scrolls IV: Oblivion15 — Enemy Territory: Quake Wars16 — Power Consumption17 — Final Thoughts

0009

shader blocks 128 x2 TMU 64 x2 ROP 16 x2 MIC

9000 9000

1000

1000

1000

1000

1000

1000

1000

1000

1000

memory GDDR3 256-bit x2 Memory volume 1024 PSP, GB/s 64 x2

999

shaders model

4.0

000

000

000

000

00090Radeon RX 580 XTRRadeon RX 580Radeon RX 570Radeon RX 560Radeon RX 550Radeon RX 480Radeon RX 470Radeon RX 460Radeon R9 Fury XRadeon R9 FuryRadeon R9 NanoRadeon R9 390XRadeon R9 390Radeon R9 380XRadeon R9 380Radeon R7 370Radeon R7 360Radeon R9 295X2Radeon R9 290XRadeon R9 290Radeon R9 280XRadeon R9 285Radeon R9 280Radeon R9 270XRadeon R9 270Radeon R7 265Radeon R7 260XRadeon R7 260Radeon R7 250Radeon R7 240Radeon HD 7970Radeon HD 7950Radeon HD 7870 XTRadeon HD 7870Radeon HD 7850Radeon HD 7790Radeon HD 7770 HD 7770Radeon90Radeon HD 6970Radeon HD 6950Radeon HD 6930Radeon HD 6870Radeon HD 6850Radeon HD 6790Radeon HD 6770Radeon HD 6750Radeon HD 6670 GDDR5Radeon HD 6670 GDDR3Radeon HD 6570 GDDR5Radeon HD 6570 GDDR3Radeon HD 6450 GDDR5Radeon HD 6450 GDDR3Radeon HD 5570 GDDR5Radeon HD 3750Radeon HD 3730Radeon HD 5970Radeon HD 5870Radeon HD 5850Radeon HD 5830Radeon HD 5770Radeon HD 5750Radeon HD 5670Radeon HD 5570Radeon HD 5550Radeon HD 5450Radeon HD 4890Radeon HD 4870 X2Radeon HD 4870Radeon HD 4860Radeon HD 4850 X2Radeon HD 4850Radeon HD 4830Radeon HD 4790Radeon HD 4770Radeon HD 4730Radeon HD 4670Radeon HD 4650Radeon HD 4550Radeon HD 4350Radeon HD 4290 (IGP 890GX) Radeon HD 4200 (IGP)Radeon HD 3870 X2Radeon HD 3870Radeon HD 3850Radeon HD 3690Radeon HD 3650Radeon HD 3470Radeon HD 3450Radeon HD 3300 (IGP)Radeon HD 3200 ( IGP)Radeon HD 3100 (IGP)Radeon HD 2900 XT 1Gb GDDR4Radeon HD 2900 XTRadeon HD 2900 PRORadeon HD 2900 GTRadeon HD 2600 XT DUALRadeon HD 2600 XT50 CrossFire EditionRadeon X1950 XTXRadeon X1950 XTRadeon X1950 PRO DUALRadeon X1950 PRORadeon X1950 GTRadeon X1900 CrossFire EditionRadeon X1900 XTXRadeon X1900 XTRadeon X1900 GT Rev2Radeon X1900 GTRadeon X1800 CrossFire EditionRadeon X1800 XT PE 512MBRadeon X1800 XTRadeon X1800 XLRadeon X1800 GTORadeon X1650 XTRadeon X1650 GTRadeon X1650 XL DDR3Radeon X1650 XL DDR2Radeon X1650 PRO on RV530XTRadeon X1650 PRO on RV535XTRadeon X1650Radeon X1600 XTRadeon X1600 PRORadeon X1550 PRORadeon X1550Radeon X1550 LERadeon X1300 XT on RV530ProRadeon X1300 XT on RV535ProRadeon X1300 CERadeon X1300 ProRadeon X1300Radeon X1300 LERadeon X1300 HMRadeon X1050Radeon X850 XT Platinum EditionRadeon X850 XT CrossFire EditionRadeon X850 XT Radeon X850 Pro Radeon X800 XT Platinum Edition Radeon X800 XTRadeon X800 CrossFire Edition ProRadeon X700Radeon X600 XTRadeon X600 ProRadeon X550 XTRadeon X550Radeon X300 SE 128MB HM-256MBRadeon X300 SE 32MB HM-128MBRadeon X300Radeon X300 SERadeon 9800 XTRadeon 9800 PRO /DDR IIRadeon 9800 PRO /DDRRadeon 9800Radeon 9800 SE-256 bitRadeon 9800 SE-128 bitRadeon 9700 PRORadeon 9700Radeon 9600 XTRadeon 9600 PRORadeon 9600Radeon 9600 SERadeon 9600 TXRadeon 9550 XTRadeon 9550Radeon 9550 SERadeon 9500 PRORadeon 9500 /128 MBRadeon 9500 /64 MBRadeon 9250Radeon 9200 PRORadeon 9200Radeon 9200 SERadeon 9000 PRORadeon 9000Radeon 9000 XTRadeon 8500 LE / 9100Radeon 8500Radeon 7500Radeon 7200 Radeon LE Radeon DDR OEM Radeon DDR Radeon SDR Radeon VE / 7000Rage 128 GL Rage 128 VR Rage 128 PRO AFRRage 128 PRORage 1283D Rage ProNVIDIAGeForce RTX 4090GeForce RTX 4080 16GBGeForce RTX 4080 12GBGeForce RTX 3090 TiGeForce RTX 3090GeForce RTX 3080 TiGeForce RTX 3080 12GBGeForce RTX 3080GeForce RTX 3070 TiGeForce RTX 3070GeForce RTX 3060 TiGeForce RTX 3060 rev. 2GeForce RTX 3060GeForce RTX 3050GeForce RTX 2080 TiGeForce RTX 2080 SuperGeForce RTX 2080GeForce RTX 2070 SuperGeForce RTX 2070GeForce RTX 2060 SuperGeForce RTX 2060GeForce GTX 1660 TiGeForce GTX 1660 SuperGeForce GTX 1660GeForce GTX 1650 SuperGeForce GTX 1650 GDDR6GeForce GTX 1650 rev.3GeForce GTX 1650 rev.2GeForce GTX 1650GeForce GTX 1630GeForce GTX 1080 TiGeForce GTX 1080GeForce GTX 1070 TiGeForce GTX 1070GeForce GTX 1060GeForce GTX 1060 3GBGeForce GTX 1050 TiGeForce GTX 1050 3GBGeForce GTX 1050GeForce GT 1030GeForce GTX Titan XGeForce GTX 980 TiGeForce GTX 980GeForce GTX 970GeForce GTX 960GeForce GTX 950GeForce GTX TitanGeForce GTX 780 TiGeForce GTX 780GeForce GTX 770GeForce GTX 760GeForce GTX 750 TiGeForce GTX 750GeForce GT 740GeForce GT 730GeForce GTX 690GeForce GTX 680GeForce GTX 670GeForce GTX 660 TiGeForce GTX 660GeForce GTX 650 Ti BoostGeForce GTX 650 TiGeForce GTX 650GeForce GT 640 rev.2GeForce GT 640GeForce GT 630 rev.2GeForce GT 630GeForce GTX 590GeForce GTX 580GeForce GTX 570GeForce GTX 560 TiGeForce GTX 560GeForce GTX 550 TiGeForce GT 520GeForce GTX 480GeForce GTX 470GeForce GTX 465GeForce GTX 460 SEGeForce GTX 460 1024MBGeForce GTX 460 768MBGeForce GTS 450GeForce GT 440 GDDR5GeForce GT 440 GDDR3GeForce GT 430GeForce GT 420GeForce GTX 295GeForce GTX 285GeForce GTX 280GeForce GTX 275GeForce GTX 260 rev. 2GeForce GTX 260GeForce GTS 250GeForce GTS 240GeForce GT 240GeForce GT 230GeForce GT 220GeForce 210Geforce 205GeForce GTS 150GeForce GT 130GeForce GT 120GeForce G100GeForce 9800 GTX+GeForce 9800 GTXGeForce 9800 GTSGeForce 9800 GTGeForce 9800 GX2GeForce 9600 GTGeForce 9600 GSO (G94)GeForce 9600 GSOGeForce 9500 GTGeForce 9500 GSGeForce 9400 GTGeForce 9400GeForce 9300GeForce 8800 ULTRAGeForce 8800 GTXGeForce 8800 GTS Rev2GeForce 8800 GTSGeForce 8800 GTGeForce 8800 GS 768MBGeForce 8800 GS 384MBGeForce 8600 GTSGeForce 8600 GTGeForce 8600 GSGeForce 8500 GT DDR3GeForce 8500 GT DDR2GeForce 8400 GSGeForce 8300GeForce 8200GeForce 8100GeForce 7950 GX2GeForce 7950 GTGeForce 7900 GTXGeForce 7900 GTOGeForce 7900 GTGeForce 7900 GSGeForce 7800 GTX 512MBGeForce 7800 GTXGeForce 7800 GTGeForce 7800 GS AGPGeForce 7800 GSGeForce 7600 GT Rev.2GeForce 7600 GTGeForce 7600 GS 256MBGeForce 7600 GS 512MBGeForce 7300 GT Ver2GeForce 7300 GTGeForce 7300 GSGeForce 7300 LEGeForce 7300 SEGeForce 7200 GSGeForce 7100 GS TC 128 (512)GeForce 6800 Ultra 512MBGeForce 6800 UltraGeForce 6800 GT 256MBGeForce 6800 GT 128MBGeForce 6800 GTOGeForce 6800 256MB PCI-EGeForce 6800 128MB PCI-EGeForce 6800 LE PCI-EGeForce 6800 256MB AGPGeForce 6800 128MB AGPGeForce 6800 LE AGPGeForce 6800 GS AGPGeForce 6800 GS PCI-EGeForce 6800 XTGeForce 6600 GT PCI-EGeForce 6600 GT AGPGeForce 6600 DDR2GeForce 6600 PCI-EGeForce 6600 AGPGeForce 6600 LEGeForce 6200 NV43VGeForce 6200GeForce 6200 NV43AGeForce 6500GeForce 6200 TC 64(256)GeForce 6200 TC 32(128)GeForce 6200 TC 16(128)GeForce PCX5950GeForce PCX 5900GeForce PCX 5750GeForce PCX 5550GeForce PCX 5300GeForce PCX 4300GeForce FX 5950 UltraGeForce FX 5900 UltraGeForce FX 5900GeForce FX 5900 ZTGeForce FX 5900 XTGeForce FX 5800 UltraGeForce FX 5800GeForce FX 5700 Ultra /DDR-3GeForce FX 5700 Ultra /DDR-2GeForce FX 5700GeForce FX 5700 LEGeForce FX 5600 Ultra (rev. 2)GeForce FX 5600 Ultra (rev.1)GeForce FX 5600 XTGeForce FX 5600GeForce FX 5500GeForce FX 5200 UltraGeForce FX 5200GeForce FX 5200 SEGeForce 4 Ti 4800GeForce 4 Ti 4800-SEGeForce 4 Ti 4200-8xGeForce 4 Ti 4600GeForce 4 Ti 4400GeForce 4 Ti 4200GeForce 4 MX 4000GeForce 4 MX 440-8x / 480GeForce 4 MX 460GeForce 4 MX 440GeForce 4 MX 440-SEGeForce 4 MX 420GeForce 3 Ti500GeForce 3 Ti200GeForce 3GeForce 2 Ti VXGeForce 2 TitaniumGeForce 2 UltraGeForce 2 PROGeForce 2 GTSGeForce 2 MX 400GeForce 2 MX 200GeForce 2 MXGeForce 256 DDRGeForce 256Riva TNT 2 UltraRiva TNT 2 PRORiva TNT 2Riva TNT 2 M64Riva TNT 2 Vanta LTRiva TNT 2 VantaRiva TNTRiva 128 ZXRiva 128 9Fury XRadeon R9 FuryRadeon R9 NanoRadeon R9 390XRadeon R9 390Radeon R9 380XRadeon R9 380Radeon R7 370Radeon R7 360Radeon R9 295X2Radeon R9 290XRadeon R9 290Radeon R9 280XRadeon R9 285Radeon R9 280Radeon R9 270XRadeon R9 270Radeon R7 265Radeon R7 260XRadeon R7 260Radeon R7 250Radeon R7 240Radeon HD 7970Radeon HD 7950Radeon HD 7870 XTRadeon HD 7870Radeon HD 7850Radeon HD 7790Radeon HD 7770Radeon HD 7750Radeon HD 6990Radeon HD 6970Radeon HD 6950Radeon HD 6930Radeon HD 6870Radeon HD 6850Radeon HD 6790Radeon HD 6770Radeon HD 6750Radeon HD 6670 GDDR5Radeon HD 6670 GDDR3Radeon HD 6570 GDDR5Radeon HD 6570 GDDR3Radeon HD 6450 GDDR5Radeon HD 6450 GDDR3Radeon HD 5570 GDDR5Radeon HD 3750Radeon HD 3730Radeon HD 5970Radeon HD 5870Radeon HD 5850Radeon HD 5830Radeon HD 5770Radeon HD 5750Radeon HD 5670Radeon HD 5570Radeon HD 5550Radeon HD 5450Radeon HD 4890Radeon HD 4870 X2Radeon HD 4870Radeon HD 4860Radeon HD 4850 X2Radeon HD 4850Radeon HD 4830Radeon HD 4790Radeon HD 4770Radeon HD 4730Radeon HD 4670Radeon HD 4650Radeon HD 4550Radeon HD 4350Radeon HD 4350Radeon HD 43500 (IGP 890GX) Radeon HD 4200 (IGP)Radeon HD 3870 X2Radeon HD 3870Radeon HD 3850Radeon HD 3690Radeon HD 3650Radeon HD 3470Radeon HD 3450Radeon HD 3300 (IGP)Radeon HD 3200 (IGP)Radeon HD 3100 (IGP)Radeon HD 2900 XT 1Gb GDDR4Radeon HD 2900 XTRadeon HD 2900 PRORadeon HD 2900 GTRadeon HD 2600 XT DUALRadeon HD 2600 XT GDDR4Radeon HD 2600 XTRadeon HD 2600 PRORadeon HD 2400 XTRadeon HD 2400 PRORadeon HD 2350Radeon X1950 CrossFire EditionRadeon X1950 XTXRadeon X1950 XTRadeon X1950 PRO DUALRadeon X1950 PRORadeon X1950 GTRadeon X1900 CrossFire EditionRadeon X1900 XTXRadeon X1900 XTRadeon X1900 GT Rev2Radeon X1900 GTRadeon X1800 CrossFire EditionRadeon X1800 XT PE 512MBRadeon X1800 XTRadeon X1800 XLRadeon X1800 GTORadeon X1650 XTRadeon X1650 GTRadeon X1650 XL DDR3Radeon X1650 XL DDR2Radeon X1650 PRO on RV530XTRadeon X1650 PRO on RV535XTRadeon X1650Radeon X1600 XTRadeon X1600 PRORadeon X1550 PRORadeon X1550Radeon X1550 LERadeon X1300 XT on RV530ProRadeon X1300 XT on RV535ProRadeon X1300 CERadeon X1300 ProRadeon X1300Radeon X1300 LERadeon X1300 HMRadeon X1050Radeon X850 XT Platinum EditionRadeon X850 XT CrossFire EditionRadeon X850 XT Radeon X850 Pro Radeon X800 XT Platinum EditionRadeon X800 XTRadeon X800 CrossFire EditionRadeon X800 XLRadeon X800 GTO 256MBRadeon X800 GTO 128MBRadeon X800 GTO2 256MBRadeon X800Radeon X800 ProRadeon X800 GT 256MBRadeon X800 GT 128MBRadeon X800 SERadeon X700 XTRadeon X700 ProRadeon X700Radeon X600 XTRadeon X600 ProRadeon X550 XTRadeon X550Radeon X300 SE 128MB HM-256MBR adeon X300 SE 32MB HM-128MBRadeon X300Radeon X300 SERadeon 9800 XTRadeon 9800 PRO /DDR IIRadeon 9800 PRO /DDRRadeon 9800Radeon 9800 SE-256 bitRadeon 9800 SE-128 bitRadeon 9700 PRORadeon 9700Radeon 9600 XTRadeon 9600 PRORadeon 9600Radeon 9600 SERadeon 9600 TXRadeon 9550 XTRadeon 9550Radeon 9550 SERadeon 9500 PRORadeon 9500 /128 MBRadeon 9500 /64 MBRadeon 9250Radeon 9200 PRORadeon 9200Radeon 9200 SERadeon 9000 PRORadeon 9000Radeon 9000 XTRadeon 8500 LE / 9100Radeon 8500Radeon 7500Radeon 7200 Radeon LE Radeon DDR OEM Radeon DDR Radeon SDR Radeon VE / 7000Rage 128 GL Rage 128 VR Rage 128 PRO AFRRage 128 PRORage 1283D Rage ProNVIDIAGeForce RTX 4090GeForce RTX 4080 16GBGeForce RTX 4080 12GBGeForce RTX 3090 TiGeForce RTX 3090GeForce RTX 3080 TiGeForce RTX 3080 12GBGeForce RTX 3080GeForce RTX 3070 TiGeForce RTX 3070GeForce RTX 3060 TiGeForce RTX 3060 rev. 2GeForce RTX 3060GeForce RTX 3050GeForce RTX 2080 TiGeForce RTX 2080 SuperGeForce RTX 2080GeForce RTX 2070 SuperGeForce RTX 2070GeForce RTX 2060 SuperGeForce RTX 2060GeForce GTX 1660 TiGeForce GTX 1660 SuperGeForce GTX 1660GeForce GTX 1650 SuperGeForce GTX 1650 GDDR6GeForce GTX 1650 rev.3GeForce GTX 1650 rev.2GeForce GTX 1650GeForce GTX 1630GeForce GTX 1080 TiGeForce GTX 1080GeForce GTX 1070 TiGeForce GTX 1070GeForce GTX 1060GeForce GTX 1060 3GBGeForce GTX 1050 TiGeForce GTX 1050 3GBGeForce GTX 1050GeForce GT 1030GeForce GTX Titan XGeForce GTX 980 TiGeForce GTX 980GeForce GTX 970GeForce GTX 960GeForce GTX 950GeForce GTX TitanGeForce GTX 780 TiGeForce GTX 780GeForce GTX 770GeForce GTX 760GeForce GTX 750 TiGeForce GTX 750GeForce GT 740GeForce GT 730GeForce GTX 690GeForce GTX 680GeForce GTX 670GeForce GTX 660 TiGeForce GTX 660GeForce GTX 650 Ti BoostGeForce GTX 650 TiGeForce GTX 650GeForce GT 640 rev.2GeForce GT 640GeForce GT 630 rev.2GeForce GT 630GeForce GTX 590GeForce GTX 580GeForce GTX 570GeForce GTX 560 TiGeForce GTX 560GeForce GTX 550 TiGeForce GT 520GeForce GTX 480GeForce GTX 470GeForce GTX 465GeForce GTX 460 SEGeForce GTX 460 1024MBGeForce GTX 460 768MBGeForce GTS 450GeForce GT 440 GDDR5GeForce GT 440 GDDR3GeForce GT 430GeForce GT 420GeForce GTX 295GeForce GTX 285GeForce GTX 280GeForce GTX 275GeForce GTX 260 rev. 2GeForce GTX 260GeForce GTS 250GeForce GTS 240GeForce GT 240GeForce GT 230GeForce GT 220GeForce 210Geforce 205GeForce GTS 150GeForce GT 130GeForce GT 120GeForce G100GeForce 9800 GTX+GeForce 9800 GTXGeForce 9800 GTSGeForce 9800 GTGeForce 9800 GX2GeForce 9600 GTGeForce 9600 GSO (G94)GeForce 9600 GSOGeForce 9500 GTGeForce 9500 GSGeForce 9400 GTGeForce 9400GeForce 9300GeForce 8800 ULTRAGeForce 8800 GTXGeForce 8800 GTS Rev2GeForce 8800 GTSGeForce 8800 GTGeForce 8800 GS 768MBGeForce 8800 GS 384MBGeForce 8600 GTSGeForce 8600 GTGeForce 8600 GSGeForce 8500 GT DDR3GeForce 8500 GT DDR2GeForce 8400 GSGeForce 8300GeForce 8200GeForce 8100GeForce 7950 GX2GeForce 7950 GTGeForce 7900 GTXGeForce 7900 GTOGeForce 7900 GTGeForce 7900 GSGeForce 7800 GTX 512MBGeForce 7800 GTXGeForce 7800 GTGeForce 7800 GS AGPGeForce 7800 GSGeForce 7600 GT Rev.2GeForce 7600 GTGeForce 7600 GS 256MBGeForce 7600 GS 512MBGeForce 7300 GT Ver2GeForce 7300 GTGeForce 7300 GSGeForce 7300 LEGeForce 7300 SEGeForce 7200 GSGeForce 7100 GS TC 128 (512)GeForce 6800 Ultra 512MBGeForce 6800 UltraGeForce 6800 GT 256MBGeForce 6800 GT 128MBGeForce 6800 GTOGeForce 6800 256MB PCI-EGeForce 6800 128MB PCI-EGeForce 6800 LE PCI-EGeForce 6800 256MB AGPGeForce 6800 128MB AGPGeForce 6800 LE AGPGeForce 6800 GS AGPGeForce 6800 GS PCI-EGeForce 6800 XTGeForce 6600 GT PCI-EGeForce 6600 GT AGPGeForce 6600 DDR2GeForce 6600 PCI-EGeForce 6600 AGPGeForce 6600 LEGeForce 6200 NV43VGeForce 6200GeForce 6200 NV43AGeForce 6500GeForce 6200 TC 64(256)GeForce 6200 TC 32(128)GeForce 6200 TC 16(128)GeForce PCX5950GeForce PCX 5900GeForce PCX 5750GeForce PCX 5550GeForce PCX 5300GeForce PCX 4300GeForce FX 5950 UltraGeForce FX 5900 UltraGeForce FX 5900GeForce FX 5900 ZTGeForce FX 5900 XTGeForce FX 5800 UltraGeForce FX 5800GeForce FX 5700 Ultra /DDR-3GeForce FX 5700 Ultra /DDR-2GeForce FX 5700GeForce FX 5700 LEGeForce FX 5600 Ultra (rev. 2)GeForce FX 5600 Ultra (rev.1)GeForce FX 5600 XTGeForce FX 5600GeForce FX 5500GeForce FX 5200 UltraGeForce FX 5200GeForce FX 5200 SEGeForce 4 Ti 4800GeForce 4 Ti 4800-SEGeForce 4 Ti 4200-8xGeForce 4 Ti 4600GeForce 4 Ti 4400GeForce 4 Ti 4200GeForce 4 MX 4000GeForce 4 MX 440-8x / 480GeForce 4 MX 460GeForce 4 MX 440GeForce 4 MX 440-SEGeForce 4 MX 420GeForce 3 Ti500GeForce 3 Ti200GeForce 3GeForce 2 Ti VXGeForce 2 TitaniumGeForce 2 UltraGeForce 2 PROGeForce 2 GTSGeForce 2 MX 400GeForce 2 MX 200GeForce 2 MXGeForce 256 DDRGeForce 256Riva TNT 2 UltraRiva TNT 2 PRORiva TNT 2Riva TNT 2 M64Riva TNT 2 Vanta LTRiva TNT 2 VantaRiva TNTRiva 128 ZXRiva 128

You can simultaneously select
up to 10 video cards by holding Ctrl

Reviews of video cards NVIDIA GeForce 9800 GX2:

  • GeForce GTX 280 versus GeForce 9800 GX2. The fastest modern video cards in comparative testing

    Zotac GeForce 9800GX2

NVIDIA GeForce 9800GX2 video card review from GIGABYTE GECID.

com. Page 1

::>Video cards
>2008
> GIGABYTE GV-NX98X1GHI-B

04-06-2008

Page 1
Page 2
One page

Today, NVIDIA GeForce 9800 GTX and ATI Radeon HD 3870 X2, the top solutions of two eternal competitors, compete with each other in the upper price segment of video cards. To consolidate its leadership in the market and retain the title of «the owner of the fastest video accelerator», NVIDIA decides to release the NVIDIA GeForce 9800 GX2, which combines two boards with G92 chips. Keeping in mind the shortcomings of cards based on the GeForce 7950 GX2, the Californian manufacturer tried to avoid them when developing its new brainchild. At the moment, all video adapters based on NVIDIA GeForce 9800 GX2 are reference products that differ only in stickers and delivery kit, therefore, talking about a video card made by GIGABYTE, we are actually talking about all NVIDIA GeForce 9800 GX2 cards.

G9 chips are already the heart of the video adapter2, which are not very different architecturally from their predecessor, the G80. The G92 graphics chips are produced using 65 nm technology, thanks to which, in addition to heat dissipation and power consumption, it was also possible to reduce the cost of production. The bit width of the data bus has also been reduced to 256 bits. The number of universal processors remained unchanged. Below is the diagram of the G92 processor:

0113

GeForce 9800 GX2 2×512 MB

GeForce 9800 GTX 512 MB

GeForce 8800 Ultra 768 MB

GeForce 8800 GTS 512 MB

Graphics chip

2x G92-450-A2

G92-420-A2

G80-450-A3

G92-400-A2

Core frequency, MHz

600

675

612

650

Frequency of unified processors, MHz

1512

1688

1500

1625

Number of universal processors

2×128

128

128

128

Number of texture/blend units

2×64 / 2×16

64 / 16

32 / 24

64 / 16

Memory size, MB

2×512

512

768

512

Effective memory frequency, MHz

2000 (2*1000)

2200 (2*1100)

2160 (2*1080)

1940 (2*970)

Memory type

GDDR3

GDDR3

GDDR3

GDDR3

Memory bus width, bits

2×256

256

384

256

Power consumption, W

to 197

to 156

to 171

to 150

Model

GIGABYTE GeForce 9800 GX2 (GV-NX98X1GHI-B)

Graphics core

NVIDIA GeForce 9800 GX2 (2x G92-450-A2)

Conveyors

2x 128 unified

Supported APIs

DirectX 10. 0
OpenGL 2.0

Core / shader frequency, MHz

600 / 1512

Size (type) of memory, MB

2×512 (GDDR3)

Real (effective) memory frequency, MHz

1000 (2000 DDR)

Memory bus

2×256 bit

Tire standard

PCI Express 2.0 X16

Maximum resolution

Up to 2560 x 1600 dual-link DVI Up to 2048 x 1536 @ 85 Hz via analog VGA (via adapter)
Up to 1080i via HDMI

Outputs

2x DVI-I (VGA via adapter only)
HDMI

HDCP support
HD video decoding

Yes
H. 264, VC-1, MPEG2 and WMV9

Drivers

Fresh drivers can be downloaded from:
— video card manufacturer’s website;
— GPU manufacturer website.

Manufacturer website

http://gigabyte.ru/

All prices for GIGABYTE GeForce 9800 GX2 (GV-NX98X1GHI-B)

Now let’s proceed directly to the review and testing.

The video card comes in a rather impressive box (designed in the same style as the rest of the GIGABYTE video adapters), which depicts a girl looking into the distance. The association suggests itself that we can look into the future with the help of the contents of the package. In order to have reason to assume so, our attention is focused on the distinctive features of the new video accelerator:

  • PCI-E 2.0;
  • Dual-Link DVI;
  • HDCP;
  • HDMI;
  • 1GB GDDR3.

However, apart from the amount of video memory, there is nothing unusual here, we could already see all these «features» in previous reviews of video cards based on NVIDIA chips. In fact, the total amount of memory installed on each «half» of the video card is indicated, and due to the peculiarities of the implementation of this adapter, only 512 MB is available for use by games. GIGABYTE marketers decided to follow NVIDIA’s information and deliberately indicated a larger amount of memory, believing that this could attract additional attention to their product.

The sticker on the end indicates the model (GV-NX98X1GHI-B) and partially repeats the characteristics of the content:

  • NVIDIA 9800 GX2 chip;
  • PCI-E 2.0;
  • 1GB GDDR3;
  • 2x256bit;
  • DVI-I;
  • HDMI;
  • DRIVER CD (it is not entirely clear why it was necessary to indicate the presence of a disk in this list).

The back side of the box is not very informative and only complements the above parameters .

In addition to the video card, the delivery set consists of:

  • Driver CD;
  • adapter 2x Molex-> 6-pin PCI-Express power supply;
  • adapter 2x Molex-> 8-pin PCI-Express power supply;
  • user manual;
  • quick installation guide;
  • two DVI->VGA adapters;
  • S/PDIF cable.

Thus, the buyer will receive the entire set of cables and adapters necessary for operation.

The user manual contains information about the minimum system requirements for the normal operation of the GeForce 9800 GX2: power supply 12V. For operation in SLI mode, it is recommended to use power supplies with a capacity of 1000 W or higher. The NVIDIA website lists a list of certified power supplies for 9800 GX2. There is also a mention of mechanical and electrical incompatibility of 8-pin additional power connectors for various purposes:

to the failure of the video adapter. By their nature, connectors are mechanically incompatible as they have different contact hole shapes. However, according to the testimonies of service centers, «craftsmen» still manage to combine the incompatible, and the result is rather deplorable in the end — a non-working video card and money spent for nothing. Therefore, when connecting the power, be extremely careful and take special care. The power cable should be inserted into the connector without much effort — this will be a sign that you are on the right track.

Next to the power connectors is a socket for connecting an S\PDIF cable, which is required for audio transmission via HDMI.

To ensure the normal operation of the accelerator, it is necessary to simultaneously connect both 6-pin and 8-pin auxiliary power connectors.

The manual indicates the reasons why the auxiliary LEDs located near the power connectors give an error signal — they turn on in red, indicating the connector that needs to be connected to power. There are also recommendations for connecting additional power adapters — to connect each Molex connector, you must use a different 12V power line. This is due to the possible load of up to 15 A on each line, which can be provided by almost any powerful power supply. But the block may already be unable to withstand a load of 30 A for one line. It is not in vain that the manufacturer focuses on this, primarily because most of the problems are associated with improper power connections.

The dimensions of the new «monster» in comparison with the GIGABYTE GeForce 8800 GT seem unrealistic. The advantage in terms of dimensions of about 30% is also achieved due to the frame of the cooling system. Potential buyers will most likely have to think about replacing the case of their home PC with a more spacious one — otherwise the video card may rest against the rack (or basket) for hard drives.

Thanks to the new exclusive cooling system, the video card is a solid rectangular bar, which adds rigor and grandeur .

Upon closer examination, we notice the first drawbacks — the cooling is not implemented as competently as, for example, in the GeForce 9800 GTX. The fan is located between two boards, in the back of each of them there is a figured hole for air intake. The cooling system is designed in such a way that cool air is driven through the card heatsink and partially expelled outside the case, but most of the heated air exits through the slots in the casing back into the system unit, thereby significantly increasing the temperature in the PC case.

All video card connectors are carefully closed with plastic plugs to prevent damage during transportation.

NVIDIA GeForce 9800 GX2 graphics cards have two DVI-I ports and one HDMI port. It is technically possible to install a DisplayPort connector instead of HDMI. In addition to the connectors, there are two LEDs on the bar with ports — two-color and blue. The first one displays the status of the supplied auxiliary power: glows green when both cables are connected, and red when:

  • one of the cables is not connected;
  • Both power cables are not connected;
  • both cables use six-pin connectors.

The second LED indicates the DVI port through which the image is output (called the master monitor). Unlike the ATI Radeon HD 3870 X2, the NVIDIA GeForce 9800 GX2 does not support more than one monitor in SLI or Quad SLI mode.

The cooling system shroud is not designed to be disassembled by the user.

Video adapter boards are connected by two flexible ribbon cables and fastened with metal racks to provide rigidity. One of the boards has an SLI connector for a pair of video adapters to work in Quad SLI mode.

The main difference from its predecessor, the NVIDIA GeForce 7950 GX2, is the arrangement of two boards facing each other, which makes it possible to use a single heatsink to cool the chips.

The cooling system consists of two thick copper plates, flat heat pipes and a common aluminum plate radiator.

The graphics chip is marked G92-450-A2.

The GPU-Z utility detects a video card as two video adapters in SLI mode, and displays data on the chip, memory and temperature values ​​for each of the video card cards.

For the board to which the cooler is connected, its rotation speed is also displayed.

The G92-450-A2 GPU core operates at 600 MHz, while the shader and memory clocks are 1500 MHz and 1000 MHz, respectively.

The BR04-300-A2 chip is used as a PCI Express bus switch. It, like the SLI connector, is located on the «lower» card.

The NVIDIA GeForce 9800 GX2 uses Samsung’s GDDR3 memory chips. The marking of microcircuits K4J52324QE-BJ1A indicates a nominal frequency of 1000 MHz, at which they operate.

just add one more G92 — Ferra.ru

The appearance of «two-headed monsters» among GPU developers is usually due to problems with the release of single-chip competitive products with existing capabilities. Let’s remember the same ATI Rage Fury MAXX, 3dfx Voodoo 5 5500 and XGI Volari V8 Duo Ultra, of which, probably, only the card from the sunk into oblivion 3dfx managed to stay on the market for a long time. After the transition of the computer industry to the use of the PCI Express bus, thanks to SLI and CrossFire technology, it became possible to produce multi-chip solutions, which, in turn, could be combined into tandems of several video cards. And the GeForce 79 became such a card.00 GX2 by NVIDIA. It became, but did not appear on store shelves and remained a ghost card. But the Californians did not stop there, and to fight against the competitor represented by ATI Radeon X1950 XTX, they introduced the GeForce 7950 GX2, which finally became a «mass» product, but for some time lacked support for Quad-SLI (a technology for combining two GX2 video cards in one bundle).

With the release of the G80, NVIDIA remained the leader of the graphics market for a long time, until ATI — already part of AMD — introduced its first two-headed solution in so many years — the Radeon HD 3870 X2, which took over the palm from the Californians. True, the holiday was short-lived, and a few months later NVIDIA adequately responded by releasing one of the fastest video cards — GeForce 9800 GX2 based on a pair of G92 GPUs. By the way, as practice has shown, this chip turned out to be quite versatile, suitable for both mid-range cards and high-end monsters — a kind of «and a reaper, and a Swiss, and a gambler on the pipe.»

Compared to previously produced tandems, the novelty has some differences, primarily due to higher heat dissipation. In fact, the GeForce 9800 GX2 is a bundle of two GeForce 8800 GTS 512MB with slightly lower operating frequencies — 600/1500/2000 MHz versus 650/1625/1940 (chip, shader unit and memory, respectively). But, despite this, the design of the video card was completely redesigned, which made it possible to increase the cooling efficiency of the components of the new two-chip solution. NVIDIA did not use one PCB for two video chips, as, for example, a competitor, but placed each GPU and 512 MB of video memory on separate boards, which are like a mirror image of each other, between which there is a common cooling system. Two cables are used to connect the halves of the cards, and a switch chip is installed on the main board with a PCI-E connector to coordinate the signals between the graphics processors and the system, while one MIO interface remains to combine two GeForce 9800 GX2 in Quad-SLI mode. Taking into account the increased power consumption, about 200 W, each card from the tandem is equipped with a power connector: 6-pin on one and 8-pin on the other. The developer recommends using a power supply unit with a power of at least 580 W with one such accelerator, which is an overestimated requirement.

Another significant innovation for the implementation of Quad-SLI (compared to previously released «monsters») was the use of only one AFR instead of the simultaneous operation of Split Frame Rendering (SFR) and Alternate Frame Rendering (AFR) modes, which is for modern games with complex shaders and multi-pass rendering is very suitable.

Like the entire 9th series, the GeForce 9800 GX2 supports HybridPower, thanks to which, when using an appropriate motherboard with an integrated GPU, it becomes possible to turn off a discrete card at low load on the video subsystem. Due to such a simple “manipulation”, the noise level of the system is reduced while watching movies or surfing the Internet, and also saves electricity.

With the release of the new ForceWare drivers, PureVideo HD technology has received some enhancements such as the ability to decode two HD streams simultaneously, dynamically change contrast and color saturation, and play HD video in windowed mode when using the Aero interface in Windows Vista. True, you will only have to enjoy movies on modern equipment that supports the HDMI interface, since the card lacks the usual HDTV output.

That’s all the innovations of the GeForce 9800 GX2, of which, perhaps, we can single out only the design of the video adapter and the use of the pure AFR rendering mode.

As a representative of the high-level solution of the 9800 series, we received for testing the EN9800GX2/G/2DI/1G video card from ASUS with fully reference characteristics. The opponent chose the R3870X2-T2D1G-OC based on ATI Radeon HD 3870 X2 from MSI as a more worthy rival to the two-chip product from NVIDIA. PCI Express

Inspecting video cards

ASUS EN9800GX2/G/2DI/1G

ASUS video card comes in a box with a hinged lid and an image of «heroes» from the game «Company of Heroes», which has already become familiar , — still, some diversity in this regard would not hurt.

The delivery set does not differ in the presence of any exclusive items — after all, the times when “20 discs” were put in a box are long gone. At the moment, the same kit can be supplied with a video card for both $600-700 and $100-150. But on the other hand, there is practically nothing to surprise us with, and an extra detail will simply affect the cost of the accelerator. End up with ASUS EN9The 800GX2/G/2DI/1G comes with everything you need to install the card into the system and connect external equipment.

But there are also nice bonuses in the form of the Company of Heroes game and a soft case for CDs. In addition, there is an audio cable for connecting a video card and a digital output of a board or sound card, as well as a power adapter from two PCI-E 6-pin to one PCI-E 8-pin in addition to the usual Molex->PCI-E.

Tested ASUS EN9 video card800GX2 is fully consistent with the reference design — this is not surprising, given that the cards are manufactured by a contractor for NVIDIA, which, in turn, already distributes adapters between its partners. The only thing that distinguishes the card from its competitors is a branded sticker on the cooling system.

The length of the card corresponds to the size of high-end accelerators (GeForce 9800 GTX, GeForce 8800 GTX/Ultra and ATI Radeon HD 3870 X2), but because of the design, the CO seems much more massive than its high-end counterparts.

The fact is that each GPU with its own set of memory faces inward and contacts the cooler, while the outside of the card is completely covered with a decorative rectangular lid-box with multiple slots in the back and top and consisting of two halves.

Removing the cover is not so easy — it requires some skill and the ability to work with a screwdriver, as, indeed, when disconnecting the additional power cables, since the connectors are located opposite each other, and you simply cannot crawl there with your fingers.

If the box can be opened, the achievement of engineering in the field of video card development will appear before your eyes. The GeForce 9800 GX2 is a «sandwich» of two boards and a huge cooler between them. To cool the system, a single low-speed turbine is used, which draws air through special slots in the printed circuit boards.

The main tandem card contains a G92 graphics processor, eight memory chips with a total capacity of 512 MB, PCI-E and MIO interfaces, an audio connector near the 6-pin auxiliary power connector, and one DVI. Between the core and the output for the monitor there is a chip switch responsible for matching the signals of the cards and the «outside world».

The second half also has a GPU, eight memory chips, DVI and HDMI connectors, and an 8-pin PCI-E auxiliary power connector. Both cards are connected by two flexible cables, which, when assembling the accelerator, do not particularly tend to fall into place.

The cooling system is a massive construction of two aluminum plates with thin aluminum fins located between them, heat to which is transferred via heat pipes from a copper base located on each side of the CO.

Thermal paste is used at the point of contact between the GPU and the copper bases, thermal pads are used between the memory and power elements.

The release of hot air from the cooling system, unlike modern high-end video cards, occurs partly outside, from the system unit, and to a lesser extent, and partly inside the case, as, for example, in adapters of the GeForce 8800GT/9600GT series and ATI Radeon HD 3850. This is primarily due to the design of the GeForce 9800 GX2 — since the peripheral connectors are located on both tandem boards, exhaust air is only possible in the lower part of the card and in small quantities. The rest leaves immediately after the turbine in the direction of the fins towards the rear wall of the case, thereby remaining in the system unit with subsequent heating of the neighboring components, and the card itself too.

As with all accelerators based on the G92, the GPUs of the «monster» in question have a protective frame, while the core belongs to an even older version than that of the GeForce 9800 GTX — G92-450-A2, but works on a smaller frequency — only 600/1500 MHz core and stream processors, respectively.

16 Samsung GDDR3 memory chips with access time of 1.0 ns (marked K4J53324QE-BJ1A) and a total capacity of 2×512 MB, operating at their nominal frequency of 2000 MHz, are wired as memory on the card.

Naturally, we could not ignore such a topic as overclocking, and decided to find out what our copy of ASUS EN9800GX2 is capable of. But, alas, the miracle did not happen, and the card was able to function stably only at a frequency of 670/1675/2200 MHz, which fell a little short of the nominal GeForce 9800 GTX.

MSI R3870X2-T2D1G-OC

Our next participant is also a reference product, but with a slightly increased GPU frequency relative to the nominal. The MSI R3870X2-T2D1G-OC card is delivered in a branded box with the image of a male cyborg and with a hinged lid, on which numerous awards from various publishers are painted.

The delivery set is devoid of any nice bonuses — there is only everything you need to connect the card.

Since we already considered a card of the same design at the very beginning of our acquaintance with AMD’s two-chip solution, we can only recall some features of the ATI Radeon HD 3870 X2.

The linear dimensions of the video card correspond to other high-level representatives of the competing camp, and a fairly spacious case is required to install the accelerator. As a distinction, the accelerator cooling system has a branded sticker with the image of the same cyborg.

No matter how many video card manufacturers tried to independently develop and release tandems on one PCB, each time incomprehensible monsters of gigantic sizes came out. Only AMD managed to place sixteen memory chips, two RV670 GPUs and an equally sized switch chip on a single 27 cm board. Some companies released ATI Radeon HD 3870 X2 based on their own PCB, sometimes even shorter than the reference one. True, such cards have become slightly higher than the original.

Cooling system to match the card itself — a copper-aluminum 1 kg structure covers the entire card, while the exhaust air is removed outside the system unit. The cooler consists of a massive aluminum base and two radiators, one of which is made of copper, and the second, which is closer to the turbine, is made of aluminum. The whole structure is covered with a plastic casing, while hot air is removed outside the casing. Thermal paste is used at the point of contact between the chips and heatsinks, and thermal pads are used between memory and power elements.

An aluminum plate with a stiffening rib is installed on the back of the card, but it is very thin, and some memory chips lack good pressure due to the bending of the makeshift heatsink.

Two Dual Link DVI and TV-Out are provided as interface connectors on the board. It is also possible to transmit high-definition video and audio signals through a DVI / HDMI adapter, since the RV670 chip has an integrated HDA class multi-channel audio solution. Due to the features of combining GPUs, the card has only one connector for building a tandem in CrossFireX mode, and PCI Express of the first generation is used as an interface for connecting the card to the motherboard.

Two RV670 GPUs operate at 850 MHz instead of the reference 825 MHz and fully meet the overclocking orientation of the product, although 25 MHz is still somehow not enough.

The card contains GDDR3 memory chips manufactured by Samsung with an access time of 1.0 ns (marked K4J53324QE-BJ1) and a total capacity of 1024 MB, but, as a rule, in the CrossFire or SLI accelerator combination modes, the memory capacity is always equal to a single card. The memory operates at a frequency of 1800 MHz, which corresponds to the specifications for the ATI Radeon HD 3870 X2.

Core overclocking turned out to be not as impressive as in the case of the GeForce 9800 GX2 — only 860 MHz, which is 10 MHz more than the factory settings. But the memory overclocked very well, having conquered the mark of 2160 MHz.

Testing

Testing was carried out on the following stand.

  • Processor:
    • Intel Core 2 Quad Q6600 (2.4GHz)
  • Motherboard:
    • ASUS P5K-E
  • RAM:

    DirectX 9 benchmarks

    In «Company of Heroes» the new NVIDIA in normal mode is 1.5 times faster than the opponent, but with the activation of anti-aliasing and anisotropic filtering at a screen resolution of 1280×1024 and higher, it starts to lose ground, dropping to the level of ATI Radeon HD 3870 X2 by MSI.

    The MSI R3870X2-T2D1G-OC video card in «Call of Juarez» does not stand up to criticism — the performance of the solution based on the GeForce 9800 GX2 with the increase in resolution and graphics quality is almost twice as high.

    For unknown reasons, both cards in question in the game «Crysis» could not pass the test at a resolution of 1600×1200. But, judging by the results, at a resolution of 1280×1024 playing at maximum settings is contraindicated even on the GeForce 9800 GX2. We think that more than one generation of video cards will stumble on the creation of Crytek.

    Conclusion

    If video cards based on ATI Radeon HD 3870 X2 have become one of the fastest, then GeForce 9800 GX2 is just the fastest. Powerful, massive and mind-blowingly priced tandem of two G9s2 is unlikely to leave a chance for competitors. But, as in the case of the Radeon HD 3870 X2, the performance of the video card will largely depend on optimization — both from the side of gaming applications and from the side of the drivers, because the card has two graphics adapters operating in SLI mode. If for some reason these conditions are not met, the user of this «monster» may be left with nothing, or rather, with a card that will be slower than the GeForce 8800 GTS. This prospect is unlikely to please anyone.

    As for the cards we have reviewed, despite the cost, they cannot boast of anything like that. ASUS is now packing a «Company of Heroes» game and a disc case in the box, which we found included with the EN9800GX2/G/2DI/1G. Although it would be more relevant to supply a disc with 3DMark Vantage with cards of this kind. The MSI R3870X2-T2D1G-OC has an even more meager package, but the GPUs are slightly overclocked at the factory. If only memory…

    GeForce 9800 GX2 video card [in 1 benchmark]

    NVIDIA
    GeForce 9800 GX2

    • PCIe 2.0 x16 interface
    • Core clock 600MHz
    • Video memory size 512 MB
    • Memory type GDDR3
    • Memory frequency 1000MHz
    • Maximum resolution

    Description

    NVIDIA started GeForce 9800 GX2 sales on March 18, 2008 at a recommended price of 599$. This is a desktop video card based on Tesla architecture and 65 nm manufacturing process, primarily designed for office use. It has 512 MB of GDDR3 memory at 1 GHz, and coupled with a 512-bit interface, this creates a bandwidth of 128 (64 on the GPU).

    In terms of compatibility, this is a dual-slot PCIe 2.0 x16 card. The length of the reference version is 26.7 cm. An additional 6-pin & 8-pin power cable is required for connection, and the power consumption is 197 W.

    It provides poor performance in tests and games at the level of

    2.71%

    from the leader, which is the NVIDIA GeForce RTX 3090 Ti.


    GeForce
    9800 GX2

    or


    GeForce RTX
    3090 Ti

    1097 out of 49999 (A100 SXM4)

    Value for money

    To obtain an index, we compare the characteristics of video cards and their cost, taking into account the cost of other cards.

    • 0
    • 50
    • 100

    Features

    GeForce 9800 GX2’s general performance parameters such as number of shaders, GPU core clock, manufacturing process, texturing and calculation speed. They indirectly talk about GeForce 9 performance0009

    SLI support +

    RAM

    Parameters of memory installed on GeForce 9800 GX2 — type, size, bus, frequency and bandwidth. For video cards built into the processor that do not have their own memory, a shared part of the RAM is used.

    Memory type GDDR3
    of 14400 (Radeon R7 M260)

    Video outputs

    Types and number of video connectors present on GeForce 9800 GX2. As a rule, this section is relevant only for desktop reference video cards, since for laptop ones the availability of certain video outputs depends on the laptop model.

    Video connectors HDMIDual Link DVI
    Multi-monitor support

    CUDA +

    Benchmark tests

    These are the results of GeForce 9800 GX2 rendering performance tests in non-gaming benchmarks. The overall score is set from 0 to 100, where 100 corresponds to the fastest video card at the moment.


    Overall benchmark performance

    This is our overall performance rating. We regularly improve our algorithms, but if you find any inconsistencies, feel free to speak up in the comments section, we usually fix problems quickly.

    9800 GX2
    2.71

    • Passmark
    Passmark

    This is a very common benchmark included in the Passmark PerformanceTest package. He gives the card a thorough evaluation, running four separate tests for Direct3D versions 9, 10, 11, and 12 (the latter being done at 4K resolution whenever possible), and a few more tests using DirectCompute.

    Benchmark coverage: 26%

    9800 GX2
    797


    Game tests

    FPS in popular games on GeForce 9800 GX2, as well as compliance with system requirements. Remember that the official requirements of the developers do not always match the data of real tests.

    Average FPS
    Popular games

    Relative performance

    Overall GeForce 9800 GX2 performance compared to its nearest desktop counterparts.


    ATI Radeon HD 3850 X2
    102.95

    NVIDIA GeForce GT 730
    102.95

    AMD Radeon R7 M365X
    102.58

    NVIDIA GeForce 9800 GX2
    100

    ATI Radeon HD 5670
    98.15

    AMD Radeon R5 M435
    97.79

    NVIDIA GeForce GT 440
    97.05

    AMD competitor

    We believe that the nearest equivalent to GeForce 9800 GX2 from AMD is Radeon HD 5670, which is slower by 2% on average and lower by 4 positions in our rating.


    Radeon HD
    5670

    Compare

    Here are some GeForce 9 closest competitorsAMD 800 GX2:

    AMD Radeon R7 M260DX
    102.95

    ATI Radeon HD 3850 X2
    102.95

    AMD Radeon R7 M365X
    102.58

    NVIDIA GeForce 9800 GX2
    100

    ATI Radeon HD 5670
    98.15

    AMD Radeon R5 M435
    97.79

    ATI Radeon HD 4810
    96.31

    Other video cards

    Here we recommend several video cards that are more or less similar in performance to the reviewed one.


    GeForce GT
    440

    Compare


    Radeon HD
    3850 X2

    Compare


    Radeon HD
    4810

    Compare


    GeForce
    9800 GTX

    Compare


    Radeon HD
    4830

    Compare


    Radeon HD
    3870 X2

    Compare

    Recommended processors

    According to our statistics, these processors are most often used with the GeForce 9800 GX2.


    Core 2
    Quad Q6600

    8.3%


    Core 2
    Duo E8400

    5.6%


    Core i3
    10100F

    5.6%


    Ryzen 9
    3950X

    2.8%


    Pentium
    E5700

    2.8%


    Core i7
    920

    2.8%


    Core i5
    7300HQ

    2.8%


    Ryzen 7
    3700U

    2.8%


    Core i3
    3220

    2.8%


    Ryzen 7
    PRO 2700

    2. 8%

    User rating

    Here you can see the rating of the video card by users, as well as put your own rating.


    Tips and comments

    Here you can ask a question about the GeForce 9800 GX2, agree or disagree with our judgements, or report an error or mismatch.


    Please enable JavaScript to view the comments powered by Disqus.

    Video card NVIDIA GeForce 9800 GX2

    Specifications
    Drivers
    Price

    Name: NVIDIA GeForce 9800 GX2
    Series: GeForce 9
    GPU model: 2× G92-450 (G92)
    CUDA cores: 256
    Base clock : 600 MHz
    Memory speed: 2
    Gbps
    Memory: 1
    Gb GDDR3 (512-bit)

    NVIDIA GeForce 9800 GX2 graphics card based on 65 nm process technology and based on 2× G9 GPU2-450 (G92).
    The card supports Directx 10. NVIDIA has placed 1024 megabytes of GDDR3 RAM, which is connected using a 512-bit interface.
    The graphics processor runs at 600 MHz. The number of CUDA cores is 256, with a speed of 2000 Mbps.

    The power consumption of the video card is 197W, and the recommended power supply is 580W.

    NVIDIA GeForce 9800 GX2 supports Microsoft DirectX 10 and OpenGL 2.1.

    NVIDIA GeForce 9 graphics card specifications800 GX2

    GPU specifications:
    Model: NVIDIA GeForce 9800 GX2
    Series: GeForce 9, Desktop
    GPU model: 2× G92-450 (G92)
    Process: 65nm
    CUDA cores: 256
    Streaming Multiprocessors (SMs): 32
    Texture units (TMUs): 128
    Base clock: 600MHz
    Number of transistors: 2× 754 million
    Memory specifications:
    Memory capacity: 1
    Gb
    Memory type: GDDR3
    Memory bus: 512-bit
    Memory speed: 2000 Mbps
    (2
    Gbps)
    Memory clock: 1000MHz
    Texture Fill Rate: 76. 8 GTexel/s
    Display support:
    Maximum digital resolution: 2560×1600
    Maximum VGA resolution: 2048×1536
    Standard connectors: Two DVI-I DualLink, HDMI
    Multi-monitor support: Yes
    HDMI: Yes
    Audio input for HDMI: SPDIF
    Thermal data:
    Maximum GPU temperature: 105℃
    Energy consumption (TDP): 197 W
    Recommended Nutritional Requirements: 580 W
    Additional power connectors: 6-pin and 8-pin
    Video card dimensions:
    Height: 11. 1 cm
    Length: 26.7 cm
    Width: 2 slots
    Technologies and capabilities:
    CUDA: Yes
    SLI: Yes, Quad
    PhysX: Yes
    3D Vision: Yes
    3D games: Yes
    DirectX: 10
    OpenGL: 2.1
    Tire: PCI Express 2.0 x16
    Support OS: Microsoft Windows 7-10, Linux, FreeBSDx86

    Please note: The table shows the reference characteristics of the video card, they may vary by different manufacturers.


    New drivers for NVIDIA GeForce 9 graphics card0910 Output date:


    December 14, 2016

    version:
    342. 01 WhQL

    download (238.97 MB)

    Status: :

    MD5

    Operating system:
    Windows 8 32-bit, 8.1 32-bit, 7 32-bit, VISTA 32-bit
    Output date:
    December 14, 2016
    9013 Version:

    5 3425 3425 3425 3425 3425 3425 3425 3425 3425 3ul

    Download (238.97 MB)

    Status: Obsolete

    MD5

    Operating system:
    Windows 8 64-bit, 8.1 64-bit, 7 64-bit, VISTA 64-bit
    Output date:
    December 14, 2016
    9013 VERSION:

    5 3425 3425 3425 3425 3425 3425 3ul

    Download (292.47 MB)

    Status: Outdated

    MD5

    Operating system:
    Windows XP 32-bit
    Release date:

    Status: Obsolete

    MD5
    |
    Release Notes (PDF)

    Driver for NVIDIA GeForce 9800 GX2 video card is downloaded from the official website!

    Or use the GeForce Experience program — it will automatically select the necessary driver for your video card.

    Price in Russia

    NVIDIA GeForce 9800 GX2 Frequently Asked Questions and Answers

    What series is this video card?

    Video card series: GeForce 9

    What is the power consumption and power requirements?

    Maximum power consumption is: 197 W.

    Recommended power supply: 580W.

    Auxiliary power connectors: 6-pin and 8-pin .

    Where can I download the GeForce 9800 GX2 driver?


    Video cards with 1 Gb memory:

    Video cards with GDDR3 memory type:

    Video cards with 512-bit bus:

    review of characteristics and performance tests in games

    The GeForce 9800 GX2 video card was released by NVIDIA, release date: 18 March 2008. At the time of release, the video card cost $599. The video card is designed for desktop computers and is built on the Tesla architecture codenamed G92.

    Core frequency — 1500 MHz. Texturing speed — 76.8 billion / sec. Number of shader processors — 2x 128. Floating point performance — 2x 384.0 gflops. Technological process — 65 nm. The number of transistors is 754 million. Power consumption (TDP) — 197 Watt.

    Memory type: GDDR3. The maximum memory size is 2x 512 MB. The memory bus width is 512 Bit. Memory frequency — 1000 MHz. Memory bandwidth — 128 (64 per GPU).

    Benchmarks

    PassMark
    G3D Mark
    Top 1 GPU
    This GPU
    PassMark
    G2D Mark
    Top 1 GPU
    This GPU
    GFXBench 4. 0
    T-Rex
    Top 1 GPU
    This GPU
    69225 Frames
    3238 Frames
    GFXBench 4.0
    T-Rex
    Top 1 GPU
    This GPU
    69225.000 Fps
    3238.000 Fps
    Name Meaning
    PassMark — G3D Mark 798
    PassMark — G2D Mark 518
    GFXBench 4. 0 — T-Rex 3238 Frames
    GFXBench 4.0 — T-Rex 3238.000 Fps

    Features

    Architecture Tesla
    Codename G92
    Production date March 18, 2008
    Price at first issue date $599
    Place in the ranking 382
    Type Desktop
    Core frequency 1500 MHz
    Number of CUDA conveyors 256 (128 per GPU)
    Floating point performance 2x 384. 0 gflops
    Process 65nm
    Maximum temperature 105 °C
    Number of shaders 2x 128
    Texturing speed 76.8 billion/sec
    Power consumption (TDP) 197 Watt
    Number of transistors 754 million
    Audio input for HDMI S/PDIF
    Video connectors 2x DVI, 1x HDMI, HDMIDual Link DVI
    HDMI
    Maximum resolution VGA 2048×1536
    Multi-monitor support
    Interface PCIe 2.

    2024 © All rights reserved