Geforce 8800 gtx ultra: New Ultra High End Price Point With GeForce 8800 Ultra

GeForce 8800 Ultra [in 1 benchmark]


NVIDIA
GeForce 8800 Ultra

Buy

  • Interface PCIe 1.0 x16
  • Core clock speed 612 MHz
  • Max video memory 512MB
  • Memory type GDDR3
  • Memory clock speed 1080MHz
  • Maximum resolution

Summary

NVIDIA started GeForce 8800 Ultra sales 2 May 2007 at a recommended price of $829. This is Tesla architecture desktop card based on 90 nm manufacturing process and primarily aimed at office use. 512 MB of GDDR3 memory clocked at 1.08 GHz are supplied, and together with 384 Bit memory interface this creates a bandwidth of 103.7 GB/s.

Compatibility-wise, this is dual-slot card attached via PCIe 1.0 x16 interface. Its manufacturer default version has a length of 270 mm. Two 6-pin power connectors are required, and power consumption is at 171 Watt.

It provides poor gaming and benchmark performance at


2.19%

of a leader’s which is NVIDIA GeForce RTX 3090 Ti.


GeForce
8800 Ultra

vs


GeForce RTX
3090 Ti

General info


Of GeForce 8800 Ultra’s architecture, market segment and release date.

Place in performance rating 824
Value for money 0.07
Architecture Tesla (2006−2010)
GPU code name G80
Market segment Desktop
Release date 2 May 2007 (15 years ago)
Launch price (MSRP) $829
Current price $235 (0. 3x MSRP) of 49999 (A100 SXM4)

Value for money

To get the index we compare the characteristics of video cards and their relative prices.

  • 0
  • 50
  • 100

Technical specs


GeForce 8800 Ultra’s general performance parameters such as number of shaders, GPU base clock, manufacturing process, texturing and calculation speed. These parameters indirectly speak of GeForce 8800 Ultra’s performance, but for precise assessment you have to consider its benchmark and gaming test results.

Pipelines / CUDA cores 128 of 18432 (AD102)
CUDA cores 612
Core clock speed 612 MHz of 2610 (Radeon RX 6500 XT)
Number of transistors 681 million of 14400 (GeForce GTX 1080 SLI Mobile)
Manufacturing process technology 90 nm of 4 (h200 PCIe)
Thermal design power (TDP) 171 Watt of 900 (Tesla S2050)
Texture fill rate 39. 2 billion/sec of 939.8 (h200 SXM5)
Floating-point performance 387.1 gflops of 16384 (Radeon Pro Duo)

Compatibility, dimensions and requirements


Information on GeForce 8800 Ultra’s compatibility with other computer components. Useful when choosing a future computer configuration or upgrading an existing one. For desktop video cards it’s interface and bus (motherboard compatibility), additional power connectors (power supply compatibility).

Interface PCIe 1.0 x16
Length 270 mm
Width 2-slot
Supplementary power connectors 2x 6-pin
SLI options +

Memory


Parameters of memory installed on GeForce 8800 Ultra: its type, size, bus, clock and resulting bandwidth. Note that GPUs integrated into processors don’t have dedicated memory and use a shared part of system RAM.

Memory type GDDR3
Maximum RAM amount 512 MB of 128 (Radeon Instinct MI250X)
Memory bus width 384 Bit of 8192 (Radeon Instinct MI250X)
Memory clock speed 1080 MHz of 21000 (GeForce RTX 3090 Ti)
Memory bandwidth 103.7 GB/s of 14400 (Radeon R7 M260)

Video outputs and ports


Types and number of video connectors present on GeForce 8800 Ultra. As a rule, this section is relevant only for desktop reference video cards, since for notebook ones the availability of certain video outputs depends on the laptop model.

Display Connectors 2x DVI, 1x S-Video

API support


APIs supported by GeForce 8800 Ultra, sometimes including their particular versions.

DirectX 11.1 (10_0)
Shader Model 4.0
OpenGL 3.3 of 4.6 (GeForce GTX 1080 Mobile)
OpenCL 1.1
Vulkan N/A
CUDA +

Benchmark performance


Non-gaming benchmark performance of GeForce 8800 Ultra. Note that overall benchmark performance is measured in points in 0-100 range.


Overall score

This is our combined benchmark performance rating. We are regularly improving our combining algorithms, but if you find some perceived inconsistencies, feel free to speak up in comments section, we usually fix problems quickly.


8800 Ultra
2.19

  • Passmark
Passmark

This is probably the most ubiquitous benchmark, part of Passmark PerformanceTest suite. It gives the graphics card a thorough evaluation under various load, providing four separate benchmarks for Direct3D versions 9, 10, 11 and 12 (the last being done in 4K resolution if possible), and few more tests engaging DirectCompute capabilities.

Benchmark coverage: 26%


8800 Ultra
642


Game benchmarks


Let’s see how good GeForce 8800 Ultra is for gaming. Particular gaming benchmark results are measured in frames per second. Comparisons with game system requirements are included, but remember that sometimes official requirements may reflect reality inaccurately.

Average FPS
Popular games

Relative perfomance


Overall GeForce 8800 Ultra performance compared to nearest competitors among desktop video cards.



NVIDIA GeForce 720A
102.74


ATI Radeon HD 2900 XT
102. 28


AMD FireStream 9170
100.46


NVIDIA GeForce 8800 Ultra
100


NVIDIA GeForce GT 710
99.09


ATI Radeon HD 2900 PRO
97.72


NVIDIA GeForce 810A
96.8

AMD equivalent


We believe that the nearest equivalent to GeForce 8800 Ultra from AMD is FireStream 9170, which is nearly equal in speed and higher by 3 positions in our rating.


FireStream
9170


Compare


Here are some closest AMD rivals to GeForce 8800 Ultra:


AMD Radeon R8 M535DX
104.11


ATI Radeon HD 2900 XT
102. 28


AMD FireStream 9170
100.46


NVIDIA GeForce 8800 Ultra
100


ATI Radeon HD 2900 PRO
97.72


AMD Radeon HD 7570
94.98


AMD Radeon R5 A240
89.5

Similar GPUs

Here is our recommendation of several graphics cards that are more or less close in performance to the one reviewed.


FireStream
9170


Compare


Radeon HD
2900 PRO


Compare


Radeon HD
2900 XT


Compare


GeForce GTS
250


Compare


GeForce
8800 GTX


Compare


GeForce
8800 GTS 512


Compare

Recommended processors

These processors are most commonly used with GeForce 8800 Ultra according to our statistics.


Xeon
X5492

17.6%


Core 2
Extreme QX9770

5.9%


Core i3
2328M

5.9%


Core i5
2300

5.9%


Core 2
Quad Q9650

5.9%


Pentium 4
HT 531

5.9%


Core 2
Quad Q9550

5.9%


Core 2
Quad Q8400

5. 9%


E1
6010

5.9%


Core i7
10750H

5.9%

User rating


Here you can see the user rating of the graphics card, as well as rate it yourself.


Questions and comments


Here you can ask a question about GeForce 8800 Ultra, agree or disagree with our judgements, or report an error or mismatch.


Please enable JavaScript to view the comments powered by Disqus.

GeForce 8800 GTX vs GeForce GTX 275 Graphics cards Comparison

Find out if it is worth upgrading your current GPU setup by comparing GeForce 8800 GTX and GeForce GTX 275. Here you can take a closer look at graphics cards specs, such as core clock speed, memory type and size, display connectors, etc. The price, overall benchmark and gaming performances are usually defining factors when it comes to choosing between GeForce 8800 GTX and GeForce GTX 275. Make sure that the graphics card has compatible dimensions and will properly fit in your new or current computer case. Also these graphics cards may have different system power recommendations, so take that into consideration and upgrade your PSU if necessary.

GeForce 8800 GTX

GeForce GTX 275

Check Price

Main Specs

  GeForce 8800 GTX GeForce GTX 275
Power consumption (TDP) 155 Watt 219 Watt
Interface PCIe 1.0 x16 PCIe 2.0 x16
Supplementary power connectors 2x 6-pin 2x 6-pin
Memory type GDDR3 GDDR3
Maximum RAM amount 768 MB 896 MB
Display Connectors 2x DVI, 1x S-Video 2x DVI
 

Check Price

  • GeForce GTX 275 has 41% more power consumption, than GeForce 8800 GTX.
  • GeForce 8800 GTX is connected by PCIe 1.0 x16, and GeForce GTX 275 uses PCIe 2.0 x16 interface.
  • GeForce GTX 275 has 128 GB more memory, than GeForce 8800 GTX.
  • Both cards are used in Desktops.
  • GeForce 8800 GTX is build with Tesla architecture, and GeForce GTX 275 — with Tesla 2.0.
  • Core clock speed of GeForce GTX 275 is 828 MHz higher, than GeForce 8800 GTX.
  • GeForce 8800 GTX is manufactured by 90 nm process technology, and GeForce GTX 275 — by 55 nm process technology.
  • GeForce 8800 GTX is 260 mm longer, than GeForce GTX 275.
  • Memory clock speed of GeForce GTX 275 is 234 MHz higher, than GeForce 8800 GTX.

Game benchmarks

Assassin’s Creed OdysseyBattlefield 5Call of Duty: WarzoneCounter-Strike: Global OffensiveCyberpunk 2077Dota 2Far Cry 5FortniteForza Horizon 4Grand Theft Auto VMetro ExodusMinecraftPLAYERUNKNOWN’S BATTLEGROUNDSRed Dead Redemption 2The Witcher 3: Wild HuntWorld of Tanks
high / 1080p 0−1 6−7
ultra / 1080p 4−5
QHD / 1440p 0−1 0−1
low / 720p 1−2 16−18
medium / 1080p 0−1 8−9
The average gaming FPS of GeForce GTX 275 in Assassin’s Creed Odyssey is 1600% more, than GeForce 8800 GTX.
high / 1080p 10−12
ultra / 1080p 10−11
QHD / 1440p 0−1 0−1
low / 720p 0−1 24−27
medium / 1080p 12−14
low / 768p 50−55 45−50
high / 1080p 45−50 45−50
QHD / 1440p 0−1 0−1
The average gaming FPS of GeForce 8800 GTX in Call of Duty: Warzone is 6% more, than GeForce GTX 275.
low / 768p 60−65 130−140
medium / 768p 27−30 100−110
ultra / 1080p 7−8 50−55
QHD / 1440p 27−30
4K / 2160p 27−30
high / 768p 16−18 75−80
The average gaming FPS of GeForce GTX 275 in Counter-Strike: Global Offensive is 228% more, than GeForce 8800 GTX.
low / 768p 70−75 55−60
ultra / 1080p 0−1 18−20
medium / 1080p 45−50 45−50
The average gaming FPS of GeForce 8800 GTX in Cyberpunk 2077 is 15% more, than GeForce GTX 275.
low / 768p 45−50 75−80
medium / 768p 10−11 55−60
ultra / 1080p 0−1 30−35
The average gaming FPS of GeForce GTX 275 in Dota 2 is 131% more, than GeForce 8800 GTX.
high / 1080p 8−9
ultra / 1080p 7−8
4K / 2160p 3−4
low / 720p 0−1 18−20
medium / 1080p 8−9
high / 1080p 14−16
ultra / 1080p 10−11
low / 720p 21−24 60−65
medium / 1080p 0−1 21−24
The average gaming FPS of GeForce GTX 275 in Fortnite is 181% more, than GeForce 8800 GTX.
high / 1080p 0−1 12−14
ultra / 1080p 10−12
QHD / 1440p 0−1 1−2
low / 720p 0−1 24−27
medium / 1080p 0−1 12−14
low / 768p 18−20 50−55
medium / 768p 45−50
high / 1080p 0−1 12−14
ultra / 1080p 6−7
QHD / 1440p 0−1 0−1
medium / 720p 12−14
The average gaming FPS of GeForce GTX 275 in Grand Theft Auto V is 173% more, than GeForce 8800 GTX.
high / 1080p 4−5
ultra / 1080p 3−4
4K / 2160p 0−1
low / 720p 0−1 12−14
medium / 1080p 6−7
low / 768p 75−80 90−95
high / 1080p 27−30 85−90
ultra / 1080p 80−85
medium / 1080p 90−95
The average gaming FPS of GeForce GTX 275 in Minecraft is 69% more, than GeForce 8800 GTX.
high / 1080p 16−18
ultra / 1080p 14−16
low / 720p 8−9 30−35
medium / 1080p 18−20
The average gaming FPS of GeForce GTX 275 in PLAYERUNKNOWN’S BATTLEGROUNDS is 300% more, than GeForce 8800 GTX.
ultra / 1080p 7−8
QHD / 1440p 0−1
low / 720p 0−1 10−12
medium / 1080p 10−12
low / 768p 0−1 24−27
medium / 768p 16−18
high / 1080p 9−10
ultra / 1080p 6−7
low / 768p 45−50 85−90
medium / 768p 14−16 40−45
ultra / 1080p 0−1 18−20
high / 768p 12−14 35−40
The average gaming FPS of GeForce GTX 275 in World of Tanks is 120% more, than GeForce 8800 GTX.

Full Specs

  GeForce 8800 GTX GeForce GTX 275
Architecture Tesla Tesla 2.0
Code name G80 GT200B
Type Desktop Desktop
Release date 8 November 2006 15 January 2009
Pipelines 128 240
Core clock speed 576 MHz 1404 MHz
Transistor count 681 million 1,400 million
Manufacturing process technology 90 nm 55 nm
Texture fill rate 36.8 billion/sec 50.6 billion/sec
Floating-point performance 345. 6 gflops 673.9 gflops
Length 270 mm 10.5″ (267 mm) (26.7 cm)
Memory bus width 384 Bit 448 Bit
Memory clock speed 900 MHz 1134 MHz
Memory bandwidth 86.4 GB/s 127.0 GB/s
DirectX 11.1 (10_0) 11.1 (10_0)
Shader Model 4.0 4.0
OpenGL 3.3 3.0
OpenCL 1.1 1.1
Vulkan N/A N/A
CUDA + +
CUDA cores 575 240
Bus support PCI-E 2. 0
Height 4.376″ (111 mm) (11.1 cm)
SLI options + +
Multi monitor support +
Maximum VGA resolution 2048×1536
Audio input for HDMI S/PDIF
 

Check Price

Similar compares

  • GeForce 8800 GTX vs Radeon HD 8730M
  • GeForce 8800 GTX vs GeForce GT 640M LE
  • GeForce GTX 275 vs Radeon HD 8730M
  • GeForce GTX 275 vs GeForce GT 640M LE
  • GeForce 8800 GTX vs GeForce 8800M GTX SLI
  • GeForce 8800 GTX vs Mobility Radeon HD 5850
  • GeForce GTX 275 vs GeForce 8800M GTX SLI
  • GeForce GTX 275 vs Mobility Radeon HD 5850

Nvidia’s GeForce 8800 GT graphics processor

This is an absolutely spectacular time to be a PC gamer. The slew of top-notch and hotly anticipated games hitting stores shelves is practically unprecedented, including BioShock, Crysis, Quake Wars, Unreal Tournament 3, and Valve’s Orange Box trio of goodness. I can’t remember a time quite like it.

However, this may not be the best time to own a dated graphics card. The latest generation of high-end graphics cards brought with it pretty much twice the performance of previous high-end cards, and to add insult to injury, these GPUs added DirectX 10-class features that today’s games are starting to exploit. If you have last year’s best, such as a GeForce 7900 or Radeon X1900, you may not be able to drink in all the eye candy of the latest games at reasonable frame rates.

And if you’ve played the Crysis demo, you’re probably really ready to upgrade. I’ve never seen a prettier low-res slide show.

Fortunately, DirectX 10-class graphics power is getting a whole lot cheaper, starting today. Nvidia has cooked up a new spin of its GeForce 8 GPU architecture, and the first graphics card based on this chip sets a new standard for price and performance. Could the GeForce 8800 GT be the solution to your video card, er, Crysis? Let’s have a look.

Meet the G92

In recent years, graphics processor transistor budgets have been ballooning at a rate even faster than Moore’s Law, and that has led to some, um, exquisitely plus-sized chips. This fall’s new crop of GPUs looks to be something of a corrective to that trend, and the G92 is a case in point. This chip is essentially a die shrink of the G80 graphics processor that powers incumbent GeForce 8800 graphics cards. The G92 adds some nice new capabilities, but doesn’t double up on shader power or anything quite that earth-shaking.

Here’s an extreme close-up of the G92, which may convince your boss/wife that you’re reading something educational and technically edifying right about now. We’ve pictured it next to a U.S. quarter in order to further propagate the American hegemonic mindset. Er, I mean, to provide some context, size-wise. The G92 measures almost exactly 18 mm by 18 mm, or 324 mm². TSMC manufactures the chip for Nvidia on a 65nm fab process, which somewhat miraculously manages to shoehorn roughly 754 million transistors into this space. By way of comparison, the much larger G80—made on a 90nm process—had only 681 million transistors. AMD’s R600 GPU packs 700 million transistors into a 420 mm² die area.

Why, you may be asking, does the G92 have so many more transistors than the G80? Good question. The answer is: a great many little additions here and there, including some we may not know about just yet.

One big change is the integration of the external display chip that acted as a helper to the G80. The G92 natively supports twin dual-link DVI outputs with HDCP, without the need for a separate display chip. That ought to make G92-based video cards cheaper and easier to make. Another change is the inclusion of the VP2 processing engine for high-definition video decoding and playback, an innovation first introduced in the G84 GPU behind the GeForce 8600 lineup. The VP2 engine can handle the most intensive portions of H.264 video decoding in hardware, offloading that burden from the CPU.

Both of those capabilities are pulled in from other chips, but here’s a novel one: PCI Express 2.0 support. PCIe 2.0 effectively doubles the bandwidth available for communication between the graphics card and the rest of the system, and the G92 is Nvidia’s first chip to support this standard. This may be the least-hyped graphics interface upgrade in years, in part because PCIe 1.1 offers quite a bit of bandwidth already. Still, PCIe 2.0 is a major evolutionary step, though I doubt it chews up too many additional transistors.

So where else do the G92’s additional transistors come from? This is where things start to get a little hazy. You see, the GeForce 8800 GT doesn’t look to be a “full” implementation of G92. Although this chip has the same basic GeForce 8-series architecture as its predecessors, the GeForce 8800 GT officially has 112 stream processors, or SPs. That’s seven “clusters” of 16 SPs each. Chip designers don’t tend to do things in odd numbers, so I’d wager an awful lot of Nvidia stock that the G92 actually has at least eight SP clusters onboard.

Eight’s probably the limit, though, because the G92’s SP clusters are “fatter” than the G80’s; they incorporate the G84’s more robust texture addressing capacity of eight addresses per clock, up from four in the G80. That means the GeForce 8800 GT, with its seven SP clusters, can sample a total of 56 texels per clock—well beyond the 24 of the 8800 GTS and 32 of the 8800 GTX. We’ll look at the implications of this change in more detail in a sec.

Another area where the GeForce 8800 GT may be sporting a bit of trimmed down G92 functionality is in the ROP partitions. These sexy little units are responsible for turning fully processed and shaded fragments into full-blown pixels. They also provide much of the chip’s antialiasing grunt, and in Nvidia’s GeForce 8 architecture, each ROP has a 64-bit interface to video memory. The G80 packs six ROP partitions, which is why the full-blown GeForce 8800 GTX has a 384-bit path to memory and the sawed-off 8800 GTS (with five ROP partitions) has a 320-bit memory interface. We don’t know how many ROP partitions the G92 has lurking inside, but the 8800 GT uses only four of them. As a result, it has a 256-bit memory interface, can output a maximum of 16 finished pixels per clock, and has somewhat less antialiasing grunt on a clock-for-clock basis.

How many ROPs does G92 really have? I dunno. I suspect we’ll find out before too long, though.

The 8800 GT up close

What the 8800 GT lacks in functional units, it largely makes up in clock speed. The 8800 GT’s official core clock speed is 600MHz, and its 112 SPs run at 1.5GHz. The card’s 512MB of GDDR3 memory runs at 900MHz—or 1.8GHz effective, thanks to the memory’s doubled data rate.

MSI’s NX8800GT

Here’s a look at MSI’s rendition of the GeForce 8800 GT. Note the distinctive MSI decal. This card is further differentiated in a way that really matters: it comes hot from the factory, with a 660MHz core clock and 950MHz memory. This sort of “overclocking” has become so common among Nvidia’s board partners, it’s pretty much expected at this point. MSI doesn’t disappoint.

I don’t want to give too much away, since we’ve measured noise levels on a decibel meter, but you’ll be pleased to know that the 8800 GT’s single-slot cooler follows in the tradition of Nvidia’s coolers for its other GeForce 8800 cards. The thing is whisper-quiet.

The sight of a single-slot cooler may be your first hint that this is not the sort of video card that will put an ugly dent in your credit rating. Here’s another hint at the 8800 GT’s mainstream aspirations. Nvidia rates the power consumption of the 8800 GT at 110W, which makes the single-slot cooler feasible and also means the 8800 GT needs just one auxiliary PCIe power connector, of the six-pin variety, in order to do its thing.

The 8800 GT sports a single six-pin PCIe aux power connector

Another place where the 8800 GT sports only one connector is in the SLI department. That probably means the 8800 GT won’t be capable of ganging up with three or four of its peers in a mega-multi-GPU config. Two-way SLI is probably the practical limit for this card.

Here’s the kicker, though. 8800 GT cards are slated to become available today for between $199 and $249.

Doing the math

So that’s a nice price, right? Well, like so many things in life—and I sure as heck didn’t believe this in high school—it all boils down to math. If you take the 8800 GT’s seven SP clusters and 112 SPs and throw them into the blender with a 1.5GHz shader clock, a 256-bit memory interface, along with various herbs and spices, this is what comes out:

Peak
pixel
fill rate
(Gpixels/s)

Peak texel
sampling
rate
(Gtexels/s)

Peak bilinear

texel
filtering
rate
(Gtexels/s)


Peak bilinear

FP16 texel
filtering
rate
(Gtexels/s)


Peak
memory
bandwidth
(GB/s)

Peak
shader
arithmetic
(GFLOPS)
GeForce 8800 GT 9. 6 33.6 33.6 16.8 57.6 504
GeForce 8800 GTS 10.0 12.0 12.0 12.0 64.0 346

GeForce 8800 GTX

13.8 18.4 18.4 18.4 86.4 518
GeForce 8800 Ultra 14. 7 19.6 19.6 19.6 103.7 576
Radeon HD 2900 XT 11.9 23.8 11.9 11.9 105.6 475

In terms of texture sampling rates, texture filtering capacity, and shader arithmetic, the 8800 GT is actually superior to the 8800 GTS. It’s also quicker than the Radeon HD 2900 XT in most of those categories, although our FLOPS estimate for the GeForce GPUs is potentially a little rosy—another way of counting would reduce those numbers by a third, making the Radeon look relatively stronger. Also, thanks to its higher clock speed, the 8800 GT doesn’t suffer much in terms of pixel fill rate (and corresponding AA grunt) due to its smaller ROP count. The 8800 GT’s most noteworthy numbers may be its texture sampling and filtering rates. Since its SPs can grab twice as many texels per clock as the G80’s, its texture filtering performance with standard 8-bit integer color formats could be more than double that of the 8800 GTS.

Performance-wise in graphics, math like this isn’t quite destiny, but it’s close. The only place where the 8800 GT really trails the 8800 GTS or the 2900 XT is in memory bandwidth. And, believe it or not, memory bandwidth is arguably at less of a premium these days, since games produce “richer” pixels that spend more time looping through shader programs and thus occupying on-chip storage like registers and caches.

Bottom line: the 8800 GT should generally be as good as or better than the 8800 GTS, for under 250 bucks. Let’s test that theory.

Our testing methods

As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and the results were averaged.

Our test systems were configured like so:

Processor Core
2 Extreme X6800 2.93GHz
System
bus
1066MHz
(266MHz quad-pumped)
Motherboard XFX
nForce 680i SLI
BIOS
revision
P31
North
bridge
nForce
680i SLI SPP
South
bridge
nForce
680i SLI MCP
Chipset
drivers
ForceWare
15. 08
Memory
size
4GB
(4 DIMMs)
Memory
type
2
x Corsair
TWIN2X20488500C5D
DDR2 SDRAM at 800MHz
CAS
latency (CL)
4
RAS
to CAS delay (tRCD)
4
RAS
precharge (tRP)
4
Cycle
time (tRAS)
18
Command
rate
2T
Audio Integrated
nForce 680i SLI/ALC850

with RealTek 6. 0.1.5497 drivers

Graphics GeForce
8800 GT 512MB PCIe

with ForceWare 169.01 drivers

XFX
GeForce 8800 GTS XXX 320MB PCIe

with ForceWare 169.01 drivers

EVGA
GeForce 8800 GTS OC 640MB PCIe

with ForceWare 169.01 drivers

Radeon HD 2900 XT 512MB PCIe

with Catalyst 7.10 drivers

Hard
drive
WD
Caviar SE16 320GB SATA
OS Windows
Vista Ultimate x86 Edition
OS
updates
KB36710, KB938194, KB938979, KB940105,
DirectX August 2007 Update

Please note that we’re using “overclocked in the box” versions of the 8800 GTS 320MB and 640MB, while we’re testing a stock-clocked GeForce 8800 GT reference card from Nvidia.

Thanks to Corsair for providing us with memory for our testing. Their quality, service, and support are easily superior to no-name DIMMs.

Our test systems were powered by PC Power & Cooling Silencer 750W power supply units. The Silencer 750W was a runaway Editor’s Choice winner in our epic 11-way power supply roundup, so it seemed like a fitting choice for our test rigs. Thanks to OCZ for providing these units for our use in testing.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

We used the following versions of our test applications:

  • Crysis demo
  • Unreal Tournament 3 demo
  • Team Fortress 2
  • BioShock 1.0 with DirectX 10
  • Lost Planet: Extreme Condition with DirectX 10
  • FRAPS 2.9.2

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Crysis demo

The Crysis demo is still fresh from the oven, but we were able to test the 8800 GT in it. Crytek has included a GPU benchmarking facility with the demo that consists of a fly-through of the island in which the opening level of the game is set, and we used it. For this test, we set all of the game’s quality options at “high” (not “very high”) and set the display resolution to—believe it or not—1280×800 with 4X antialiasing.

Even at this low res, these relatively beefy graphics cards chugged along. The game looks absolutely stunning, but obviously it’s using a tremendous amount of GPU power in order to achieve the look.

The demo is marginally playable at these settings, but I’d prefer to turn antialiasing off in order to get smoother frame rates on the 8800 GT. That’s what I did when I played through the demo, in fact.

Notice several things about our results. Although the 8800 GT keeps up with the 8800 GTS 640MB in terms of average frame rates, it hit lower lows of around 10 FPS, probably due to its lesser memory bandwidth or its smaller amount of total RAM onboard. Speaking of memory, the card for which the 8800 GT is arguably a replacement, the 320MB version of the GTS, stumbles badly here. This is why we were lukewarm on the GTS 320MB when it first arrived. Lots of GPU power isn’t worth much if you don’t have enough video memory. GTS 320MB owners will probably have to drop to “medium” quality in order to run Crysis smoothly.

Unreal Tournament 3 demo

We tested the UT3 demo by playing a deathmatch against some bots and recording frame rates during 60-second gameplay sessions using FRAPS. This method has the advantage of duplicating real gameplay, but it comes at the expense of precise repeatability. We believe five sample sessions are sufficient to get reasonably consistent and trustworthy results. In addition to average frame rates, we’ve included the low frames rates, because those tend to reflect the user experience in performance-critical situations. In order to diminish the effect of outliers, we’ve reported the median of the five low frame rates we encountered.

Because the Unreal engine doesn’t support multisampled antialiasing, we tested without AA. Instead, we just cranked up the resolution to 2560×1600 and turned up the demo’s quality sliders to the max. I also disabled the demo’s 62 FPS frame rate cap before testing.

All of these cards can play the UT3 demo reasonably well at this resolution, the 8800 GT included. I noticed some brief slowdowns on the GTS 320MB right as I started the game, but those seemed to clear up after a few seconds.

Team Fortess 2

For TF2, I cranked up all of the game’s quality options, set anisotropic filtering to 16X, and used 4X multisampled antialiasing at 2560×1600 resolution. I then hopped onto a server with 24 players duking it out on the “ctf_2fort” map. I recorded a demo of me playing as a soldier, somewhat unsuccessfully, and then used the Source engine’s timedemo function to play the demo back and report performance.

The 8800 GT leads all contenders in TF2. Even at 2560×1600 with 4X AA and 16X aniso, TF2 is perfectly playable with this card, although that didn’t help my poor soldier guy much.

BioShock

We tested this game with FRAPS, just like we did the UT3 demo. BioShock’s default settings in DirectX 10 are already very high quality, so we didn’t tinker with them much. We just set the display res to 2560×1600 and went to town. In this case, I was trying to take down a Big Daddy, another generally unsuccessful effort.

A low of 23 FPS for the 8800 GT puts it right on the edge of smooth playability. The 8800 GT pretty much outclasses the Radeon HD 2900 XT here, amazingly enough. The 2900 XT couldn’t quite muster a playable frame rate at these settings, which my seat-of-the-pants impression confirmed during testing.

Lost Planet: Extreme Condition

Here’s another DX10 game. We ran this game in DirectX 10 mode at 1920×1200 with all of its quality options maxed out, plus 4X AA and 16X anisotropic filtering. We used the game’s built-in performance test, which tests two very different levels in the game, a snowy outdoor setting and a cave teeming with flying doodads.

Here’s another case where the 8800 GTS 320MB stumbles, while the 8800 GT does not. Although the Radeon HD 2900 XT lists for $399, it looks like an also-ran in most of our tests.

Power consumption

We measured total system power consumption at the wall socket using an Extech power analyzer model 380803. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement.

The idle measurements were taken at the Windows Vista desktop with the Aero theme enabled. The cards were tested under load running BioShock in DirectX 10 at 2560×1600 resolution, using the same settings we did for performance testing.

Nvidia has done a nice job with the G92’s power consumption. Our 8800 GT-based test system draws over 20 fewer watts at idle than any of the others tested. Under load, the story is similar. Mash up these numbers with the performance results, and you get a very compelling power efficiency picture.

Noise levels

We measured noise levels on our test systems, sitting on an open test bench, using an Extech model 407727 digital sound level meter. The meter was mounted on a tripod approximately 14″ from the test system at a height even with the top of the video card. We used the OSHA-standard weighting and speed for these measurements.

You can think of these noise level measurements much like our system power consumption tests, because the entire systems’ noise levels were measured, including the stock Intel cooler we used to cool the CPU. Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a card’s highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.

Nvidia’s single-slot coolers have too often been gratuitously small and noisy in the past year or two, but the 8800 GT is different. This may be the quietest single-slot cooler we’ve ever tested (save for the passive ones), and it doesn’t grow audibly louder under load. That’s a pleasant surprise, since the thing can get very loud during its initial spin-up at boot time. Fortunately, it never visted that territory for us when runnning games.

Conclusions

You’ve seen the results for yourself, so you pretty much know what I’m going to say. The 8800 GT does a very convincing imitation of the GeForce 8800 GTS 640MB when running the latest games, even at high resolutions and quality settings, with antialiasing and high-quality texture filtering. Its G92 GPU has all of the GeForce 8800 goodness we’ve come to appreciate in the past year or so, including DX10 support, coverage-sampled antialiasing, and top-notch overall image quality. The card is quiet and draws relatively little power compared to its competitors, and it will only occupy a single slot in your PC. That’s a stunning total package, sort of what it would be like if Jessica Biel had a brain.

With pricing between $199 and $249, I find it hard to recommend anything else—especially since we found generally playable settings at 2560×1600 resolution in some of the most intensive new games (except for Crysis, which is in a class by itself.) I expect we may see some more G92-based products popping up in the coming weeks or months, but for most folks, this will be the version to have.

The one potential fly in the ointment for the 8800 GT is its upcoming competition from AMD. As we were preparing this review, the folks from AMD contacted us to let us know that the RV670 GPU is coming soon, and that they expect it to bring big increases in performance and power efficiency along with it. In fact, the AMD folks sound downright confident they’ll have the best offering at this price point when the dust settles, and they point to several firsts they’ll be offering as evidence. With RV670, they expect to be the first to deliver a GPU fabbed on a 55nm process, the first to offer a graphics processor compliant with the DirectX 10.1 spec, and the first to support four-way multi-GPU configs in Windows Vista. DirectX 10.1 is a particular point of emphasis for AMD, because it allows for some nifty things like fast execution of global illumination algorithms and direct developer control of antialiasing sample patterns. Those enhancements, of course, will be pretty much academic if RV670-based cards don’t provide as compelling a fundamental mix of performance, image quality, and power efficiency as the GeForce 8800 GT. We’ll know whether they’ve achieved that very soon.

This concludes our first look at the 8800 GT, but it’s not the end of our evaluation process. I’ve been knee–deep in CPUs over the past month or so, culminating today with our review of the 45nm Core 2 Extreme QX9650 processor today, and that’s kept me from spending all of the time with the 8800 GT that I’d like. Over the next week or so, I’ll be delving into multi-GPU performance, some image quality issues, HD video playback, more games, and more scaling tests. We may have yet another new video card for you shortly, too.

XFX Triple SLI — 8800 Ultra’s in 3 Way SLI | Introduction | GPU & Displays

Introduction

 

SLI has been around for a few years and Nvidia have pretty much been the driving force behind multi-GPU solutions since the inception of their SLI solution. Last year they tried to ramp it up a notch with Quad SLI, but this failed to impress enthusiasts, review websites and pretty much the whole industry.

 

Now they’ve brought in another crazy solution: Triple SLI, or 3 Way SLI.

 

What you need for SLI, according to Nvidia is:

 

 

3-way NVIDIA SLI-Ready GPUs:
NVIDIA GeForce 8800 Ultra
NVIDIA GeForce 8800 GTX

3-way NVIDIA SLI-Ready MCPs:
NVIDIA nForce 780i SLI for INTEL
NVIDIA nForce 680i SLI for INTEL

3-way NVIDIA SLI-Ready Power Supplies:
Please visit the SLI Zone Certified SLI-ready Power Supply website and choose a power supply model from the section For “Three GeForce 8800 Ultra or GeForce 8800 GTX.

3-way NVIDIA SLI Cases:
Please visit the Please visit the SLI Zone Certified SLI-ready Cases website and choose a cases from the section For “Three GeForce 8800 Ultra or GeForce 8800 GTX.”

3-way NVIDIA SLI Connector:
3-way SLI requires a unique SLI connector in order to operate properly. These connectors may not have been included with your previous purchase of SLI-ready components or PCs. PCs specifically sold as 3-way SLI PCs will have this connector included and preinstalled.

 

SLI…so what is it?

 

SLI stands for Scaleable Link Interface is the marketing name for a way of using two or more graphics processors in parallel. Using both the PCI Express bus and the proprietary SLI connector made by Nvidia, the graphics cards communicate using dedicated scaling logic in each GPU. Load balancing, pixel and display data are passed between each GPU over the PCI-e and SLI connector; basically the two cards share the workload.

 

SLI isn’t perfrect, but it is improving as Nvidia revise and re-work their drivers and algorithms to get the best out of SLI. Most situations where SLI is supported see a 1.5-1.9x increase in performance, although unsupported games do not see any at all, some even having a loss in performance.

 

There are three different rendering or balancing modes for Tri SLI, below is an excerp from WikiPedia with the details:

 

 

* Split Frame Rendering (SFR), the first rendering method. This analyzes the rendered image in order to split the workload 50/50 between the two GPUs. To do this, the frame is split horizontally in varying ratios depending on geometry. For example, in a scene where the top half of the frame is mostly empty sky, the dividing line will lower, balancing geometry workload between the two GPUs. This method does not scale geometry or work as well as AFR, however.

* Alternate Frame Rendering (AFR), the second rendering method. Here, each GPU renders entire frames in sequence – one GPU processes even frames, and the second processes odd frames, one after the other. When the slave card finishes work on a frame (or part of a frame) the results are sent via the SLI bridge to the master card, which then outputs the completed frames. Ideally, this would result in the rendering time being cut in half, and thus performance from the video cards would double. In their advertising, NVIDIA claims up to 1.9x the performance of one card with the dual-card setup.

* SLI Antialiasing. This is a standalone rendering mode that offers up to double the antialiasing performance by splitting the antialiasing workload between the two graphics cards, offering superior image quality. One GPU performs an antialiasing pattern which is slightly offset to the usual pattern (for example, slightly up and to the right), and the second GPU uses a pattern offset by an equal amount in the opposite direction (down and to the left). Compositing both the results gives higher image quality than is normally possible. This mode is not intended for higher frame rates, and can actually lower performance, but is instead intended for games which are not GPU-bound, offering a clearer image in place of better performance. When enabled, SLI Antialiasing offers advanced antialiasing options: SLI 8X, SLI 16X, and SLI 32x (8800-series only). A Quad SLI system is capable of up to SLI 64X antialiasing.

Note that Tri SLI generally tends to use 3 GPU AF rendering and this certainly has the biggest performance benefit.

 

Triple SLI

 

Triple SLI works on the same precept as SLI, with 3 cards sharing the load. Unfortunately for the masses who went out and bought the excellent 8800 GT or 8800 GTS, Triple SLI supports only the 8800 GTX or the 8800 Ultra meaning that those without those expensive top-end GPU’s will not see the benefit of Tri-SLI.

 

What do we have here then?

 

XFX have kindly sent us three of their top-end  8800 Ultra’s to perform the review, along with their 780i SLI motherboard we reviewed previously.

 

 

 

Oh, and not forgetting the larger one of the two connectors in this picture:

 

 

We’ll take a brief look at whats inside those rather large boxes, then get into the benchmarks!

1 — Introduction2 — Tri SLI — pictures3 — Test Setup4 — Benchmarks — 3DMark performance5 — Call of Duty 4, F. E.A.R. and Bioshock6 — Oblivion and Company of Heroes7 — Unreal Tounament 3 and Crysis8 — Gaming experience & Issues Encountered9 — Conclusion«Prev 1 2 3 4 5 6 7 8 9 Next»

NVIDIA GeForce 8800 Ultra Launch


Published on 2nd May 2007, written by Rys for Consumer Graphics — Last updated: 2nd May 2007

With AMD’s graphics group’s next
generation of Radeon products just about to break cover, NVIDIA have
decided to preempt any attack on their single-board performance
leadership by announcing GeForce 8800 Ultra today, usurping GeForce
8800 GTX as their premier graphics product.

Launching today with
availability on the 14th May, the same day that AMD will announce their
next generation line, GeForce 8800 Ultra is based on the same G80 architecture as the other 8800-series SKUs. It’s created by means of increased clocks in all three main clock domains compared to the GTX launched last November.

Click for a bigger version

GeForce 8800 Ultra

Coming
clocked at 612/1500/1080MHz (base/shader/mem), compared to 8800 GTX at
575/1350/900, 8800 Ultra differentiates itself in clocks and cooling
solution alone, sharing the same PCB and memory count as 8800 GTX.
NVIDIA have employed Samsung’s 512Mib K4J52324QE GDDR3 DRAM devices
again, but this time they’re rated to 0.8ns refresh, giving them a
rated clock of some 1200MHz. Running at 1080MHz, the memories are 20%
faster than on GTX (103.6GB/sec peak bandwidth), and with shader clock
up more than 10% and base clock up nearly 7%, NVIDIA claim an aggregate
performance increase of around 5-10% versus GTX over a range of titles
and at a range of resolutions.

Our
initial game test results confirm NVIDIA’s statements with 6-9% gains
over 8800 GTX at 2560×1600 with 4xAA/16xAF in S.T.A.L.K.E.R., Half-Life
2: Episode 1 and Oblivion. We’ll present the full spectrum of results
around the same time we’ll cover AMD’s next generation Radeon hardware.
Peak instruction rate testing shows the serial MUL in the SFU is still
unavailable for general shading, at least under Vista x64 with the
158.18 driver. Fillrates through the ROPs are scaling as expected with
base clock, and peak surface filtering rates (INT8 bilinear at least)
are ~5% higher than GTX in our tests.

NVIDIA
claim a set of process tweaks are what have allowed them to create the
Ultra, allowing them the higher clocks at the same rough power profile
as 8800 GTX. Indeed, NVIDIA state that worst-case power draw is
actually a couple of watts lower than GTX.

However NVIDIA also state
that the new power profile doesn’t affect every die cut from the wafer
starts that are benefiting from the tweaks, and they also don’t state
what the tweaks comprise. The net result is that it’s actually a bit of
a lottery as to whether you’ll get an Ultra (or indeed a GTX or GTS,
since the tweaked wafers are to be used for all 8800 production) with
the new profile.

Our testing
with an A3 revision 8800 Ultra shows it has substantially higher idle
and peak power compared to a 8800 GTX in the same test system. The
hardware platform for the power testing was nForce 680i SLI, Intel Core
2 Extreme X6800, 8GiB DDR2-800 and the same disk subsystem each time.
The load condition consisted of 3DMark05 GT3 running at 1920×1200 with
4xAA/16xAF.

Power Testing Results

As
you can see, 8800 Ultra consumes around 11% more on average in our
3DMark05 test, with the peak around 10% more. Idle power isn’t
impressive compared to GTX either. Clocking the board down to GTX’s
base and memory frequencies (shader clock unknown because of the tools
to adjust clocks) shows our test Ultra can’t match our GTX in the power
stakes, clock-for-clock. There’s no way to tell if the G80-based
product you’re buying will exhibit the new power profile in terms of
consumption.

Board Physicals

Ultra
and GTX share a PCB with only minor board-level component differences.
The cooler is different however, NVIDIA moving the fan placement in the
cooler assembly to overhang the board edge, presumably to aid airflow.
Noise seems to be unchanged, our test Ultra no different to our GTX
under load conditions. If there are noise differences, we’re unable to
detect them. Neither DVI port is able to drive a dual-link receiver
with HDCP protection, and for those that care about the minutiae, the
entire Ultra product assembly is 30g heavier than a GTX.

Click for a bigger version

Click for a bigger version

Early Thoughts

Priced
at €599EUR (~$820USD/~£470GBP), we tentatively argue — before we run the full
gamut of tests — that the performance increase offered by the Ultra
certainly isn’t worth the asking price at the time of writing, given
that you can find overclocked GTX offerings from NVIDIA’s AIB partners
for much less money.

Factor in the power profile issue, where you’re not
guaranteed to get a G80 die that’s any better in that respect than
before, and the financial ask for the SKU on launch is too
much. However, as a clear boutique part, there’s little doubt that
NVIDIA will sell those it produces to the very few consumers that must
have the fastest no matter what it is or what it costs.

It
remains to be seen if it’s enough to keep the performance crown, as we
wait the last week or so before the next generation of Radeon hardware
breaks cover. We’ll pit the Ultra, GTX and GTS against the first new
Radeon SKUs, to see where things shake out, since we’ve not been able
to spend a massive amount of time with the board to date, given
NVIDIA’s sampling policy for the product.

NVIDIA GeForce 8800 Ultra review: GPU specs, performance benchmarks

Buy on Amazon

GeForce 8800 Ultra videocard released by NVIDIA; release date: 2 May 2007. At the time of release, the videocard cost $829. The videocard is designed for desktop-computers and based on Tesla microarchitecture codenamed G80.

Core clock speed — 612 MHz. Texture fill rate — 39.2 billion / sec. Pipelines — 128. Floating-point performance — 387.1 gflops. Manufacturing process technology — 90 nm. Transistors count — 681 million. Power consumption (TDP) — 171 Watt.

Memory type: GDDR3. Maximum RAM amount — 512 MB. Memory bus width — 384 Bit. Memory clock speed — 1080 MHz. Memory bandwidth — 103.7 GB / s.

Benchmarks



PassMark
G3D Mark

Top 1 GPU
This GPU


PassMark
G2D Mark

Top 1 GPU
This GPU





Name Value
PassMark — G3D Mark 643
PassMark — G2D Mark 252

Specifications (specs)




























Architecture Tesla
Code name G80
Launch date 2 May 2007
Launch price (MSRP) $829
Place in performance rating 479
Type Desktop
Core clock speed 612 MHz
CUDA cores 612
Floating-point performance 387. 1 gflops
Manufacturing process technology 90 nm
Pipelines 128
Texture fill rate 39.2 billion / sec
Thermal Design Power (TDP) 171 Watt
Transistor count 681 million

Display Connectors 2x DVI, 1x S-Video
Interface PCIe 1.0 x16
Length 270 mm
Supplementary power connectors 2x 6-pin
DirectX 10. 0
OpenGL 3.3
Maximum RAM amount 512 MB
Memory bandwidth 103.7 GB / s
Memory bus width 384 Bit
Memory clock speed 1080 MHz
Memory type GDDR3
SLI

Navigation

Choose a GPU

Compare videocards

Compare NVIDIA GeForce 8800 Ultra with others




NVIDIA
GeForce 8800 Ultra



vs



NVIDIA
Quadro FX 1300




NVIDIA
GeForce 8800 Ultra



vs



ATI
FireGL V3100




NVIDIA
GeForce 8800 Ultra



vs



ATI
Radeon X1600 PRO




NVIDIA
GeForce 8800 Ultra



vs



NVIDIA
GeForce GTX 555 OEM




NVIDIA
GeForce 8800 Ultra



vs



AMD
Radeon HD 7770 GHz Edition




NVIDIA
GeForce 8800 Ultra



vs



AMD
Radeon HD 7870M

Characteristics and reviews of NVIDIA GeForce 8800 ULTRA / Overclockers.

ua

  • News
  • Specifications
  • Reviews
  • Processors
  • Motherboards
  • Memory
  • Video cards
  • Cooling systems
  • Enclosures
  • Power supplies
  • Accumulators
  • Peripherals
  • Systems 9Fury XRadeon R9 FuryRadeon R9 NanoRadeon R9 390XRadeon R9 390Radeon R9 380XRadeon R9 380Radeon R7 370Radeon R7 360Radeon R9 295X2Radeon R9 290XRadeon R9 290Radeon R9 280XRadeon R9 285Radeon R9 280Radeon R9 270XRadeon R9 270Radeon R7 265Radeon R7 260XRadeon R7 260Radeon R7 250Radeon R7 240Radeon HD 7970Radeon HD 7950Radeon HD 7870 XTRadeon HD 7870Radeon HD 7850Radeon HD 7790Radeon HD 7770Radeon HD 7750Radeon HD 6990Radeon HD 6970Radeon HD 6950Radeon HD 6930Radeon HD 6870Radeon HD 6850Radeon HD 6790Radeon HD 6770Radeon HD 6750Radeon HD 6670 GDDR5Radeon HD 6670 GDDR3Radeon HD 6570 GDDR5Radeon HD 6570 GDDR3Radeon HD 6450 GDDR5Radeon HD 6450 GDDR3Radeon HD 5570 GDDR5Radeon HD 3750Radeon HD 3730Radeon HD 5970Radeon HD 5870Radeon HD 5850Radeon HD 5830Radeon HD 5770Radeon HD 5750Radeon HD 5670Radeon HD 5570Radeon HD 5550Radeon HD 5450Radeon HD 4890Radeon HD 4870 X2Radeon HD 4870Radeon HD 4860Radeon HD 4850 X2Radeon HD 4850Radeon HD 4830Radeon HD 4790Radeon HD 4770Radeon HD 4730Radeon HD 4670Radeon HD 4650Radeon HD 4550Radeon HD 4350Radeon HD 4350Radeon HD 43500 (IGP 890GX) Radeon HD 4200 (IGP)Radeon HD 3870 X2Radeon HD 3870Radeon HD 3850Radeon HD 3690Radeon HD 3650Radeon HD 3470Radeon HD 3450Radeon HD 3300 (IGP)Radeon HD 3200 (IGP)Radeon HD 3100 (IGP)Radeon HD 2900 XT 1Gb GDDR4Radeon HD 2900 XTRadeon HD 2900 PRORadeon HD 2900 GTRadeon HD 2600 XT DUALRadeon HD 2600 XT GDDR4Radeon HD 2600 XTRadeon HD 2600 PRORadeon HD 2400 XTRadeon HD 2400 PRORadeon HD 2350Radeon X1950 CrossFire EditionRadeon X1950 XTXRadeon X1950 XTRadeon X1950 PRO DUALRadeon X1950 PRORadeon X1950 GTRadeon X1900 CrossFire EditionRadeon X1900 XTXRadeon X1900 XTRadeon X1900 GT Rev2Radeon X1900 GTRadeon X1800 CrossFire EditionRadeon X1800 XT PE 512MBRadeon X1800 XTRadeon X1800 XLRadeon X1800 GTORadeon X1650 XTRadeon X1650 GTRadeon X1650 XL DDR3Radeon X1650 XL DDR2Radeon X1650 PRO on RV530XTRadeon X1650 PRO on RV535XTRadeon X1650Radeon X1600 XTRadeon X1600 PRORadeon X1550 PRORadeon X1550Radeon X1550 LERadeon X1300 XT on RV530ProRadeon X1300 XT on RV535ProRadeon X1300 CERadeon X1300 ProRadeon X1300Radeon X1300 LERadeon X1300 HMRadeon X1050Radeon X850 XT Platinum EditionRadeon X850 XT CrossFire EditionRadeon X850 XT Radeon X850 Pro Radeon X800 XT Platinum EditionRadeon X800 XTRadeon X800 CrossFire EditionRadeon X800 XLRadeon X800 GTO 256MBRadeon X800 GTO 128MBRadeon X800 GTO2 256MBRadeon X800Radeon X800 ProRadeon X800 GT 256MBRadeon X800 GT 128MBRadeon X800 SERadeon X700 XTRadeon X700 ProRadeon X700Radeon X600 XTRadeon X600 ProRadeon X550 XTRadeon X550Radeon X300 SE 128MB HM-256MBR adeon X300 SE 32MB HM-128MBRadeon X300Radeon X300 SERadeon 9800 XTRadeon 9800 PRO /DDR IIRadeon 9800 PRO /DDRRadeon 9800Radeon 9800 SE-256 bitRadeon 9800 SE-128 bitRadeon 9700 PRORadeon 9700Radeon 9600 XTRadeon 9600 PRORadeon 9600Radeon 9600 SERadeon 9600 TXRadeon 9550 XTRadeon 9550Radeon 9550 SERadeon 9500 PRORadeon 9500 /128 MBRadeon 9500 /64 MBRadeon 9250Radeon 9200 PRORadeon 9200Radeon 9200 SERadeon 9000 PRORadeon 9000Radeon 9000 XTRadeon 8500 LE / 9100Radeon 8500Radeon 7500Radeon 7200 Radeon LE Radeon DDR OEM Radeon DDR Radeon SDR Radeon VE / 7000Rage 128 GL Rage 128 VR Rage 128 PRO AFRRage 128 PRORage 1283D Rage ProNVIDIAGeForce RTX 3090 TiGeForce RTX 3090GeForce RTX 3080 TiGeForce RTX 3080 12GBGeForce RTX 3080GeForce RTX 3070 TiGeForce RTX 3070GeForce RTX 3060 TiGeForce RTX 3060 rev. 2GeForce RTX 3060GeForce RTX 3050GeForce RTX 2080 TiGeForce RTX 2080 SuperGeForce RTX 2080GeForce RTX 2070 SuperGeForce RTX 2070GeForce RTX 2060 SuperGeForce RTX 2060GeForce GTX 1660 TiGeForce GTX 1660 SuperGeForce GTX 1660GeForce GTX 1650 SuperGeForce GTX 1650 GDDR6GeForce GTX 1650 rev.3GeForce GTX 1650 rev.2GeForce GTX 1650GeForce GTX 1630GeForce GTX 1080 TiGeForce GTX 1080GeForce GTX 1070 TiGeForce GTX 1070GeForce GTX 1060GeForce GTX 1060 3GBGeForce GTX 1050 TiGeForce GTX 1050 3GBGeForce GTX 1050GeForce GT 1030GeForce GTX Titan XGeForce GTX 980 TiGeForce GTX 980GeForce GTX 970GeForce GTX 960GeForce GTX 950GeForce GTX TitanGeForce GTX 780 TiGeForce GTX 780GeForce GTX 770GeForce GTX 760GeForce GTX 750 TiGeForce GTX 750GeForce GT 740GeForce GT 730GeForce GTX 690GeForce GTX 680GeForce GTX 670GeForce GTX 660 TiGeForce GTX 660GeForce GTX 650 Ti BoostGeForce GTX 650 TiGeForce GTX 650GeForce GT 640 rev.2GeForce GT 640GeForce GT 630 rev.2GeForce GT 630GeForce GTX 590GeForce GTX 580GeForce GTX 570GeForce GTX 560 TiGeForce GTX 560GeForce GTX 550 TiGeForce GT 520GeForce GTX 480GeForce GTX 470GeForce GTX 465GeForce GTX 460 SEGeForce GTX 460 1024MBGeForce GTX 460 768MBGeForce GTS 450GeForce GT 440 GDDR5GeForce GT 440 GDDR3GeForce GT 430GeForce GT 420GeForce GTX 295GeForce GTX 285GeForce GTX 280GeForce GTX 275GeForce GTX 260 rev. 2GeForce GTX 260GeForce GTS 250GeForce GTS 240GeForce GT 240GeForce GT 230GeForce GT 220GeForce 210Geforce 205GeForce GTS 150GeForce GT 130GeForce GT 120GeForce G100GeForce 9800 GTX+GeForce 9800 GTXGeForce 9800 GTSGeForce 9800 GTGeForce 9800 GX2GeForce 9600 GTGeForce 9600 GSO (G94)GeForce 9600 GSOGeForce 9500 GTGeForce 9500 GSGeForce 9400 GTGeForce 9400GeForce 9300GeForce 8800 ULTRAGeForce 8800 GTXGeForce 8800 GTS Rev2GeForce 8800 GTSGeForce 8800 GTGeForce 8800 GS 768MBGeForce 8800 GS 384MBGeForce 8600 GTSGeForce 8600 GTGeForce 8600 GSGeForce 8500 GT DDR3GeForce 8500 GT DDR2GeForce 8400 GSGeForce 8300GeForce 8200GeForce 8100GeForce 7950 GX2GeForce 7950 GTGeForce 7900 GTXGeForce 7900 GTOGeForce 7900 GTGeForce 7900 GSGeForce 7800 GTX 512MBGeForce 7800 GTXGeForce 7800 GTGeForce 7800 GS AGPGeForce 7800 GSGeForce 7600 GT Rev.2GeForce 7600 GTGeForce 7600 GS 256MBGeForce 7600 GS 512MBGeForce 7300 GT Ver2GeForce 7300 GTGeForce 7300 GSGeForce 7300 LEGeForce 7300 SEGeForce 7200 GSGeForce 7100 GS TC 128 (512)GeForce 6800 Ultra 512MBGeForce 6800 UltraGeForce 6800 GT 256MBGeForce 6800 GT 128MBGeForce 6800 GTOGeForce 6800 256MB PCI-EGeForce 6800 128MB PCI-EGeForce 6800 LE PCI-EGeForce 6800 256MB AGPGeForce 6800 128MB AGPGeForce 6800 LE AGPGeForce 6800 GS AGPGeForce 6800 GS PCI-EGeForce 6800 XTGeForce 6600 GT PCI-EGeForce 6600 GT AGPGeForce 6600 DDR2GeForce 6600 PCI-EGeForce 6600 AGPGeForce 6600 LEGeForce 6200 NV43VGeForce 6200GeForce 6200 NV43AGeForce 6500GeForce 6200 TC 64(256)GeForce 6200 TC 32(128)GeForce 6200 TC 16(128)GeForce PCX5950GeForce PCX 5900GeForce PCX 5750GeForce PCX 5550GeForce PCX 5300GeForce PCX 4300GeForce FX 5950 UltraGeForce FX 5900 UltraGeForce FX 5900GeForce FX 5900 ZTGeForce FX 5900 XTGeForce FX 5800 UltraGeForce FX 5800GeForce FX 5700 Ultra /DDR-3GeForce FX 5700 Ultra /DDR-2GeForce FX 5700GeForce FX 5700 LEGeForce FX 5600 Ultra (rev. 2)GeForce FX 5600 Ultra (rev.1)GeForce FX 5600 XTGeForce FX 5600GeForce FX 5500GeForce FX 5200 UltraGeForce FX 5200GeForce FX 5200 SEGeForce 4 Ti 4800GeForce 4 Ti 4800-SEGeForce 4 Ti 4200-8xGeForce 4 Ti 4600GeForce 4 Ti 4400GeForce 4 Ti 4200GeForce 4 MX 4000GeForce 4 MX 440-8x / 480GeForce 4 MX 460GeForce 4 MX 440GeForce 4 MX 440-SEGeForce 4 MX 420GeForce 3 Ti500GeForce 3 Ti200GeForce 3GeForce 2 Ti VXGeForce 2 TitaniumGeForce 2 UltraGeForce 2 PROGeForce 2 GTSGeForce 2 MX 400GeForce 2 MX 200GeForce 2 MXGeForce 256 DDRGeForce 256Riva TNT 2 UltraRiva TNT 2 PRORiva TNT 2Riva TNT 2 M64Riva TNT 2 Vanta LTRiva TNT 2 VantaRiva TNTRiva 128 ZXRiva 128 9Fury XRadeon R9 FuryRadeon R9 NanoRadeon R9 390XRadeon R9 390Radeon R9 380XRadeon R9 380Radeon R7 370Radeon R7 360Radeon R9 295X2Radeon R9 290XRadeon R9 290Radeon R9 280XRadeon R9 285Radeon R9 280Radeon R9 270XRadeon R9 270Radeon R7 265Radeon R7 260XRadeon R7 260Radeon R7 250Radeon R7 240Radeon HD 7970Radeon HD 7950Radeon HD 7870 XTRadeon HD 7870Radeon HD 7850Radeon HD 7790Radeon HD 7770Radeon HD 7750Radeon HD 6990Radeon HD 6970Radeon HD 6950Radeon HD 6930Radeon HD 6870Radeon HD 6850Radeon HD 6790Radeon HD 6770Radeon HD 6750Radeon HD 6670 GDDR5Radeon HD 6670 GDDR3Radeon HD 6570 GDDR5Radeon HD 6570 GDDR3Radeon HD 6450 GDDR5Radeon HD 6450 GDDR3Radeon HD 5570 GDDR5Radeon HD 3750Radeon HD 3730Radeon HD 5970Radeon HD 5870Radeon HD 5850Radeon HD 5830Radeon HD 5770Radeon HD 5750Radeon HD 5670Radeon HD 5570Radeon HD 5550Radeon HD 5450Radeon HD 4890Radeon HD 4870 X2Radeon HD 4870Radeon HD 4860Radeon HD 4850 X2Radeon HD 4850Radeon HD 4830Radeon HD 4790Radeon HD 4770Radeon HD 4730Radeon HD 4670Radeon HD 4650Radeon HD 4550Radeon HD 4350Radeon HD 4350Radeon HD 43500 (IGP 890GX) Radeon HD 4200 (IGP)Radeon HD 3870 X2Radeon HD 3870Radeon HD 3850Radeon HD 3690Radeon HD 3650Radeon HD 3470Radeon HD 3450Radeon HD 3300 (IGP)Radeon HD 3200 (IGP)Radeon HD 3100 (IGP)Radeon HD 2900 XT 1Gb GDDR4Radeon HD 2900 XTRadeon HD 2900 PRORadeon HD 2900 GTRadeon HD 2600 XT DUALRadeon HD 2600 XT GDDR4Radeon HD 2600 XTRadeon HD 2600 PRORadeon HD 2400 XTRadeon HD 2400 PRORadeon HD 2350Radeon X1950 CrossFire EditionRadeon X1950 XTXRadeon X1950 XTRadeon X1950 PRO DUALRadeon X1950 PRORadeon X1950 GTRadeon X1900 CrossFire EditionRadeon X1900 XTXRadeon X1900 XTRadeon X1900 GT Rev2Radeon X1900 GTRadeon X1800 CrossFire EditionRadeon X1800 XT PE 512MBRadeon X1800 XTRadeon X1800 XLRadeon X1800 GTORadeon X1650 XTRadeon X1650 GTRadeon X1650 XL DDR3Radeon X1650 XL DDR2Radeon X1650 PRO on RV530XTRadeon X1650 PRO on RV535XTRadeon X1650Radeon X1600 XTRadeon X1600 PRORadeon X1550 PRORadeon X1550Radeon X1550 LERadeon X1300 XT on RV530ProRadeon X1300 XT on RV535ProRadeon X1300 CERadeon X1300 ProRadeon X1300Radeon X1300 LERadeon X1300 HMRadeon X1050Radeon X850 XT Platinum EditionRadeon X850 XT CrossFire EditionRadeon X850 XT Radeon X850 Pro Radeon X800 XT Platinum EditionRadeon X800 XTRadeon X800 CrossFire EditionRadeon X800 XLRadeon X800 GTO 256MBRadeon X800 GTO 128MBRadeon X800 GTO2 256MBRadeon X800Radeon X800 ProRadeon X800 GT 256MBRadeon X800 GT 128MBRadeon X800 SERadeon X700 XTRadeon X700 ProRadeon X700Radeon X600 XTRadeon X600 ProRadeon X550 XTRadeon X550Radeon X300 SE 128MB HM-256MBR adeon X300 SE 32MB HM-128MBRadeon X300Radeon X300 SERadeon 9800 XTRadeon 9800 PRO /DDR IIRadeon 9800 PRO /DDRRadeon 9800Radeon 9800 SE-256 bitRadeon 9800 SE-128 bitRadeon 9700 PRORadeon 9700Radeon 9600 XTRadeon 9600 PRORadeon 9600Radeon 9600 SERadeon 9600 TXRadeon 9550 XTRadeon 9550Radeon 9550 SERadeon 9500 PRORadeon 9500 /128 MBRadeon 9500 /64 MBRadeon 9250Radeon 9200 PRORadeon 9200Radeon 9200 SERadeon 9000 PRORadeon 9000Radeon 9000 XTRadeon 8500 LE / 9100Radeon 8500Radeon 7500Radeon 7200 Radeon LE Radeon DDR OEM Radeon DDR Radeon SDR Radeon VE / 7000Rage 128 GL Rage 128 VR Rage 128 PRO AFRRage 128 PRORage 1283D Rage ProNVIDIAGeForce RTX 3090 TiGeForce RTX 3090GeForce RTX 3080 TiGeForce RTX 3080 12GBGeForce RTX 3080GeForce RTX 3070 TiGeForce RTX 3070GeForce RTX 3060 TiGeForce RTX 3060 rev. 2GeForce RTX 3060GeForce RTX 3050GeForce RTX 2080 TiGeForce RTX 2080 SuperGeForce RTX 2080GeForce RTX 2070 SuperGeForce RTX 2070GeForce RTX 2060 SuperGeForce RTX 2060GeForce GTX 1660 TiGeForce GTX 1660 SuperGeForce GTX 1660GeForce GTX 1650 SuperGeForce GTX 1650 GDDR6GeForce GTX 1650 rev.3GeForce GTX 1650 rev.2GeForce GTX 1650GeForce GTX 1630GeForce GTX 1080 TiGeForce GTX 1080GeForce GTX 1070 TiGeForce GTX 1070GeForce GTX 1060GeForce GTX 1060 3GBGeForce GTX 1050 TiGeForce GTX 1050 3GBGeForce GTX 1050GeForce GT 1030GeForce GTX Titan XGeForce GTX 980 TiGeForce GTX 980GeForce GTX 970GeForce GTX 960GeForce GTX 950GeForce GTX TitanGeForce GTX 780 TiGeForce GTX 780GeForce GTX 770GeForce GTX 760GeForce GTX 750 TiGeForce GTX 750GeForce GT 740GeForce GT 730GeForce GTX 690GeForce GTX 680GeForce GTX 670GeForce GTX 660 TiGeForce GTX 660GeForce GTX 650 Ti BoostGeForce GTX 650 TiGeForce GTX 650GeForce GT 640 rev.2GeForce GT 640GeForce GT 630 rev.2GeForce GT 630GeForce GTX 590GeForce GTX 580GeForce GTX 570GeForce GTX 560 TiGeForce GTX 560GeForce GTX 550 TiGeForce GT 520GeForce GTX 480GeForce GTX 470GeForce GTX 465GeForce GTX 460 SEGeForce GTX 460 1024MBGeForce GTX 460 768MBGeForce GTS 450GeForce GT 440 GDDR5GeForce GT 440 GDDR3GeForce GT 430GeForce GT 420GeForce GTX 295GeForce GTX 285GeForce GTX 280GeForce GTX 275GeForce GTX 260 rev. 2GeForce GTX 260GeForce GTS 250GeForce GTS 240GeForce GT 240GeForce GT 230GeForce GT 220GeForce 210Geforce 205GeForce GTS 150GeForce GT 130GeForce GT 120GeForce G100GeForce 9800 GTX+GeForce 9800 GTXGeForce 9800 GTSGeForce 9800 GTGeForce 9800 GX2GeForce 9600 GTGeForce 9600 GSO (G94)GeForce 9600 GSOGeForce 9500 GTGeForce 9500 GSGeForce 9400 GTGeForce 9400GeForce 9300GeForce 8800 ULTRAGeForce 8800 GTXGeForce 8800 GTS Rev2GeForce 8800 GTSGeForce 8800 GTGeForce 8800 GS 768MBGeForce 8800 GS 384MBGeForce 8600 GTSGeForce 8600 GTGeForce 8600 GSGeForce 8500 GT DDR3GeForce 8500 GT DDR2GeForce 8400 GSGeForce 8300GeForce 8200GeForce 8100GeForce 7950 GX2GeForce 7950 GTGeForce 7900 GTXGeForce 7900 GTOGeForce 7900 GTGeForce 7900 GSGeForce 7800 GTX 512MBGeForce 7800 GTXGeForce 7800 GTGeForce 7800 GS AGPGeForce 7800 GSGeForce 7600 GT Rev.2GeForce 7600 GTGeForce 7600 GS 256MBGeForce 7600 GS 512MBGeForce 7300 GT Ver2GeForce 7300 GTGeForce 7300 GSGeForce 7300 LEGeForce 7300 SEGeForce 7200 GSGeForce 7100 GS TC 128 (512)GeForce 6800 Ultra 512MBGeForce 6800 UltraGeForce 6800 GT 256MBGeForce 6800 GT 128MBGeForce 6800 GTOGeForce 6800 256MB PCI-EGeForce 6800 128MB PCI-EGeForce 6800 LE PCI-EGeForce 6800 256MB AGPGeForce 6800 128MB AGPGeForce 6800 LE AGPGeForce 6800 GS AGPGeForce 6800 GS PCI-EGeForce 6800 XTGeForce 6600 GT PCI-EGeForce 6600 GT AGPGeForce 6600 DDR2GeForce 6600 PCI-EGeForce 6600 AGPGeForce 6600 LEGeForce 6200 NV43VGeForce 6200GeForce 6200 NV43AGeForce 6500GeForce 6200 TC 64(256)GeForce 6200 TC 32(128)GeForce 6200 TC 16(128)GeForce PCX5950GeForce PCX 5900GeForce PCX 5750GeForce PCX 5550GeForce PCX 5300GeForce PCX 4300GeForce FX 5950 UltraGeForce FX 5900 UltraGeForce FX 5900GeForce FX 5900 ZTGeForce FX 5900 XTGeForce FX 5800 UltraGeForce FX 5800GeForce FX 5700 Ultra /DDR-3GeForce FX 5700 Ultra /DDR-2GeForce FX 5700GeForce FX 5700 LEGeForce FX 5600 Ultra (rev. 2)GeForce FX 5600 Ultra (rev.1)GeForce FX 5600 XTGeForce FX 5600GeForce FX 5500GeForce FX 5200 UltraGeForce FX 5200GeForce FX 5200 SEGeForce 4 Ti 4800GeForce 4 Ti 4800-SEGeForce 4 Ti 4200-8xGeForce 4 Ti 4600GeForce 4 Ti 4400GeForce 4 Ti 4200GeForce 4 MX 4000GeForce 4 MX 440-8x / 480GeForce 4 MX 460GeForce 4 MX 440GeForce 4 MX 440-SEGeForce 4 MX 420GeForce 3 Ti500GeForce 3 Ti200GeForce 3GeForce 2 Ti VXGeForce 2 TitaniumGeForce 2 UltraGeForce 2 PROGeForce 2 GTSGeForce 2 MX 400GeForce 2 MX 200GeForce 2 MXGeForce 256 DDRGeForce 256Riva TNT 2 UltraRiva TNT 2 PRORiva TNT 2Riva TNT 2 M64Riva TNT 2 Vanta LTRiva TNT 2 VantaRiva TNTRiva 128 ZXRiva 128

    You can simultaneously select
    up to 10 video cards by holding Ctrl

    Reviews of video cards NVIDIA GeForce 8800 ULTRA:

    • Fathers and Sons. GeForce 8800 Ultra vs GeForce 9800 GTX

      ASUS EN8800ULTRA/G/HTDP/768M/A

    • U.A. | EN

    GeForce 8800 Ultra video card [in 1 benchmark]

    NVIDIA
    GeForce 8800 Ultra

    • PCIe 1. 0 x16 interface
    • Core clock 612 MHz
    • Video memory size 512MB
    • Memory type GDDR3
    • Memory frequency 1080MHz
    • Maximum resolution

    Description

    NVIDIA started GeForce 8800 Ultra sales on May 2, 2007 at a suggested price of $829. This is a desktop video card based on Tesla architecture and 90 nm manufacturing process, primarily designed for office use. It has 512 MB of GDDR3 memory at 1.08 GHz, and together with a 384-bit interface, this creates a bandwidth of 103.7 Gb / s.

    In terms of compatibility, this is a dual-slot PCIe 1.0 x16 card. The length of the reference version is 270 mm. The connection requires two 6-pin additional power cables, and the power consumption is 171 watts.

    It provides poor performance in tests and games at the level of

    2.19%

    from the leader, which is the NVIDIA GeForce RTX 3090 Ti.


    GeForce
    8800 Ultra

    or


    GeForce RTX
    3090 Ti

    General information

    Information about the type (desktop or laptop) and architecture of GeForce 8800 Ultra, as well as when sales started and cost at that time.

    Performance ranking 824

  • 50
  • 100
  • Features

    GeForce 8800 Ultra’s general performance parameters such as number of shaders, GPU core clock, manufacturing process, texturing and calculation speed. They indirectly speak about GeForce 8800 Ultra’s performance, but for precise assessment you have to consider its benchmark and gaming test results.

    Number of stream processors

    Compatibility and dimensions

    Information on GeForce 8800 Ultra compatibility with other computer components. Useful for example when choosing the configuration of a future computer or to upgrade an existing one. For desktop video cards, these are the interface and connection bus (compatibility with the motherboard), the physical dimensions of the video card (compatibility with the motherboard and case), additional power connectors (compatibility with the power supply).

    RAM

    Parameters of memory installed on GeForce 8800 Ultra — type, size, bus, frequency and bandwidth. For video cards built into the processor that do not have their own memory, a shared part of the RAM is used.

    187


    Overall benchmark performance

    This is our overall performance rating. We regularly improve our algorithms, but if you find any inconsistencies, feel free to speak up in the comments section, we usually fix problems quickly.

    8800 Ultra
    2.19

    • Passmark
    Passmark

    This is a very common benchmark included in the Passmark PerformanceTest package. He gives the graphics card a thorough evaluation by running four separate tests for Direct3D versions 9, 10, 11 and 12 (the latter is done in 4K resolution if possible), and a few more tests using DirectCompute.

    Benchmark coverage: 26%

    8800 Ultra
    642


    Game tests

    FPS in popular games on the GeForce 8800 Ultra, as well as compliance with system requirements. Remember that the official requirements of the developers do not always match the data of real tests.

    Average FPS
    Popular games

    Relative performance

    Overall GeForce 8800 Ultra performance compared to its nearest desktop counterparts.


    NVIDIA GeForce 720A
    102.74

    ATI Radeon HD 2900XT
    102.28

    AMD FireStream 9170
    100.46

    NVIDIA GeForce 8800 Ultra
    100

    NVIDIA GeForce GT 710
    99.09

    ATI Radeon HD 2900 PRO
    97.72

    NVIDIA GeForce 810A
    96.8

    AMD competitor

    We believe that the nearest equivalent to GeForce 8800 Ultra from AMD is FireStream 9170, which is approximately equal in speed and is 3 positions higher in our rating.


    firestream
    9170

    Compare

    Here are some of AMD’s closest competitors to the GeForce 8800 Ultra:

    AMD Radeon R8 M535DX
    104.11

    ATI Radeon HD 2900XT
    102.28

    AMD FireStream 9170
    100.46

    NVIDIA GeForce 8800 Ultra
    100

    ATI Radeon HD 2900 PRO
    97.72

    AMD Radeon HD 7570
    94.98

    AMD Radeon R5 A240
    89.5

    Other video cards

    Here we recommend several video cards that are more or less similar in performance to the reviewed one.


    GeForce GTS
    250

    Compare


    GeForce
    8800 GTX

    Compare


    GeForce
    9800 GTX

    Compare


    GeForce
    8800 GT

    Compare


    GeForce
    9800 GT

    Compare


    GeForce
    7900 GTX

    Compare

    Recommended processors

    According to our statistics, these processors are most often used with the GeForce 8800 Ultra.


    xeon
    X5492

    17.6%


    Core 2
    Extreme QX9770

    5.9%


    Core i3
    2328M

    5.9%


    Core i5
    2300

    5.9%


    Core 2
    Quad Q9650

    5.9%


    Pentium 4
    HT 531

    5.9%


    Core 2
    Quad Q9550

    5.9%


    Core 2
    Quad Q8400

    5.9%


    E1
    6010

    5.9%


    Core i7
    10750H

    5. 9%

    User rating

    Here you can see the rating of the video card by users, as well as put your own rating.


    Tips and comments

    Here you can ask a question about GeForce 8800 Ultra, agree or disagree with our judgements, or report an error or mismatch.


    Please enable JavaScript to view the comments powered by Disqus.

    Video card NVIDIA GeForce 8800 Ultra

    Characteristics
    Drivers
    Price

    Memorial type GDDR3
    Memory 512 MB
    Name: NVIDIA GeForce 8800 Ultra
    Series: GeForce 8
    GPU model: G80-450 (G80)
    CUDA cores: 128
    Base clock : 612 MHz
    Memory speed: 2.1
    Gbps
    Memory: 768
    Mb GDDR3 (384-bit)

    The NVIDIA GeForce 8800 Ultra graphics card is based on the 90 nm process technology and is based on the G80-450 (G80) graphics processor.
    The card supports Directx 10. NVIDIA has placed 768 megabytes of GDDR3 RAM, which is connected using a 384-bit interface.
    The graphics processor runs at 612 MHz. The number of CUDA cores is 128, with a speed of 2100 Mbps.

    The power consumption of the video card is 171W, and the recommended power supply is 500W.

    NVIDIA GeForce 8800 Ultra supports Microsoft DirectX 10 and OpenGL 3.3.

    Specifications for NVIDIA GeForce 8800 Ultra

    GPU specifications:
    Model: NVIDIA GeForce 8800 Ultra
    Series: GeForce 8
    GPU model: G80-450 (G80)
    Process: 90nm
    CUDA cores: 128
    Streaming Multiprocessors (SMs): 16
    Texture Units (TMUs): 32
    Base clock: 612MHz
    Number of transistors: 681 million
    Memory specifications:
    Memory capacity: 768
    Mb
    Memory type: GDDR3
    Memory bus: 384-bit
    Memory speed: 2100 Mbps
    (2. 1
    Gbps)
    Memory clock: 1080 MHz
    Texture Fill Rate: 19.6 GTexel/s
    Display support:
    Maximum digital resolution: 2560×1600
    Maximum VGA resolution: 2048×1536
    Standard connectors: DVI-I DualLink, S-Video
    Multi-monitor support: Yes
    HDMI: Yes, Via adapter
    Audio input for HDMI: SPDIF
    Thermal data:
    Maximum GPU temperature: 105℃
    Energy consumption (TDP): 171 W
    Recommended Nutritional Requirements: 500 W
    Additional power connectors: Two 6-pin
    Video card dimensions:
    Height: 13 cm
    Length: 27 cm
    Width: 2 slots
    Technologies and capabilities:
    CUDA: Yes
    SLI: Yes
    Purevideo: Yes
    DirectX: 10
    OpenGL: 3. 3
    Tire: PCI-Express 1.0 x16
    Support OS: Microsoft Windows 7-10, Linux, FreeBSDx86

    Please note: the table shows the reference characteristics of the video card, they may vary by different manufacturers.
    342.01 WHQL

    Status: Outdated

    MD5

    Operating system:
    Windows 8 64-bit, 8.1 64-bit, 7 64-bit, Vista 64 -by
    Release date:
    December 14, 2016

    342.01 WHAL

    Download (292.47 MB)

    Status: Outdated

    MD5

    Operating system: WHQL

    Download (225.04 MB)

    Status: Obsolete

    MD5
    |
    Release Notes (PDF)

    Driver for video card NVIDIA GeForce 8800 Ultra is downloaded from the official site!

    Or use the GeForce Experience program — it will automatically select the necessary driver for your video card.

    Price in Russia

    NVIDIA GeForce 8800 Ultra

    FAQ

    What series is this video card?

    Video card series: GeForce 8

    What is the power consumption and power requirements?

    The maximum power consumption is: 171 W.

    Recommended power supply: 500W.

    Auxiliary power connectors: Two 6-pin .

    Where can I download the GeForce 8800 Ultra driver?


    Video cards with 768 Mb memory:

    Video cards with GDDR3 memory type:

    Video cards with 384-bit bus:

    Review of EVGA e-GeForce 8800 Ultra video card GECID.com. Page 1

    ::>Video cards
    >2007
    > EVGA e-GeForce 8800 Ultra

    09/11/2007

    Page 1
    Page 2
    One page

    The NVIDIA GeForce 8800 Ultra graphics processor was announced in early May of this year, but the recommended price of video cards on it, from $829, continues to impress today. The new GPU is an updated, faster revision of the GeForce 8800 GTX and differs from it in higher clock frequencies of the core and stream processors, as well as work in conjunction with faster video memory. Naturally, the performance has increased, but so has the price… Well, let’s see what the video card is capable of and what the video card looks like on this new GPU for fabulous money.

    EVGA e-GeForce 8800 Ultra ( 7 68-P2-N881-AR)

    Model

    e-GeForce 8800 Ultra (768-P2-N881-AR)

    Graphics core

    GeForce 8800 Ultra

    Core frequency

    612 MHz (1500 MHz stream processors)

    Memory capacity

    768MB GDDR3

    Memory frequency

    1080 MHz (2160 MHz DDR)

    Memory bus

    384 bit

    Tire standard

    PCI Express X16

    Maximum resolution

    Up to 2560 x 1600

    Outlets

    2x DVI-I
    VGA (via DVI to VGA adapter)
    TV-Out (HDTV, S-Video and Composite)

    HD-Video hardware acceleration

    H. 264, WMV/WMV-HD and MPEG-2 HD

    HDCP 9 support0187

    Yes

    Drivers

    Fresh drivers can be downloaded from:
    — video card manufacturer’s website;
    — GPU manufacturer website.

    Manufacturer website

    http://www.evga.com/

    The video card is supplied in a fairly compact, as for a product of this level, branded black-white-green cardboard box. In addition, the packaging is quite informative and contains almost all the information about the product, both in the form of large icons on the front side, and in the form of a more detailed specification on the back — only the exact operating frequencies are missing. In addition, the box has a window through which you can make sure that the coveted video card is inside. Next to the window is a list of what should be included and a notice that the graphics card is covered by a 10-year limited warranty. And on the side of the box are the minimum system requirements for the video card to function in it.

    Please note that the owner will need at least a 500 W power supply that delivers from 35 A through the 12V line.

    But the bundle of the video card turned out to be quite modest, as for a «mega-expensive» top product:

    • 2 adapters from VGA to DVI;
    • 2 adapters from two Molex power connectors to an additional 6-pin video power connector;
    • Component TV-Out;
    • S-Video Extender;
    • Video Card Quick Install Guide;
    • Driver CD for Windows XP.

    Interestingly, there are no drivers for the Windows Vista operating system included in the package — they will have to be downloaded from the Internet. But it seems that this is not discrimination, but simply at the time of the release of the video card, even NVIDIA itself did not yet have drivers suitable for the GeForce 8800 Ultra. (Looking ahead, we note that during testing it turned out that only the latest Forceware 163.44 drivers ensured stable operation of the video card and adequate results .)

    GeForce 8800 Ultra has no architectural differences, except for higher clock frequencies and use in conjunction with faster memory chips. And NVIDIA continues to manage production, giving «manufacturers» only the opportunity to complete, sell and accompany their product.

    Visually noticeable, distinguishing feature of the GeForce 8800 Ultra is the updated cooling system, which has become even larger. In addition to dimensions, there are also design differences, the most important of which is the possibility of more active cooling of the power supply module, which was previously ignored in the GeForce 8800 GTX/GTS (only the most heated elements had contact with the radiator, but there was no board airflow).

    The video card traditionally occupies two slots. To connect to display devices, there are two DVI connectors and a TV-Out output on the side. To connect to monitors with an analog VGA input (15-pin D-Sub), a pair of DVI-VGA adapters are included.

    The graphics processor G80-450 revision A3 (aka GeForce 8800 Ultra) installed on the video card operates at a frequency of 612 MHz, and its stream processors operate at a frequency of 1500 MHz, which fully complies with the NVIDIA specification. Note that the use of the new core revision makes this GPU more economical than the GeForce 8800 GTX by as much as 2 W, despite the higher clock frequencies, although with a total power consumption of 175 W, these ±2 W are more of a marketing feature. The graphics core itself, like the already mentioned, now not the most productive GeForce 8800 GTX, has 128 stream processors (unified shader pipelines) and provides full hardware support for DirectX 10.0 and OpenGL 2.0. A 384-bit bus is used to communicate with the video memory.

    The NVIO-I-A3 chip is responsible for working with output ports, which includes support for 2 RAMDAC, 2 Dual DVI and HDTV, and also, according to some sources, almost «native» HDMI. Recall that all the blocks for working with interfaces were moved to a separate chip due to the already huge size of the main graphics processor, but this made it possible to declare the distance from interference sources, and, therefore, the possibility of obtaining a better image with analog output.

    The video card has 768 MB of video memory, which is recruited by twelve Samsung K4J52324QE-BJ08 GDDR3 chips, with an access time of 0.8 ns, which according to the specification corresponds to a clock frequency of 2400 MHz DDR, but the chips work slower on the video card — 2160 MHz.

    Even at a fairly high load, the core temperature did not rise above 75°C, and the cooling system itself was not very noisy.

    GeForce 8800 ULTRA | GeForce 8 Series | Video cards NVIDIA

    Posted by myxxx323, . Published in GeForce 8 Series

    Even before NVIDIA’s official announcement of the GeForce 8800 Ultra, it was known that this high-end solution, intended to further strengthen the company’s position in the premium consumer graphics card sector, would be an overclocked version of the GeForce 8800 GTX. No wonder: developing a more powerful GPU, even based on the G80, would be too slow and expensive, especially since video adapters of this class make up an insignificant share of the discrete desktop graphics market. In addition, NVIDIA has done this more than once: just remember the GeForce 7800 GTX 512, not to mention earlier solutions like the GeForce 2 Ultra or GeForce 3 Titanium 500.

    So, it was initially clear that the GeForce 8800 Ultra would use the same graphics processor as its predecessor, the only question was the degree of overclocking, the possibility of using a new printed circuit board (as was the case with the GeForce 7800 GTX 512), the use of an improved system cooling and, possibly, installing fast GDDR-4 memory. The clock frequencies of the core were called about 650-700 MHz, memory — 2000 MHz and higher, as well as the price of about 9, but the situation gradually cleared up. It turned out that the GeForce 8800 Ultra will still use GDDR-3 memory with a total capacity of 768 MB, like the GeForce 8800 GTX, and the GPU frequency will be only 612 MHz for the main domain and 1. 5 GHz for the shader processor domain — 6% and 11% growth compared to the GeForce 8800 GTX, respectively. The memory frequency has increased from the nominal 1800 MHz to 2160 MHz; according to this parameter, the novelty received the most solid increase.

    Initially, the release of the GeForce 8800 Ultra was supposed to be a preemptive strike against AMD before it announced the R600 and cards based on it. Such a blow was actually dealt on May 2, somewhat later than the announcement of new mass solutions of mass solutions represented by the G84 and G86 chips. However, the performance gain turned out to be quite modest, from 2% to 14%. In addition, ATI decided to counterattack in other sectors, and NVIDIA’s new product simply had no one to compete with — the Radeon HD 2900 XT with its price of 9became a direct competitor to the GeForce 8800 GTS 640MB, a card belonging to a completely different price class.

    Characteristics NVIDIA GeForce 8800 ULTRA

    Name GeForce 8800 ULTRA
    Core G80
    Process (µm) 0. 08
    Transistors (millions) 681
    Core clock 612
    Memory clock (DDR) 1080 (2160)
    Bus and memory type GDDR3 384 Bit
    Bandwidth (Gb/s) 103.7
    Unified shaders 128
    Unified shader frequency 1512
    TMU for conveyor 32 (total)
    ROP 24
    Textures per clock 32
    Texture per pass 32
    Shaders Model 4.0
    Fill Rate (Mpix/s) 14688
    Fill Rate (Mtex/s) 19584
    DirectX 10. 0
    Anti-Aliasing (Max) SS & MS — 16x
    Anisotropic Filtering (Max) 16x
    Memory capacity 768
    Interface PCI-E
    RAMDAC 2×400

    In a certain sense, the GeForce 8800 Ultra was a disappointment — more was expected from NVIDIA, especially in light of the fact that many of the company’s partners have been offering overclocked versions of the GeForce 8800 GTX for a long time, with parameters close enough to those of the new product. Nevertheless, despite the inappropriateness of its release, this accelerator became the king of 3D graphics for a long time….

    Bioshock

    Comments (0)

    Leave your comment

    Username

    Password

    Comparison NVIDIA GeForce 8800 Ultra vs NVIDIA GeForce 8800 GTX which is better?

    0187

    vs

    NVIDIA GeForce 8800 GTX

    37%

    DeviceList Score

    We compared the specs of NVIDIA GeForce 8800 Ultra and NVIDIA GeForce 8800 GTX and compiled a list of benefits and comparison table for you. Find out which one to choose in 2022.

    Benefits NVIDIA GeForce 8800 Ultra

    Core frequency

    612 MHz

    At 36 MHz (6.3%) better than

    vs

    576 MHz

    Number of CUDA cores

    612

    37 (6.4%) better than

    vs

    575

    Memory frequency

    1080 MHz

    180 MHz (20%) better than

    vs

    900 MHz

    Memory bandwidth

    103.7

    17.3 (20%) better than

    vs

    86.4

    Benefits NVIDIA GeForce 8800 GTX

    Comparison winner

    Value for money

    10. 2%

    2% (24.4%) better than

    vs

    8.2%

    Release price

    $599

    -230$ (-27.7%) better than

    vs

    829 $

    Energy Demand (TDP)

    155 W

    -16 W (-9.4%) better than

    vs

    171 W

    Maximum memory

    0.768 GB

    0.268 GB (53.6%) better than

    vs

    0.5 GB

    General information

    Value for money

    The sum of all the advantages of the device divided by its price. The more%, the better the quality per unit price in comparison with all analogues.

    8.2% 10.2%

    2% (24.4%) better than

    Architecture

    Tesla Tesla

    Codename

    G80 G80

    Type

    Desktop Desktop

    Release price

    829 $ $599

    -230 $ (-27. 7%) better than

    Number of shader processors

    128 128

    Core clock

    612MHz

    36 MHz (6.3%) better than

    576 MHz

    Number of transistors

    681 million 681 million

    Process

    90 nm 90 nm

    Interface

    PCIe 1.0 x16 PCIe 1.0 x16

    Power Demand (TDP)

    Calculated thermal power shows the average heat dissipation in load operation,
    the larger the value, the more the requirements for cooling and power consumption increase.

    171W 155 W

    -16 W (-9.4%) better than

    Length

    270 mm 270 mm

    Additional power connectors

    2x 6-pin 2x 6-pin

    SLI support

    + +

    Number of CUDA cores

    A large number of CUDA cores improve performance in graphics computing,
    especially affect anti-aliasing and lighting in games, the speed of training neural networks.

    612

    37 (6.4%) better than

    575

    Video connectors

    2x DVI, 1x S-Video 2x DVI, 1x S-Video

    DirectX

    11.1 (10_0) 11.1 (10_0)

    Floating point performance

    387.1 gflops 345.6 gflops
    Benchmarks
    Memory

    Memory type

    GDDR3 GDDR3

    Maximum memory

    Large video memory allows you to run demanding games with a lot of textures,
    use high resolution monitors, provide more opportunities for cryptocurrency mining.

    0.5 GB 0.768 GB

    0.268 GB (53.6%) better than

    Memory bus width

    The wider the video memory bus, the more data is transferred to the GPU per unit of time and the better performance in demanding games.

    2024 © All rights reserved