Overclocking the GeForce 8800 Series
For this project I have selected the FOXCONN NVIDIA GeForce 8800 GTS as my test subject, which I reviewed here at Bjorn3D back in November 2006. Presently, the 640MB version of the GeForce 8800 GTS is the second-best video card available on the market. Gamers and computer enthusiasts alike have already speculated on how the GTS could be made to perform to the same level as the GeForce 8800 GTX with some tweaking.
Overclocking can take on many forms, and the practice can range from minor product improvements to a re-engineering project which completely alters the product. In this article, I will concentrate my efforts to achieving the most gain with the least amount of effort.
Some industry voices have called overclocking a hobby, while others have compared it to product misuse. However, I believe that if you are reading this article, you are probably one of the many computer enthusiasts who believe it is perfectly acceptable to get something more out of a product without it costing more money. When I think about it, everyone enjoys getting something for nothing; it’s human nature.
Additionally, it is human nature to blame someone else if something goes wrong. This is where I warn you, the reader of this article, that neither I nor Bjorn3d.com recommend that you overclock your video card. This article explains how I conducted these experiments on my own property. Bjorn3d.com and the author of this article will not be responsible for damages or injury resulting from experiments you choose to conduct on your own property. If you read beyond this point, you are accepting responsibility for your own actions and hold Olin Coles and the staff of Bjorn3d.com harmless.
For this project I have selected the FOXCONN NVIDIA GeForce 8800 GTS as my test subject, which I reviewed here at Bjorn3D back in November 2006. Presently, the 640MB version of the GeForce 8800 GTS is the second-best video card available on the market. Gamers and computer enthusiasts alike have already speculated on how the GTS could be made to perform to the same level as the GeForce 8800 GTX with some tweaking. Unfortunately, this just isn’t possible. What is possible though is taking a great product and making it even better; and do it all for free.
Test Subject: FOXCONN 640MB GeForce 8800 GTS
By default, the FOXCONN 8800 GTS operates with a 575MHz G80 GPU core clock speed, a 1188MHz shader, and a 900MHz (1800MHz) GDDR3 RAM speed. I will utilize the free overclocking utility ATITool v0.26 to search out the best clock speeds and simulate heavy graphic loading to establish stability. I will then make use of NiBiTor v3.2 which is another free tool to program a custom flashable video BIOS. After creating the new custom video BIOS with NiBiTor, I will use yet another free application, nvFlash, to program the new custom video BIOS onto the GeForce 8800 GTS.
ATITool v0.26 (click for large image)
Sure the name says “ATI” Tool, but the author has made this a great tool for both ATI and NVIDIA products for several years now. I have personally used many different tools for overclocking in the past, but this has proven to be the best tool for all my needs. After a straightforward installation and reboot (to complete the installation of the driver-level service), I open ATITool and see a lot of options confronting me. Try not to be overwhelmed, since all of these options could strike fear in the hearts of the inexperienced.
The first screen displayed will be the only screen that is really needed. While the memory clock speed copies its adjustments across all three performance levels (2D, 3D low power, and 3D performance), the software allows the user to make individual speed settings to the GPU core clock speed. Initially, the default values are saved in the profile named “default”, but as you make changes you may save and delete profiles as needed. For this project, I kept raising my speeds and saving them into a profile named “MAX”.
ATITool: Artifacts Indicate Unstable Settings
To begin the overclocking process, I start by raising the temperature of the GPU core by using the “Show 3D View” button to display a rotating fuzzy cube. It is critical that the video card attain the highest temperature possible prior to overclocking, because the results of a cold overclock may prove unstable during gaming conditions. After reaching the loaded operating temperature, I change to the “Scan for Artifacts” view by pressing the button. Although I could have used the “Find Max Core & Mem” buttons to have ATITool automatically work out the best settings, I choose to manually test each incremental improvement on my own.
Find the Best Speeds
Experience has taught me that overclocking the memory is the best starting point, since it has a very small impact on operating temperatures when compared to overclocking the GPU. I have also learned that clock speed improvements can be made in larger steps on RAM (10MHz steps), then they can be made on the GPU (3MHz steps). It is recommended that each clock be adjusted in small increments one after the other. Do not find the maximum RAM clock and then set out to find the best GPU clock; this will give skewed results at either end.
In my testing, I found that this particular card could operate a maximum RAM clock speed of 1060MHz (2120MHz) with the default GPU core clock. Alternatively, the maximum GPU clock could be raised up to 618MHz while maintaining the default RAM clock speed. However, the best combination of the two results yielded a stable 600MHz GPU core clock (25MHz improvement) with a final RAM clock speed of 1030MHz (130MHz improvement). These settings were saved to the profile I named “MAX”, and tested for a ten-minute duration using the “Scan for Artifacts” function. After stability was successfully tested, I played some of my favorite video games for an hour to confirm real-world stability. Now that I have my numbers, it is time to program them into the card.
Once the maximum stable speeds for both GPU and RAM have been found and tested, there is a choice ahead: either continue using ATITool to maintain these speeds, or flash them onto the video cards BIOS and make them permanent. Since I have already tested my overclocking results with both artificial loading and real-world usage, it is now safe enough for me to modify the video card BIOS file with my new overclocked settings and make the FOXCONN 8800 GTS operate with enhanced performance without additional software. This will make the card identical to factory overclocked versions.
Unfortunately, part of this process requires that the system boot into MS-DOS with a 3.5″ floppy disk drive, which I normally don’t have installed in my computer because it is considered an obsolete legacy piece of hardware. The USB flash drive and recordable optical drives have proven themselves to be very good solutions in terms of suitable replacements, so it may become a difficult task locating a 3.5″ floppy disk drive any further into the future. For the remainder of this project, a bootable USB flash drive or a properly created bootable CD/DVD could have been substituted, but I choose to avoid reinventing the wheel and retained the use of a spare floppy disk drive.
NiBiTor v3.2 Video Card BIOS Editor
To begin the (simple) process of creating a custom BIOS file, I utilized the free NiBiTor v3.2 program to save a copy of my original BIOS. There are two steps for this process:
- Tools → Read BIOS → Select Device (Choose the video card here)
- Tools → Read BIOS → Read Into File (Choose a safe backup location to save the original BIOS)
Save a backup
I should only continue after creating a backup of the original video BIOS, since this file will allow the opportunity to return my settings back the factory defaults if it is every required. It is highly recommended that this backup BIOS be copied and renamed so there will be a working copy and the original backup file available. I named the original BIOS file “BACKUP.ROM”, and then created a copy of it which I named “8800GTS2.ROM”.
Up to this point, I have saved my modified video BIOS “8800GTS2.ROM” onto one floppy disk. I will save the nvFlash program which is available for free download onto this same floppy disk. On a second floppy disk I will use Windows XP to format and create an MS-DOS startup disk. It is may not required to split the project files and nvFlash from the MS-DOS startup disk because of file sizes and available space, but this is a safe practice which also decreases the chance for possible media problems.
Now that a backup of the video BIOS has been copied and stored for safe keeping, the next step for reading the working file and making changes begins with: File → Open BIOS (Choose the renamed copy “8800GTS2.ROM” of the video BIOS here).
New Speed Values are Added
Once again, the novice could become very concerned about the many tabs and options available in NiBiTor, but for my purposes I will only use the Clockrates tab to change the values for 3D speeds.
Using the values discovered and tested to be stable in my previous steps, I apply these values into the appropriate 3D fields, replacing the original values. Once I have typed in the new Core and Memory values, I saved this modified video BIOS file by choosing: File → Save BIOS (save this modified file onto a formatted floppy disk and name the file something simple with less then eight characters such as “8800GTS2. ROM”).
Flash the BIOS with nvFlash v5.40
Now that I have prepared the new modified video BIOS file and saved it to a floppy disk, I am ready to make my video card operate as if it came from the factory with my new settings. I will now flash my working copy of the modified video BIOS “8800GTS2.ROM” onto the FOXCONN 8800 GTS using nvFlash.
Flash the new BIOS
Flashing a video BIOS is a very simple process; yet extra care and precaution must be taken or the hardware being flashed may be rendered non-operational. I have taken steps to ensure my computer systems stability will not be compromised by removing any system component overclocking (CPU, RAM, bus speeds), and have placed my system on a 1500VA backup battery UPS. With stability ensured, I am ready to move forward and flash the video BIOS.
The video card BIOS is flashed by:
- Boot into MS-DOS mode from a pre-formatted floppy disk
- Insert the floppy disk containing the BIOS file and nvFlash
- Type “nvflash 8800GTS2. ROM” (where “8800GTS2.ROM is the name of the BIOS file)
- Press “Y” to confirm that the system is ready to flash the firmware
That’s it! The hard part is done. Once the system reboots after the successful BIOS flash, the video card is programmed with the new enhanced performance settings. But am I finished?
With all this new power there will come increased heat output, which is something the GeForce 8800 series already knows plenty about.
NVIDIA nTune Performance Application
The GeForce 8800 GTS already runs close to 90° C when it is under full load, so I have taken an extra step to make sure my temperatures don’t turn this product into a personal space heater. Using the NVIDIA nTune Performance Application available free, I can adjust the fan speed from the default 60% output up to the desired 100% output.
Don’t be fooled
A closer look at the nTune utility will reveal that it offers the opportunity to overclock the video card through the GPU clock settings interface; but I had to discover the hard way that this was a very unstable and unsafe method which always resulted in system crashes. I have since avoided every feature offered in this utility except the GPU fan settings feature.
nTune: a necessary evil
If I could find a better program with a smaller footprint which would enable me to manually adjust 8800 series blower fan speeds, I would be using it. But since this is the only one I am aware of, it is a necessary evil.
With the NVIDIA nTune utility, I can manually raise (or lower) the fan controller output. The blower fan on the GeForce 8800 series is somewhat silent at the default 60% output, but it gets humming when set up to 100% output which means that noise may become a concern.
For the cost of the product and about an hour of time spent, I was able to take my FOXCONN 8800 GTS and overclock the G80 GPU up to 600MHz (575MHz default) along with a GDDR3 RAM speed increase up to 1030MHz (900MHz default). This amounts to a 25MHz GPU improvement and a 130MHz RAM improvement; and all for free!
Sure, these results did not transform my 8800 GTS into a GTX, but as I mentioned before this is just plain impossible because of architecture. When it was all said and done, I did make enough improvement to the 8800 GTS to keep it more relative to the 8800 GTX. My video card was already a product pushed closely to the limit from the factory, and now I have it operating and performing as best as it can. Just imagine what could be done to the GeForce 8800 GTX!
With this in mind, it could be very possible to find more performance available for the taking out of other NVIDIA video cards. Just remember, what you do with your property is your own business. At least now you know how I did it.
Releasing the Beasts — Overclocking the GeForce 8800’s
Video Cards & GPUs
NVIDIA GeForce GPU
It’s time to Release the Beasts — we overclock a couple of GeForce 8800 graphics cards and share our results with you!
Published Nov 15, 2006 11:00 PM CST | Updated Tue, Nov 3 2020 7:04 PM CST
8 minute read time
It’s clear that the power on offer from the new GeForce 8800GTS and GTX beasts is more than enough for most people, though honestly — who cares? There is nothing wrong with wanting more power and this is where overclocking comes in.
We did have intentions to do a little piece on overclocking in our XFX article but we thought that we would dive into it more. We took some extra time see exactly what we can expect from the GeForce 8800 graphics cards when it comes to overclocking. Since all cards are practically all coming out of the same factory, you would think it’s safe to assume that most cards are going to sit quite close together when it comes to overclocking.
The main thing we want to know today; can the 8800 GTS, which sits significantly cheaper, come in and match or even beat GTX performance at stock speeds? And if it does, how much further can the 8800 GTX go to make it worth your while to spend those extra hard earned dollars on it, as opposed to the GTS variant?
In this article we’re not going to look at the graphics cards, we’ve already done that. Instead we will have a quick look at our test system setup and what we got with our overclocks. From there it’s straight into the benchmarks as that’s all we really need to see here today.
Let’s get overclocking!
Benchmarks — Test System Setup and 3DMark05
Test System Setup
Processor(s): Intel Core 2 Duo E6600 @ 3150MHz (350MHz FSB with memory @ 1:1)
Motherboard(s): ASUS P5B Deluxe (Supplied by ASUS)
Memory: 2 X 1GB G.Skill HZ PC8000 @ 350MHz 4-4-4-12 (Supplied by Bronet)
Hard Disk(s): Hitachi 80GB 7200RPM SATA 2
Operating System: Windows XP Professional SP2
Drivers: nVidia ForceWare 96.97 (Reviewer Driver) and DX9c
With Coolbits not working on the latest version of the nVidia drivers, it was time to go diving into some other program to use so we could start overclocking our XFX graphics cards.
Surprisingly, the current beta version of ATI Tool worked almost without a hitch. We thought we would be slack and try the auto detect feature but as soon as it went about 4MHz up on the core, it crapped out and we had artifacts everywhere. Using the old manual method — increase the clock speeds, run a 3DMark, rinse and repeat until we crashed something or got artifacts, we got our maximum speed on both the GeForce 8800 GTX and GTS.
At default the 8800 GTS comes in at 500MHz on the core and 1600MHz DDR on the memory. The core increased to a truly outstanding 643MHz (143MHz increase or about 22%) and the memory got a significant boost up to 1824MHz DDR (224MHz DDR increase or about 12%).
The 8800 GTX comes in at 575MHz on the core and 1800MHz DDR on the memory. We got that to 654MHz (79MHz increase or about 12%) which is just above what we got out of the GTS and the memory had no problems breaking the 2GHz DDR marking coming in at 2020MHz DDR (220MHz DDR increase or about 11%).
Version and / or Patch Used: Build 120
Developer Homepage: http://www.futuremark.com
Product Homepage: http://www.futuremark.com/products/3dmark05/
Buy It Here
3DMark05 is now the second latest version in the popular 3DMark «Gamers Benchmark» series. It includes a complete set of DX9 benchmarks which tests Shader Model 2.0 and above.
For more information on the 3DMark05 benchmark, we recommend you read our preview here.
We can see in our first benchmark the massive overclocking on the GTS brings it to similar performance of the standard clocked GTX.
The GTX overclock wasn’t as significant as what the GTS offered, so while we do see an increase, it isn’t nearly as big as to what is on offer from the GTS which makes it even more impressive considering the price difference between the two cards.
Benchmarks — 3DMark06
Version and / or Patch Used: Build 102
Developer Homepage: http://www.futuremark.com
Product Homepage: http://www.futuremark.com/products/3dmark06/
Buy It Here
3DMark06 is the very latest version of the «Gamers Benchmark» from FutureMark. The newest version of 3DMark expands on the tests in 3DMark05 by adding graphical effects using Shader Model 3.0 and HDR (High Dynamic Range lighting) which will push even the best DX9 graphics cards to the extremes.
3DMark06 also focuses on not just the GPU but the CPU using the AGEIA PhysX software physics library to effectively test single and Dual Core processors.
The more intensive 3DMark06 also sees the significant overclock stand out for the GTS. While it does come close to the GTX, when we overclock the most expensive card, we see another nice boost in performance for the top dog.
Benchmarks — Half Life 2 (Lost Coast)
Half Life 2 (Lost Coast)
Version and / or Patch Used: Unpatched
Timedemo or Level Used: Custom Timedemo
Developer Homepage: http://www.valvesoftware.com
Product Homepage: http://www.half-life2. com
Buy It Here
By taking the suspense, challenge and visceral charge of the original, and adding startling new realism, responsiveness and new HDR technology, Half-Life 2 Lost Coast opens the door to a world where the player’s presence affects everything around him, from the physical environment to the behaviors even the emotions of both friends and enemies.
We benchmark Half Life 2 Lost Coast with our own custom timedemos as to avoid possible driver optimizations using the «record demo_name» command and loading the timedemo with the «timedemo demo_name» command — For a full list of the commands, click here.
We can see in our non-HDR tests that the GTS wasn’t quite hitting the CPU wall but when we overclocked, it was sitting up with the GTX.
We can also see the overclocked GTX has no performance gains at all here. As soon as we turn on HDR though and start moving up the resolutions again, we can see the difference a lot easier with the overclocked GTS just trailing behind a stock clocked GTX.
Benchmarks — PREY
Version and / or Patch Used: Unpatched
Timedemo or Level Used: HardwareOC Custom Benchmark
Developer Homepage: http://www.humanhead.com
Product Homepage: http://www.prey.com
Buy It Here
PREY is one of the newest games to be added to our benchmark line-up. It is based off the Doom 3 engine and offers stunning graphics passing what we’ve seen in Quake 4 and does put quite a lot of strain on our test systems.
The GTX saw no gains when it came to overclocking though the GTS on the other hand rocks along and gets some very nice gains across the board.
Benchmarks — F.E.A.R.
Version and / or Patch Used: Unpatched
Timedemo or Level Used: Built-in Test
Developer Homepage: http://www. vugames.com
Product Homepage: http://www.whatisfear.com/us/
Buy It Here
F.E.A.R. (First Encounter Assault Recon) is an intense combat experience with rich atmosphere and a deeply intense paranormal storyline presented entirely in first person. Be the hero in your own spine-tingling epic of action, tension, and terror…and discover the true meaning of F.E.A.R.
We see gains in the minimum on both cards but when we move to the average the GTX doesn’t move much from 1280 x 1024. It’s only at the higher resolutions we really see a difference.
Benchmarks — Quake 4
Version and / or Patch Used: 1.2
Timedemo or Level Used: HardwareOC Custom Benchmark
Developer Homepage: http://www.idsoftware.com
Product Homepage: http://www.quake4game.com
Buy It Here
Quake 4 is one of the latest new games to be added to our benchmark suite. It is based off the popular Doom 3 engine and as a result uses many of the features seen in Doom. However, Quake 4 graphics are more intensive than Doom 3 and should put more strain on different parts of the system.
For some reason apart from 1600 x 1200 the overclock really didn’t affect both cards. Both saw a significant increase at 1600 x 1200 but at 1280 and 1920, we see that the cards score almost identical.
Benchmarks — Company of Heroes
Company of Heroes
Version and / or Patch Used: Demo
Timedemo or Level Used: Built-in Test
Developer Homepage: http://www.relic.com
Product Homepage: http://www.companyofheroesgame.com
Buy It Here
Company of Heroes, or COH as we’re calling it, is one of the latest World War II games to be released and also one of the newest in our lineup of benchmarks. It is a super realistic real-time strategy (RTS) with plenty of cinematic detail and great effects. Because of its detail, it will help stress out even the most impressive computer systems with the best graphics cards — especially when you turn up all the detail. We use the built-in test to measure the frame rates.
The overclock for the GTX doesn’t see too much happening in COH though the GTS shows significant gains across the board especially when it comes to the minimum FPS, which is clearly the most important.
Benchmarks — High Quality AA and AF
High Quality AA and AF
Our high quality tests let us separate the men from the boys and the ladies from the girls. If the cards weren’t struggling before they will start to now. ATI and the GeForce 8800 series are able to offer HDR and AA at the same time unlike older nVidia cards.
When we start to increase the detail, we see a similar picture with 3DMark06 — the overclocked GTS is just trailing a stock clocked GTX and the overclocked GTX is still consistently ahead.
In Lost Coast when we aren’t seeing a CPU limitation, we can see that it is able to make use of the new found performance from both cards.
While PREY only saw gains at 1280 x 1024 with regular quality settings, when we turn on AA and AF, we can see at 1920 x 1200 both cards offer a good increase in performance.
GeForce 8800 GTS offers an absolutely huge overclock thanks to how generous nVidia and its partners have been with the core. Generally speaking, the overclock brings it pretty close to the performance of a stock clocked GeForce 8800 GTX. With that said though, the GTX doesn’t have any problems with overclocking either giving it the ability to go up and beyond what it was doing at default.
If you were thinking about getting a GTX and have thought, «Oh maybe I will get a GTS instead, due to the overclock on offer…«, if you can afford the GTX, we would recommend that you continue with that and purchase it because it will be nice to have this kind of performance without having to overclock your graphics card.
One of the biggest things that you have to take note of is the frame rates you’re getting with these overclocked cards. A GTX at default speeds with AA, AF and HDR on when running at 1920 x 1200 is getting an average of 104 FPS and when overclocked 114FPS. Is it worth the mucking around for an extra 10FPS? Yes, it is roughly 10% faster but you’re honestly not going to see a real-world difference in game and all you’re doing is increasing the amount of work your GPU is doing. Heck, you could probably even underclock the GTX if you wanted… but that’s just silly.
You have to understand our point though — move away from 3DMark and move away from the FPS difference being 150FPS average and 175FPS average and think, is it really worth even contemplating overclocking your graphics card? It’s pretty safe you will come to the assumption that it’s not but others will want to overclock anyway.
Yes, overclocking can be done with ease and you do get a nice speed bump on the GTS — though honestly, you have to decide if it’s even worth it. If GTS is all you can afford and you want to bump the AA up another level but it runs a little too slow, then do some overclocking. Although, unless you’re gaming at levels of 1920 x 1200 and above, stock performance is clearly going to be enough for the time being it on the majority of the latest games.
Let’s wait for some new games to be released, such as Crysis or UT2007, and then you’ll probably want to overclock as far as you can as these games will be able to stress out the GPU more than the current batch of games.
PRICING: You can find products similar to this one for sale below.
United States: Find other tech and computer products like this over at Amazon.com
United Kingdom: Find other tech and computer products like this over at Amazon.co.uk
Australia: Find other tech and computer products like this over at Amazon. com.au
Canada: Find other tech and computer products like this over at Amazon.ca
Deutschland: Finde andere Technik- und Computerprodukte wie dieses auf Amazon.de
Shawn takes care of all of our video card reviews. From 2009, Shawn is also taking care of our memory reviews, and from May 2011, Shawn also takes care of our CPU, chipset and motherboard reviews. As of December 2011, Shawn is based out of Taipei, Taiwan.
BFG Tech GeForce 8800 GTS OC 512MB
Since the current version of RivaTuner doesn’t work with Nvidia’s latest drivers, we chose to use Nvidia nTune for our overclocking endeavours with BFG Tech’s GeForce 8800 GTS OC 512MB card.
As a quick reminder, BFG’s factory-overclocked 8800 GTS 512MB comes with default speeds of 675MHz core, 1,674MHz stream processor and 1,940MHz (effective) memory.
After a couple of hours of tweaking and stability testing using Crysis (it’s a hard life. .. I know), we found that we were able to increase these frequencies relatively successfully, with the core clocked at 719MHz and memory at 2142MHz (effective).
This represents increases of 101MHz (202MHz effective) on the memory and 44MHz on the core. If you factor in the fact that the BFG Tech card already has a 25MHz core speed increase by default, the overclock represents a 69MHz increase – that’s not too bad in the grand scheme of things. Sadly, we are unable to report the shader clock increase at the moment because RivaTuner version 2.06 doesn’t work correctly with the drivers used – all of the Forceware-specific options are not available.
With the recent price drops on Nvidia’s GeForce 8800 GT, GeForce 8800 GTS 512MB and GeForce 8800 GTX, there are some interesting options on the market for gamers. At the same time we can’t forget the Radeon HD 3870 X2 either, but as we stated during our review earlier this week, we’re hesitant to give it a solid recommendation because its reliance on drivers.
The cheapest GeForce 8800 GTS 512MB we’ve found is priced at £194.17, including VAT and it’s clocked at Nvidia’s reference speeds. BFG Tech’s card, on the other hand, costs about £10 more at £205.50 (inc. VAT) and comes with a fairly modest factory overclock, but it’s an overclock nevertheless. The result is a relatively unnoticeable performance increase, but what it should mean is that there is the potential for a higher overclock as the chips are qualified to run at higher-than-standard speeds.
Update: it’s also available on OcUK for £204.44 (inc. VAT), but remember that you’ll get free delivery from Scan if you’re a regular contributor in the bit-tech forums.
In addition to that, BFG sweetens the deal with its 10-year warranty in Europe (and a Lifetime warranty in North America) – and there are few graphics board partners that match BFG on this front.
We couldn’t talk about the GeForce 8800 GTS 512MB without mentioning the alternatives—such as the GeForce 8800 GT and GeForce 8800 GTX—that are tentatively priced either side of the card we’re looking at here today. Ultimately, the choice on whether you need to spend £150, £200 or £250 will depend on your requirements.
1680×1050 seems to be the optimal resolution for the GeForce 8800 GTS 512MB, but performance doesn’t tail off too much at 1920×1200; therefore, if you’re gaming on a higher-resolution screen, we’d recommend plumping for a GeForce 8800 GTX or Radeon HD 3870 X2. However, if you’ve got a 1280×1024 screen you should probably save the cash and opt for a GeForce 8800 GT. It’s also worth mentioning that the GeForce 8800 GT is a pretty capable card at 1680×1050 as well – although it’s not as competent as the 8800 GTS 512MB, where it’s around 15 percent slower on average.
You’re probably wondering why I’ve not mentioned the alternatives from ATI yet – that’s because there really isn’t any alternative at this price point. The newly-released Radeon HD 3870 X2 typically retails for around £270 (inc. VAT)—some £65 more than the BFG Tech GeForce 8800 GTS 512MB. And at the other end of the scale, the Radeon HD 3870 is available for around £130 (inc. VAT) – that’s about £65 less than the cheapest 8800 GTS 512MB and it’s in a different performance class.
So, BFG Tech’s GeForce 8800 GTS OC 512MB appears to have hit a price point that can’t be matched by anything other than stock-clocked GeForce 8800 GTS 512MB cards and as such it earns a recommendation from us. However, it’s important to make sure that it’s going to be connected to a 1680×1050 or 1920×1200 display, as that will show the card in its best possible light. You can get away with running the BFG Tech GeForce 8800 GTS 512MB on higher or lower resolution screens, but the benefits of the card aren’t going to be quite so profound.
What do these scores mean?
BFG Tech GeForce 8800 GTS OC 512MB
1 — BFG Tech GeForce 8800 GTS OC 512MB2 — Card & Warranty3 — BFG Tech GeForce 8800 GTS OC2 512MB Watercooled4 — Test Setup5 — Crysis6 — Call of Duty 4: Modern Warfare7 — World in Conflict8 — Unreal Tournament 39 — Overclocking & Final Thoughts
com. FOXCONN FV-N88XMAD2-ONOC GeForce 8800 GTX 768MB 384-bit GDDR3 PCI Express x16 HDCP Ready SLI Support OverClock Video Card
Recently viewed products
FOXCONN FV-N88XMAD2-ONOC GeForce 8800 GTX 768MB 384-bit GDDR3 PCI Express x16 HDCP Ready SLI Support OverClock Video Card
Tesla / Cluster GPU
Mobile Video Cards
Desktop / Gaming Graphics
GeForce 8800 Series
GeForce 8800 GTS (G92)
GeForce 8800 GT
GeForce 8800 GTX
GeForce 8800 GTS
GeForce 8800 Ultra
GeForce 8800Ultra KO
GeForce 8800 GS
GeForce 9800 Series
GeForce 9600 Series
GeForce 9500 GT Series
GeForce 8500 GT Series
GeForce 9400 GT Series
GeForce 7200GS Series
GeForce 7100GS Series
GeForce 6500 Series
GeForce 6200 Series
GeForce 6600 Series
GeForce 6800 Series
GeForce 7300 Series
GeForce 7600 Series
GeForce 7800 Series
GeForce 7900 Series
GeForce 8400 Series
GeForce 8600 Series
GeForce FX Series
GeForce GT Series
GeForce GTS Series
GeForce GTX Series
GeForce MX Series
GeForce PCX Series
GeForce 3 Series
GeForce 9300 GE Series
GeForce 8300GS Series
GeForce 315 Series
GeForce 9500GS Series
GeForce 405 Series
GeForce 9300 GS Series
GeForce 310 Series
GeForce 605 Series
GeForce RTX Series
Professional / Workstation Cards
MAC Compatable GPU
Special / Medical Cards
HP Workstation Video Cards
Slot Machines Cards
Dock / Docking Station / Port Replicator
Parts of Server
Desktop / Gaming Graphics
GeForce 8800 Series
GeForce 8800 GTX
FOXCONN FV-N88XMAD2-ONOC GeForce 8800 GTX 768MB 384-bit GDDR3 PCI Express x16 HDCP Ready SLI Support OverClock Video Card
Part number: FV-N88XMAD2-ONOC
Availability: In stock
TSpire. com 6 Months Warranty
1 Year Extended Warranty
2 Year Extended Warranty
3 Year Extended Warranty
4 Year Extended Warranty
5 Year Extended Warranty
The MSI NX8800GTX-T2D768E-HD videocard
is pretty darn quick in its own right, without anyone messing around with
it. That doesn’t mean PCSTATS is going to spare this flagship videocard from
the overclocking tests.
We’ll be starting today’s overclocking tests
with the nVidia ‘G80’ GPU first. It is clocked at a default speed
of 576 MHz, and that frequency will be increased in 5 MHz increments.
Using RivaTuner, the GeForce 8800GTX core turned out to
be a pretty good overclocker and easily cracked 600 MHz. Shortly thereafter
we passed 610, 620 and 630 MHz without any trouble. nVIDIA may not want
videocard manufacturers to factory overclock the G80, but that doesn’t mean it
doesn’t overclock well!
The MSI NX8800GTX-T2D768E-HD hit a top speed of 665 MHz
before RivaTuner stared to complain about it not passing the internal test. If
we disabled the internal test, the core could probably go higher but with stock
cooling, I didn’t want to push too hard. There are reports online that the
G80 scales well with watercooling, TEC or phase change cooling solutions…. but
unfortunately none of those types of cooling were available at the time of this
Next up was the 768MB of Samsung GDDR3 memory.
This videocard memory is clocked at 1800 MHz by default, or 1.8GHz.
We were a bit more aggressive when it came to overclocking the videocard’s memory,
and took it in 20 MHz strides.
past we have seen Samsung GDDR3 memory overclock well this way, so we were
anticipating some really fast results this time.
The MSI NX8800GTX-T2D768E-HD’s memory didn’t disappoint.
We broke through 1900 MHz in no time, and closed in on 2000MHz just as fast. The
2 GHz plateau came and went, as did the 2100 MHz barrier!
A few moments later, we reached the end of the game,
with the memory at its maximum speed of 2132 MHz. RivaTuner started to complain
about failing the internal driver test, so we didn’t try for anything
One of the interesting results of all of this
overclocking was that it didn’t actually improve the benchmark results of the
MSI NX8800GTX-T2D768E-HD videocard all around… However, as you saw from the
power draw tests, overclocking does have a noticeable impact on total system
Aside from reaching for 3DMark records the
NX8800GTX-T2D768E-HD is fast enough for hardcore gamers at its stock speeds,
even those of you who play 1600×1200+ with AA/AF. It’s hard to believe, but
overclocking really isn’t all that necessary with this GPU.
The details of how the MSI NX8800GTX-T2D768E-HD test
system was configured for benchmarking; the specific hardware, software drivers,
operating system and benchmark versions are indicated below. In the second
column are the general specs for the reference platforms this nVIDIA GeForce
8800GTX based videocard is to be compared against. Please take a moment to look
over PCSTATS test system configurations before moving on to the individual
benchmark results on the next page.
|PCSTATS Test System Configurations|
< Previous Page
|© 2022 PCSTATS. com||
Next Page >
Seven GeForce 8800 series graphics cards compared
NVIDIA’S GEFORCE 8800 SERIES is a jaw-dropping marriage of performance and image quality that has raised the bar for PC graphics substantially. Not since ATI’s Radeon 9700 Pro have we been so impressed by a single graphics card. The G80 GPU is simply a marvel, and if you’re looking to buy a high-end graphics card today, it’s the only chip you want.
Of course, your quest for the best graphics card won’t end there; you also have to choose between GTS and GTX flavors of the GeForce 8800. And you’re still not done, because GeForce 8800 GTS and GTX cards are available from a wide variety of manufacturers, each of which tries to bring something unique to the table, be it through bundled extras, tweaked clock speeds, or exotic cooling.
As daunting as the selection of GeForce 8800 series graphics cards may be, choice is a good thing. To help you wade through the options, we’ve rounded up a collection of GeForce 8800 series cards from BFG Tech, EVGA, Foxconn, MSI, OCZ, PNY, and XFX to see how they stack up. Read on to see which cards rise to the top and which get lost in the reference card shuffle.
GT to the S… or X
Before diving into card-specific attributes, it’s worth taking a moment to highlight some of the key differences between GTS and GTX flavors of the GeForce 8800. If you haven’t already, I’d strongly suggest reading our initial coverage of the GeForce 8800, which explores the intricate details of the G80 architecture and why it’s such a radical departure from current GPU designs. We’ll stick to the basics here, starting with a breakdown of what Nvidia has lopped off the GeForce 8800 GTX to make the GTS.
|Stream processors||ROPs||Core clock||Memory bus width||Memory clock|
|GeForce 8800 GTS||96||5||500MHz||320-bit||1. 6GHz|
|GeForce 8800 GTX||128||6||575MHz||384-bit||1.8GHz|
The GeForce 8800 GTS only retains 96 of the GTX’s 128 stream processors, cutting the chip’s shader power by 25%. Nvidia further handicaps the GTS by trimming the number of ROPs from six to five, and by reducing the memory bus width from 384 to 320 bits.
Not content to rely solely on microsurgery to separate the GTS from the GTX, Nvidia also uses clock speeds to differentiate the two. The GTX’s 575MHz core clock is reduced to just 500MHz for the GTS, and effective memory speeds drop from 1.8GHz to 1.6GHz. Those clock speeds aren’t cut in stone, though. The first wave of GeForce 8800 series cards may have stuck with stock speeds, but several of the cards we’ll be looking at today offer higher out-of-the-box frequencies.
We should note that there are now two versions of the GeForce 8800 GTS: the original with 640MB of memory, and a new model with 320MB. The 320MB card can be had for around $300, which is about $90 cheaper than the most affordable 640MB card. However, we’ve found that the GeForce 8800 GTS 320MB’s reduced memory size can be a liability with newer games at higher resolutions. All the GTS cards we’ll be looking at today are 640MB models.
Referencing seven designs
GeForce 8800 series graphics cards revolve around the same Nvidia reference designs, and apart from the unique heatsink stickers offered by each manufacturer, you’d be hard-pressed to tell one card from another. Board vendors can’t be blamed for sticking to the reference designs, though. Instead of supplying its add-in board partners with graphics chips, Nvidia has GeForce 8800 series cards built by a contract manufacturer. Those cards are then sold to Nvidia’s board partners, effectively eliminating custom or tweaked board designs.
Nvidia says that add-in-card partners are free to offer their own customization, but that customization is effectively limited to coolers, clock speeds, and bundles. Interestingly, though, Nvidia won’t confirm whether it sells factory overclocked cards directly to board vendors or the vendors are doing the overclocking on their own. Nvidia apparently doesn’t release that much detail regarding its arrangements with add-in board partners.
So board partners may not have much freedom when it comes to card-specific features, but there’s still a little room for differentiation. Exhibit one:
|GT?||Core clock||Memory clock||Memory size||Sticker||Warranty length||Street price|
|BFG GeForce 8800 GTS||GTS||513MHz||1. 58GHz||640MB||Brooding Mr. Clean||Lifetime|
|EVGA GeForce 8800 GTX ACS³||GTX||626MHz||2.0GHz||768MB||NA||Lifetime*|
|Foxconn FV-N88SMBD2-ONOC||GTS||575MHz||1.8GHz||640MB||3D space scene||2 years|
|MSI NX8800GTX||GTX||576MHz||1. 8GHz||768MB||Fairy princess||3 years parts, 2 years labor|
|OCZ GeForce 8800 GTX||GTX||576MHz||1.8GHz||768MB||Sports car||Lifetime|
|PNY XLR8 GeForce 8800 GTS||GTS||513MHz||1.58GHz||640MB||XLR8 logo||5 years*|
|XFX GeForce 8800 GTX||GTX||576MHz||1. 8GHz||768MB||Armored wolfman||“Double lifetime”|
Today’s contestants are split between three GTS cards and four GTXs, most of which are running at stock speeds (513MHz appears to be the actual stock clock speed for the 8800 GTS). There are a couple of exceptions, though. EVGA’s GeForce 8800 GTX ACS³ pushes the GTX’s clocks to 626MHz core and an effective 2.0GHz memory, while Foxconn’s FV-N88SMBD2-ONOC cranks the GTS up to GTX speeds.
Fortunately, so-called factory “overclocking” doesn’t affect warranty coverage—all cards are covered at their shipping clock speeds, regardless of whether those are higher than Nvidia’s default clocks for the GTS and GTX. There’s plenty of variety when it comes to warranties, too; some manufacturers only offer a few years of coverage, while others pledge lifetime support.
As one might expect, prices vary from card to card, as well. Higher-clocked models tend to be more expensive, but that’s not always the case. Bundled extras also alter the value proposition, and we’ll be detailing all the extra goodies you get in the box in a moment. We’ll also be taking a closer look at those always-exciting heatsink stickers.
BFG’s GeForce 8800 GTS
No OC, for a change
BFG Tech has made a name for itself by offering higher-than-stock clock speeds on virtually all of its graphics cards. Curiously, though, the company’s GeForce 8800 GTS isn’t “overclocked in the box.” Yes, the GPU’s 513MHz core clock is just a smidge higher than the 500MHz Nvidia recommends, but its memory actually runs a tad slower than the 800MHz prescribed by Nvidia, at 792MHz—an effective 1.58GHz once we take DDR’s clock-doubling effect into account. Don’t get too excited, though; that looks to be the de facto standard for GTS cards. Our stock-clocked PNY GeForce 8800 GTS also has a 513MHz core and 792MHz memory clock, and we’d expect all GTS cards to follow suit.
With Nvidia selling finished reference cards to its add-in board partners, BFG’s GeForce 8800 GTs looks just like everyone else’s. Well, apart from the brooding Mr. Clean sticker on the heatsink, that is. I’m not quite sure what BFG is getting at with the sticker—perhaps that the card’s performance is so beyond your comprehension that it will give you a headache—but it definitely sets the card apart.
Like every other GTS, this BFG model has a single six-pin PCIe power connector, a pair of dual-link DVI outputs, and a video port capable of standard and high-definition output.
BFG complements the card’s output ports with a small collection of cables, including a couple of DVI-to-VGA adapters, a molex power adapter, and a video dongle. The video dongle only features component outputs, but you can plug an S-Video cable directly into the card’s video output port.
Of course, the gravy train of extras doesn’t end there. BFG also throws in a pack of Teflon mouse feet and a rather nice black t-shirt. I’ve actually used BFG’s Teflon pads on some older mice, and they work pretty well. I’d wear the shirt, too, if it weren’t an extra large. That’s a little big for my frame, but probably just right for the stereotypical North American gamer.
The average gamer should also know what BFG stands for, but for those who don’t, there’s a helpful sticker in the box. Unfortunately, I’m far too out of touch with today’s 1337 gamers to have any clue what OMGWTFBFGSAUCE is, but I assume it’s spicy.
While BFG includes plenty of extras with its GeForce 8800 GTS, you don’t get much in the way of software—just a driver CD. There’s also an interesting note suggesting that instead of returning a defective card to the place of purchase, you should contact BFG directly. Apparently, BFG would rather you deal with their customer support than that of a retailer, and based on the experiences I’ve had with retailer support, that’s probably a good idea. BFG offers free 24/7 technical support with the card via a toll-free number, and when combined with the company’s lifetime warranty, that’s quite a lifeline.
EVGA’s GeForce 8800 GTX ACS³ Edition
Shrouding the stock cooler
We’ve seen enough Special Ultra Xtreme XXX Golden Sample Edition graphics cards to last a lifetime, but EVGA’s GeForce 8800 GTX ACS³ Edition is unique, and not because it has some fancy superscript in its name. No, the ACS³ is distinctive because it’s the only card in this round-up that’s done something really different with Nvidia’s stock cooler, and we don’t just mean another sticker.
In addition to its unique cooler, the ACS³ also has the distinction of being the fastest card in the roundup. EVGA pushes the card’s core clock speed to 626MHz, which is nearly a 9% jump. The memory clock speed has also been increased from an effective 1.8GHz to an even 2.0GHz, which works out to an 11% boost.
But it’s the ACS³ that really grabs your attention. This is the third revision of EVGA’s Advanced Cooling System, and the company actually has a patent on the design, which is described as follows:
Graphics card apparatus with improved heat dissipation and including a planar metallic cover plate having an external perimeter configuration that generally corresponds to the plan-form of the printed circuit board used in the graphics card assembly, a plurality of thermal transfer blocks that can be selectively affixed to sources of thermal energy on the graphics card assembly and thermally coupled to the cover plate, and a fan and carriage therefor comprised of a heat sink and flow directing structure.
And also, it looks menacing. If Darth Vader had a graphics card, this would be it.
Flipping the card reveals that there’s a little ACS action on the underside, too. A beefy, finned back plate sits directly below the graphics and memory chips, allowing for better heat dissipation on both sides of the card.
One might be tempted to assume that the ACS³ is a radical departure from Nvidia’s stock GeForce 8800 series cooler, but that would be a mistake. The ACS³ is really more of a metal replacement for the plastic shroud that directs airflow with the stock cooler. It is a larger shroud, though.
As you can see in the picture above, a standard GeForce 8800 GTX cooler stops short of the edge of the card. The ACS³, however, extends the entire length of the card, with plenty of vents cut to encourage airflow.
Removing the ACS³ reveals a standard GeForce 8800 series cooler under the hood. Normally, we’d encourage graphics card makers to experiment with exotic heatsinks that deviate from the reference design, but Nvidia’s latest high-end graphics card coolers are among the best we’ve ever used. They do a heck of a job dumping warm air out the back of a case, and they barely make a sound in the process. EVGA hasn’t messed with what works here; they’ve just added a little twist of ACS to the equation.
Although not nearly as exciting as the Darth Vader cooler shroud, the EVGA card’s assortment of bundled extras is reasonably complete. In addition to a pair of DVI-to-VGA adapters, you also get two molex adapters for the card’s dual PCIe power plugs, a component video output dongle, and an S-Video cable.
EVGA throws in the requisite driver CD and a copy of Dark Messiah of Might and Magic, too. Most vendors are selling the game for around $40, so it’s a decent addition to the bundle, if adventure games are your thing. If they’re not, the game is new enough that you should be able to roll it on EBay and make a few bucks.
Like BFG, OCZ, and XFX, EVGA offers a lifetime warranty for its new graphics cards. However, to get the lifetime warranty, you have to register the card within 30 days of purchase, or you get stuck with only one year of coverage. Registration really isn’t a big deal, but setting a 30-day deadline is a little harsh, especially when no one else has a similar registration cut-off for their lifetime warranties.
To EVGA’s credit, the company offers its customers a unique step-up program that provides a measure of upgrade incentive protection. After buying an EVGA graphics card, you have 90 days to decide whether it’s fast enough. If you get the upgrade itch during that time—either to keep up with the Joneses or because a graphics chip refresh has juggled prices enough to let you jump up in performance without laying out too much cash—EVGA will let you trade in your card toward a more expensive model. You can’t get your money back, and you can only “step-up” once for each card you purchase. Still, it’s nice to know the option is there.
Making a statement
Foxconn only recently got into the retail graphics card game, and despite the fact that the company is best know for relatively bland budget and OEM designs, its take on the GeForce 8800 GTS is anything but. Of course, the card can still be identified as a Foxconn; the company is known for incredibly awkward and cryptic product names, and FV-N88SMBD2-ONOC fits the bill on that front beautifully.
We have no clue what FV-N88SMBD2-ONOC actually means, but the OC at the end hints that the normally reserved and buttoned-down Foxconn is doing a little factory “overclocking.” Indeed, a rather innocuous little sticker on the card’s box indicates that this is an “Over-Clocking Version,” whatever over-clocking is.
Impressively, Foxconn isn’t messing around with modest clock speed hikes, either. The FV-N88SMBD2-ONOC is essentially a GeForce 8800 GTS running at GTX speeds—the core clock has been bumped from 500MHz to 575MHz, and the memory from 1.6GHz to an effective 1.8GHz. Don’t expect GTX performance, though; Foxconn can’t make up the GTS’ fewer stream processors, fewer ROPs, and narrower memory bus width.
Outside of its clock frequencies, this card is about as bland as the rest. I’m not even sure what to make of the heatsink sticker, other than it looks like something an intern whipped up in 3D Studio in about 10 minutes. Rendering time included. So that would make the FV-N88SMBD2-ONOC the ugliest GeForce 8800 in this round-up. After seeing that EVGA card naked, exposing the reference heatsink in all its glory, I think Foxconn would’ve been better off with clear plastic shroud and no sticker at all.
Things start to perk up for the Foxconn card when we begin digging around in the box, though. In addition to a couple of DVI-to-VGA adapters and a molex plug adapter for the card’s PCIe power connector, you also get a video output dongle with component, composite, and S-Video outputs. And that’s not all.
Dig deeper, and you’ll find a USB game controller with dual analog sticks and loads of buttons. We’ve seen graphics cards bundled with game controllers before, but perhaps not nearly often enough. I’d certainly rather have a game controller than a T-shirt that doesn’t fit or a game I’m not particularly interested in playing.
Speaking of software that doesn’t particularly interest me, Foxconn also throws in copies of RestoreIT 7 and VirtualDrive Pro 10. They make a big deal about it on the box, too, claiming that with the game controller, there’s $180 worth of extras included. The only problem is that VirtualDrive and RestoreIT are currently selling for $30 and $40, respectively. According to Foxconn, that makes what feels like a $25 game controller worth closer to $110. Maybe it’s the new math.
Perhaps that same new arithmetic is responsible for the FV-N88SMBD2-ONOC’s uninspired two-year warranty, as well. That’s the shortest coverage period in the bunch and the Foxconn card’s real achilles’ heel. When competitors are offering variations on a lifetime warranty with their graphics cards, two years looks pretty shabby.
Exactly as expected
Although better known for its motherboards, MSI has long offered a wide range of graphics cards. The company is one of only a handful that sells cards based on GPUs from both Nvidia and ATI AMD, as well. What’s even more impressive than MSI’s penchant for playing both sides is the fact that the company’s graphics card lineup is peppered with interesting products and features, including factory “overclocked” cards, custom coolers, and HDMI output.
Unfortunately, you won’t find any of those exotic features on the NX8800GTX, which, despite a unique name, is about as bone stock as the GeForce 8800 GTX comes.
Surprisingly, though, the NX8800GTX is the only card in the bunch to leverage sex appeal on its heatsink sticker. At least MSI has been subtle about it; the fairy princess (RPG aficionados feel free to correct me here) is rather restrained in comparison to the leather-clad and half-naked ladies we’ve seen showcased by ATI and Nvidia. Heck, she almost looks angelic, making this card a far cry from an XXX Edition.
Even so, MSI takes care to cover each of the card’s DVI outputs with a plastic cap, presumably for, er, protection. That doesn’t strike us as entirely necessary, especially since MSI hasn’t bothered to protect the card’s video output port with a similar cap. If anything, we think that’s the port most likely to go unused.
If you do want to tap the video output port, MSI supplies a dongle with component and S-Video outputs. There’s also an S-Video cable in the box alongside a couple of DVI-to-VGA adapters.
On the software front, MSI packs in a number of little applications, including one that allows for automatic graphics card overclocking. Unfortunately, that app requires that you use MSI’s graphics drivers, so you can’t just run the latest ForceWare release. Since Nvidia has pledged to update its graphics drivers every month from here on out, you probably don’t want to lose out on fresh drivers just to use MSI’s overclocking utility.
We’re not particularly enamored with many of the little MSI-branded apps included with the NX8800GTX, but there are a few third-party software titles you might want to play with. CyberLink’s PowerCinema and Power2Go apps come in the box, and you also get a copy of Serious Sam II. The older Serious Sam SE was a staple of graphics card bundles long after the game hit the bargain bin, but Serious Sam II is at least recent enough to be selling online for around $25 still.
As entertaining as Serious Sam II is to play, I’d trade it in a heartbeat for a longer warranty on the NX8800GTX. Parts are covered for three years, but MSI only takes care of labor for the first two, making this the second-shortest warranty of the bunch.
OCZ’s GeForce 8800 GTX
Back in the game
The last time we had an OCZ graphics card in-house for testing was way back in 2001 with the Titan 3. Powered by a then-cutting-edge GeForce3 graphics chip, the Titan 3 was a revelation at the time; you got factory “overclocking,” an aftermarket Blue Orb cooler, and chunky aluminum ramsinks.
My, how things have changed.
OCZ is riding the GeForce 8800 GTX for its return to the graphics game, and the card bears little resemblance to the Titan 3 that defined what an enthusiast graphics card should be so many years ago. The OCZ GeForce 8800 GTX is, in fact, just another re-badged reference design. You won’t find any custom cooling solutions here, and the clock speeds are bone stock. OCZ does say that its cards have been “hand-selected,” though, and that it is committed to delivering “the highest possible headroom for overclocking.” We’ll see how that commitment pans out in our overclocking tests, but first, have a gander at the card.
Yeah, that’s definitely no Titan 3. This shrouded design no doubt offers better airflow than a naked heatsink, but it doesn’t look as good, at least not to those of us with a tendency to get hot and bothered by heatpipes and cooling fins.
In an attempt to give you something to look at through that case window, OCZ adorns the 8800 GTX heatsink with its own sticker depicting a bright green sports car that’s just ambiguous enough to avoid identification. This is the first heatsink sticker we’ve seen make a car analogy, and although bright green is an almost inhumane color for a sports car, at least they didn’t slap a spoiler onto the card. A carbon fiber heatsink shroud would have been trick, though.
What the OCZ card lacks in carbon it makes up with cables. Loads of them. Alongside a pair of DVI-to-VGA adapters and those all-important PCIe power connectors, you also get a video output dongle that provides component and S-Video outputs, and composite and S-Video cables. The composite video cable’s a little out of place, though; both the card and output dongle lack a composite output port, so there’s nothing to plug the cable into.
OCZ may give you a cable you don’t actually need, but they haven’t loaded the box up with software you’re not going to use. A simple driver CD is all that’s included, and that suits us just fine.
What we can’t provide on our own is warranty coverage, so it’s a good thing that OCZ offers a lifetime guarantee with the card. OCZ’s warranty support page doesn’t demand registration within a set time period to take advantage of the lifetime warranty coverage, either. A lifetime warranty doesn’t actually mean that one GeForce 8800 GTX is going to be more reliable than another—especially not with Nvidia’s contract manufacturer building all the cards—but it at least entitles you to a replacement anytime.
PNY’s XLR8 GeForce 8800 GTS
PNY is pushing its latest GeForce 8800 series graphics cards under a new XLR8 performance brand that manages to capture a couple of well-worn enthusiast clichÃ©s with just four characters. If there’s one thing enthusiasts need, it’s more capital Xs and numbers masquerading as letters.
Of course, we tend to scoff at most branding exercises, so it’s not surprising that XLR8 isn’t resonating with us. The fact that the card doesn’t offer much in the way of actual acceleration doesn’t help, either; this is a standard GeForce 8800 GTS running at 513MHz core and 1.58GHz memory. For some reason, those stock clock speeds warrant a “Performance Edition” moniker that PNY proudly scribes across the card’s heatsink sticker.
XLR8 branding is really all there is to the sticker, which looks like it was ripped from an iPod nano commercial. Not that looks matter much. Even with a case window, admirers are going to have a hard time seeing the heatsink sticker when the card in installed in a tower enclosure. Board vendors might as well put their logos along the top edge of the card, since that’s all you’re really going to see once the card is installed in a system.
Interestingly, the PNY card is one of only a few in this round-up to come on a green printed circuit board. Either Nvidia has more than one contract manufacturer, or they’ve switched board colors somewhere along the way.
Sifting through the array of goodies PNY has included in the box reveals a standard collection of adapters and an S-Video cable. A video output dongle is also included that provides component, composite, and S-Video outputs in a single block.
Things are pretty simple on the software front, with a driver CD and download coupon for System Mechanic 7. Since this is the least expensive card of the bunch, we don’t mind the lack of bundled extras. We don’t even mind the fact that there’s an extra step if you want to maximize your warranty coverage. The card is covered by a standard three-year warranty, but users can add two years to that just by registering their card with PNY. Registration effectively brings the XLR8 card’s warranty up to five years, which might not match the lifetime terms offered by some manufacturers, but should last the useful life of the card.
XFX’s GeForce 8800 GTX
We’ve poked fun at XFX’s “XXX Edition” cards on more than one occasion, but we have a decidedly more conservative GeForce 8800 GTX in today’s round-up. This particular model runs at stock clock speeds and sits at the bottom of XFX’s GTX range, just below Extreme and XXX versions that boast faster core and memory clocks. All three flavors feature the same G80 GPU, board design, and Nvidia cooler, though, and you’re free to overclock this stock version on your own. We certainly did, and with impressive results.
You’ve seen essentially the same 8800 GTX card four times now, and with the exception of EVGA’s fancy ACS³ aluminum shroud, not much as changed other than the sticker on the heatsink.
XFX’s artistic statement features some sort of armored wolfman. Or maybe it’s just a wolf, minus the man—vents on the cooler make it hard to tell one way or another.
Like all the others in this round-up, the XFX card’s back plate has a classy pewter finish that’s a pleasant departure from the boring black and silver, and gaudy gold, that dominate the PC world.
After seeing a game controller bundled with Foxconn’s GTS, I almost expected one to appear in the XFX GeForce 8800 GTX’s box. After all, it was a bundled game controller that won us over when we compared XFX’s GeForce 7800 GTX Overclocked with a couple of other 7800 GTXs a year and a half ago. Sadly, though, XFX doesn’t offer many extras with its standard GeForce 8800 GTX. You do get a couple of DVI-to-VGA adapters, a component output dongle, and an S-Video cable, but there are no PCIe power adapters in the box. Since the GeForce 8800 GTX has two six-pin PCIe power plugs, you’ll want to make sure your power supply has enough connectors or pick up some power adapters on your own.
A plain driver CD rounds out the card’s bundle in unspectacular fashion, but XFX does have an ace up its sleeve. While BFG, EVGA, and OCZ boast lifetime warranties, XFX offers what it calls a “double lifetime” warranty on its GeForce 8800 GTX. Double lifetime coverage doesn’t actually last two lifetimes; instead, it covers the second owner of the card, should it ever be resold. Registration is necessary for the card’s original and second owners, but that’s a small price to pay for what continued warranty coverage can add to the card’s resale value down the road.
Our testing methods
Since we’ve narrowed today’s focus to the unique attributes offered by each of Nvidia’s add-in board partners, we won’t spend too much time testing 3D gaming performance. For a more in-depth look at how the performance of the GeForce 8800 GTS and GTX compare to each other and a wide range of competitors, check out our initial review of the cards.
All tests were run at least twice, and their results were averaged, using the following test systems.
|Processor||Core 2 Duo E6700 2.67GHz|
|System bus||1066MHz (266MHz quad-pumped)|
|North bridge||Nvidia nForce 680i SLI SPP|
|South bridge||Nvidia nForce 680i SLI MCP|
|Chipset drivers||ForceWare 9. 53|
|Memory size||2GB (2 DIMMs)|
|Memory type||Corsair TWIN2X2048-8500C5 DDR2 SDRAM at 800MHz|
|CAS latency (CL)||4|
|RAS to CAS delay (tRCD)||4|
|RAS precharge (tRP)||4|
|Cycle time (tRAS)||12|
|Audio||Integrated nForce 680i SLI MCP/ALC885 with Realtek HD 1.54 drivers|
|Graphics|| BFG GeForce 8800 GTS 640MB PCI-E
EVGA GeForce 8800 GTX ACS³ 768MB PCI-E
Foxconn FV-N88SMBD2-ONOC 640MB PCI-E
MSI NX8800GTX 768MB PCI-E
OCZ GeForce 8800 GTX 768MB PCI-E
PNY XLR8 GeForce 8800 GTS 640MB PCI-E
XFX GeForce 8800 GTX 768MB PCI-E
|Graphics driver||ForceWare 97. 92 drivers|
|Hard drive||Western Digital Caviar RE2 400GB|
|OS||Windows XP Professional|
|OS updates||Service Pack 2|
Thanks to Corsair for providing us with memory for our testing. 2GB of RAM seems to be the new standard for most folks, and Corsair hooked us up with some of its 1GB DIMMs for testing.
Also, all of our test systems were powered by OCZ GameXStream 700W power supply units. Thanks to OCZ for providing these units for our use in testing.
We used the following versions of our test applications:
- Futuremark 3DMark06 Build 1.02
- F.E.A.R. 1.08
- The Elder Scrolls IV: Oblivion 1.1
The test systems’ Windows desktop was set at 1280×1024 in 32-bit color at an 85Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests.
All the tests and methods we employed are publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.
We’ve narrowed our performance testing to 3DMark06’s Shader Model 3.0 tests and F.E.A.R.’s built-in performance benchmark. Both were run at 1920×1440—the highest resolution supported by our test monitor—with 4X antialiasing and 16X anisotropic filtering.
Obviously, the GeForce 8800 GTX cards have a considerable lead over the GTS models. What’s more interesting to see, however, is how much the higher clocked models manage to outpace their competition. The EVGA ACS³ makes more of its clock speed advantage over the rest of the GTX field than the Foxconn does over other GTS cards. That’s notable because it’s the Foxconn card that actually enjoys a greater clock speed jump, at least percentage-wise, over its stock-clocked compatriots.
The handy automatic overclocking utility built into Nvidia’s graphics drivers and then relocated to its nTune system utility doesn’t seem to be working properly with the GeForce 8800 series, so we had to kick it old-school with manual slider manipulation and loads of trial-and-error testing. Each of our overclocked configurations had to loop successfully through three iterations of 3DMark’s Shader Model 3.0 tests at 1920×1440 with 4X antialiasing and 16X aniso, and then endure ten minutes of Oblivion at the same graphics settings.
We were able to hit the following core and memory clock speeds with each card (doubling the memory clock speed gives you the effective memory clock):
- BFG GeForce 8800 GTS — 653MHz core, 962MHz memory
- EVGA GeForce 8800 GTX ACS3 — 643MHz core, 1048MHz memory
- Foxconn FV-N88SMBD2-ONOC — 650MHz core, 1060MHz memory
- MSI NX8800GTX — 629MHz core, 1053MHz memory
- OCZ GeForce 8800 GTX — 622MHz core, 1047MHz memory
- PNY XLR8 GeForce 8800 GTS — 653MHz core, 1059MHz memory
- XFX GeForce 8800 GTX — 659MHz core, 1060MHz memory
The XFX GeForce 8800 GTX hit managed a higher core clock speed than any other card by 9MHz, and it also shared the memory clock crown with Foxconn’s FV-N88SMBD2-ONOC, although only by 1MHz over PNY’s GTS. Note that with only a couple of exceptions, all the cards hit about the same core and memory clock speeds, regardless of their GTS or GTX designation.
As is always the case, overclocking success is never guaranteed and is often just the luck of the draw. Because Nvidia takes care of the manufacturing for all of these cards, there probably isn’t much the add-in board partners can do to ensure greater overclocking success.
Pushing these cards to their limits changes how they stack up a little, with the factory “overclocked” cards no longer sitting in the lead. These results suggests the overclocking potential of stock-clocked cards isn’t being eroded by cherry-picking for faster models.
Noise levels were measured using an Extech 407727 Digital Sound Level meter placed along the edge of the motherboard 1″ from the graphics card and out of the direct path of airflow. We recorded noise levels after 10 minutes idling at the Windows desktop, and again after 10 minutes rendering this stunning scene from Oblivion at 1920×1440 with 4X antialiasing and 16X anisotropic filtering, and all the in-game eye candy cranked. Cards were tested at both their default and overclocked speeds.
At idle, only a decibel separates the quietest card from the loudest. As one might expect, the overclocked cards tend to run a little louder, but the GeForce 8800 series cooler is so quiet at idle that it’s hard to tell.
Things spread out a little under load, and this time it’s the GTS cards that prove to be the quietest. What’s particularly interesting here is that even the overclocked GTS cards are running quieter than the best of our GTX crowd. GTX noise levels are reasonably consistent at idle, regardless of whether the cards are overclocked.
System power consumption was tested, sans monitor and speakers, at the wall outlet using a Watts Up power meter. We used the same idle and load conditions as our noise level tests.
It turns out that some GeForce 8800 series cards draw a little less power than others. The GTS models are predictably the most frugal when it comes to power consumption, but overclocking also makes a difference.
We tracked GPU temperatures using Nvidia’s nTune system utility, which can log temperatures to a text file. Again, we used the same idle and load conditions as our noise level tests.
Perhaps the most striking thing about these results is the fact that none of the cards saw much of a change in GPU temperature from idle to load. That’s curious to say the least, and it’s not like nTune can’t track changes in temperature—the app had no problem logging higher GPU temperatures when the cards were overclocked.
Since these cards are using the same graphics chips and reference coolers, we can’t draw too many conclusions beyond the fact that G80 operating temperature seems to vary from chip to chip. EVGA’s ACS³ cooling shroud doesn’t appear to have a significant impact on GPU temperatures, either.
Update — Several readers have written in to tell us that they’re seeing much higher load temperatures with GeForce 8800 series graphics cards monitored by third party apps like RivaTuner than we saw with Nvidia’s nTune system utility. It appears that nTune’s GPU temperature tracking isn’t working properly with the GeForce 8800 series, and we’ve contacted Nvidia regarding the issue.
If you’re struggling to decide between a GeForce 8800 GTS and a GTX, I suggest reading through our more detailed coverage of how the models compare. In terms of performance per dollar, both offer exceptional value; it’s just a question of how much you want to spend, and what resolution you want to run.
Then there’s the matter of selecting a board vendor. You’re essentially getting the same reference hardware regardless of which you choose. That kills diversity in the market a little, but it also implies consistent production quality. Picking favorites then becomes a matter of comparing warranties, pricing, extras, and clock speeds. I’ve selected a couple of stand-outs from the cards we’ve looked at today.
On the GTS front, I was first torn between the BFG and Foxconn cards. The Foxconn’s factory “overclocking” and bundled game controller weigh heavily in its favor, and it is $10 cheaper than the BFG. However, Foxconn’s warranty spoils the deal—a measly two years of coverage in a market filled with lifetime warranties is a joke. So surely the BFG, with its lifetime warranty, excellent tech support, and freebie t-shirt and mouse feet, would win me over. But not quite. You see, the BFG card is also the most expensive GeForce 8800 GTS of the bunch, but it runs at stock speeds, and our card’s memory didn’t overclock nearly as well as that of the others.
As it turns out, I was looking in the wrong direction all along. PNY’s XLR8 GeForce 8800 GTS may lack the flash of bundled extras, higher stock frequencies, or a lifetime warranty, but for just $385, it’s a heck of a deal. The money you save can purchase extras you actually want. As our overclocking results show, the fact that PNY doesn’t offer faster clock speeds out of the box doesn’t mean you can’t get there on your own. Couple that with a three-year warranty that goes up to five years with registration, and you have our first Editor’s Choice.
PNY XLR8 GeForce 8800 GTS
Finding the pick of the GeForce 8800 GTX litter was considerably easier. MSI gets knocked out of the running early for providing only a three-year parts, two-year labor warranty—Serious Sam II doesn’t even come close to making up for that, although the NX8800GTX is the cheapest GTX in the round-up by $10. At over $620 online, OCZ’s GeForce 8800 GTX is simply too expensive for a reference card running at stock speeds with few intriguing extras. We’re happy to see OCZ back in the graphics game, but we think they can do better, especially for a card that commands a higher price premium than the EVGA ACS³.
The EVGA card does look tempting, if not for the exotic heatsink shroud, then for the performance advantage that the card’s higher clock speeds provide. We really like the idea behind the Step-Up program, too, but EVGA’s warranty policy leaves a sour aftertaste. Lifetime warranties are great, but cutting coverage to a single year if users don’t register within 30 days of purchase is unnecessarily restrictive for a card that costs north of $600. That leaves us with XFX’s GeForce 8800 GTX, which just happens to be one of the more affordable GTX options on the market. You don’t get fancy bundled extras or tweaked clock speeds, but you save a bit of cash. With some luck, you should be able to hit the higher clock speeds of the EVGA card on your own. Throw in a warranty that enhances the resale value of the card by extending coverage to its second owner, and you have our second Editor’s Choice.
EVGA 8800GTX. Reference card overclocking / Video cards
We have already had the opportunity to study the impressive architecture of NVIDIA’s latest chip, G80, and get acquainted with the first tests of video cards based on it.
Today we will continue our acquaintance with products based on his son using the GeForce 8800GTX video card provided by EVGA as an example. As you will see below, this is quite a typical sample of 8800GTX. The more interesting it will be to find out what overclocking potential we can expect from the reference video cards 8800GTX. We will also evaluate the efficiency of the standard cooler and the temperature conditions of the video chip in various conditions. And let’s start, according to tradition, with studying the appearance of the EVGA 8800GTX video card and its configuration.
Packaging is done in a typical EVGA strict manner. Of the most important points, it is worth noting the minimum power requirements for the power supply indicated on the right side of the package — 450 W, a current of at least 30 A through the 12 V line. Another interesting point is that you can win the video card you bought in the lottery held by EVGA. How can this be if you have already bought a video card? This is not about the fact that they will send you a second one to install them in SLI mode. If you register on the EVGA website after purchasing a graphics card, you will be entered into a monthly raffle among buyers. The main prize is the cost of the video card you bought, that is, by winning the lottery, you will become the owner of an EVGA video card for free.
On the back of the box, you can find a description of the features of the video card, among which a large part is occupied by the innovations that appeared in the G80 video chip. It also provides information about the product warranty. The warranty period has been drastically increased and is now 10 years. Also on the back there is a description of the scope of delivery:
- HDTV/S-Video-out adapter
- DVI/D-SUB adapter — 2 pcs.
- power adapter — 2 pcs.
- S-Video cable
- driver disc
- user manual
The cutout on the back of the box is needed so that you can compare the markings on the packaging with the markings on the video card itself. Of course, it is unlikely that the reason is that American buyers are constantly slipped something wrong, but the concern for the consumer is definitely felt.
The video card itself looks quite ordinary, like many other similar 8800GTX video cards. The sticker on the cooler says that the video card belongs to EVGA, and nothing more. As you can see in the photo, both auxiliary power connectors are 6-pin, so there will be no problems finding an 8-pin power adapter, as it might seem after looking at the photos of the first samples of 8800GTX video cards. Let us remind you once again that the EVGA 8800GTX video card is a serial product.
The reverse side of the PCB contains many small elements, which is not surprising, given the increased complexity and size of the video chip, increased memory bus width and a powerful power supply system. If you are confused by the «Made in China» sticker, don’t worry, it’s not a fake, it’s a consequence of globalization.
So we got to the cooler. As you can see, the sole of the cooler is made of a single piece of aluminum, in the middle of which a copper core is “implanted” to cool the GPU. From the front side, we see only a corner of this core, but two heat pipes leading to the heatsink fins are clearly visible. One tube runs in the plane of the radiator sole, the second makes the upper part of the cooling fins work. In general, the design looks elegant, and even less bulky than the stock cooler on the 79 video card.00GTX with 4 heatpipes. However, this cooler is enough to cool the hot temper of the new video processor, memory and power elements.
We deliberately did not peel off the thermal pads on the back of the heatsink, so that it would be clear that the protrusions on the base of the heatsink were not made for nothing, but were intended for forced cooling of the board components. The leftmost thermal pad marks the location of the NVIO chip responsible for outputting video to signal receivers. This is done to offload the already complex video processor chip.
With the cooler removed, the 8800GTX video card looks no less impressive than assembled. The huge size of the video chip, with all the desire, is hard not to notice. Almost half of the PCB is occupied by the video card’s power subsystem — from the right edge of the board to the video memory chips.
The video chip itself is now equipped with a protective cover, which is not surprising, given its size and cost, you don’t want to risk careless installation of the cooler. Chip marking G80-300-A2, which indicates the second revision. Interestingly, on the 8800GTS video card, the chip marking looked like G80-100-KO-A2. Does this mean that a version of chips with the number 200 may appear, which will occupy an intermediate position between 8800GTS and 8800GTX? One can only guess. The nominal frequency of the GPU on the EVGA 8800GTX video card is, as it should be, 575 MHz. GPU frequencies for 2D and 3D modes are the same.
The EVGA 8800GTX video card, like the other reference 8800GTX video cards, is equipped with DDR3 video memory. The chips are made by Samsung, the access time is 1.1 ns, which corresponds to the nominal frequency of 1800 MHz DDR. At this frequency, the video memory works normally. The total amount of video memory is 768 MB. The “non-circularity” of the figure (even from the point of view of the binary number system) is due to the atypical width of the video processor memory bus, which is 384 bits.
This is where we will finish our acquaintance with the external features of the video card, and proceed to the study of thermal conditions, its overclocking potential and performance.
We increased the frequency of the graphics chip and video memory using the RivaTuner 2.0 utility. The stable frequencies that we were able to achieve on a sample of the video card that was in the laboratory are 650 MHz for the GPU and 2000 MHz DDR for the video memory. Interestingly, even when the cooler speed was increased to the maximum (with the help of the same RivaTuner), it turned out to be impossible for the GPU to overcome the bar of 675 MHz. The built-in test passed, and the applications worked for a while, but the system crashed quite quickly, although the GPU temperature remained quite normal. What could be the reason? Let’s not forget that the video chip is quite large in size, the internal units operate at different frequencies, and the temperature measured by the sensor in one place of the chip does not have to be the same for the entire chip. As for the temperature regimes, the screenshot below will tell you more clearly about this.
The marks shown on the graph correspond to the following modes (from left to right). The leftmost line is «idle». On the Windows desktop screen, no 3D applications are running. The GPU temperature is 63 degrees, the cooler speed is set automatically. The second line from the left is the «warming up» of the video card by running 3DMark 3D tests multiple times. The maximum GPU temperature rose to 81 degrees, the cooler speed remained practically unchanged. The third line is the temperature during overclocking to frequencies of 650/2000 MHz “under load”. The temperature rose to only 85 degrees, due to the fact that the automatic control of the cooler slightly increased its speed, and the cooler rotation speed became equal to 1850 rpm. The line on the far right is the temperature during overclocking to frequencies of 650/2000 MHz “under load”, but with the cooler speed manually set to maximum. As you can see, the efficiency of the standard cooler is beyond praise. The GPU temperature no longer exceeds 73 degrees.
The noise produced by the stock cooler deserves a special mention. In regular modes it is almost inaudible. Increasing the cooler rotation speed to the maximum also does not hurt the ear, although the noise becomes more noticeable. When the cover of the case of the system unit is closed, it is quite difficult to distinguish the noise of the video card cooler among the noise of other components, so we can safely recommend overclockers to increase the speed of the video card cooler during overclocking to the maximum, the comfort of the ears will not suffer, and the soul will not hurt for overheating of the video card. Well, now let’s move on to testing.
Testing was done with ForceWare 97.44 and Catalyst 6.12 drivers. It happens that once you finish the tests and prepare the material, both companies, ATI and NVIDIA, release new versions of drivers. 🙂
However, this is unlikely to drastically change the balance of power in our testing. We will compare the results shown by the EVGA 8800GTX video card in normal mode and overclocked. Also given are the results of Radeon X1950XTX, which is still carrying a heavy burden of a top AMD/ATI product in the absence of the R600. The results presented here are somewhat different from what we got during testing at the time of the 8800GTX announcement. And the point is not even in different versions of drivers, but in the fact that slightly different graphics quality settings were used in games, not to mention a different set of games themselves. And, of course, overclocking the 8800GTX to evaluate the frequency performance scalability.
There isn’t much to comment on here. Everything is obvious.
It seems that in the NFS Most Wanted tests, the growth of the 8800GTX results rested on the performance of the central processor. The engine of this game is very processor dependent, although for now we will leave this fact without evidence. The assumption that a fixed screen refresh rate of 75 Hz played a role was not confirmed during the test.
In the more recent version of NFS, which is Carbon, there were no surprises and the results decrease almost linearly with increasing resolution.
The game Company of Heroes used the highest possible graphics quality settings of the engine, including the level of detail Ultra.
In Oblivion, the results for 1024×768 and 1280×1024 resolutions differ little. The question of whether the matter is in the central processor will still have to be clarified. Interestingly, overclocking a video card at a resolution of 1280×1024 gives even better results than at 1024×768. Although this is quite understandable. We used the FRAPS utility, so some measurement error will be present in any case, and in the case of performance limitation from the CPU, this is generally a typical result.
The benchmarks of Battlefield 2 illustrate CPU limitation and nothing else.
In the rest of the shooters, we see a quite natural increase in the results of the overclocked 8800GTX, which ranges from 10% to 15%.
The EVGA 8800GTX video card showed good overclocking potential. Of course, in absolute terms, the increase in GPU frequency by 13% and video memory frequency by 11% looks quite modest. However, for top products, the result is quite decent and, paraphrasing the well-known advertising slogan, we can say — «and 10% is not superfluous. » By the way, EVGA has already released overclocked versions of the 8800GTX video cards, and the frequencies of the older one are 626/2000 MHz for the GPU and video memory. But even when using a reference video card with standard frequencies, you can achieve the same, and even slightly better results, as we could see today.
GeForce 8800 GTS 320 MB: 7 overclocked models
The GeForce 8800 GTS graphics accelerator with 320 MB of video memory is probably the most successful NVIDIA product in the eighth generation of GeForce. Against the background of other high-performance solutions, it combines both excellent performance and reasonable price. The halved amount of memory has little effect on performance in most games, problems arise only at maximum texture size settings with simultaneous anti-aliasing, not to mention HDR. However, in the latest games, this mode can bring to its knees not only the 640-MB version, but also the «big brother» — GeForce 8800 GTX. Recall that the most powerful video cards of the GeForce 8800 series — GTX and Ultra — have, in addition to increased memory capacity to 768 MB and, consequently, expanded from 320 to 384 bits of the bus, also increased from 96 to 128 number of stream processors.
In this test, seven GeForce 8800 GTS models are considered, six of which are equipped with 320 MB of memory, and one more with 640 MB for comparison. We will evaluate how important the frame buffer size is for the most popular games and test applications, and, of course, we will check the overclocking potential of video cards.
- 0.1 ASUS EN8800GTS/HTDP/320M
- 1 ASUS EN8800GTS/HTDP/320M
- 1.1 MSI NX8800GTS-T2D320E-HD-OC
- 2 MSI NX8800GTS-T2D320E-HD-OC
- 2.1 MSI NX8800GTS-T2D640E-HD-OC
- 3 MSI NX8800GTS-T2D640E-HD-OC
- 3.1 Point of View GeForce 8800 GTS 320 MB
- 4 Point of View GeForce 8800 GTS 320 MB
- 4. 1 XFX GeForce 8800 GTS 320 MB
- 5 XFX GeForce 8800 GTS 320 MB
- 5.1 EVGA GeForce 8800 GTS 320 MB KO ACS3 Edition
Product provided by Euro Plus
This video card was chosen for comparison purposes with its «little brothers». It differs only in that 640 MB of memory are installed on board. As in the case with the previous model, the initial frequency characteristics are still the same 575/1700 MHz, in the process of overclocking we managed to raise the frequencies to 684 MHz for the core and 2160 MHz for the memory. As you can see, larger capacity memory chips overclocked noticeably better than smaller ones. At the same time, such an increase in frequency does not have a significant effect on the level of heat release.
XFX’s product line includes both GeForce 8800 GTS 320 MB models with standard characteristics and higher frequencies. The version with reference frequencies and a cooler took part in testing. Nevertheless, the overclocking potential of this specimen pleased us: we managed to reach 684 MHz for the core and 2160 MHz for the memory — one of the best results. Perhaps, for the production of its video cards, XFX selects the best specimens from shipments from NVIDIA contract manufacturers. This should please budget overclockers, because, say, overclocked products of the XXX series are sold at significantly higher prices. Our testing has shown that the «regular» versions of the GeForce 8800 GTS 320 MB from XFX can overclock just as well as the elite ones, if, of course, we are lucky with a copy, as we are. 92/2200 MHz, while the core temperature under load, according to Rivatuner, increased by only 2 degrees: from 54 to 56 °C. Indeed, an excellent result. As crafty overclockers would say: «just add a voltmod.»
How we tested
Our test program includes the most popular 3DMark05 and 3DMark06 packages, as well as several modern games: Call of Juarez, Prey, Quake 4, launched using the SmartFPS.com 1.5 utility. To create extreme conditions for video cards and eliminate possible restrictions from the CPU (in this case, Intel Core 2 Duo E6700), we additionally tested games at high resolutions (monitor — Samsung SyncMaster 275T) with full-screen anti-aliasing and anisotropic filtering enabled through the driver settings textures (Full Screen Antialiasing = 4x, Anisotropic Filtering = 16x). The video cards were overclocked using the Rivatuner 2.02 utility, while the room temperature was maintained at 21 ºС.
The 3DMark05 and 3DMark06 packages respond very well to overclocking of GeForce 8 series video cards. more points. Nevertheless, the test results are quite indicative: having accelerated to the maximum frequencies, Sparkle Caliber GeForce 8800 GTS 320 MB overtakes all rivals.
Call of Juarez belongs to the latest generation of gaming applications focused on today’s unified graphics card architectures. Testing showed that without anti-aliasing and filtering the GeForce 8800 GTS 320 MB provides good graphics processing speed in all resolutions up to 1600×1200, but additional quality settings make the game uncomfortable. With active FSAAx4 and AFx16, an acceptable frame rate remains only at 1024×768, while again Sparkle Caliber is at the top of the podium, and the XFX GeForce 8800 GTS breathes down its neck.
The «heavyness» of the DOOM 3 engine, on which the Quake 4 game is based, by calculations with the help of the CPU leads to the fact that in the Pure Speed mode, the performance at 1024×768 and 1280×1024 resolutions is almost identical, and only at 1600×1200 video cards are starting to unlock their potential. Note that we set all quality settings to Maximum, not Ultra, as the latter requires a very large framebuffer size. 320 MB of memory in the tested video cards are not able to accommodate ultra-high resolution textures and Shadow Volumes used in Ultra mode.
The modified DOOM 3 engine used in the game Prey is no longer CPU intensive, and we can observe excellent performance of all video cards in the vast majority of modes. In addition, the game responds very well to overclocking, and even slightly prefers the XFX GeForce 8800 GTS over Sparkle.
In general, I would like to note once again that GeForce 8800 GTS video cards with 320 MB of memory on board are an excellent choice for a mid-range gaming computer. They provide excellent performance in all the most popular games, while maintaining good speed in the latest projects. In addition, these adapters support DirectX 10, which means that having spent about $300, now the user will not regret the purchase even after six months with the release of new games.
Anatomy GeForce 8800 GTS 320 MB
Like the GeForce 8800 GTX, this video card has a four-phase digital power converter, you can see it in the picture on the right (three rows of elements are located in parallel and one at the top). For the video card to work, it is necessary to connect an additional 6-pin power connector, otherwise, when the PC starts up, the piezoelectric speaker located on the board will start beeping loudly.
A large G80 core stands out on the PCB, surrounded by an installation frame that protects the GPU from damage if the cooler is skewed, although this is not so important in the light of the presence of a heat-spreading cover on the chip. The G80-100-K0 marking on the core testifies that this is the GeForce 8800 GTS. The symbols «A3» mean the fourth revision of the core (the first, engineering, is marked «A0»), it is this one that is installed in the GeForce 8800 GTS 320 MB. The core in the video card model corresponding to the standard must operate at a frequency of 512 MHz. When overclocking, the GPU frequencies change in steps, in steps of 3, 5 or 9MHz, and stream processors — in 54 MHz increments. Therefore, when overclocking the board, you should always follow the indicators in specialized utilities (RivaTuner, for example), which show the real operating frequencies of the main and shader blocks of the core.
Memory chips are installed around the core. As you can see in the photo, the two rightmost contact pads for memory are empty. This is explained by the fact that the GeForce 8800 GTS PCB is based on a slightly simplified GeForce 8800 GTX board. The memory layout remained the same, hence the extra pads. The memory bus width of the 640 and 320 MB versions of these video cards is the same and equals 320 bits, just the GeForce 8800 GTS 320 MB uses chips that are twice as small in capacity (256 Mbit instead of 512 Mbit). Most often, these boards are equipped with Samsung BC-12 GDDR3 memory, these are chips with a nominal access time of 1. 2 ns and a frequency of 800 MHz (1600 MHz DDR), at which it usually works. There are several GeForce 8800 GTS 320 MB models on the market with Hynix FP11 memory, which is designed for a frequency of 1800 MHz (1.1 ns).
To the left of the memory chips, another chip is installed — this is NVIO, the RAMDAC controller, that is, a digital-to-analog converter responsible for converting digital data from the graphics processor into a signal that goes to the monitor. He is also responsible for signal transmission to TV-out, including HD-video. Usually, these elements are integrated directly into the core die, but in the case of the G80, NVIDIA engineers decided to move it to a separate chip to get rid of interference and noise. The
GeForce 8800 GTS 320 MB are equipped with two digital DVI-I connectors for connecting monitors and a TV-out connector for connecting to other devices (TVs, plasma panels, etc.). The G80 core itself does not support HDCP signal encoding, however, the vast majority of end-game video card manufacturers additionally install a memory chip with embedded HDCP codes, thus making their video cards fully HD-compatible.
Absolutely all GeForce 8800 graphics adapters, both GTS, GTX, and Ultra, are manufactured at NVIDIA contract factories, so the differences in products from different companies can only be in the applied cooling systems and frequency characteristics, it is forbidden to change the design of printed circuit boards.
NVIDIA GeForce 8800 GTX Review. Benchmarks and specs
The NVIDIA GeForce 8800 GTX graphics card (GPU) is position 698 in our performance ranking. Manufacturer: NVIDIA. NVIDIA GeForce 8800 GTX runs at a minimum clock speed of 576 MHz. The graphics chip is equipped with an acceleration system and can operate in turbo mode or during overclocking. RAM size — 768 MB GB with a clock speed of 900 MHz and a bandwidth of 86.4 GB/s.
Power consumption of NVIDIA GeForce 8800 GTX is 155 Watt, and the process technology is only 90 nm. Below you will find key compatibility, sizing, technology, and gaming performance test results. You can also leave comments if you have any questions.
Let’s take a closer look at the most important characteristics of the NVIDIA GeForce 8800 GTX. To have an idea of which video card is better, we recommend using the comparison service.
Popular video cards
AMD Radeon RX Vega 7
Intel UHD Graphics 630
Intel UHD Graphics 600
NVIDIA Quadro T1000
AMD Radeon RX Vega 10
NVIDIA GeForce MX330
Intel HD Graphics 530
Intel UHD Graphics 620
Intel HD Graphics 4600
Intel HD Graphics 520
The base set of information will help you find out the release date of the NVIDIA GeForce 8800 GTX graphics card and its purpose (laptops or PCs), as well as the price at the time of release and the average current cost. This data also includes the architecture used by the manufacturer and the video processor code name.
Performance Rating Position: 811 Value for money: 3.50 Architecture: Tesla Code name: G80 Type: Desktop Release date: November 8, 2006 (15 years ago) Starting price: $599 Current price: $78 (0.1x MSRP) Value for money: 0.79 GPU code name: G80 Market segment: Desktop
This is important information that determines all the performance characteristics of the NVIDIA GeForce 8800 GTX graphics card. The smaller the technological process of manufacturing a chip, the better (in modern realities). The clock frequency of the core is responsible for its speed (direct correlation), while signal processing is carried out by transistors (the more transistors, the faster the calculations are performed, for example, in cryptocurrency mining).
Conveyors: 128 Core Clock: 576MHz Number of transistors: 681 million Process: 90nm Power consumption (TDP): 155 Watt Number of texels processed in 1 second: 36.8 billion/sec Floating point: 345.6 gflops CUDA Cores: 575 Pipelines / CUDA cores: 128 Number of transistors: 681 million Estimated heat output: 155 Watt
Dimensions, Connectors and Compatibility
There are many form factors for PC cases and laptop sizes, so it’s important to know the length of your graphics card and how it’s connected (except for laptop versions). This will help make the upgrade process easier, as Not all cases can accommodate modern video cards.
Interface: PCIe 1.0 x16 Length: 270 mm Additional power: 2x 6-pin SLI options: +
Memory (frequency and overclocking)
Internal memory is used to store data when performing calculations. Modern games and professional graphics applications place high demands on the amount and speed of memory. The higher this parameter, the more powerful and faster the video card. Memory type, size and bandwidth for NVIDIA GeForce 8800 GTX + turbo overclocking option.
Memory type: GDDR3 Maximum RAM amount: 768MB Memory bus width: 384 Bit Memory frequency: 900MHz Memory bandwidth: 86. 4 GB/s
Support for ports and displays
As a rule, all modern video cards have several types of connections and additional ports, for example HDMI and DVI . Knowing these features is very important in order to avoid problems connecting a video card to a monitor or other peripherals.
Display connections: 2x DVI, 1x S-Video
All APIs supported by the NVIDIA GeForce 8800 GTX graphics card are listed below. This is a minor factor that does not greatly affect the overall performance.
DirectX: 11.1 (10_0) OpenGL: 3.3
Overall gaming performance
All tests are based on FPS. Let’s see how the NVIDIA GeForce 8800 GTX ranks in the gaming performance test (the calculation was made in accordance with the recommendations of the game developer for system requirements; it may differ from real situations).
Horizon Zero DawnDeath StrandingF1 2020Gears TacticsDoom EternalHunt ShowdownEscape from TarkovHearthstoneRed Dead Redemption 2Star Wars Jedi Fallen OrderNeed for Speed HeatCall of Duty Modern Warfare 2019GRID 2019Ghost Recon BreakpointFIFA 20Borderlands 3ControlF1 2019League of LegendsTotal War: Three KingdomsRage 2Anno 1800The Division 2Dirt Rally 2.0AnthemMetro ExodusFar Cry New DawnApex LegendsJust Cause 4Darksiders IIIFarming Simulator 19Battlefield VFallout 76Hitman 2Call of Duty Black Ops 4Assassin´s Creed OdysseyForza Horizon 4FIFA 19Shadow of the Tomb RaiderStrange BrigadeF1 2018Monster Hunter WorldThe Crew 2Far Cry 5World of Tanks enCoreX-Plane 11.11Kingdom Come: DeliveranceFinal Fantasy XV BenchmarkFortniteStar Wars Battlefront 2Need for Speed PaybackCall of Duty WWIIAssassin´s Creed OriginsWolfenstein II: The New ColossusDestiny 2MEDLE-Evil Within : Shadow of WarFIFA 18Ark Survival EvolvedF1 2017Playerunknown’s Battlegrounds (2017)Team Fortress 2Dirt 4Rocket LeaguePreyMass Effect AndromedaGhost Recon WildlandsFor HonorResident Evil 7Dishonored 2Call of Duty Infinite WarfareTitanfall 2Farming Simulator 17Civilization VIBattlefield 1Mafia 3Deus Ex Mankind DividedMirror’s Edge CatalystOverwatchDoomAshes of the SingularityHitman 2016The DivisionFar Cry PrimalXCOM 2Rise of the Tomb RaiderRainbow Six SiegeAssassin’s Creed SyndicateStar Wars BattlefrontFallout 4Call of Duty: Black Ops 3Anno 2205World of WarshipsDota 2 RebornThe Witcher 3Dirt RallyGTA VDragon Age: InquisitionFar Cry 4Assassin’s Creed Un ityCall of Duty: Advanced WarfareAlien: IsolationMiddle-earth: Shadow of MordorSims 4Wolfenstein: The New OrderThe Elder Scrolls OnlineThiefX-Plane 10. 25Battlefield 4Total War: Rome IICompany of Heroes 2Metro: Last LightBioShock InfiniteStarCraft II: Heart of the SwarmSimCityTomb RaiderCrysis 3Hitman: AbsolutionCall of Duty : Black Ops 2World of Tanks v8Borderlands 2Counter-Strike: GODirt ShowdownDiablo IIIMass Effect 3The Elder Scrolls V: SkyrimBattlefield 3Deus Ex Human RevolutionStarCraft 2Metro 2033Stalker: Call of PripyatGTA IV — Grand Theft AutoLeft 4 DeadTrackmania Nations ForeverCall of Duty 4 — Modern WarfareSupreme Commander — FA BenchCrysis — GPU BenchmarkWorld in Conflict — BenchmarkHalf Life 2 — Lost Coast BenchmarkWorld of WarcraftDoom 3Quake 3 Arena — TimedemoHalo InfiniteFarming Simulator 22Battlefield 2042Forza Horizon 5Riders RepublicGuardians of the GalaxyBack 4 BloodDeathloopF1 2021Days GoneResident Evil VillageHitman 3Cyberpunk 2077Assassin´s Creed ch Dogs LegionMafia Definitive EditionCyberpunk 2077 1.5GRID LegendsDying Light 2Rainbow Six ExtractionGod of War
Horizon Zero Dawn (2020)
Death Stranding (2020)
F1 2020 (2020)
Gears Tactics (2020)
Doom Eternal (2020)
Description 5 Stutter — The performance of this video card with this game has not yet been studied enough. Based on interpolated information from graphics cards of a similar performance level, the game is likely to stutter and display low frame rates. May Stutter — The performance of this video card with this game has not yet been studied enough. Based on interpolated information from graphics cards of a similar performance level, the game is likely to stutter and display low frame rates. 30 Fluent — According to all known benchmarks with the specified graphic settings, this game is expected to run at 25 fps or more 40 Fluent — According to all known benchmarks with the specified graphics settings, this game is expected to run at 35fps or more 60 Fluent — According to all known benchmarks with the specified graphic settings, this game is expected to run at 58 fps or more May Run Fluently — The performance of this video card with this game has not yet been sufficiently studied. Based on interpolated information from graphics cards of a similar performance level, the game is likely to show smooth frame rates. ? Uncertain — testing this video card in this game showed unexpected results. A slower card could deliver higher and more consistent frame rates while running the same reference scene. Uncertain — The performance of this video card in this game has not yet been studied enough. It is not possible to reliably interpolate data based on the performance of similar cards in the same category. The value in the fields reflects the average frame rate across the entire database. To get individual results, hover over a value.
ATI Radeon HD 2900 PRO
NVIDIA GeForce 8800 GTX in benchmark results
Benchmarks help determine the performance in standard tests NVIDIA GeForce 8800 GTX. We have compiled a list of the most famous benchmarks in the world so that you can get accurate results for each of them (see description). Pre-testing the graphics card is especially important when there are high loads, so that the user can see how the graphics processor copes with calculations and data processing.
Overall performance in benchmarks
NVIDIA GeForce 810A
NVIDIA GeForce GTX 280M
NVIDIA GeForce 8800 GTX
NVIDIA GeForce GTS 250
Intel HD Graphics P4600
Passmark is an excellent benchmark that is updated regularly and shows relevant graphics card performance information.
NVIDIA GeForce 810A
NVIDIA GeForce GTX 280M
NVIDIA GeForce 8800 GTX
Intel HD Graphics P4600
NVIDIA GeForce GTS 250
Share on social networks:
To leave a review, you need to log in
Review NVIDIA GeForce 8800 GTX
Compare NVIDIA GeForce 8800 GTX
AMD Radeon Pro 555
ATI Radeon HD 2900 PRO
NVIDIA GeForce GTX 560 Ti
NVIDIA GeForce GT 730M
AMD Radeon HD 8550M
NVIDIA GeForce RTX 2080 Ti
Voltmod for XFX GeForce 8800 GT, 8800 GS and 9600 GSO video cards. . .
Video cards Xfx — Buy on Yandex .market
Xfx Pci-E Geforce 8800Gt 512Mb 256Bit — instructionlifestyle
Video card Geforce 950 Buy
infocom .uz › 2008 › 03 › 25 › videokarta-xfx-geforce-8800-gt-alpha-dog-edition
Video card XFX GeForce 8800 GT Alpha Dog Edition March 25, 2008 Heading: Technology .Tags: hardware Author: Site Administrator. There are not many manufacturers of world-class video cards on the market of Uzbekistan.
Video card XFX GeForce ® 8800 GT 512MB DDR3 XXX Video card 512Mb GDDR3 XFX (RTL) +DualDVI+TV Out+SLI < GeForce 8800GT > 512 MB PCI Express 2 .0 16x GDDR3 2 DVI-I outputs, D-Sub adapter included GeForce ® 8800 GT 256 bit . SKU: #70881 Compare .
The video card GeForce 8800 GT from Gigabyte shocked us like its own. . . a good addition to an already very profitable purchase for a gaming computer. . . . better suited for overclocking. XFX GeForce 8800GT Alpha. . .
Nvidia Geforce 8800 GT Overview and Specifications . What games will the 8800 GT pull. How to overclock the video card to the maximum. How to download new drivers.
Video card XFX GeForce 8800 GT 600Mhz PCI-E 512Mb 1800Mhz 256 bit 2xDVI TV HDCP YPrPb — buy today with delivery and warranty at a bargain price. Video card XFX GeForce 8800 GT 600Mhz PCI-E 512Mb 1800Mhz 256 bit 2xDVI TV HDCP YPrPb: specifications, photos, shops nearby on the map.
This volt mod is suitable for three XFX graphics cards using their own PCB design — GeForce 8800 GT, GeForce 8800 GS and GeForce 9600 GSO.
Video card XFX GeForce 8800 GT 625Mhz PCI-E 2.0 512Mb 1800Mhz 256 bit 2xDVI TV HDCP YPrPb . fifty . one . . . What you need to know about video cards for your computer.
instructionlifestyle .weebly .com › blog › xfx-pci-e-geforce-8800gt-512mb-256bit
Product Brief NVIDIA GeForce 8800 GT 512MB GDDR3 VRAM You might also like . The code name for the GPU. Reviews XFX Radeon R9 FURY X 1050Mhz PCI-E 3.0 4096Mb 1000Mhz 4096 bit HDMI HDCP .
Video card for laptop geforce mobile gt 210 7 . . . Video card GeForce 8800 GTS. 4.5. . . . Working video card PC-Express XFX Nvidia Geforce 8600GT 256. . .
Video card ASUS GeForce 8800 GS 384Mb . . . . Video card GT 440, 1 Gb DDR5, 128bit, DVI/VGA/HDMI . Computers and parts » Parts and accessories . . . Set a password for your account. . .
Compatible Graphics Cards With Asus Motherboard
Amd Radeon 530 Specifications Graphics Card
I5 Without Graphics Card
Graphics Card Rtx 2080 Ti Price
Graphics Card Gigabyte Geforce Rtx 2080 Price
Graphics Card 2070 Super Gaming X
Power Adapter Sata0 51 Gaming 55 Graphics Card
Cost of Video Card 1080ti
Video Card Radeon Hd 6800 Characteristics
Video Card Msi N560gtx Ti 1gb
Graphics Video Card Nvidia
Buy Video Card Geforce Msi 3070
I5 Video Card
Gigabyte Line of Video Cards
Video Card Nvidia Geforce 980
Video Card Kfa2 1060 6gb
Video Card For Computer Xfx Geforce 8800 Gt
Geforce 8800 gts specifications.
Video cards. Game Tests: F.E.A.R. Extraction Point
Over a year that has passed since the release of video cards based on NVIDIA GeForce 8800 chips, the situation on the graphics accelerator market has been extremely unfavorable for the end customer. In fact, an overclocker who could pay a tidy sum of money for top-end video cards simply had no alternative. A competitor from ATI(AMD) appeared later and, in the end, could not compete with the GeForce 8800 GTX, and later the Ultra version of NVIDIA GeForce 8800. Therefore, NVIDIA marketers easily realized that in the absence of competition, reduce the cost of top video cards are not required. As a result, throughout this period, the prices for the GeForce 8800 GTX and Ultra remained at the same very high level, and only a few could afford such video cards.
However, the upper price segment has never been a defining and priority for manufacturers of graphic chips and video cards. Yes, leadership in this class is certainly prestigious for any company, but from an economic point of view, the average price range is the most profitable. Nevertheless, as recent tests of AMD Radeon HD 3850 and 3870, which claim to dominate the middle class, have shown, the performance of such video cards is unsatisfactory for modern games and, in principle, unacceptable for their quality modes. The NVIDIA GeForce 8800 GT is faster than the pair, but also falls short of being comfortable in DirectX 10 games. What is next after him, if there is an opportunity to pay extra? Until yesterday, in fact, there was nothing, since there is literally an abyss in terms of price between GT and GTX and that’s it.
But technical progress does not stand still — the appearance of the new NVIDIA G92 chip, manufactured using 65-nm technology, allowed the company not only to attract overclockers with a quite successful video card GeForce 8800 GT, but also yesterday, December 11 at 17:00 Moscow time, to announce new — GeForce 8800 GTS 512 MB
. Despite the quite uncomplicated name of the video card, the new graphics accelerator has a number of significant differences from the regular version of the GeForce 8800 GTS. In today’s material, we will get acquainted with one of the first GeForce 8800 GTS 512 MB video cards that appear on the Russian market, check its temperature regime and overclocking potential, and, of course, study the performance of the new product.
The technical characteristics of the novelty are presented in the following table in comparison with NVIDIA video cards of the GeForce 8800 family:
NVIDIA GeForce 8800 GT 8800 GTS 8800 GTS
GTX / Ultra
GPU G92 (TSMC) G80 (TSMC) G92 (TSMC) G80 (TSMC) Process technology, nm 65 (low k) 90 (low k) 65 (low k) 90 (low k) Core area, sq. mm 330 484 330 484 Number of transistors, mln. 754 681 754 681 GPU frequency, MHz 600
575 / 612
(1350 / 1500
Effective operating frequency of
video memory, MHz
1800 1584 1940 1800 / 2160 Memory size, MB 256 / 512 320 / 640 512 768 Supported memory type GDDR3 Memory exchange bus width, bit 256
(4 x 64)
(4 x 64)
384 Interface PCI-Express
x16 (v2. 0)
Number of unified shader processors
112 96 128 Number of texture units, pcs. 56 (28) 24 64 (32) 32 Number of rasterization units (ROP’s), pcs. 16 20 16 24 Support for Pixel Shaders / Vertex version
4.0 / 4.0 Video memory bandwidth, Gb/s ~57.6 ~61.9 ~62. 1 ~86.4 / ~103.7 fills, Gpix./sec ~9.6 ~10.3 ~10.4 ~13.8 / ~14.7 Theoretical maximum texture sampling rate
~33.6 ~24.0 ~41.6 ~36.8 / ~39.2 Peak power consumption in
3D mode, Watt
~106 ~180 PSU Wattage Requirements,
~400 ~400 ~400 ~450 / ~550 Reference design
graphics card dimensions, mm (L x H x D)
220 x 100 x 15 228 x 100 x 39 220 x 100 x 32 270 x 100 x 38 Outlets 2 x DVI-I
2 x DVI-I
2 x DVI-I
2 x DVI-I
Optional SLI support Recommended cost, USD 199 / 249 349 ~ 399 299~349 499~599 / 699
The latest video card from a company well known to overclockers comes in a very compact box, decorated in dark colors.
Update: we decided to supplement the initial review with additional theoretical information, comparison tables, and test results from the American THG laboratory, where the «younger» GeForce 8800 GTS also participated. In the updated article you will also find quality tests.
GeForce 8800 GTX is head and shoulders above the competition.
You’ve probably heard of DirectX 10 and the wonders the new API promises over DX9. On the Internet, you can find screenshots of games that are still in development. But until now, there were no video cards with DX10 support on the market. And nVidia was the first to fix this shortcoming. Let’s welcome the release of DirectX 10 graphics cards in the form of nVidia GeForce 8800 GTX and 8800 GTS!
A single unified architecture will be able to squeeze more out of shader units, since they can now be used more efficiently than with a fixed layout. A new era in computer graphics is opened by the GeForce 8800 GTX with 128 unified shaders and the GeForce 8800 GTS with 96 such blocks. The days of pixel pipelines are finally over. But let’s take a closer look at the new cards.
The backing shows 80 graphics cores. The new GPU promises to deliver twice the performance of the GeForce 7900 GTX (G71). 681 million transistors translates to a huge die area, but when we asked about it, nVidia CEO Jen-Hsun Huang replied: «If my engineers said they could double the performance by doubling the die area, I would even no doubt!»
Experience has shown that doubling the area does not double the performance, but NVIDIA seems to have found the right balance between technological advances and silicon implementation.
GeForce 8800 GTX and 8800 GTS fully comply with the DX10 and Shader Model 4.0 standard, various data storage and transmission standards, support geometry shaders and stream output (Stream Out). How did nVidia implement all this?
Let’s start with the fact that nVidia has moved away from the fixed design that the industry has been using for the last 20 years in favor of a unified shader core.
Earlier, we showed similar slides illustrating the trend of increasing the power of pixel shaders. Nvidia is well aware of this trend and is moving towards balancing computing needs by implementing unified shaders through which data flows. This gives maximum efficiency and productivity.
nVidia states: «The GeForce 8800 development team was well aware that high-end DirectX 10 3D games would require a lot of hardware power to compute shaders. Although DirectX 10 specifies a unified instruction set, the standard does not require a unified GPU shader design. But the GeForce 8800 engineers believed that it was the unified shader architecture of the GPU that would effectively distribute the load of DirectX 10 shader programs, improving the architectural efficiency of the GPU and properly distributing the available power.»
GeForce 8800 GTX | 128 SIMD stream processors
The processor core operates at 575 MHz for the GeForce 8800 GTX and at 500 MHz for the GeForce 8800 GTS. If the rest of the core runs at 575 MHz (or 500 MHz), then the shader core uses its own clock generator. The GeForce 8800 GTX runs at 1350 GHz, while the 8800 GTS runs at 1200 GHz.
Each core shader element is called a Streaming Processor. The GeForce 8800 GTX uses 16 blocks of eight such elements. As a result, we get 128 stream processors. Similar to the design of the ATi R580 and R580+ where pixel shader blocks are present, Nvidia plans to both add and remove blocks in the future. Actually, this is what we have been observing since 96 stream processors in the GeForce 8800 GTS.
Click on the picture to enlarge.
GeForce 8800 GTX | specification comparison table
nVidia hasn’t been able to do full-screen anti-aliasing with HDR lighting at the same time before, but that’s history. Each raster operations unit (ROP) supports framebuffer blending. Thus, with multisampling antialiasing, both FP16 and FP32 render targets can be used. Under D3D10 color and Z acceleration, up to eight multiple render targets can be used in ROPs, as well as new compression technologies.
GeForce 8800 GTX can fill 64 textures per clock, and at 575 MHz we get 36.8 billion textures per second (GeForce 8800 GTS = 32 billion/s). The GeForce 8800 GTX has 24 raster operations units (ROPs) and when the card is running at 575 MHz, the peak pixel fill rate is 13.8 gigapixels/s. The GeForce 880GTS version has 20 ROPs and a peak fill rate of 10 gigapixels/s at 500 MHz.
nVidia GeForce Specifications
Process (nm) 90 90 90 90 110 110 Core G80 G80 G71 G71 G70 G70 Number of GPUs 1 1 2 1 1 1 Transistors per core (millions) 681 681 278 278 302 302 Vertex block frequency (MHz) 1350 1200 500 700 550 470 Core frequency (MHz) 575 500 500 650 550 430 Memory frequency (MHz) 900 600 600 800 850 600 Effective memory frequency (MHz) 1800 1200 1200 1600 1700 1200 Number of vertex blocks 128 96 16 8 8 8 Number of pixel blocks 128 96 48 24 24 24 ROP number 24 20 32 16 16 16 Memory bus width (bits) 384 320 256 256 256 256 GPU Memory (MB) 768 640 512 512 512 256 GPU memory bandwidth (GB/s) 86. 4 48 38.4 51.2 54.4 38.4 Vertices/sec (millions) 10 800 7200 2000 1400 1100 940 Pixel Bandwidth (ROP x frequency, GPS) 13.8 10 16 10. 4 8.8 6.88 Texture Bandwidth (number of pixel pipelines x frequency, in GPS) 36.8 32 24 15.6 13.2 10.32 RAMDAC (MHz) 400 400 400 400 400 400 Tire PCI Express PCI Express PCI Express PCI Express PCI Express PCI Express
Pay attention to the width of the memory bus. Looking at the diagram on the previous page, the GeForce 8800 GTX GPU uses six memory sections. Each of them is equipped with a 64-bit memory interface, which gives a total width of 384 bits. 768 MB of GDDR3 memory is connected to the memory subsystem, which is built on a high-speed cross-switch, like the GeForce 7x GPU. This cross switch supports DDR1, DDR2, DDR3, GDDR3 and GDDR4 memory.
GeForce 8800 GTX uses GDDR3 memory at 900 MHz by default (GTS version runs at 800 MHz). With a width of 384 bits (48 bytes) and a frequency of 900 MHz (1800 MHz effective DDR frequency), the throughput is a whopping 86.4 GB/s. And 768 MB of memory allows you to store much more complex models and textures, with higher resolution and quality.
GeForce 8800 GTX | Nvidia knocks out ATI
Click on the picture to enlarge.
We have good news and bad news. The good ones are faster than the fast ones, they are very quiet and they have so many interesting technical things that there is not even software for yet. The bad news is they are not for sale. Well, yes, there is always something wrong with the new hardware. Sparkle sells such cards for 635 euros. We are already starting to get used to such prices for top-end hardware.
The board is 27 centimeters long, so you won’t be able to install it in every case. If your computer has hard drives located directly behind the PCIe slots, then most likely installing a GeForce 8800 GTX will be a difficult task. Of course, disks can always be moved to a 5-inch bay through an adapter, but you must admit that there is little pleasant in the problem itself.
Click on the picture to enlarge.
Don’t laugh at the technical implementation. it’s the best piece of hardware you can buy as a Christmas present for your PC. Why has the GeForce 8800 GTX garnered so much attention from the Internet community? Elementary — it’s about record performance. So, in Half-Life 2: Episode 1, the number of frames per second on the GeForce 8800 GTX at a resolution of 1600×1200 is as much as 60 percent higher than in the top Radeon X1000 family (1900 XTX and X1950 XTX).
Oblivion runs incredibly smoothly at all levels. More precisely, with HDR rendering enabled in Oblivion, the speed is at least 30 fps. In Titan Quest, you can’t see less than 55 fps. Sometimes you wonder if the benchmark is hanging, or maybe something happened to the levels. Turning on full-screen anti-aliasing and anisotropic filtering has no effect on the GeForce 8800 at all. Radeon X19 only50 XTX in CrossFire pair mode catches up with 8800 GTX here and there. So if you were asking on which card Gothic 3, Dark Messiah and Oblivion don’t slow down, then here is the answer — you have a GeForce 8800 GTX.
GeForce 8800 GTX | Two power sockets
Power is supplied to the board through two slots on top of the board. Both are necessary — if you remove the cable from the left one, then 3D performance will be greatly reduced. Do you want the neighbors to go crazy? Then take out the right one — the crazy squeak that starts to be heard from the board will be envied by your car alarm. The board itself will not turn on at all. Note that nVidia recommends using a power supply with at least 450 watts with the GeForce 8800 GTX, and that 30 amps can be on the 12 volt line.
GeForce 8800 GTX must have both power sockets connected. Click on the picture to enlarge.
The two power sockets are easy to explain. According to the PCI Express specifications, no more than 75 watts of power can fall on one PCIe slot. Our test only in 2D mode consumes about 180 watts. That’s a whopping 40 watts more than the Radeon X1900 XTX or X1950 XTX. Well, in 3D mode the board «eats» about 309 watts. The same Radeon X1900/1950 XTX in this case consume from 285 to 315 watts. We don’t understand what needs the GeForce 8800 puts so much energy into when working in simple Windows.
Two more slots are reserved for SLI mode. According to nVidia’s documentation, SLI only requires one plug. The second one is not used yet. Theoretically, having two connectors, you can connect more than two in a multi-board system. The appearance of the second connector can also be linked to the growing popularity of the hardware calculation of physics. Maybe another video card will be connected through it in order to count the physical functions in the game engine. Or maybe we are talking about Quad SLI on 4 boards, or something like that.
An additional slot is now reserved for SLI. But with the current version of the driver, you can use only one. Click on the picture to enlarge.
GeForce 8800 GTX | Quiet cooling system
The GeForce 8800 features a very quiet 80mm turbo cooler. Like the Radeon X1950 XTX, it’s located at the very end of the board to force cool air across the entire surface of the GeForce 8800 and out. A special grille is installed at the end of the board, which releases hot air not only outside, through the hole that occupies the second slot, but also down, right into the case. In general, the system is simple, but there are a number of controversial points.
Warm air is expelled through the hole in the second slot, but some of it gets back into the case through the grille on the side of the GeForce 8800 GTX. Click on the picture to enlarge.
If the PCIe slots in your computer are close, and in SLI the two boards will stand up so that the gap between them is not too large, then the temperature in this place will be very decent. The bottom card will be additionally heated by the top one, through the same side grille on the cooling system. Well, what will happen if you install three cards, it’s better not to even think about it. Get an excellent household electric heater. In cold weather, you will work near an open window.
When the board is installed alone, the cooling system is impressive and works out to the fullest. Like the GeForce 7900 GTX boards, it also works quietly. During the entire six-hour test run, at a constant high load, the board was not heard even once. Even if the board is fully loaded with work, the cooler at medium speeds will cope with heat removal. If you put your ear close to the back of the computer, you will only hear a slight noise, a kind of soft rustling.
The 80mm cooler runs quietly and never runs to full capacity. The board’s cooling system occupies two slots. Click on the picture to enlarge.
The special ForceWare 96.94 driver that nVidia prepared for the GeForce 8800 GTX does not output temperature monitoring data. Prior to this release, you could choose between the classic and new interfaces, but the 96.94 press release contains only the new version of the settings panel. If you try to open the frequency and temperature settings, the driver will take you to the nVidia website so you can download the Ntune utility. It is in it that these functions are configured. Download the 30 MB archive and install it. At the first start, we get a complete freeze of the computer and Windows.
After installing Ntune, if you choose to adjust the frequencies and temperature in the settings panel, a special information page opens, where the motherboard settings are indicated. You will not be able to find any settings, that is, information about frequency and temperature. Therefore, we measured the temperature in the classical way — using an infrared thermometer. When fully loaded, the measurements showed a temperature of 71 degrees Celsius, when working in 2D mode, the card was kept within 52 to 54 degrees.
Let’s hope that nVidia will release a standard version of ForceWare for the GeForce 8800. The classic configuration interface is sometimes more convenient, besides, temperature information is displayed, and coolbits can be used to adjust frequencies. The new driver paired with Ntune takes about 100 megabytes and is segmented into a considerable number of tabs and windows. Working with him is not always convenient.
The GeForce 8800 GTX chip has as many as 681 million transistors and is manufactured using 90 nanometer technology at the TSMC factory. Click on the picture to enlarge.
The G80 GeForce 8800 GTX has 681 million transistors. This is twice as much as in the Conroe core of Intel Core 2 Duo processors or in the GeForce 7 chip. The video card’s GPU operates at a frequency of 575 MHz. The memory interface is 384-bit and serves 768 megabytes. For memory, nVidia used high-speed GDDR3, which operates at a frequency of 900 MHz.
For comparison: GeForce 7900 GTX memory runs at 800 MHz, and GeForce 7950 GT at 700 MHz. The Radeon X1950 XTX graphics cards use GDDR4 memory at 1000 MHz. The GeForce 8800 GTS card has a core frequency of 500 MHz, a memory capacity of 640 MB with a frequency of 800 MHz.
The test results show that full-screen anti-aliasing and anisotropic filtering, finally, do not reduce performance in any way when turned on. In resource-intensive games like Oblivion, you used to have to keep an eye on this, but now you can turn everything on to the maximum. The performance of previous nVidia is such that these games only run smoothly at resolutions up to 1024×768, while HDR rendering with pixel shaders of the third version took a huge amount of resources. Video cards are so powerful that enabling 4xAA and 8xAF allows you to play at resolutions up to 1600×1200 without any problems. The G80 chip supports maximum anti-aliasing settings of 16x and anisotropic filtering of 16x.
GeForce 8800 GTX supports 16x anti-aliasing and anisotropic filtering.
Compared to single ATi, the GeForce 8800 GTX has no competitors. New nVidia can now pull out HDR rendering using third shaders and anti-aliasing. HDR rendering allows for extreme reflections and glare, simulating the effect of blinding when you step out of a dark room into bright light. Unfortunately, many older games — Half Life Episode 1, Neef For Speed Most Wanted, Spellforce 2, Dark Messiah and others — only use second shaders for HDR effects. Newer games like Gothic 3 or Neverwinter Nights 2 use the previous Bloom method, as they did in Black & White 2. And while Neverwinter Nights 2 can be configured to support HDR rendering, the developer is wary of these features so that even those with normal FPS can play who has the usual hardware installed. This is done right in Oblivion, which has both Bloom and outstanding HDR rendering effects through third shaders.
Supports the fourth shaders (Shader Model 4.0), and the most important innovation is the changed architecture of the rendering pipeline. It is no longer divided into pixel and vertex shaders. The new shader core can process all data — vertex, pixel, geometry and even physical. This did not hurt performance — Oblivion runs almost twice as fast as on a pixel-optimized Radeon X1900 XTX or X1950 XTX.
What the video card supports in terms of DirectX 10 cannot be tested yet. Windows Vista, Direct X 10 and games for it do not yet exist. However, on paper, everything looks more than decent: geometry shaders support displacement maps (Displacement Mapping), which will allow you to display even more realistic things, such as rendering stereoscopic effects, trough-shaped objects and corrugated surfaces. Stream Output will allow you to get even better shader effects for particles and physics. The technology of quantum effects (Quantum Effect) copes well with the calculation of the effects of smoke, fog, fire and explosions, and will allow you to remove their calculation from the CPU. All this together will result in significantly more shader and physics effects that can be seen in future games. How all this will be implemented in practice, in what games and in what form, the future will show.
GeForce 8800 GTX | Boards in the test
Video cards on nVidia
Vers. /Pix. shaders
nVidia GeForce 8800 GTX G80 768MB GDDR3 Yes 4.0 575 MHz 1800 MHz Asus + Gigabyte GeForce 7900 GTX SLI G71 512MB GDDR3 Yes 3. 0/3.0 650 MHz 1600 MHz Gigabyte GeForce 7900 GTX G71 512 MB GDDR3 Yes 3.0/3.0 650 MHz nVidia GeForce 7950 GT G71 512MB GDDR3 Yes 3.0/3.0 550 MHz 1400 MHz Asus GeForce 7900 GT Top G71 256MB GDDR3 Yes 3. 0/3.0 520 MHz 1440 MHz nVidia GeForce 7900GS G71 256MB GDDR3 Yes 3.0/3.0 450 MHz 1320 MHz Asus GeForce 7800 GTX EX G70 256MB GDDR3 Yes 3.0/3.0 430 MHz 1200 MHz Gigabyte GeForce 7800 GT G70 256MB GDDR3 Yes 3. 0/3.0 400 MHz 1000 MHz Asus GeForce 7600 GT G73 256MB GDDR3 Yes 3.0/3.0 560 MHz 1400 MHz nVidia GeForce 6800 GT NV45 256MB GDDR3 Yes 3.0/3.0 350 MHz 1000 MHz Gainward GeForce 7800 GS+ GSa AGP G71 512MB GDDR3 Yes 3. 0/3.0 450 MHz 1250 MHz
The following table lists the ATi we tested.
Video cards based on ATi
Video card and chip
Club 3D + Club 3D Radeon X1950 XTX CF R580+ 512MB GDDR4 Yes 3. 0/3.0 648 MHz 1998 MHz Club 3D Radeon X1950 XTX R580+ 512MB GDDR4 Yes 3.0/3.0 648 MHz 1998 MHz HIS + HIS Radeon X1900 XTX CF R580 512MB GDDR3 Yes 3.0/3.0 621 MHz 1440 MHz Gigabyte Radeon X1900 XTX R580 512MB GDDR3 Yes 3. 0/3.0 648 MHz 1548 MHz Power Color Radeon X1900 XT R580 512MB GDDR3 Yes 3.0/3.0 621 MHz 1440 MHz ATI Radeon X1900XT R580 256MB GDDR3 Yes 3.0/3.0 621 MHz 1440 MHz Sapphire Radeon X1900 GT R580 256MB GDDR3 Yes 3. 0/3.0 574 MHz 1188 MHz HIS Radeon X1650 Pro Turbo RV535 256MB GDDR3 Yes 3.0/3.0 621 MHz 1386 MHz Gecube Radeon X1300 XT RV530 256MB GDDR3 Yes 3.0/3.0 560 MHz 1386 MHz
GeForce 8800 GTX | Test configuration
We used three reference stands for testing. All of them were based on extremely identical components — a dual-core AMD Athlon 64 FX-60 processor with a frequency of 2.61 GHz, were equipped with 2 gigabytes of Mushkin MB HP 3200 2-3-2 RAM, two 120 GB Hitachi hard drives in a RAID 0 configuration. The difference was in the motherboards used — for tests of single and nVidia motherboards in SLI mode, we used the Asus A8N32-SLI Deluxe motherboard. To test video cards in CrossFire mode (this is indicated by the abbreviation CF
in the graphs below), we used the same computer with an ATi reference motherboard based on the RD580 chipset. Finally, AGP video cards were tested on a computer in the same configuration, but on an ASUS AV8 Deluxe motherboard. The configuration data is summarized in a table.
For all nVidia graphics cards (including SLI) and single Ati cards
Processor Bus frequency 200 MHz Motherboard Asus A8N32-SLI Deluxe Chipset nVidia nForce4 Memory Hard disk Hitachi 2 x 120 GB SATA, 8 MB cache DVD Gigabyte GO-D1600C LAN controller marvel Sound controller Realtek AC97 Power supply Silverstone SST-ST56ZF 560W For tests of ATi video cards in CrossFire mode
Processor Dual-core AMD Athlon 64 FX-60 2. 61 GHz Bus frequency 200 MHz Motherboard Reference ATi Chipset ATI RD580 Memory Mushkin 2×1024 MB HP 3200 2-3-2 LAN controller marvel Sound controller AC97 For testing AGP graphics cards
Processor Dual-core AMD Athlon 64 FX-60 2. 61 GHz Bus frequency 200 MHz Motherboard Asus AV8 Deluxe Chipset VIA K8T800 Pro Memory Mushkin 2×1024 MB HP 3200 2-3-2 LAN controller marvel Sound controller Realtek AC97
We used Windows XP Professional with SP1a to test single and nVidia cards in SLI mode. CrossFire boards and AGP video cards were tested on systems with Windows XP Professional SP2 installed. Driver and software versions are summarized in the following table.
Drivers and configuration
ATi graphics cards ATI Catalyst 6.6, X1900 XTX, X1950 + Crossfire, X1650 + Crossfire, X1300 XT + Crossfire, Crossfire X1900, Crossfire X1600 XT ATI Catalyst 6.7 (entspricht Catalyst 6.8), Crossfire X1600 Pro, Crossfire X1300 Pro, Crossfire X1300 ATI Catalyst 6.8 nVidia graphics cards nVidia Forceware 91.31, 7900 GS, nVidia Forceware 91.47, 7950 GT nVidia Forceware 91.47 (Special), 8800 GTX nVidia Forceware 96. 94 (Special) Operating system Single cards and SLI: Windows XP Pro SP1a, ATI Crossfire, and Windows XP Pro SP2 AGP graphics cards DirectX Version 9.0c Chipset driver nVidia Nforce4 6.86, AGP VIA Hyperion Pro V509A
GeForce 8800 GTX | Test results
We received the reference board from nVidia directly to THG. For testing, we were provided with a special ForceWare 9 driver6.94 prepared exclusively for the press. is a card that supports DirectX 10 and Shader Model 4.0. Performance in DirectX 9 and Pixelshader 2 or Pixelshader 3 applications is staggering.
Enabling anti-aliasing and anisotropic filtering has almost no performance impact. In Half-Life 2 Episode 1, the GeForce 8800 GTX video card cannot be slowed down. At 1600×1200 the chip is 60 percent faster than the Radeon X1950 XTX, in Oblivion the performance is twice that of the Radeon X1900 XTX or X1950 XTX. In Prey, the graphics card at 1600×1200 is a whopping 55 percent faster than the Radeon X1950 XTX. In Titan Quest, the frames per second does not change, no matter what resolution you set, and is 55 FPS.
In tests of Half-Life 2: Episode 1 with HDR rendering, the board’s results are impressive, but at low resolutions it loses to Radeon X1950 XTX and boards in CrossFire mode, being approximately on the same level with SLI solutions on the GeForce 7900 GTX. Note that at low resolutions, the video card is not the limiting factor. The higher we turn up the settings, the more interesting the result.
With the inclusion of anti-aliasing and anisotropic filtering, the picture begins to change. All boards lose some performance, but the GeForce 8800 GTX drops very slightly — by only 10 fps on average, while the dual ATi Radeon X1950 XTX loses as much as 20 fps in CrossFire mode.
As soon as we step over 1280×1024 with anti-aliasing and anisotropic filtering turned on, the single GeForce 8800 GTX board takes the lead. Performance exceeds that of the Radeon X1950 XTX at almost 35 fps. This is a significant difference.
More is more. At 1600×1200 with anti-aliasing and anisotropic filtering, the gap from all other boards becomes a nightmare. Twice as much as the GeForce 7900 GTX SLI and slightly less than the CrossFire Radeon X1950 XTX. Here it is yea!
Finally, let’s look at the dynamics of FPS decline with increasing resolution and image quality. We can see that the GeForce 8800 GTX has a slight performance drop — from bare 1024×768 to smoothed and anisotropy-filtered 1600×1200, the difference is just over 20 fps. Previously, the top solutions from ATi and nVidia go way back.
Hard Truck: Apocalypse is demanding on both the graphics card and the CPU. This explains the virtually identical performance at 1024×768 when simple trilinear filtering is used and full-screen anti-aliasing is turned off.
As soon as you switch to 4xAA and 8x anisotropic filtering the results start to vary. The «younger» cards significantly lose their performance, but they don’t seem to notice an improvement in the picture quality.
At 1280×960 the difference increases even more, but the GeForce 8800 GTX demonstrates the same results. It is clearly seen that the Athlon 64 FX-60 is not capable of bringing this video card to its knees.
At 1600×1200 all single boards’ performance tends to be unplayable. But the GeForce 8800 GTX showed 51 fps, as it does.
Consider the decrease in performance with increasing settings. CrossFire Radeon X19The 50 XTX and the GeForce 7900 GTX stick around, and the older generation cards have long been on their knees and begging for mercy.
In Oblivion, a game that loads the graphics card to the limit, the picture is initially depressing for all boards except for the Radeon X1950 XTX in CrossFire and . We have collected statistics on the work of video cards in open locations, and when rendering indoors. It can be seen that in the open air the GeForce 8800 GTX stands next to or slightly behind the dual Radeon X1950 XTX.
But when the resolution reaches 1600×1200, our GeForce 8800 GTX goes far forward. The gap is especially visible at closed levels.
Look at the decrease in performance as resolution and quality increase. The picture does not need comments. In closed locations, the speed is unshakable.
In the Prey game, the video card is between single-board ATi Radeon X1950 XTX solutions and the same boards in CrossFire mode. And the higher the resolution, the better the GeForce 8800 GTX looks.
Comparing the GeForce 8800 GTX with single-board solutions from ATi and nVidia is useless. The gap in high resolutions is huge, and at 1024×768 with anti-aliasing it is impressive.
In Rise of Nations: Rise of Legends, the graphics card is the only leader. If we calculate the gap between CrossFire Radeon X1950 XTX and GeForce 8800 GTX as a percentage, then the gap will be very, very large. If we count in fps, then the difference is not so noticeable, but still significant.
Note how the speed decreases with increasing resolution. At all settings, the GeForce 8800 GTX is a leader not only in comparison with single boards, but also with SLI/CrossFire solutions.
In Titan Quest, nVidia cards perform at their best. At the same time, fps does not change from 1024×768 to 1600×1200 with anti-aliasing and anisotropic filtering.
The picture of what is happening is well illustrated by the following graph. The performance of the GeForce 8800 GTX is at the same level regardless of the settings.
In 3DMark06 the card performs well with both the second and third shaders. Note how slightly the performance penalty is when both anisotropy and anti-aliasing are enabled.
The increase in resolution is also not scary. The card is on par with SLI and CrossFire solutions, well ahead of all previous leaders in a single run.
To give you a better idea of gaming performance, we’ve rearranged the graphics. There is no comparison here, only the pure result of one video card. It is worth paying attention to the fact that the performance of the GeForce 8800 GTX does not change from resolution to resolution. In all games, the limiting factor is the insufficiently fast AMD Athlon 64 FX-60 processor. In the future, with the release of much faster chips, the card will perform even better in the same games. We think that the latest generation Core 2 Quad is not able to force the GeForce 8800 GTX to reach its limit.
So, having finished with the test results, let’s try to rank the efficiency of video cards. To do this, we will collect together the results of all gaming tests and compare them with the price of the solution. We take the recommended prices as a basis, that is, without markups of specific stores. Of course, they will be very expensive at first, and many stores will charge excess profit into the price. But then the prices will drop, and you will probably be able to get a GeForce 8800 GTX for a more reasonable price pretty soon.
As we can see, GeForce 8800 GTX outperforms almost all solutions, including dual CrossFire and SLI. In absolute terms, the GeForce 8800 GTX is very fast. But what about the price?
The price is appropriate — the manufacturer asks for a fee of 635 euros. That’s a lot, but for two Radeon X1900 XTX boards in CrossFire mode, you’ll have to pay more — 700 euros. And for two Radeon X1950 XTX or SLI GeForce 7900 GTX as much as 750 euros. Despite the fact that in some tests a single GeForce 8800 GTX bypasses these solutions, and in the case it takes up less space in width, there is something to think about.
Finally, let’s divide fps by money. We see that this figure is better than that of SLI and CrossFire. Of course, the cost of each fps will be higher than that of the GeForce 7800 GTX EX, and, of course, noticeably higher than that of the Radeon X1300 XT. But the performance of the board is appropriate. A very effective solution in terms of price-performance ratio.
We decided to supplement our review with the test results of the American laboratory THG, where the GeForce 8800 GTS also participated. Please note that due to differences in the test configuration, you should not directly compare the results above with the results of the American laboratory.
The GeForce 8800GTX is longer than the Radeon X1950XTX and most other cards on the market. 8800GTS is somewhat shorter.
Like other graphics card tests in 2006, we tested on the AMD Athlon FX-60 platform. We will also show the results of multi-GPU configurations. In addition, let’s evaluate how new video cards behave when performance is limited by the CPU (low resolution and picture quality).
Processors AMD Athlon 64 FX-60, 2.6GHz, 1.0GHz HTT, 1MB L2 Cache Platform nVidia: Asus AN832-SLI Premium, nVidia nForce4 SLI, BIOS 1205 Memory Corsair CMX1024-4400Pro, 2x 1024MB DDR400 (CL3. 0-4-4-8) Hard disk Western Digital Raptor, WD1500ADFD, 150 GB, 10,000 rpm, 16 MB cache, SATA150 Network Embedded nForce4 Gigabit Ethernet Video cards ATi Radeon X1950XTX 512MB GDDR4, 650MHz core, 1000MHz memory (2.00GHz DDR)
nVidia cards: GHz DDR)
XFX GeForce 8800GTS 640 MB GDDR3, 500 MHz core, 1.200 GHz stream processors, 800 MHz memory (1.60 GHz DDR)
nVidia GeForce 7900GTX 512MB GDDR3, 675MHz core, 820MHz memory (1.64GHz DDR)
Power supply PC Power & Cooling Turbo-Cool 1000 W CPU cooler Zalman CNPS9700 LED System software and drivers
OS Microsoft Windows XP Professional 5. 10.2600 Service Pack 2 DirectX Version 9.0c (4.09.0000.0904) Graphics drivers ATi — Catalyst 6.10 WHQL
nVidia — ForceWare 96.94 Beta
During the first 3DMark run, we ran tests at all resolutions, but with FSAA and anisotropic filtering turned off. In the second run, we enabled the 4xAA and 8xAF image enhancement options.
nVidia is clearly in first place in 3DMark05. GeForce 8800 GTX gives the same result at 2048×1536 as ATi Radeon X1950 XTX at the default resolution of 1024×768. Impressive.
Doom 3 is usually dominated by nVidia cards as their designs are well suited to this game. But ATi not so long ago was able to «take» this game with new cards.
This is where we first encounter the limits of CPU processing power, since at low resolution the result is somewhere around 126 frames per second. The ATi card is capable of higher frames per second on a given system configuration. The reason lies in the drivers. The fact is that ATi releases drivers that load the CPU less. As a result, the CPU is in better conditions and can give more performance to the graphics subsystem.
Overall, the winners are the new 8800 cards. Looking at the results at all resolutions, the new DX10 cards outperform the Radeon X1950 XTX at 1280×1024 and up.
GeForce 8800 GTX and GTS | F.E.A.R.
F.E.A.R. nVidia cards usually lead the way. But, again, there is a noticeable lower load on the CPU for ATi drivers. Of course, the results will be different with a faster platform, but if your computer is not advanced, then this test clearly shows how the G80 cards will work on it. But apart from the test at 1024×768, the G80 just kills the Radeon X1950 XTX. GTX is a monster. And no matter what kind of load we give to the GeForce 8800 GTX, the card always delivers over 40 frames per second.
Click on the picture to enlarge.
The second screenshot (below) is taken on an 8800 GTX with the same settings.
Click on the picture to enlarge.
The nVidia picture is much better than the ATi screenshot. It looks like Nvidia is back in the lead in this regard. Before us is another advantage that nVidia cards based on the G80 chip have.
Here is a table of new quality enhancements on G80 cards.
In addition to the new DX10 graphics cards, nVidia has also revealed several features that will be available on the G80 cards. And the first of them is a patented image quality improvement technology called Coverage Sampled Antialiasing (CSAA).
New version of fullscreen anti-aliasing uses an area of 16 subsamples. According to nVidia, the result can be compressing «redundant color and depth information into memory and a bandwidth of four or eight multisamples. » The new quality level works more efficiently, reducing the amount of data per sample. If CSAA doesn’t work with any game, the driver will fall back to traditional anti-aliasing methods.
Click on the picture to enlarge.
Before we end this review, let’s talk about two more aspects of video cards that have been in development for a long time and will become more important over time. The first aspect is video playback. During the reign of the GeForce 7, the ATI Radeon X1900 cards led the way in terms of video playback quality. But the situation has changed with the advent of unified shaders with a dedicated Pure Video core.
Thanks to clever algorithms and 128 compute units, the GeForce 8800 GTX managed to score 128 out of 130 in HQV. In the near future, we plan to release a more detailed article regarding picture quality, so stay tuned to our website.
Finally, a very strong point of the G80 is what NVIDIA calls CUDA. For years, scientists and enthusiasts have been looking for ways to squeeze more performance out of powerful parallel processors. Cluster Beowulf, of course, not everyone can afford. Therefore, ordinary mortals offer different ways to use a video card for computing.
The problem here is that the GPU is good for parallel computing, but it doesn’t handle branching very well. This is where the CPU comes in handy. Also, if you want to use a graphics card, you should program shaders like game developers do. NVIDIA once again decided to take the lead by introducing the Compute Unified Device Architecture or CUDA.
This is how CUDA can work to simulate fluid.
nVidia has released a C+ compiler whose resulting programs can scale to GPU processing power (eg 96 stream processors in the 8800 GTS or 128 in the 8800 GTX). Now programmers have the opportunity to create programs that scale in terms of both CPU and GPU resources. CUDA will certainly appeal to various distributed computing programs. However, CUDA can be used not only to calculate blocks, but also to simulate other effects: bulk fluid, clothes and hair. Through CUDA on the GPU, you can potentially transfer the calculations of physics and even other game aspects.
Developers will be presented with the full set of SDKs.
GeForce 8800 GTX and GTS | Conclusion
Those who now upgrade from GeForce 6 to , will get almost a threefold increase in performance. It doesn’t matter when games for DirectX 10 come out, it doesn’t matter that we get fourth shaders — already today the GeForce 8800 GTX is the fastest chip. Games like Oblivion, Gothic 3 or Dark Messiah seemed to be waiting for the G80 chip and video cards. It became possible to play without brakes again. The GeForce 8800 GTX has enough power for all the latest games.
The cooling system is quiet, the 80mm cooler on the reference card was unheard of. Even at full load, the cooler rotation speed is low. I wonder how ATI will respond to this. In any case, nVidia has done a damn good job of releasing a really powerful piece of hardware.
Drawbacks: The board is 27 centimeters long and takes up the space of two PCI Express slots. The power supply must be at least 450 watts (12V, 30A). For the GeForce 8800 GTS, the minimum will be a 400 watt PSU with 30 amps on the 12 volt bus.
Following a long tradition, nVidia cards are now available online. On the international market, the recommended price for GeForce 8800GTX is $599, and for GeForce 8800GTS — $449. Yes, and games under DX10 should appear soon. But just as important, you will get a better picture in existing games.
This is what a DX10-bred supermodel might look like. Click on the picture to enlarge.
GeForce 8800 GTX and GTS | Editor’s opinion
I’m personally impressed with nVidia’s DX10/D3D10 implementation. Watching Crysis in real time and numerous demos is impressive. The implementation of CUDA allows you to turn a video card into something more than just a frame renderer. Now programs will be able to use not only the resources of the CPU, but also the full parallel power of the universal shader core of the GPU. Can’t wait to see these solutions in reality.
But the G80 leaves much to be desired. What? Of course, new games. Gentlemen developers, be so kind as to release DX10 games as soon as possible.
GeForce 8800 GTX | Photo gallery
To begin with, NVIDIA installed the G80 on 2 video cards: GeForce 8800 GTX and GeForce 8800 GTS.
GeForce 8800 Series Specifications GeForce 8800 GTX GeForce 8800 GTS Number of transistors 681 million 681 million Core clock (including allocator, texture units, ROP units) 575 MHz 500 MHz Shader frequency (stream processors) 1350 MHz 1200 MHz Number of shaders (stream processors) 128 96 Memory frequency 900 MHz (effective 1. 8 GHz) 800 MHz (effective 1.6 GHz) Memory interface 384 bit 320 bit Memory bandwidth (GB/s) 86.4 GB/s 64 GB/s Number of ROPs 24 20 Memory capacity 768 MB 640 MB
As you can see, the number of transistors in the GeForce 8800 GTX and 8800 GTS is the same, this is because they are absolutely identical G80 GPUs. As already mentioned, the main difference between these GPU options is 2 disabled banks of stream processors — a total of 32 shaders. At the same time, the number of working shader units has been reduced from 128 for the GeForce 8800 GTX to 96 for the GeForce 8800 GTS. NVIDIA also disabled 1 ROP (rasterization unit).
The core and memory frequencies of these video cards are also slightly different: the core frequency of the GeForce 8800 GTX is 575 MHz, while that of the GeForce 8800 GTS is 500 MHz. Shader units GTX operate at a frequency of 1350 MHz, GTS — 1200 MHz. With the GeForce 8800 GTS, NVIDIA also uses a narrower 320-bit memory interface and 640 MB of slower memory that runs at 800 MHz. GeForce 8800 GTX has 384-bit memory interface, 768 MB memory / 900 MHz. And, of course, a completely different price.
The video cards themselves are very different:
As you can see in these photos, the GeForce 8800 reference boards are black (a first for NVIDIA). With a cooling module GeForce 8800 GTX and 8800 GTS are two-slot. The GeForce 8800 GTX is slightly longer than the GeForce 8800 GTS: its length is 267 mm, versus 229 mm for the GeForce 8800 GTS, and, as previously stated, the GeForce 8800 GTX 2 PCIe power connector. Why 2? The maximum power consumption of the GeForce 8800 GTX is 177 W. However, NVIDIA says that this can only be an extreme case, when all the functional units of the GPU are loaded to the maximum, and in normal games during testing, the video card consumed an average of 116 — 120 W, maximum — 145 W.
Since each external PCIe power connector on the video card itself is rated for a maximum of 75W, and the PCIe slot is also rated for a maximum of 75W, 2 of these connectors will not be enough to supply 177W, so I had to make 2 external PCIe power connectors. By adding a second connector, NVIDIA provided the 8800 GTX with a solid headroom. By the way, the maximum power consumption of the 8800 GTS is 147 W, so it can get by with one PCIe power connector.
Another feature added to the design of the GeForce 8800 GTX reference board is a second SLI slot, a first for NVIDIA GPUs. NVIDIA doesn’t officially announce the purpose of the second SLI connector, but the journalists managed to get the following information from the developers: “The second SLI connector in the GeForce 8800 GTX is intended for hardware support of a possible expansion of the SLI configuration. Only one SLI connector is used with current drivers. Users can connect the SLI bridge to both the first and second contact groups.”
Based on this, and the fact that nForce 680i SLI motherboards come with three PCI Express (PEG) slots, we can conclude that NVIDIA plans to support three SLI video cards in the near future. Another option would be to increase the power for the SLI physics, but that doesn’t explain why the GeForce 8800 GTS doesn’t have a second SLI connector.
It can be assumed that NVIDIA reserves its GX2 “Quad SLI” technology for the less powerful GeForce 8800 GTS, while the more powerful GeForce 8800 GTX will run in a triple SLI configuration.
If you remember, NVIDIA’s original Quad SLI video cards are closer in performance to GeForce 7900 GT than GeForce 7900 GTX, as 7900 GT video cards have lower power consumption/heat dissipation. It is natural to assume that NVIDIA will follow the same path with the GeForce 8800. Gamers with motherboards with three PEG slots will be able to increase the performance of the graphics subsystem by assembling a triple SLI 8800 GTX configuration, which in some cases will give them better performance than Quad SLI system, judging by the characteristics of the 8800 GTS.
Again, this is just a guess.
The cooling block of the GeForce 8800 GTS and 8800 GTX is double-slot, ducted, venting hot air from the GPU outside the computer case. The heatsink consists of a large aluminum heatsink, copper and aluminum heatpipes, and a copper plate that presses against the GPU. This whole structure is blown by a large radial-type fan, which looks a little intimidating, but is actually quite quiet. The cooling system of the 8800 GTX is similar to that of the 8800 GTS, only the former has a slightly longer heatsink.
In general, the new cooler does a pretty good job of cooling the GPU, and at the same time it is almost silent — like the GeForce 7900 GTX and 7800 GTX 512MB video cards, but the GeForce 8800 GTS and 8800 GTX are a little more audible. In some cases, to hear the noise from the video card fan, you will need to listen well.
All production of the GeForce 8800 GTX and 8800 GTS is carried out under an NVIDIA contract. This means that whether you buy a video card from ASUS, EVGA, PNY, XFX or any other manufacturer, they are all made by the same company. NVIDIA does not even allow manufacturers to overclock the first batches of GeForce 8800 GTX and GTS video cards: they all go on sale with the same clock speeds regardless of the manufacturer. But they are allowed to install their own cooling systems.
For example, EVGA has already released its e-GeForce 8800 GTX ACS3 Edition with its unique ACS3 cooler. The ACS3 video card is hidden in a single large aluminum cocoon. It has the letters E-V-G-A on it. For additional cooling, EVGA placed an additional heatsink on the back of the graphics card, directly opposite the G80 GPU.
In addition to cooling, manufacturers of the first GeForce 8800 video cards can customize their products only with warranty obligations and bundles — games and accessories. For example, EVGA bundles its graphics cards with Dark Messiah, the GeForce 8800 GTS BFG comes with a BFG T-shirt and mouse pad.
It will be interesting to see what happens next — many NVIDIA partners believe that for future releases of GeForce 8800 video cards, NVIDIA’s restrictions will not be so strict, and they will be able to compete in overclocking.
Since all video cards come from the same pipeline, all GeForce 8800s support 2 dual-link DVI and HDCP connectors. In addition, it became known that NVIDIA does not plan to change the amount of memory in GeForce 8800 GTX and GTS (for example, 256 MB GeForce 8800 GTS or 512 MB 8800 GTX). At least for now, the standard configuration for the GeForce 8800 GTX is 768 MB, and the GeForce 8800 GTS is 640 MB. NVIDIA also has no plans to make an AGP version of GeForce 8800 GTX/GTS video cards.
NVIDIA has made several changes in the GeForce 8800 driver that we should definitely mention. First of all, the traditional overclocking utility Coolbits has been removed, replaced by NVIDIA nTune. That is, if you want to overclock the GeForce 8800 video card, you will need to download the nTune utility. This is probably good for owners of motherboards based on the nForce chipset, since the nTune utility can be used not only for overclocking a video card, but also for system configuration. Otherwise, those, for example, who managed to upgrade to Core 2 and have a motherboard with a chipset 975X or P965, you will need to download a 30MB application to overclock your graphics card.
Another change in the new driver that we noticed is that there is no option to switch to the classic NVIDIA control panel. Hopefully NVIDIA will bring this feature back into their video driver as it was well liked by many, unlike the new NVIDIA control panel interface.
Let’s get acquainted with the GeForce 8800GTS 512 boards, compare them with the cheaper GeForce 8800GT and the veteran GeForce 8800GTX. Along the way, we run a new test bench and collect flaws in drivers for DX10
With the release of a new series of GeForce 8800GTS 512 video cards, NVIDIA has significantly strengthened its positions. The new product replaced the more expensive, hotter and bulkier GeForce 8800GTX, and the only drawback compared to its predecessor was a narrower 256-bit memory bus (vs. . However, the novelty has undergone not only reductions, but also some improvements: the number of texture units has been increased from 32 to 64 pieces, which, of course, partly compensates for the simplifications in the map. Also, to compensate for the simplifications, the frequencies were increased compared to their predecessor, and the amount of video memory is easily expanded to 1 GB by simply installing larger-capacity chips, which, by the way, some manufacturers have already begun to do. But, despite the fact that the GeForce 8800GTS 512 replaced the GeForce 8800GTX, its main competitor is not its predecessor, but its closest relative GeForce 8800GT, and the whole point is in its lower price. Video cards GeForce 8800GTS 512 and GeForce 8800GT are not much different from each other, since GeForce 8800GT is a stripped-down version of GeForce 8800GTS 512 and, oddly enough, appeared on the market before the full-fledged version. Both video cards are equipped with 512 MB of video memory and, as today’s study showed, they have the same memory. The main differences lie in the GPU, and specifically, in the GT version, some of its functional blocks are disabled. See the table below for more details:
As you can see, GeForce 8800GT differs from its older sister in the number of universal processors reduced to 112 and the number of texture units reduced to 56. Initially, the cards also differ in clock speeds, but this does not matter for our today’s review, since almost all cards have been factory overclocked. Let’s find out how the differences on paper are reflected in reality.
Leadtek 8800GTS 512
Designers from Leadtek chose a bright orange color to attract
attention to their video card, and they were absolutely right: the new product does not
will go unnoticed.
The face of the novelty was the image of a scene from a fictional
«shooters», under which are located technical
graphics card specifications and bonus note — full version
Neverwinter Nights 2 games.
The reverse side of the box contains the specifications of the video card, a list of the package, and standard information from NVIDIA.
- S-video > S-video + component out splitter;
- DVI to D-sub adapter;
- CD with drivers;
- Power DVD 7 CD;
The Leadtek 8800GTS 512 graphics card is based on a reference design familiar
we have more on GeForce 8800GT boards. Outwardly, the novelty is distinguished
«two-story» cooling system, which, in
unlike its predecessor, it throws hot air out of the
computer. The advantages of such a solution are obvious, and the reason for using
advanced cooling system is most likely not something
that the “new” chip heats up more, but what
big money buyer has every right to get the best product.
After all, to be honest, the reference system of the GeForce 8800GT is not the best
in a way that fulfills its responsibilities.
The reverse sides of GeForce 8800GTS 512 and GeForce 8800GT look almost
identical and differ in that the 8800GTS 512 version has all elements
mounted. However, we can see the differences later on the example
video cards Leadtek 8800GT, but for now let’s get into the new product «under
Having removed the cooling system, we can again verify the identity of the boards.
However, pay attention to the right side of the board, where the
power subsystem. Where the GeForce 8800GT is empty and there are only
seats, Leadtek 8800GTS 512 has a densely populated space
radioelements. It turns out that the GeForce 8800GTS 512 has a more complex
power subsystem than GeForce 8800GT. Basically, it is not surprising
because the GeForce 8800GTS 512 has high operating frequencies, and,
consequently, more stringent requirements for food quality.
External differences between G92 chip in Leadtek 8800GTS 512 and G92 chip in
no GeForce 8800GT video cards.
The new video card uses the same Qimonda chips over time
1.0 ns sampling, as in the GeForce 8800GT. Set of eight chips
forms 512 MB of video memory. Rated frequency for similar chips
is 2000 MHz DDR, however the actual frequency set in
video card, a little lower.
Aluminum graphics card cooling system with copper plate.
This combination of two materials has been used for a long time and allows
achieve the desired efficiency with less weight and lower
The processing of the copper «core» is at a satisfactory level, but
After removing the casing from the cooling system, we see
amazing picture: they are engaged in heat removal from the copper base
as many as three heat pipes that go to different parts of the radiator from
aluminum plates. This scheme serves to evenly distribute
heat, and the large dimensions of the radiator should in the best way
affect the quality of cooling, which cannot be said about the reference system
cooling GeForce 8800GT. There are also three heat pipes, but their dimensions
noticeably smaller, as are the dimensions of the radiator itself.
Differences, overclocking and cooling efficiency
The differences from the GeForce 8800GT lie in the increased from 112 to 128
the number of universal processors, as well as the frequencies of operation
the entire GPU.
Leadtek 8800GTS 512 frequencies correspond to the recommended ones and are equal to
650/1625 MHz for the GPU and 1944 MHz for the video memory.
Now — about the heating of the video card, which we will check using
Oblivion games at max settings.
Video card Leadtek 8800GTS 512 heated up from 55 degrees in the state
rest up to 71 degrees, the noise from the fan was practically not
heard. However, this was not enough for overclocking, and with the help of
using the same Riva Tuner, we increased the fan speed up to 50%
from the possible maximum.
After that, the GPU temperature did not rise above 64
degrees while keeping the noise level low. video card
Leadtek 8800GTS 512 overclocked to 756/1890 MHz for GPU and 2100
MHz for video memory. Such high frequencies were not available for GeForce
8800GT, apparently, because of the simplified power system.
Well, let’s meet the next member of our
today’s testing — the ASUS EN8800GTS TOP video card.
ASUS EN8800GTS TOP
When looking at the packaging of powerful ASUS graphics cards,
you get the feeling that this is not a video card at all, but, for example,
motherboard. It’s all about large dimensions; for example in our
case, the size of the box is noticeably larger than that of the first participant
today’s testing. Large area on the front of the package
allowed to fit a large image of a branded archer girl and
a considerable chart showing a 7% faster speed compared to
«Regular» GeForce 8800GTS 512. Abbreviation
«TOP» in the name of the video card indicates that it
has been factory overclocked. The disadvantage of packaging is that it is not obvious
that the video card belongs to the GeForce 8800GTS 512 series, but, by and large
count, it’s the little things. At first, it surprises that there are too few on the box
information, however, the truth is revealed later, by itself, and in
It is worth taking the box by the handle, as at the first breath of the breeze it
opens like a book. The information under the cover is completely dedicated to
proprietary utilities from ASUS, in particular, ASUS Gamer OSD, which is now
can not only change the brightness / contrast / color in real
time, but also show the FPS value, as well as record video and
take screenshots. The second described utility called Smart
Doctor is designed to monitor the value of supply voltages and
frequencies of the video card, and also allows you to overclock it. It’s impossible not to say
about the fact that the ASUS proprietary utility can change two GPU frequencies, then
there is a core and a shader block. This brings her close to the famous
Riva Tuner utility.
The back of the box contains a bit of everything, in particular a short
description of the Video Security utility to be used
computer as a «smart» system
video surveillance online.
The complete set of the card is executed according to the principle “nothing
- adapter for powering PCI-express cards;
- S-video > component out adapter;
- DVI to D-sub adapter;
- bag for 16 discs;
- CD with drivers;
- Documentation CD;
- brief instructions for installing a video card.
Externally, the video card is almost an exact copy of Leadtek 8800GTS 512, and
this is not surprising: both cards are based on the reference design
and, most likely, produced at the same factory on order
NVIDIA itself, and only then sent to Leadtek and ASUS. In other words, today
a card from Leadtek could well become a card from ASUS, and vice versa.
It is clear that the reverse side of the video card is also nothing
differs from that of Leadtek 8800GTS 512, except that they have stickers
Under the cooling system is also nothing unusual. Power system in
the right side of the board is fully assembled, in the center — the graphics processor
G92 with 128 active stream processors and eight memory chips,
overall forming 512 MB.
The memory chips are manufactured by Qimonda and have an access time of 1.0 ns,
which corresponds to a frequency of 2000 MHz.
The appearance of the GPU does not reveal its noble
origin, like Leadtek 8800GTS 512.
ASUS EN8800GTS TOP video card cooling system is exactly the same as
for the video card Leadtek 8800GTS 512: it is mounted in an aluminum heatsink
copper «core» for heat dissipation from the GPU.
The polishing quality of the copper core is satisfactory, as in
The heat from the copper core is distributed over the aluminum fins when
three copper heat pipes. In the effectiveness of this solution, we
already convinced by the example of the first card.
Rated frequencies and acceleration
As we have already said, the prefix TOP after the name of the video card means
its factory overclock. The nominal frequencies of the novelty are 740/1780 MHz for the GPU
(against 650/1625 MHz for Leadtek) and 2072 MHz for video memory (against 1944
MHz from Leadtek). Note that for memory chips with an access time of 1.0
ns nominal clock frequency is 2000 MHz.
We managed to overclock the card to the same frequencies as Leadtek 8800GTS 512:
756/1890 MHz for GPU and 2100 MHz for video memory at speed
fan 50% of maximum.
Well, now let’s go down a step and get acquainted with two
video cards of the GeForce 8800GT class.
Video card Leadtek 8800GT is a typical representative of the series
GeForce 8800GT and, in fact, differs little from the majority. All salt
that GeForce 8800GT video cards are cheaper
«advanced» GeForce 8800GTS 512, so no
become less interesting.
The box of Leadtek 8800GT is almost the same as that of the more expensive 8800GTS
512. The differences are in the smaller thickness, the absence of a handle for
carrying and, of course, in the name of the video card. Inscription
«extreme» after the name of the video card refers to its
The back of the box contains brief information about the video card, its
advantages and equipment list. By the way, in our case
missing game Neverwinter Nights 2 and installation instructions
New items included:
- adapter for powering PCI-express cards;
- S-video > S-video + component out splitter;
- DVI to D-sub adapter;
- CD with drivers;
- Power DVD 7 CD;
- CD with the full version of Newervinter Nights 2;
- brief instructions for installing a video card.
The Leadtek 8800GT video card is made according to the reference design and looks different
only a sticker on the casing of the cooling system.
The reverse side of the video card also does not stand out, however, after
acquaintances with the video card GeForce 8800GTS 512 attract attention
the missing row of chip capacitors on the left side of the board.
Cooling system made to reference design and well known
us from previous reviews.
When examining the printed circuit board, the absence of
elements on the right side of the map, which, as we have seen,
mounted in version 8800GTS 512. Otherwise, quite ordinary
board with «cut» up to 112 stream processors
GPU G92 and eight memory chips in total
forming 512 MB.
Like the previous participants in today’s testing, the memory chips in
Leadtek 8800GT are manufactured by Qimonda and have an access time of 1.0
ns, which corresponds to 2000 MHz.
Rated frequencies and acceleration
As already mentioned, the Leadtek 8800GT video card has a standard factory overclock. Its nominal frequencies are 678/1700 MHz for the GPU and 2000 MHz for the video memory. Not bad,
however, despite such a considerable factory overclocking, the video card showed
not the best result with manual overclocking, only 713/1782 MHz for
GPU and 2100 MHz for video memory. Recall that
participants in previous reviews overclocked to frequencies of 740/1800 MHz for
video processor and 2000-2100 MHz for video memory. We also note that
we achieved this result at maximum fan speed
cooling systems, since, as we have already said, the reference system is
The GeForce 8800GT doesn’t cope with its duties in the best way.
Now let’s move on to the next participant in today’s test.
Palit 8800GT sonic
The face of the video card Palit 8800GT sonic is a fighting frog in a spectacular
decoration. Silly, but very funny! However, our life is
nonsense, and remembering this once again does not hurt at all. passing
from fun to business, you should pay attention to the lower right corner, where
there is a sticker indicating the frequencies of the video card and its other
characteristics. The frequencies of the novelty are almost the same as those of the GeForce 8800GTS
512: 650/1625 MHz for GPU and 1900 MHz for
video memory, which is only 44 MHz less than that of the 8800GTS 512.
The back of the box does not contain anything remarkable, because everything
interesting is located on the front side.
The package includes:
- adapter for powering PCI-express cards;
- S-video > component out adapter;
- S-video adapter > tulip;
- DVI to D-sub adapter;
- DVI to HDMI adapter;
- CD with drivers;
- CD with the full version of Tomb Raider The Legend;
- brief instructions for installing a video card.
It should be noted that this is the first video card of the GeForce 8800GT class with
adapter DVI > HDMI, which was in our test
laboratories; Previously, only a few were equipped with such an adapter.
AMD Radeon video cards.
And here is the first surprise! Palit 8800GT sonic graphics card is based on
printed circuit board of our own design and equipped with a proprietary system
The reverse side of the video card also has differences, but it is still difficult for us
judge the pros and cons of the new design. But about the installation of components
video card and its quality, we can fully judge.
Because the height of the standoffs between the GPU heatsink and
the board is less than the gap between them, and the radiator is mounted
screws without any damping pads, very curved itself
graphics chip board and substrate. Unfortunately, this can lead to
their deterioration, and the problem is not the strength of the textolite, from
which the board is made, and in the tracks, which, when tensioned, can
burst. However, it is not at all necessary that this will happen, but
manufacturer should have paid more attention to fixing systems
cooling on their video cards.
The cooling system is made of painted aluminum and consists of three
parts — for the graphics processor, video memory and
power subsystems. Heat sink base for GPU not
shines with any special treatment, and as a heat-conducting
interface applied solid gray mass.
Changes in the design of the printed circuit board affected the power subsystem, minor
elements were replaced with larger ones, their layout changed. AT
otherwise, we have a well-known GeForce 8800GT with a graphics
G9 processor2 and eight video memory chips, totaling 512
Like the other participants in today’s testing, memory chips
manufactured by Qimonda and have a 1.0 ns access time.
Cooling efficiency and overclocking
Check the efficiency of the proprietary cooling system used in
Palit 8800GT sonic, we will be using the game Oblivion with maximum
settings, however, as always.
The video card warmed up from 51 to 61 degrees, which, in general, is very good
result. However, the fan speed increased noticeably, in
as a result of which the already not quiet cooling system became distinctly
heard in general. Therefore, the video card from Palit is difficult
Recommend to lovers of silence.
Despite changes in the power subsystem and improved cooling,
video card Palit 8800GT sonic overclocked to the usual
frequencies 734/1782 MHz for the GPU and 2000 MHz for
So we finished getting acquainted with the participants of today’s test,
so let’s move on to reviewing the test results.
Testing and conclusions
Today’s testing differs not only in what we compare
four video cards among themselves, but also by the fact that we produced it on
different from the test bench you are familiar with, the configuration of which is as follows:
Change of the test platform due to the fact that initially
it was planned to test video cards Leadtek 8800GTS 512 and ASUS
EN8800GTS TOP in SLI mode, but, unfortunately, the ASUS graphics card
the end of the tests could not stand our bullying, and the idea collapsed. Therefore we decided
move SLI testing into a separate article as soon as we have the necessary
«iron», but for now we will limit ourselves to tests of single
video cards. We will compare seven video cards with each other, one of
which is overclocked to frequencies 756/1890/2100 MHz GeForce
8800GTS 512. GeForce 8800GT and
GeForce 8800GTX running on recommended by NVIDIA
frequencies. To make it easier for you to navigate, we present a table with
clock frequencies of all test participants:
video cards GPU clock, core/shader unit,
Effective video memory frequency, MHz Leadtek 8800GTS 512 650 / 1625 1944 ASUS EN8800GTS TOP 740 / 1780 2072 Leadtek 8800GT 678 / 1674 2000 Palit 8800GT 650 / 1625 1900 Overclocked GeForce 8800GTS 512 (on
diagram 8800GTS 512 756/1890/2100)
756 / 1890 2100 GeForce 8800GT (on the diagram 8800GT) 600 / 1500 1800 GeForce 8800GTX (in the diagram
575 / 1350 1800
We used ForceWare 169. 21 and ForceWare 169.25 drivers for
Windows XP and Windows Vista respectively. Getting to know the results
testing will traditionally start with 3DMark tests:
Based on the results of the 3DMark tests, of course, you can see who
stronger and who is weaker, but the difference is so small that there are obvious
leaders are not. However, it is worth noting that the most
the most expensive of the participants — the video card GeForce 8800GTX — took
last places. To complete the picture, please refer to
the results of gaming tests, which, as before, we produced with
4x anti-aliasing and 16x anisotropic filtering.
In the game Call of Duty 4, attention is drawn to the fact that
the video card Leadtek 8800GT is almost on a par with Leadtek 8800GTS 512, and
ASUS EN8800 TOP graphics card almost keeps up with the overclocked GeForce
8800GTS 512. The video card Palit 8800GT was in the penultimate place,
slightly outperforming the reference GeForce 8800GT. The winner was
video card GeForce 8800GTX, apparently due to the wider (according to
compared with other test participants) memory bus.
Leadtek 8800GTS 512 video card in Call of Juarez under Windows XP
goes almost on a par with the GeForce 8800GTX, which is no longer saved
wide memory bus. Note the fact that the Leadtek 8800GT video card from
does not lag behind them, and at a resolution of 1024×768 it even overtakes that
due to the higher compared to the other two video cards
frequencies. The leaders are the video card from ASUS and overclocked GeForce
8800GTS 512, and in the penultimate place is again a video card from Palit,
immediately after the GeForce 8800GT.
Call of Juarez running on Windows Vista is having problems with
resolution of 1600×1200, in which there were large differences in
speeds and sometimes very strong «brakes».
We assume that the problem lies in the lack of video memory in such
heavy mode, and whether it is or not, we will check in the next review on
An example of an ASUS 8800GT video card with 1 GB of video memory. We immediately notice
that there were no problems with the GeForce 8800GTX. Based on the same results
at two lower resolutions, it can be seen that the alignment of forces compared to
Windows XP has not changed much, except for GeForce 8800GTX
reminded of her noble origin, but did not become a leader.
In the game Crysis under Windows XP, the alignment of forces has changed slightly, however
in fact, everything remains the same: video cards Leadtek 8800GTS 512 and
Leadtek 8800GT are on the same level, the leaders are
ASUS EN8800GTS TOP video cards and overclocked GeForce 8800GTS 512, and
the last place goes to the video card GeForce 8800GT. We also note that
the fact that as the resolution grows, the gap between the overclocked GeForce
8800GTS 512 and GeForce 8800GTX shrink due to wider bus
the memory of the latter. However, high clock speeds still prevail,
and yesterday’s champion remains out of work.
The problem in Windows Vista with a resolution of 1600×1200 did not pass by and
Crysis game, passing only GeForce 8800GTX. As in the game Call of Juarez,
there were jerks in speed and in some places a very strong fall
performance, sometimes below one frame per second. On the basis of
results at two lower resolutions, you can see that this time
video card Leadtek 8800GTS 512 overtook its younger sister, taking the third
place. The first places went to ASUS EN8800GTS TOP video cards,
overclocked GeForce 8800GTS 512 and GeForce 8800GTX, which
1280×1024 finally took the lead.
GeForce is in the lead in Need for Speed Pro Street Racing
8800GTX, and at a resolution of 1024×768 — by a wide margin. Followed by
it is followed by the Leadtek 8800GTS 512 video card, followed by ASUS EN8800GTS TOP and
overclocked GeForce 8800GTS 512, and the last places left for the cards
GeForce 8800GT and Palit 8800GT sonic. Because the GeForce
8800GTX became the leader, we can conclude that there is a strong dependence
games from video memory bandwidth. After that you can
guess why the overclocked versions of the GeForce 8800GTS 512 turned out to be
slower than the non-overclocked version. Apparently, the reason for this is the increased
video memory delays due to an increase in its clock frequency.
In the game Need for Speed Carbon, a familiar picture appears before us:
video cards Leadtek 8800GTS 512 and Leadtek 8800GT are approximately on a par,
the first places were taken by the overclocked GeForce 8800GTS 512 and ASUS EN8800GTS
TOP, and the last place is occupied by GeForce 8800GT. GeForce graphics card
The 8800GTX looks good, but nothing more.
Oblivion draws attention to the fact that when resolving
1024×768 overclocked video card GeForce 8800GTS 512 and ASUS EN8800GTS
TOP took the last places. We hypothesized that it was due to the increased
due to the increase in the frequency of memory delays, and they turned out to be right: after
lowering the memory frequency of the overclocked video card GeForce 8800GTS 512
up to par, it showed a result of over 100 frames per second. As
growth in resolution, the situation returns to normal, and the former outsiders
become leaders. Incidentally, noteworthy is the fact that
Leadtek 8800GT overtakes Leadtek 8800GTS 512, apparently
affects the high frequency of the shader unit.
The Prey game turned out to be undemanding to all video cards, and they
arranged according to their clock frequencies. Is it GeForce
8800GTX behaved a little differently, but this is understandable, because it has more
a wide memory bus, and the game is highly dependent on its bandwidth
The purpose of today’s testing was to find out how much
video cards differ from each other, and how much the high price is justified
on the «advanced» GeForce 8800GTS 512 video card.
Based on the results obtained, it can be seen that the maps are very close to each other.
friend, and this despite the fact that according to the characteristics of the GeForce 8800GTS 512
outperforms GeForce 8800GT, including active functional
blocks inside the GPU. Obvious advantages of the new GeForce 8800GTS video cards
512 are a high-quality and quiet cooling system and larger than
GeForce 8800GT, overclocking potential. Deserves special attention
video card from ASUS, which, thanks to factory overclocking, takes
leading position. Of course, you can overclock the card yourself, moreover,
most likely, all video cards GeForce 8800GTS 512
«take» the frequencies of the video card from ASUS. AT
In general, we note once again that the new family of video cards based on
based on G9 graphics chips2 turned out to be very successful and may well
replace the recent leader GeForce 8800GTX.
Pros and cons of individual graphics cards:
Leadtek 8800GTS 512
- good overclocking potential;
- solid equipment;
- bright and convenient packaging.
- not noticed.
ASUS EN8800GTS TOP
- factory overclock;
- high-quality cooling system;
- has good overclocking potential.
- packaging is too big and uncomfortable.
- factory overclocked;
- solid equipment.
- not noticed.
Palit 8800GT sonic
- factory overclocked;
- alternative cooling system;
- solid equipment.
- heavily curved board in the GPU area;
- noticeable fan noise.
Volt mod for XFX GeForce 8800 GT, 8800 GS and 9600 GSO PCI-E
This volt mod is suitable for three XFX video cards using their own PCB design — GeForce 8800 GT, GeForce 8800 GS and GeForce 9600 GSO. The XFX lineup also includes reference design graphics cards. They can be easily distinguished by the type of cooling system installed. For reference video cards, the cooler covers the entire board, while for video cards with XFX design, the right side with elements of the power system is left open:
All memory chips are located on the front side of the video card. The GeForce 8800 GT has eight of them, while the GeForce 8800 GS and GeForce 9600 GSO have only six:
Both microcircuits that control the voltage on the graphics chip and memory are located on the back side of the video card on the left side: from chips:
Another distinctive feature of the XFX GeForce 8800 GS and XFX GeForce 9600 GSO video cards is the installation of slow Qimonda memory chips with an access time of 1.4 nanoseconds, designed to operate at a frequency of 1400 MHz:
and the memory of XFX video cards works, which is 200 MHz less than that of the reference GeForce 8800 GS and GeForce 9600 GSO.
The default GPU voltage is 1.15V, both in 2D and 3D mode.
As a voltage controller on the GPU, a chip with a two-line marking is used: «BN-9G» and «219». I could not determine the manufacturer of this microcircuit and find a datasheet for it, so I had to independently determine that the 4th leg is the ground, the 14th is feedback, and the 1st is phase. The legs of the controller are small, so it is more convenient to make a volt mod through resistors located nearby.
GPU volt mod using variable resistor:
solder a 1K ohm variable resistor (pre-wired to maximum resistance) in parallel to the resistor marked «Pencil Vgpu-mod» in the picture. You can control the resulting voltage on the legs of the capacitor, indicated as «Vgpu measure points».
Pencil Vgpu-mod GPU:
to increase the voltage on the GPU, you need to paint over the resistor marked in the picture as «Pencil Vgpu-mod». You can check the resistance obtained after painting using this graph:
GPU Reverse Volt Mod:
To lower the voltage on the GPU, solder a 1K ohm variable resistor in parallel to the resistor marked «Reverse Pencil Vgpu-mod» in the picture, or paint over the same resistor with a pencil.
Overcurrent protection (OCP):
Unfortunately, XFX graphics cards have overcurrent protection. The threshold for its operation is in the region of 1.50V. At the moment, the problem with OCP on these video cards remains unresolved due to the lack of necessary information about the implementation of this protection in the controller «BN-9G.
The default memory voltage is 1.89V. Memory voltage controller – SemTech SC2621A (Pin12=GND, Pin4=feedback, Pin13=phase).
Memory volt mod using a variable resistor: solder a 20K ohm variable resistor between the 4th and 12th legs of the SemTech SC2621A. The resistor must first be unscrewed to maximum resistance. To measure the voltage on the memory, you can use the legs of the capacitor, indicated in the picture as «Vmem measure points».
Pencil memory voltmod: to increase the voltage on the memory, you need to paint over the resistor marked in the picture as «Pencil Vmem-mod». You can check the resistance obtained after painting using this graph:
Reverse memory voltmod:
to lower the memory voltage, solder a 20K ohm variable resistor between the 4th and 13th legs of the SemTech SC2621A, or paint over the resistor marked in the picture as «Reverse Vcilme» with a pencil mod».
Checking the voltmod in practice and overclocking results:
The XFX GeForce 9600 GSO 580M (PV-T960-FDF4) 384Mb PCI-E video card was used for testing. The nominal frequencies of this video card are 580(1450)/1400 MHz. Without voltmod overclocking of the video card was 740 MHz on the graphics chip, 1890 MHz for shader frequency and 2030 MHz for memory frequency. At the same time, the temperature at rest was at the level of + 49 ° C, and under load + 61 ° C.
For a «moderate» voltmode (for long-term use) the voltage on the GPU was only raised to 1.30V. The cooling system has been replaced with a Zalman VF700-Cu. GPU overclocking has grown to 820(1944) MHz. The GPU temperature at rest increased to +54°c, and under load to +82°c.
For the voltmod with the expectation of short-term tests (benchmarking), the voltage on the GPU was raised to 1.48V (then the current protection already limited it). When using running cold water to cool the GPU, its overclocking was 864 (2106) MHz.
Raising the voltage on the memory from 1.