NVIDIA’s GeForce 6200 & 6600 non-GT: Affordable Gaming
by Anand Lal Shimpion October 11, 2004 9:00 AM EST
- Posted in
- GPUs
44 Comments
|
44 Comments
IndexNV4x’s Video Processor – What Happened?The ContendersPower ConsumptionThe TestDoom 3 PerformanceHalf Life 2 (Source) Visual Stress TestStar Wars Battlefront PerformanceThe Sims 2 PerformanceUnreal Tournament 2004 PerformanceBattlefield Vietnam PerformanceHalo PerformanceFarCry PerformanceAnti-Aliasing PerformanceAnti-Aliasing Image QualityAnisotropic Filtering PerformanceAnisotropic Filtering Image QualityFinal Words
Although it’s seeing very slow adoption among end users, PCI Express platforms are getting out there and the two graphics giants are wasting no time in shifting the competition for king of the hill over to the PCI Express realm.
ATI and NVIDIA have both traded shots in the mid-range with the release of the Radeon X700 and GeForce 6600. Today, the battle continues in the entry-level space with NVIDIA’s latest launch — the GeForce 6200.
The GeForce 6 series is now composed of 3 GPUs: the high end 6800, the mid-range 6600 and now the entry-level 6200. True to NVIDIA’s promise of one common feature set, all three of the aforementioned GPUs boast full DirectX 9 compliance, and thus, can all run the same games, just at different speeds.
What has NVIDIA done to make the 6200 slower than the 6600 and 6800?
For starters, the 6200 features half the pixel pipes of the 6600, and 1/4 that of the 6800. Next, the 6200 will be available in two versions: one with a 128-bit memory bus like the 6600 and one with a 64-bit memory bus, effectively cutting memory bandwidth in half. Finally, NVIDIA cut the core clock on the 6200 down to 300MHz as the final guarantee that it would not cannibalize sales of their more expensive cards.
The 6200 is a NV43 derivative, meaning it is built on the same 0.11-micron (110nm) process on which the 6600 is built. In fact, the two chips are virtually identical with the 6200 having only 4 active pixel pipelines on its die. There is one other architectural difference between the 6200 and the rest of the GeForce 6 family, and that is the lack of any color or z-compression support in the memory controller. Color and Z-compression are wonderful ways of reducing the memory bandwidth overhead of enabling technologies such as anti-aliasing. So, without support for that compression, we can expect the 6200 to take a bigger hit when turning on AA and anisotropic filtering. The benefit here is that the 6200 doesn’t have the fill rate or the memory bandwidth to run most games at higher resolutions. Therefore, those who buy the 6200 won’t be able to play at resolutions where the lack of color and z-compression would really matter with AA enabled. We’ll investigate this a bit more in our performance tests.
Here’s a quick table summarizing what the 6200 is and how it compares to the rest of the GeForce 6 family:
GPU | Manufacturing Process | Vertex Engines | Pixel Pipelines | Memory Bus Width |
GeForce 6200 | 0.11-micron | 3 | 4 | 64/128-bit |
GeForce 6600 | 0. 11-micron | 3 | 8 | 128-bit |
GeForce 6800 | 0.13-micron | 6 | 16 | 256-bit |
The first thing to notice here is that the 6200 supports either a 64-bit or 128-bit memory bus, and as far as NVIDIA is concerned, they are not going to be distinguishing cards equipped with either a 64-bit or 128-bit memory configuration. While NVIDIA insists that they cannot force their vendor partners to distinguish the two card configurations apart, we’re more inclined to believe that NVIDIA simply would like all 6200 based cards to be known as a GeForce 6200, regardless of whether or not they have half the memory bandwidth. NVIDIA makes a «suggestion» to their card partners that they should add the 64-bit or 128-bit designation somewhere on their boxes, model numbers or website, but the suggestion goes no further than just being a suggestion.
The next issue of variability comes in the topic of clock speeds. NVIDIA has «put a stake in the ground» at 300MHz as the desired clock speed for the 6200 GPUs regardless of configuration, and it does seem that add-in board vendors would have no reason to clock their 6200s any differently, since they are all paying for a 300MHz part. The variability really comes when you start talking about memory speeds. The 6200 only supports DDR1 memory and is spec’d to run at 275MHz (effectively 550MHz). However, as we’ve seen in the past, this is only a suggestion — it is up to the manufacturers as to whether or not they will use cheaper memory.
NVIDIA is also only releasing the 6200 as a PCI Express product — there will be no AGP variant at this point in time. The problem is that the 6200 is a much improved architecture compared to the current entry-level NVIDIA card in the market (the FX 5200), yet the 5200 is still selling quite well as it is not really purchased as a hardcore gaming card. In order to avoid cannibalizing AGP FX 5200 sales, the 6200 is kept out of competition by being a strictly PCI Express product. While there is a PCI Express version of the FX 5200, its hold on the market is not nearly as strong as the AGP version, so losing some sales to the 6200 isn’t as big of a deal.
In talking about AGP versions of recently released cards, NVIDIA has given us an update on the status of the AGP version of the highly anticipated GeForce 6600GT. We should have samples by the end of this month and NVIDIA is looking to have them available for purchase before the end of November. There are currently no plans for retail availability of the PCI Express GeForce 6800 Ultras — those are mostly going to tier 1 OEMs.
The 6200 will be shipping in November and what’s interesting is that some of the very first 6200 cards to hit the street will most likely be bundled with PCI Express motherboards. It seems like ATI and NVIDIA are doing a better job of selling 925X motherboards than Intel these days.
The expected street price of the GeForce 6200 is between $129 and $149 for the 128-bit 128MB version. This price range is just under that of the vanilla ATI X700 and the regular GeForce 6600 (non-GT), both of which are included in our performance comparison — so in order for the 6200 to truly remain competitive, its street price will have to be closer to the $99 mark.
The direct competition to the 6200 from ATI are the PCI Express X300 and X300SE (128-bit and 64-bit versions respectively). ATI has a bit of a disadvantage here because the X300 and X300SE are still based on the old Radeon 9600 architecture and not a derivative of the X800 and X700. ATI is undoubtedly working on a 4-pipe version of the X800, but for this review, the advantage is definitely in NVIDIA’s court.
NV4x’s Video Processor – What Happened?
IndexNV4x’s Video Processor – What Happened?The ContendersPower ConsumptionThe TestDoom 3 PerformanceHalf Life 2 (Source) Visual Stress TestStar Wars Battlefront PerformanceThe Sims 2 PerformanceUnreal Tournament 2004 PerformanceBattlefield Vietnam PerformanceHalo PerformanceFarCry PerformanceAnti-Aliasing PerformanceAnti-Aliasing Image QualityAnisotropic Filtering PerformanceAnisotropic Filtering Image QualityFinal Words
Tweet
PRINT THIS ARTICLE
NVIDIA’s GeForce 6200 graphics processor
WHEN NVIDIA first announced the GeForce 6800 series, the company boasted that its new graphics architecture would scale down to mid-range and value markets by the end of the year. GeForce 6 trickle-down has already spawned the GeForce 6600 series, whose performance and feature set are a revelation for the mid-range market. Today, NVIDIA extends the GeForce 6 series even further into the value segment with the GeForce 6200. This four-pipe GeForce 6 brings Shader Model 3.0 support to graphics cards in and around the $129 mark, giving cash-strapped gamers an intriguing new low-end option.
How does the GeForce 6200 fare against competition that includes ATI’s budget Radeons and Intel’s Graphics Media Accelerator 900? Read on to find out.
The GeForce 6200
The GeForce 6200 graphics chip is a four-pipe derivative of the NV43 GPU that powers the GeForce 6600 series. Like the rest of the GeForce 6 line, the 6200 utilizes a fragment crossbar to link pixel shaders and raster operators (ROPs) within the pixel pipeline. Rather than being bound to a single pixel shader, ROPs are free to tackle output from any of the chip’s pixel shaders. This rather promiscuous arrangement allows NVIDIA to pair eight pixel shaders with only four ROPs on the GeForce 6600, saving transistors without catastrophically bottlenecking performance. With the GeForce 6200, NVIDIA pairs four pixel pipes with four ROPs. There’s no transistor savings, but the fragment crossbar may offer a clock-for-clock performance advantage over more traditional designs.
Like the GeForce 6600 series, the GeForce 6200 has full support for DirectX 9, Shader Model 3.0, and 32-bit floating point data types. The 6200 packs three vertex shader units, just like the 6600, as well. The two also share a programmable video processor that we’ll have more to tell you about soon. The GeForce 6200 differs from the rest of the GeForce 6 line when it comes to antialiasing, though: its Intellisample 3.0 implementation lacks color and Z-compression. Since low-end cards generally lack the pixel pushing horsepower to make antialiasing viable in games, the lack of Intellisample color and Z-compression isn’t a major flaw.
Looks like NV43 to me
The GeForce 6200 GPU is manufactured by TSMC on a 0.11-micron fabrication process. The die measures 12mm x 13mm according to my tape measure, making it identical in size to the NV43 GPU that powers the GeForce 6600. Isn’t that interesting? When we asked NVIDIA for the 6200’s code name according to the “NV4x” convention, the company would only say the chip was a “derivative” of the NV43. It’s entirely possible that the GeForce 6200 GPU is simply an NV43 with four pixel pipes and Intellisample color and Z-compression disabled. If this is the case, we may see enterprising enthusiasts attempt to unlock the extra pipelines with hardware or software modifications.
Unlike other members of the GeForce 6 line, there will only be one version of the GeForce 6200no GT, XT, Ultra, or Turbo Golden Sample Special Edition. Clock speeds for the 6200 aren’t written in stone, though. NVIDIA recommends a core clock of 300MHz, but board vendors are free to run faster. There’s also flexibility on the memory clock front. Our GeForce 6200 reference card has DDR memory clocked at an effective 500MHz, which thanks to the 6200’s 128-bit memory bus, gives the card an even 8. 0GB/sec of memory bandwidth. Board manufacturers will be free to run higher or lower memory clocks, and they’ll also be able to make cheaper cards that have a narrower 64-bit path to memory.
Our GeForce 6200 reference card
As you can see, the GeForce 6200 reference card is a PCI Express affair. NVIDIA doesn’t plan to make an AGP version of the GeForce 6200, leaving the existing GeForce FX products for AGP systems. Since PC builders are already producing lots of machines based on Intel’s 900-series chipsets and PCI Express chipets are coming soon for the Athlon 64, there should be a burgeoning market for PCI Express graphics cards in the coming months.
The GeForce 6200 is primarily targeted at major OEMs and system integrators, so retail products may not make it to store shelves at places like Best Buy, CompUSA, or Fry’s any time soon. Cards should be available from online retailers for between $129 and $149, if not less. Expect 64-bit flavors of the GeForce 6200 to be even cheaper and, hopefully, clearly marked.
Finally, notice that the GeForce 6200 card lacks “golden fingers.” The 6200 doesn’t support SLI, so you won’t be able to team up two cards in a single system.
Our testing methods
All tests were run three times, and the results were averaged, using the following test systems.
Processor | Intel Pentium 4 520 2.8GHz | ||
Front-side bus | 800MHz (200MHz quad pumped) | ||
Motherboard | Gigabyte GA-8I915G-MF | ||
North bridge | Intel 915G | ||
South bridge | Intel ICH6 | ||
Chipset drivers | 6. 0.1.1002 | ||
Memory size | 1GB (2 DIMMs) | ||
Memory type | OCZ PC3200 EL Platinum Rev 2 DDR SDRAM at 400MHz | ||
CAS latency | 2 | ||
Cycle time | 5 | ||
RAS to CAS delay | 2 | ||
RAS precharge | 2 | ||
Hard drives | Western Digital Raptor WD360GD 37GB | ||
Audio | ICH6/ALC850 | ||
Graphics | ATI Radeon X600 Pro ATI Radeon X300 |
NVIDIA GeForce 6200 | Intel GMA 900 |
Graphics driver | CATALYST 4. 10 hotfix | ForceWare 66.81 | 14.7 |
OS | Microsoft Windows XP Professional with Service Pack 2 |
We’ll be comparing the GeForce 6200’s performance with a couple of Radeons and Intel’s Graphics Media Accelerator (GMA) 900. In lieu of a real Radeon X600 Pro, I used a Radeon X600 XT clocked Pro speeds. I also had to underclock our Radeon X300 to get it running at the correct 325MHz core and effective 200MHz memory clock speeds. The reference card I received from ATI was running a 400MHz core and 290MHz memory clockmuch faster than cards you can buy on the market.
I should also note that the X300 card has 256MB of memory. This is common practice for low-end cards as manufacturers try to dazzle less savvy buyers with higher numbers. However, we’ve found that low-end cards just don’t have the horsepower to take advantage of 256MB of memory, so the advantage is dubious at best.
We used the following versions of our test applications:
- DOOM 3 with trhaze and trdelta1 demos
- Unreal Tournament 2004 v3270 with trdemo1
- Counter-Strike Source with trdemo1
- Far Cry 1.2 with tr3-pier and tr1-volcano demos
- FutureMark 3DMark05 v110
- Xpand Rally single player demo with trtest
- FRAPS 2.2.5
The test systems’ Windows desktop was set at 1280×1024 in 32-bit color at an 85Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests and drivers were left at their default image quality settings. Both ATI and NVIDIA’s default image quality driver settings use adaptive anisotropic filtering algorithms.
All the tests and methods we employed are publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.
Pixel filling power
We’ll kick things off with a look at theoretical peak fill rates and memory bandwidth. Theoretical peaks don’t necessarily determine in-game performance, but they’re a good place to start sizing up the GeForce 6200’s capabilities. I’ve sorted the list below, which includes an array of low-end and mid-range PCI Express graphics options, according to multitextured fill rate.
Core clock (MHz) | Pixel pipelines | Peak fill rate (Mpixels/s) | Texture units per pixel pipeline | Peak fill rate (Mtexels/s) | Memory clock (MHz) | Memory bus width (bits) | Peak memory bandwidth (GB/s) | |
GeForce 6200 | 300 | 4 | 1200 | 1 | 1200 | 500 | 128 | 8. 0 |
Radeon X300 SE | 325 | 4 | 1300 | 1 | 1300 | 400 | 64 | 3.2 |
Radeon X300 | 325 | 4 | 1300 | 1 | 1300 | 400 | 128 | 6. 4 |
GMA 900 | 333 | 4 | 1333 | 1 | 1333 | 400 | 128 | 6.4 |
Radeon X600 Pro | 400 | 4 | 1600 | 1 | 1600 | 600 | 128 | 9. 6 |
Radeon X600 XT | 500 | 4 | 2000 | 1 | 2000 | 740 | 128 | 11.8 |
GeForce 6600 | 300 | 8* | 2400 | 1 | 2400 | TBD | 128 | TBD |
Radeon X700 | 400 | 8 | 3200 | 1 | 3200 | 600 | 128 | 9. 6 |
Radeon X700 Pro | 420 | 8 | 3360 | 1 | 3360 | 864 | 128 | 13.8 |
Radeon X700 XT | 475 | 8 | 3800 | 1 | 3800 | 1050 | 128 | 16. 8 |
GeForce 6600 GT | 500 | 8* | 2000 | 1 | 4000 | 1000 | 128 | 16.0 |
In terms of fill rate, the GeForce 6200 brings up the rear. With four pixel pipes and a 300MHz core clock, it can’t even match the peak theoretical fill rates of the Radeon X300 series. The low core clock speed means that the card’s shader units are going to be running slower than the competition, too.
In the memory bandwidth department, the GeForce 6200 looks a little more competitive. The card’s 128-bit memory bus and effective 500MHz memory clock yield 8GB/sec of bandwidthbetter than the X300s but shy of the Radeon X600 Pro.
To see how these theoretical peaks pan out in the real world, let’s have a look at 3DMark05’s synthetic fill rate tests. Note that the drivers we’re using for the 6200, X600 Pro and X300, and even the GMA 900 aren’t approved by FutureMark for use with 3DMark05.
The GeForce 6200’s single texture fill rate is just a hair behind the X600 Pro, but when we start multitexturing, the 6200 is relegated to the back of the pack. Given that the 6200 has the slowest clock speed of the lot, its relatively modest performance isn’t surprising. What is surprising, however, is how close the Intel GMA 900 integrated graphics core gets to its theoretical peak fill rates.
Shader performance
While we’re looking at synthetic tests, let’s have a peek at how the GeForce 6200 fares in 3DMark05’s shader tests. I ran all the cards using 3DMark05’s Shader Model 2.0 code path. Since the 6200 also supports Shader Model 3.0, I also ran it using the SM 3.0 code path.
The 6200’s shader power is impressive even when running the Shader Model 2.0 codepath. Based on these scores, I wouldn’t expect much from the Intel GMA 900 in our game tests. It might have fill rate to spare, but shader power is sorely lacking.
DOOM 3
I used a couple of DOOM 3 gameplay demos to test the 6200’s performance in what’s arguably the most visually stunning game around. I ran the game with the High Quality detail setting, which enables 8X anisotropic filtering. High Quality might seem a little excessive for a low-end graphics card, but the GeForce 6200 handled it with aplomb.
The first demo takes place in the Delta Labs and is representative of the dark, shadow-filled environments you’ll encounter in the bulk of the game’s levels.
The 6200 wipes the floor with the competition. It’s not even close.
Next, we move onto our heat haze demo. This demo takes place in one of DOOM 3’s hell levels, which are visually quite different from the rest of the game. These levels make liberal use of some snazzy pixel shader-powered heat shimmer effects.
Again, the 6200 dominates. The race is a little closer this time, but not by much. The GMA 900 continues to stumble through DOOM 3 and has some serious problems displaying the heat haze effect properly.
Far Cry
Before DOOM 3 hit, Far Cry was arguably the best looking first-person shooter around. The game is loaded with shader effects and, perhaps more importantly, diverse indoor and outdoor environments. We’ll be looking at two of those environments today, both with the game’s high detail image quality setting.
First up we have the Pier level. Welcome to the jungle, folks.
The GeForce 6200 isn’t nearly as dominant in Far Cry as it was in DOOM 3. In fact, the Radeon X600 Pro beats it this time around. The 6200 is clearly faster than the Radeon X300, though, and undoubtedly superior to the bottom-dwelling GMA 900.
From lush jungles to underground interiors, the next Far Cry environment we’ll be looking at is the Volcano level. Like DOOM 3, this level employs heat shimmer effects in a number of places.
Again, the GeForce 6200 plays second fiddle to the Radeon X600 Pro. It’s pretty close, though, and the 6200 definitely has an edge over the Radeon X300.
As we saw in DOOM 3, the GMA 900 is way off the pace in Far Cry. The GMA 900 makes a visual mess of Far Cry’s heat effects shaders, too. The GMA 900’s DirectX 9 compatibility is an, er, far cry from DX9 competence.
Counter-Strike: Source
Counter-Strike: Source has officially been released and should hold anxious gamers over until Half-Life 2 hits. CS: Source also includes Valve’s shader-filled video stress test, which showcases the material and shader effects found in Half-Life 2. We used CS: Source’s high detail image quality settings and DirectX 9 code path for all but the GMA 900. The game refused to run the GMA 900 with anything but the DX 8.1 code path, so scores on that front aren’t directly comparable.
In the CS: Source video stress test, the GeForce 6200 is wedged between the Radeon X300 and X600 Pro. The GMA 900’s performance looks comparably better here.
Next, we’re looking at in-game Counter-Strike performance with a demo of online gameplay on the cs_italy map.
The GeForce 6200 is stuck between the Radeons again. The card isn’t much slower than the Radeon X600 Pro, although the game seems to be CPU-bound at lower resolutions.
Unreal Tournament 2004
Although DOOM 3 and Far Cry’s visuals are far more impressive than Unreal Tournament 2004, the engine has been licensed by scores of developers. A number of other titles already make use of the Unreal engine and we’re likely to see more in the coming months. Since Unreal Tournament 2004 is a little older, I was able to max out the in-game detail levels and still get playable frame rates in our custom-recorded demo of Onslaught gameplay.
Notice a pattern yet? The GeForce 6200 is faster than the Radeon X300 in Unreal Tournament 2004, but slower than the Radeon X600 Pro. Even in this older game engine, the GMA 900 is no match for the low-end graphics cards we’ve assembled. There’s definitely something wrong with the GMA 900’s performance at 800×600, too.
Xpand Rally
The Xpand Rally single-player demo is a new addition to our graphics benchmark suite. To test this game, I used FRAPS to capture frame rates during the replay of a custom-recorded demo. I used the game’s “balanced” image quality settings to achieve playable frames. The game uses a particularly nice color glow shader effect that doesn’t appear to translate well to the screenshot below. It doesn’t translate well to the GMA 900, either. The GMA 900 doesn’t appear to be applying most of the game’s shader effects, for whatever reason.
Ouch. The GeForce 6200 takes a bit of a beating in Xpand Rally and is just barely able to keep up with the Radeon X300. Let’s have a look at frame rates across the length of our 180-second replay.
The GeForce 6200’s frame rates are reasonably consistent across the length of the demo, at least when compared with its competition. Still, it’s disappointing that the 6200 can’t even best the Radeon X300, especially since Xpand Rally carries NVIDIA’s “The way it’s meant to be played” logo.
Antialiasing
To test the GeForce 6200’s antialiasing performance, I used the same Unreal Tournament 2004 demo as in our previous tests. I kept the same in-game detail levels and used a display resolution of 1024×768. The GeForce 6200 was tested with 2X, 4X, and 8X antialiasing while the Radeons were tested with 2X, 4X, and 6X AA. The GMA 900 can’t do antialiasing, so I’ve only included its score without AA.
The Radeons’ antialiasing performance scales much better than the GeForce 6200, perhaps in part because the 6200 lacks color and Z-compression. The difference in performance is especially glaring with 4X antialiasing, where the GeForce 6200 is trounced by the Radeon X600 Pro. Even the Radeon X300 squeaks ahead by a few frames per second.
That’s how the GeForce 6200’s antialiasing performs and here’s how it looks. Click on the images below to open an uncompressed PNG in a new window.
No antialiasing, 2X, 4X, 8X – GeForce 6200
No antialiasing, 2X, 4X, 6X – Radeon X600 Pro
For 2X and 4X antialiasing, the Radeon X600 Pro’s gamma-corrected antialiasing looks better than the GeForce 6200’s output. Comparing the GeForce 6200 at 8X to the X600 at 6X is a little more complicated because NVIDIA’s 8X antialiasing algorithm combines both multi and supersampling and affects more than just jagged edges.
Anisotropic filtering
For our anisotropic texture filtering tests, I used the same Unreal Tournament 2004 demo once more. Again, I left in-game detail levels at their highest setting and used a display resolution of 1024×768.
There isn’t much difference in aniso scaling between the GeForce 6200, Radeon X600 Pro, and Radeon X300. The 6200 suffers from a slightly less dramatic performance hit between 4X and 16X, but it’s not nearly as dramatic as our antialiasing results.
Moving from performance to quality, here’s how the GeForce 6200’s anisotropic filtering looks up to 8X. Click on the images below to open an uncompressed PNG in a new window.
No aniso, 2X, 4X, 8X – GeForce 6200
No aniso, 2X, 4X, 8X – Radeon X600 Pro
Anisotropic filtering levels are comparable between the GeForce 6200 and Radeon X600 Pro, at least to my eye.
3DMark05 image quality – Game test 1
When NVIDIA extended its GeForce FX line down to the low end with the GeForce FX 5200, they resorted to all sorts of partial precision tricks to improve performance. Unfortunately, dropping precision degraded image quality. At launch, the FX 5200’s DirectX 9 image quality was noticeably inferior to that of high-end GeForce FX cards. To make sure that NVIDIA isn’t doing the same thing again with the GeForce 6200, I compared its output to a GeForce 6800 GT in 3DMark05’s game tests using the Shader Model 3.0 path. The X600 Pro can only use 3DMark05’s Shader Model 2.0 path, which will produce slightly different images by default, so I haven’t included it here.
Click on the images below to open an uncompressed PNG in a new window.
Game test 1 – GeForce 6200
Game test 1 – GeForce 6800 GT
Everything looks good in game test 1. At least to my eye, the 6200’s output doesn’t look any better or worse than the 6800 GT’s.
3DMark05 image quality – Game test 2
Game test 2 – GeForce 6200
Game test 2 – GeForce 6800 GT
The same goes for game test 2…
3DMark05 image quality – Game test 3
Game test 3 – GeForce 6200
Game test 3 – GeForce 6800 GT
Game test 3 also shows no apparent differences between the cards. If NVIDIA’s cutting precision, it’s not having a detrimental impact on image quality.
Power consumption
I broke out our trusty watt meter to measure overall system power consumption, sans monitor, at the outlet. Power consumption was measured at idle and under a 3DMark05 Return to Proxycon game test load.
System power consumption with the GeForce 6200 closely mirrors the Radeon X300. It’s particularly interesting to note that system power consumption with the integrated GMA 900 IGP isn’t much lower than with our discrete graphics cards.
Conclusions
When NVIDIA launched the GeForce 6800, they talked up the new architecture as a scalable design that would power a top-to-bottom line of graphics cards. With the GeForce 6200, that top-to-bottom line is complete, at least as far as add-in desktop graphics cards are concerned. The GeForce 6200 brings the impressive rendering capabilities of the GeForce 6 series to the budget space with fewer compromises than one might expect. The GeForce 6200’s strong performance in DOOM 3 shows that this budget card isn’t too short on pixel processing power.
Priced between $129 and $149, the GeForce 6200 will do battle with ATI’s Radeon X300 and Radeon X600 Pro. Against the X300, the GeForce 6200 is clearly superior. However, with the exception of DOOM 3, the X600 Pro’s performance is tough to match. The GeForce 6200 performs exceptionally well given its relatively low clock speeds, but against a card with a 100MHz core and memory clock advantage, there’s only so much it can do.
In some circles, the GeForce 6200 will also compete with Intel’s Graphics Media Accelerator 900. Those who purchase PCs from the big PC makers will likely have the option of going with integrated graphics or trading up to something like the GeForce 6200. Based the performance of the GMA 900, trading up looks like the only viable option for gaming, at least with newer titles. The GMA 900’s lack of shader power has a devastating impact on performance, and it’s rare that pixel shader effects are even displayed correctly. To make matters worse, the GMA isn’t detected as a DirectX 9 graphics option by some games.
As I wrap things up, I can’t help but be struck by the GeForce 6200 graphic chip’s relatively large die size. It appears to be an NV43 with some of its pixel shaders disabled, and it’s a big chip for what should be a very high volume part. I wouldn’t be surprised to see a GeForce 6100 or 6300 emerge at some point down the road with a smaller die size and perhaps four pixel pipes bound to two ROPs using that fancy fragment crossbar.
Whatever happens in the future, right now the GeForce 6200 is a pretty compelling graphics card for budget-minded gamers. It’s clearly a better option than the Radeon X300, but add-in board manufacturers are going to have to have to break out some Turbo Golden Sample Special Editions with higher clock speeds to catch the Radeon X600 Pro.
GeForce 6200 AGP — Technical City
NVIDIA
GeForce 6200 AGP
Buy
- Interface AGP 8x
- Core clock speed 230 MHz
- Max video memory 128 MB
- Memory type DDR
- Memory clock speed 132 MHz
- Maximum resolution
Summary
NVIDIA started GeForce 6200 AGP sales 14 December 2003. This is Celsius architecture desktop card based on 150 nm manufacturing process and primarily aimed at gamers. 128 MB of DDR memory clocked at 0.13 GHz are supplied, and together with 64 Bit memory interface this creates a bandwidth of 1.056 GB/s.
Compatibility-wise, this is single-slot card attached via AGP 8x interface. Its manufacturer default version has a length of 168 mm.
We have no data on GeForce 6200 AGP benchmark results.
General info
Of GeForce 6200 AGP’s architecture, market segment and release date.
Place in performance rating | not rated | |
Architecture | Celsius (1999−2005) | |
GPU code name | NV18 C1 | |
Market segment | Desktop | |
Release date | 14 December 2003 (18 years ago) | |
Current price | $162 | of 49999 (A100 SXM4) |
Technical specs
GeForce 6200 AGP’s general performance parameters such as number of shaders, GPU base clock, manufacturing process, texturing and calculation speed. These parameters indirectly speak of GeForce 6200 AGP’s performance, but for precise assessment you have to consider its benchmark and gaming test results.
Core clock speed | 230 MHz | of 2610 (Radeon RX 6500 XT) |
Number of transistors | 29 million | of 14400 (GeForce GTX 1080 SLI Mobile) |
Manufacturing process technology | 150 nm | of 4 (h200 PCIe) |
Texture fill rate | 0.92 | of 939.8 (h200 SXM5) |
Compatibility, dimensions and requirements
Information on GeForce 6200 AGP’s compatibility with other computer components. Useful when choosing a future computer configuration or upgrading an existing one. For desktop video cards it’s interface and bus (motherboard compatibility), additional power connectors (power supply compatibility).
Interface | AGP 8x | |
Length | 168 mm | |
Width | 1-slot | |
Supplementary power connectors | None |
Memory
Parameters of memory installed on GeForce 6200 AGP: its type, size, bus, clock and resulting bandwidth. Note that GPUs integrated into processors don’t have dedicated memory and use a shared part of system RAM.
Memory type | DDR | |
Maximum RAM amount | 128 MB | of 128 (Radeon Instinct MI250X) |
Memory bus width | 64 Bit | of 8192 (Radeon Instinct MI250X) |
Memory clock speed | 132 MHz | of 21000 (GeForce RTX 3090 Ti) |
Memory bandwidth | 1.056 GB/s | of 14400 (Radeon R7 M260) |
Video outputs and ports
Types and number of video connectors present on GeForce 6200 AGP. As a rule, this section is relevant only for desktop reference video cards, since for notebook ones the availability of certain video outputs depends on the laptop model.
Display Connectors | 1x VGA, 1x S-Video |
API support
APIs supported by GeForce 6200 AGP, sometimes including their particular versions.
DirectX | 8.0 | |
OpenGL | 1.3 | of 4.6 (GeForce GTX 1080 Mobile) |
OpenCL | N/A | |
Vulkan | N/A |
Benchmark performance
Non-gaming benchmark performance of GeForce 6200 AGP. Note that overall benchmark performance is measured in points in 0-100 range.
Similar GPUs
Here is our recommendation of several graphics cards that are more or less close in performance to the one reviewed.
Recommended processors
These processors are most commonly used with GeForce 6200 AGP according to our statistics.
Pentium 4
P4 3.0
13%
Core 2
Duo E4500
8. 7%
Athlon XP
3200+
4.3%
Pentium 4
630
4.3%
Athlon XP
1700+
4.3%
Pentium III
1133
4.3%
Pentium
E5400
4.3%
Core 2
Duo E7300
4.3%
Athlon II
X2 250
4.3%
Pentium D
945
4. 3%
User rating
Here you can see the user rating of the graphics card, as well as rate it yourself.
Questions and comments
Here you can ask a question about GeForce 6200 AGP, agree or disagree with our judgements, or report an error or mismatch.
Please enable JavaScript to view the comments powered by Disqus.
XFX nVIDIA GeForce 6200 AGP Display Card PVT44AWANG B&H Photo
BH #XFGF6200256M • MFR #PVT44AWANG
Key Features
- 256MB DDR2 RAM
- DVI + VGA
- S-Video
The nVIDIA GeForce 6200 AGP Display Card from XFX Force delivers powerful 3D graphics to your desktop computer. The card installs in an available AGP slot and features DVI, VGA, and TV outputs. You can even connect a TV via S-Video. The card is fully compatible with such standards as DirectX 9 and OpenGL 2. 0, making it an excellent choice for your computer.
More Details
No Longer Available
Share
PrintAsk Our Experts
-
Overview
-
Specs
-
Reviews0
-
Q&A
- Accessories
XFX PVT44AWANG Overview
- 1Description
- 2Full Microsoft DirectX 9 Shader Model 3.0 Support
- 3AGP Support
- 4Dual Monitor
- 5Windows Vista Support
The nVIDIA GeForce 6200 AGP Display Card from XFX Force delivers powerful 3D graphics to your desktop computer. The card installs in an available AGP slot and features DVI, VGA, and TV outputs. You can even connect a TV via S-Video. The card is fully compatible with such standards as DirectX 9 and OpenGL 2. 0, making it an excellent choice for your computer.
The video card has full support for the DirectX 9 Shader Model 3.0. This brings stunning graphics capabilities to gaming, geometry, vertex, physics, and pixel shading operations.
The card installs in an available AGP 8x slot. This interface is specifically designed with graphics applications in mind.
The video card can drive dual monitors. It features a DVI-I port and a VGA port; each port is capable of driving a 1600 x 1200 display.
This card is fully compatible with Microsoft’s Windows Vista operating system. It provides more than enough horsepower to drive Vista’s Aero graphics.
In the Box
- XFX nVIDIA GeForce 6200 AGP Display Card
- Software CD-ROM with Drivers
- Double Lifetime Protection Warranty
- Description
- Full Microsoft DirectX 9 Shader Model 3. 0 Support
- AGP Support
- Dual Monitor
- Windows Vista Support
XFX PVT44AWANG Specs
Hardware | |
Stream Processors | nVIDIA GeForce 6200 |
GPU Clock | Core: 350MHz |
Memory Amount | 256MB |
Memory Clock | 533MHz |
Memory Type | DDR2 |
Memory Interface | 128-bit |
Bus Type | AGP |
Bus Speed | 8X |
Rendering Pipelines | 1 |
Geometry Engines | None |
Geometry Rate | None |
Pixel Fill Rate | 1. 4Gpixels/sec |
RAMDAC | 2x 400MHz |
Multimedia Support | |
FM Tuner | No |
TV Tuner | No |
DTV Tuner | No |
HDTV Capable | No |
Hardware MPEG | None |
Display Support | |
Computer Analog | 1x VGA |
Computer Digital | 1x DVI-I |
Video | 1x S-Video |
Multiple Display Configuration | Dual Display |
Display Resolutions | |
Analog | 1600 x 1200 |
Digital | 1600 x 1200 |
Max Resolution | 1600 x 1200 |
I/O Connections | |
Analog (PC) | 1x VGA |
Digital (PC) | 1x DVI-I |
Video | 1x S-Video |
System Requirements | |
Software | Operating System: Windows XP, Vista |
Hardware | AGP slot 300W power supply |
XFX PVT44AWANG Reviews
See any errors on this page? Let us know
YOUR RECENTLY VIEWED ITEMS
Review: NVIDIA’s GeForce 6200 Launch — Graphics
At the end of my recent preview of NVIDIA’s GeForce 6600 GT, powered by their NV43 GPU, I remarked, «Now there’s only one piece of the pie left», in reference to NVIDIA’s claim that they’d offer top-to-bottom GeForce 6-series models before the end of 2004, based on the same technology that debuted this April with NV40 and 6800 Ultra.
With the mid-range 6600 and high-end 6800 range of hardware covered, the bit that’s left is 6200, which makes its initial mark today. With Euro-zone review samples held up at customs for the majority of last week and only just released to NVIDIA in Europe today, from what we can make out, it’s up to our American and Canadian counterparts to provide you with reviews. However, I can still go over the basics with you before our own sample arrives in due course.
PCI Express once again
The push to get everyone using PCI Express graphics cards on the desktop marches on at full speed. 6200 assists in that endeavour by being PCI Express only, including for the forseeable future. NVIDIA have no concrete plans for an AGP bridged version, especially since that requires a new GPU package and added cost, something that’s not attractive for the 6200’s price point.
NV43’s basic building blocks
Like GeForce 6600, 6200s use NV43 GPUs. In 6600 that’s eight fragment pipelines, three vertex shaders and four ROPs fed by a fragment crossbar. That means up to eight pixel fragments output per clock, into four output units, with three of NV40’s six vertex shaders. In 6200 everything’s the same as far as basic architecture is concerned, just four of the fragment pipelines are disabled, presumably after failing QA testing at the fab.
4 pipes, 4 ROPs, three vertex units. Easy peasy.
But there’s more
But on top of those basic building blocks, 6200 has other differentiating features that mark it out as more than just four-pipe 6600. 6200 has no support for NVIDIA’s SLI, meaning you won’t be able to combine a pair of 6200’s in a dual PCI Express system for more graphics power. It also has none of NV43’s colour or Z-buffer optimisations enabled. So there’s no colour compression on pixel fragments, no Z-buffer compression, no fast Z-clear. You can argue that at this end of the market, where the GPU is less capable and clocked slower than its mid-range counterparts, that colour and Z optimisations, under NVIDIA’s Intellisample technology umbrella, are needed more at the low end to aid performance, than in the higher performing parts.
6200 also has no support for 64-bit texture formats, meaning that it can’t do 64-bit texture blending, combining or rendering of any kind where a 64-bit precision surface is required, integer or otherwise. That has knock on effects for certain kinds of rendering techniques and means 6200 is even less capable than a four-pipe 6600 would suggest.
It’s not all bad
NVIDIA UltraShadow II technology is complete and working in 6200, offering rendering speedups for graphics techniques that make heavy use of stencil shadows. You’ve also got full support for Direct X Shader Model 3.0, NVIDIA’s newest 4X RGMS multi-sampling anti-aliasing, 16X angle-adaptive aniso-tropic filtering and the rest of what defines NV4x GPUs.
Cost
NVIDIA’s price points for 6200 boards sit at $129 for a 128MB version (all 6200 variants will have a 128-bit memory bus initially, with AIBs able to implement even lower cost 64-bit versions if they choose, at a later date) and $149 for a 256MB version. NVIDIA in the UK fully expect that to translate to
To sum up
Think 6600 and NV43. Think the same 0.11µ process production at TSMC. Think half the pixel power at the same clocks, without some optimisations before pixels are generated (Z opts) and after they’re written to their render buffers (colour opts), no SLI, but complete SM3.0. If that makes sense, you’re most of the way there.
With NVIDIA’s basic reference clocks for 6200 at 300MHz core and 275MHz (550MHz DDR) memory, it’s an X300/X600 competitor, and I’d expect 6600 GT and ATI’s X700 XT to be at least twice as fast in most situations. That pitches 6200 as the step up from integrated graphics that most people look at when they’re considering low-cost systems. It’ll be interesting to see how that works out.
Pretty pictures
While we didn’t get actual boards for today, we did get pretty pictures. So here’s some geek porn to satisfy that low-end graphics lust. Kleenex at the ready, budget fans!
It’s a single slot board using a small PCB and the cooler many of you might remember from some GeForce3 boards back in the day. Equipped with S-Video input and output, DVI-I and VGA, the pictured reference board doesn’t need any external power source and should run very cool and quiet, to steal a phrase from AMD.
We’ll have our own review up as soon as possible. In the meantime, The Tech Report were kind enough to let me preview their article, full of performance numbers, an hour or so before you’ll get to read this. It’s great stuff as always from those guys, so check it out here.
Their reference board has 128MB of memory clocked at 500MHz DDR, with the core at 300MHz, and Geoff pits it against GMA900 and a couple of PCI Express Radeons from the X300 and X600 camps. Performance sits just shy of X600 Pro in most cases. Be sure and read their review for full details.
GeForce 6 Tech Specs|NVIDIA
CineFX 3.0 Shading Architecture
- Vertex Shaders
- Support for Microsoft DirectX 9.0 Vertex Shader 3.0
- Displacement mapping
- Geometry instancing
- Infinite length vertex programs
- Pixel Shaders
- Support for DirectX 9. 0 Pixel Shader 3.0
- Full pixel branching support
- Support for Multiple Render Targets (MRTs)
- Infinite length pixel programs
- Next-Generation Texture Engine
- Up to 16 textures per rendering pass
- Support for 16-bit floating point format and 32-bit floating point format
- Support for non-power of two textures
- Support for sRGB texture format for gamma textures
- DirectX and S3TC texture compression
- Full 128-bit studio-quality floating point precision through the entire rendering pipeline with native hardware support for 32bpp, 64bpp, and 128bpp rendering modes
64-Bit Texture Filtering and Blending
- Full floating point support throughout entire pipeline
- Floating point filtering improves the quality of images in motion
- Floating point texturing drives new levels of clarity and image detail
- Floating point frame buffer blending gives detail to special effects like motion blur and explosions
Intellisample 3. 0 Technology
- Advanced 16x anisotropic filtering
- Blistering-fast antialiasing and compression performance
- New rotated-grid antialiasing removes jagged edges for incredible edge quality
- Support for advanced lossless compression algorithms for color, texture, and z-data at even higher resolutions and frame rates
- Fast z-clear
- High-resolution compression technology (HCT) increases performance at higher resolutions through advances in compression technology
UltraShadow II Technology
- Designed to enhance the performance of shadow-intensive games, like id Software’s Doom 3
TurboCache Technology
- Shares the capacity and bandwidth of dedicated video memory and dynamically available system memory for optimal system performance.
PureVideo Technology
- Adaptable programmable video processor
- MPEG video encode and decode
- High-definition MPEG-2 hardware acceleration
- High-quality video scaling and filtering
- DVD and HDTV-ready MPEG-2 decoding up to 1920x1080i resolutions
- Dual integrated 400 MHz RAMDACs for display resolutions up to and including 2048 × 1536 at 85Hz
- Display gamma correction
- Microsoft® Video Mixing Renderer (VMR) supports multiple video windows with full video quality and features in each window
Advanced Display Functionality
- Dual integrated 400MHz RAMDACs for display resolutions up to and including 2048×1536 at 85hz
- Dual DVO ports for interfacing to external TMDS transmitters and external TV encoders
- Full NVIDIA® nView™ multi-display technology capability
Advanced Engineering
- Designed for PCI Express x16
- Support for AGP 8X including Fast Writes and sideband addressing
- Designed for high-speed GDDR3 memory
- Advanced thermal management and thermal monitoring
NVIDIA® Digital Vibrance Control™ (DVC) 3. 0
- DVC color controls
- DVC image sharpening controls
Operating Systems
- Windows XP
- Windows ME
- Windows 2000
- Windows 9X
- Macintosh OS, including OS X
- Linux
API Support
- Complete DirectX support, including the latest version of Microsoft DirectX 9.0 Shader Model 3.0
- Full OpenGL support, including OpenGL 2.0
GeForce 6 Series GPUs Features Comparison:
|
1. GeForce 6200 models do not include compression technology.
2. NVIDIA SLI: Available on SLI-certified versions of GeForce 6800 Ultra, 6800 GT, 6800 GS, 6800, 6800 XT, 6800 LE, 6600 GT, 6600, and 6600 LE PCI Express GPUs only.
3. Features may vary by product. Some features may require additional software.
4. Available on select GeForce 6200 models only.
5. GDDR3 — GeForce 6800 Ultra, 6800 GT, 6800 GS, and 6600 GT only.
Turning NVIDIA GeForce 6200 into 6600
The PCI-Express x16 interface is gradually becoming popular. Already, there are a huge number of PCI-E video cards, represented by both computer giants ATI and NVIDIA. It is also important that PCI-Express x16 cards of the GeForce 6200/6600/6800 and Radeon X800 series are, as a rule, cheaper than their AGP counterparts, because exclude the use of PCI-E -> AGP bridges due to their initial focus on the new interface and, as a result, a more simplified design.
In today’s review, you will be presented with a video card from the lower price range from NVIDIA — Gigabyte GeForce 6200. The 6200 model is presented on the market in two versions — for the AGP interface and, of course, PCI-Express x16. When the GeForce 6200 was preparing for its official announcement, it was known that it would be based on the NV43 chip (with half of the texture pipelines disabled), which is known to be used in GeForce 6600 series cards. based on GeForce 6200 should be painlessly converted to GeForce 6600 without any «surgical» interventions. So, today, when the G6200 has already flooded retail, there is an opportunity to test the assumptions in practice.
Specifications NVIDIA NV43
Manufacturer | nVidia |
Core name | NV43 |
Technological process, microns | 0. 11 |
Core frequency, MHz | 300 |
Number of texture pipelines | 4/8 |
Number of texture units per pipeline | 1 |
Number of textures per pixel per pass | 16 |
Hardware T&L | + |
Memory type | DDR |
Memory bus width, bit | 128 |
Memory frequency, MHz | 250 |
Memory bandwidth, GB/s | 8 |
Anisotropic filtering level | 16x |
FSAA | Rotated grating multisampling, combination of MSAA and SSAA |
SSAA coefficients | 2x |
MSAA Sample Number | 2x-8x |
Vertex shaders | 3 |
Pixel Shaders | 3 |
Frequency of the 1st RAMDAC, MHz | 400 |
2nd RAMDAC frequency, MHz | 400 |
Dual output technology | nView |
Number of top conveyors | 3 |
As you can see, the number of texture pipelines varies depending on the use of the NV43 chip in the GeForce 6200 and GeForce 6600. Accordingly, the NVIDIA GeForce 6200 uses the 4×1 formula, and the 6600 uses the 8×1 formula, while the number of vertex pipelines is always three . I will not explain that the inclusion of all eight pipelines gives a significant jump in the performance of the GF6200 and equates it to the results shown by the GF6600.
Now let’s move on to our specific instance — Gigabyte NVIDIA GeForce 6200.
So, the card is made on Gigabyte’s standard blue textolite and has a PCI-Express x16 interface. Standard frequencies are 300/500MHz. Onboard is 128MB of Hynix DDR memory with a 4ns access time.
The standard cooling is a pretty nice copper cooler, which in terms of noise characteristics resembles overclocked boxed coolers from Intel.
recommendations
The ear spoiled by Zalman’s silence, to be honest, is very hard to endure such a strong noise, so it was decided to replace the standard cooling with a cooling system for video cards from the same Zalman — ZM80C-HP with an optional cooler ZM-OP1.
Zalman ZM80C-HP (including ZM0P1) is not a powerful overclocking tool, it’s just a set of two heatsinks, a heat pipe and a silent fan, which allow you to maintain the achieved overclocking performance on efficient stock cooling, only while maintaining the overall silence of the system. The price of such a miracle of nature is around $ 30.
But back to our topic. Ultimately, after lengthy tests, the card showed the following results in overclocking — the core operated at a frequency of 550MHz, the memory at a frequency of 670MHz. To tell the truth, I have not seen a 4ns memory for a long time, which, working almost at its maximum (500MHz), was able to add another 170MHz in frequency! Brilliant result.
Enabling eight texture pipelines
In order to save a beginner in this business from the question, in fact, «how to enable these same pipelines», I will briefly describe the rework process.
For a successful result, we need to have, first of all, the latest version of the RivaTuner program. At the time of this writing, v2.0 RC 15.4. After starting the program, we should open the «Low-Level system tweaks» window, which can be accessed by clicking on the «Low-Level system setting» button in the Customize section.
In the lower part of the window (Graphics processor configuration), enable the «Allow enabling hardware masked units» parameter, set «Custom» in the «Active pixel/vertex units configuration» item and click the Customize button.
In the window that appears, check one of the first two lines (Pixel unit 0 or Pixel unit 1, depending on which four pipelines are disabled).
After pressing the OK button and rebooting, we get a full-fledged GeForce 6600.
The EVEREST Home Edition program also speaks of the use of all eight conveyors.
Testing
After miraculous transformation of GeForce 6200 into GeForce 6600 and overclocking to 550/670MHz, let’s compare the results.
Test stand:
- CPU: Intel Celeron D [email protected] (190×19).
- MoBo: ASUS P5GD1 (LGA775; i915P/ICH6R).
- Memory: 2x256MB PC3200 Kingston KVR400X64C3A/256 (2.5/3/3/6).
- HDD: Samsung 160GB S-ATA.
- Video: Gigabyte (GV-NX62128D) NVDIA GeForce 6200.
- MS Windows XP Service Pack 2.
- NVIDIA ForceWare 71.24.
- Intel Inf Update v.6.3.0.1007.
Synthetic tests:
Real gaming applications:
For comparison, the results obtained at a resolution of 1280x1024x32 are shown, graphics settings were set to maximum, anti-aliasing was not used, 16-bit anisotropic filtering was enabled in Half-Life 2. I want to note that we specifically used such modes in order to test the prospects of the map in future games.
Conclusions
Summing up, I want to say that the video card NVDIA GeForce 6200 did not repeat the history of its predecessor NVIDIA GeForce FX 5200 and proved to be much better. Despite its low cost (less than $100 in Moscow retail), the performance of the card is up to the mark, and taking into account good overclockability and, moreover, the ability to turn on all eight pipelines, the card will allow you to feel comfortable in modern toys, even in heavy modes.
Attention! According to Unwinder ‘s author of the RivaTuner program, unblocking pipelines is physically impossible on NV43 revision A4 (the card does not react at all to an attempt to change the configuration of pipelines). The article unlocked a video card with an A2 revision chip.
We are waiting for your comments in a specially created conference thread.
Subscribe to our channel in Yandex.Zen or telegram channel @overclockers_news — these are convenient ways to follow new materials on the site. With pictures, extended descriptions and no ads.
GeForce 6200 AGP — review. Benchmarks and specs
The GeForce 6200 AGP (GPU) is ranked 0 in our performance rating. Manufacturer: GEFORCE. Works GeForce 6200 AGP with a minimum clock frequency of 230 MHz. The graphics chip is equipped with an acceleration system and can operate in turbo mode or during overclocking. The RAM size is 128 MB GB with a clock speed of 132 MHz and a bandwidth of 1.056 GB/s.
The power consumption of the GeForce 6200 AGP is , and the process technology is only 150 nm. Below you will find key compatibility, sizing, technology, and gaming performance test results. You can also leave comments if you have any questions.
Let’s take a closer look at the most important characteristics of the GeForce 6200 AGP. To have an idea of which video card is better, we recommend using the comparison service.
3.3
From 6
Hitesti Grade
Popular video cards
Most viewed
AMD Radeon RX Vega 7
Intel UHD Graphics 630
Intel UHD Graphics 600
NVIDIA Quadro T1000
AMD Radeon RX Vega 10
NVIDIA GeForce MX330
Intel HD Graphics 530
Intel UHD Graphics 620
Intel HD Graphics 4600
Intel HD Graphics 520
Buy here:
AliExpress
General information
A basic set of information will help you find out the release date of the GeForce 6200 AGP video card and its purpose (laptops or PCs), as well as the price at the time of release and the average current cost. This data also includes the architecture used by the manufacturer and the video processor code name.
Performance Rating Position: | not rated | |||
Architecture: | Celsius | |||
Release date: | 14 December 2003 (18 years ago) | |||
Current price: | $141 | |||
GPU Code Name: | NV18 C1 | |||
Market segment: | Desktop |
Specifications
This is important information that determines all the power characteristics of the GeForce 6200 AGP video card. The smaller the technological process of manufacturing a chip, the better (in modern realities). The clock frequency of the core is responsible for its speed (direct correlation), while signal processing is carried out by transistors (the more transistors, the faster the calculations are performed, for example, in cryptocurrency mining).
Core Clock: | 230 MHz | |||
Process: | 150nm | |||
Number of texels processed in 1 second: | 0.92 | |||
Number of transistors: | 29 million |
Dimensions, connectors and compatibility
Today, there are many form factors of PC cases and laptop sizes, so it is extremely important to know the length of the video card and its connection types (except for laptop options). This will help make the upgrade process easier, as Not all cases can accommodate modern video cards.
Interface: | AGP 8x | |||
Length: | 168mm | |||
Additional power: | None |
Memory (Frequency and Overclocking)
Internal memory is used to store data during calculations. Modern games and professional graphics applications place high demands on the amount and speed of memory. The higher this parameter, the more powerful and faster the video card. Memory type, size and bandwidth for GeForce 6200 AGP + turbo overclocking option.
Memory type: | DDR | |||
Maximum RAM amount: | 128MB | |||
Memory bus width: | 64 Bit | |||
Memory frequency: | 132MHz | |||
Memory bandwidth: | 1.056 GB/s |
Port and display support
As a rule, all modern video cards have several types of connections and additional ports, for example HDMI and DVI . Knowing these features is very important in order to avoid problems connecting a video card to a monitor or other peripherals.
Display connections: | 1x VGA, 1x S-Video |
API support
All APIs supported by GeForce 6200 AGP are listed below. This is a minor factor that does not greatly affect the overall performance.
DirectX: | 8.0 | |||
OpenGL: | 1.3 | |||
Vulkan: | N/A | |||
OpenCL: | N/A |
General gaming performance
All tests are based on FPS. Let’s see what place the GeForce 6200 AGP took in the gaming performance test (the calculation was made in accordance with the game developer’s recommendations for system requirements; it may differ from real situations).
Select games
Horizon Zero DawnDeath StrandingF1 2020Gears TacticsDoom EternalHunt ShowdownEscape from TarkovHearthstoneRed Dead Redemption 2Star Wars Jedi Fallen OrderNeed for Speed HeatCall of Duty Modern Warfare 2019GRID 2019Ghost Recon BreakpointFIFA 20Borderlands 3ControlF1 2019League of LegendsTotal War: Three KingdomsRage 2Anno 1800The Division 2Dirt Rally 2. 0AnthemMetro ExodusFar Cry New DawnApex LegendsJust Cause 4Darksiders IIIFarming Simulator 19Battlefield VFallout 76Hitman 2Call of Duty Black Ops 4Assassin´s Creed OdysseyForza Horizon 4FIFA 19Shadow of the Tomb RaiderStrange BrigadeF1 2018Monster Hunter WorldThe Crew 2Far Cry 5World of Tanks enCoreX-Plane 11.11Kingdom Come: DeliveranceFinal Fantasy XV BenchmarkFortniteStar Wars Battlefront 2Need for Speed PaybackCall of Duty WWIIAssassin´s Creed OriginsWolfenstein II: The New ColossusDestiny 2ELEXThe Evil Survival 2Middle-earth:8 Shadow of WarFIFA EvolvedF1 2017Playerunknown’s Battlegrounds (2017)Team Fortress 2Dirt 4Rocket LeaguePreyMass Effect AndromedaGhost Recon WildlandsFor HonorResident Evil 7Dishonored 2Call of Duty Infinite WarfareTitanfall 2Farming Simulator 17Civilization VIBattlefield 1Mafia 3Deus Ex Mankind Divid edMirror’s Edge CatalystOverwatchDoomAshes of the SingularityHitman 2016The DivisionFar Cry PrimalXCOM 2Rise of the Tomb RaiderRainbow Six SiegeAssassin’s Creed SyndicateStar Wars BattlefrontFallout 4Call of Duty: Black Ops 3Anno 2205World of WarshipsDota 2 RebornThe Witcher 3Dirt RallyGTA VDragon Age: InquisitionFar Cry 4Assassin’s Creed UnityCall of Duty: Advanced WarfareAlien: IsolationMiddle-earth: Shadow of MordorSims 4Wolfenstein: The New OrderThe Elder Scrolls OnlineThiefX-Plane 10. 25Battlefield 4Total War: Rome IICompany of Heroes 2Metro: Last LightBioShock InfiniteStarCraft II: Heart of the SwarmSimCityTomb RaiderCrysis 3Hitman: AbsolutionCall of Duty: Black Ops 2World of Tanks v8Border 2Counter-Strike: GODirt ShowdownDiablo IIIMass Effect 3The Elder Scrolls V: SkyrimBattlefield 3Deus Ex Human RevolutionStarCraft 2Metro 2033Stalker: Call of PripyatGTA IV — Grand Theft AutoLeft 4 DeadTrackmania Nations ForeverCall of Duty 4 — Modern WarfareSupreme Commander — FA BenchCrysi s — GPU BenchmarkWorld in Conflict — BenchmarkHalf Life 2 — Lost Coast BenchmarkWorld of WarcraftDoom 3Quake 3 Arena — TimedemoHalo InfiniteFarming Simulator 22Battlefield 2042Forza Horizon 5Riders RepublicGuardians of the GalaxyBack 4 BloodDeathloopF1 2021Days GoneResident Evil VillageHitman 3Cyberpunk 2077Assassin´s Creed ValhallaDirt 5Watch Dogs LegionMafia Definitive EditionCyberpunk 2077 1.5 GRID LegendsDying Light 2Rainbow Six ExtractionGod of War
low
1280×720
med.
1920×1080
high
1920×1080
ultra
1920×1080
QHD
2560×1440
4K
3840×2160
Horizon Zero Dawn (2020)
low
1280×720
med.
1920×1080
high
1920×1080
ultra
1920×1080
QHD
2560×1440
4K
3840×2160
Death Stranding (2020)
low
1280×720
med.
1920×1080
high
1920×1080
ultra
1920×1080
QHD
2560×1440
4K
3840×2160
F1 2020 (2020)
low
1280×720
med.
1920×1080
high
1920×1080
ultra
1920×1080
QHD
2560×1440
4K
3840×2160
Gears Tactics (2020)
low
1280×720
med.
1920×1080
high
1920×1080
ultra
1920×1080
QHD
2560×1440
4K
3840×2160
Doom Eternal (2020)
low
1280×720
med.
1920×1080
high
1920×1080
ultra
1920×1080
QHD
2560×1440
4K
3840×2160
Description | |
5 | Stutter — The performance of this video card with this game has not yet been studied enough. Based on interpolated information from graphics cards of a similar performance level, the game is likely to stutter and display low frame rates. |
May Stutter — The performance of this video card with this game has not yet been studied enough. Based on interpolated information from graphics cards of a similar performance level, the game is likely to stutter and display low frame rates. | |
30 | Fluent — According to all known benchmarks with the specified graphic settings, this game is expected to run at 25 fps or more |
40 | Fluent — According to all known benchmarks with the specified graphics settings, this game is expected to run at 35fps or more |
60 | Fluent — According to all known benchmarks with the specified graphic settings, this game is expected to run at 58 fps or more |
May Run Fluently — The performance of this video card with this game has not yet been sufficiently studied. Based on interpolated information from graphics cards of a similar performance level, the game is likely to show smooth frame rates. | |
? | Uncertain — testing this video card in this game showed unexpected results. A slower card could deliver higher and more consistent frame rates while running the same reference scene. |
Uncertain — The performance of this video card in this game has not yet been sufficiently studied. It is not possible to reliably interpolate data based on the performance of similar cards in the same category. | |
The value in the fields reflects the average frame rate across the entire database. To get individual results, hover over a value. |
GeForce 6200 AGP in benchmark results
Benchmarks help determine performance in standard GeForce 6200 AGP tests. We have compiled a list of the most famous benchmarks in the world so that you can get accurate results for each of them (see description). Pre-testing the graphics card is especially important when there are high loads, so that the user can see how the graphics processor copes with calculations and data processing.
Overall performance in benchmarks
3.3
From 6
Hitesti Grade
Share on social networks:
In order to leave a review, you need to log in
Review GeForce 6200 AGP
Compare GeForce 6200 AGP
VS
NVIDIA GeForce 410M
Intel UHD Graphics 620
Intel UHD Graphics 630
AMD Radeon HD 7520G + HD 7670M Dual Graphics
AMD Radeon HD 8650G + HD 8670M Dual Graphics
AMD Radeon RX Vega 10
Intel UHD Graphics (Jasper Lake 16 EU)
Qualcomm Adreno 685
AMD Radeon HD 6620G + HD 7670M Dual Graphics
AMD Radeon HD 8550G + HD 8670M Dual Graphics
NVIDIA GeForce 6200 AGP — 21 secret facts, review, specifications, reviews.
Top specifications and features
- GPU base clock
- RAM
- Memory Bandwidth
- GPU memory frequency
- Technological process
Performance
NVIDIA GeForce 6200 AGP:
1146
Best score:
Memory
NVIDIA GeForce 6200 AGP:
38
Best score:
General Information
NVIDIA GeForce 6200 AGP:
505
Best score:
Features
NVIDIA GeForce 6200 AGP:
86
Best score:
NVIDIA GeForce 6200 AGP:
35
Best score:
Description
NVIDIA GeForce 6200 AGP graphics card based on Curie architecture has 75 million transistors, tech. 110 nm process. The frequency of the graphics core is 230 MHz. In terms of memory, 0.256 GB is installed here. DDR2, 266 MHz frequency and with a maximum bandwidth of 4.256 Gb / s. At the same time, the maximum number of points for today is 260261 points.
DirectX version — 9. The OpenGL version is 2.1.
In our tests, the video card scores 2809 points.
Why NVIDIA GeForce 6200 AGP is better than others
Has no merit
- The base clock frequency of the GPU is 230 MHz. This parameter is lower than 86%
- RAM 0.256 GB. This parameter is lower than 73%
- Memory bandwidth 4.256 GB/s. This parameter is lower than 79%
- GPU memory frequency 266 MHz. This parameter is lower than that of 76%
- Technological process 110 nm. This parameter is higher than 89%
- GDDR Memory Version 2 . This parameter is lower than 75%
- Number of transistors 75 million. This setting is lower than 86%
- DirectX 9 . This parameter is lower than 93%
Review NVIDIA GeForce 6200 AGP
Performance
Memory
general information
Functions
Ports
NVIDIA GeForce 6200 AGP Review: Highlights
GPU base clock
The graphics processing unit (GPU) has a high clock speed.
230MHz
max 2457
Average: 938 MHz
2457MHz
GPU memory frequency
This is an important aspect calculating memory bandwidth
266MHz
max 16000
Average: 1326.6 MHz
16000 MHz
Texture speed
0.918
max 940
Average: 65. 6
940
Architecture name
Curie
GPU name
NV44
Memory bandwidth
This is the speed at which the device stores or reads information.
4.256GB/s
max 2656
Average: 198.3 GB/s
2656GB/s
RAM
0.256GB
max 128
Average: 4.6 GB
128GB
Maximum memory
0.128GB
max 128
Average: 3.9 GB
128GB
GDDR Memory Versions
Latest GDDR memory versions provide high data transfer rates to improve overall performance
Show all
2
Average: 4. 5
6
Memory bus width
A wide memory bus indicates that it can transfer more information in one cycle. This property affects the performance of the memory as well as the overall performance of the device’s graphics card.
Show all
64bit
max 8192
Average: 290.1bit
8192 bit
Release date
December 2003
Mean value:
Process technology
The small size of the semiconductor means it is a new generation chip.
110 nm
Average: 47.5 nm
4nm
Number of transistors
75 million
max 80000
Average: 5043 million
80000 million
Purpose
Desktop
DirectX
Used in demanding games providing enhanced graphics
9
max 12. 2
Average: 11.1
12.2
opengl version
Later versions provide quality game graphics
2.1
max 4.6
Average: 4
4.6
Shader model version
3
max 6.6
Average: 5.5
6.6
DVI outputs
Allows connection to a display using DVI
one
Mean: 1.4
3
VGA
Yes
FAQ
How much RAM does NVIDIA GeForce 6200 AGP 9 have?1033
NVIDIA GeForce 6200 AGP has 0.256 GB.
What version of RAM does NVIDIA GeForce 6200 AGP
NVIDIA GeForce 6200 AGP support GDDR2.
What is the architecture of the video card NVIDIA GeForce 6200 AGP
Curie.
What version of DirectX does NVIDIA GeForce 6200 AGP
DirectX 9 support.
Does NVIDIA GeForce 6200 AGP support DVI
1 DVI ports.
When was the NVIDIA GeForce 6200 AGP released?
December 2003.
leave your feedback
Characteristics of NVIDIA GeForce 6200 NV43A / Overclockers.ua
- News
- Specifications
- Reviews
- Processors
- Motherboards
- Memory
- Video cards
- Cooling systems
- Enclosures
- Power supplies
9Fury XRadeon R9 FuryRadeon R9 NanoRadeon R9 390XRadeon R9 390Radeon R9 380XRadeon R9 380Radeon R7 370Radeon R7 360Radeon R9 295X2Radeon R9 290XRadeon R9 290Radeon R9 280XRadeon R9 285Radeon R9 280Radeon R9 270XRadeon R9 270Radeon R7 265Radeon R7 260XRadeon R7 260Radeon R7 250Radeon R7 240Radeon HD 7970Radeon HD 7950Radeon HD 7870 XTRadeon HD 7870Radeon HD 7850Radeon HD 7790Radeon HD 7770Radeon HD 7750Radeon HD 6990Radeon HD 6970Radeon HD 6950Radeon HD 6930Radeon HD 6870Radeon HD 6850Radeon HD 6790Radeon HD 6770Radeon HD 6750Radeon HD 6670 GDDR5Radeon HD 6670 GDDR3Radeon HD 6570 GDDR5Radeon HD 6570 GDDR3Radeon HD 6450 GDDR5Radeon HD 6450 GDDR3Radeon HD 5570 GDDR5Radeon HD 3750Radeon HD 3730Radeon HD 5970Radeon HD 5870Radeon HD 5850Radeon HD 5830Radeon HD 5770Radeon HD 5750Radeon HD 5670Radeon HD 5570Radeon HD 5550Radeon HD 5450Radeon HD 4890Radeon HD 4870 X2Radeon HD 4870Radeon HD 4860Radeon HD 4850 X2Radeon HD 4850Radeon HD 4830Radeon HD 4790Radeon HD 4770Radeon HD 4730Radeon HD 4670Radeon HD 4650Radeon HD 4550Radeon HD 4350Radeon HD 4350Radeon HD 43500 (IGP 890GX) Radeon HD 4200 (IGP)Radeon HD 3870 X2Radeon HD 3870Radeon HD 3850Radeon HD 3690Radeon HD 3650Radeon HD 3470Radeon HD 3450Radeon HD 3300 (IGP)Radeon HD 3200 (IGP)Radeon HD 3100 (IGP)Radeon HD 2900 XT 1Gb GDDR4Radeon HD 2900 XTRadeon HD 2900 PRORadeon HD 2900 GTRadeon HD 2600 XT DUALRadeon HD 2600 XT GDDR4Radeon HD 2600 XTRadeon HD 2600 PRORadeon HD 2400 XTRadeon HD 2400 PRORadeon HD 2350Radeon X1950 CrossFire EditionRadeon X1950 XTXRadeon X1950 XTRadeon X1950 PRO DUALRadeon X1950 PRORadeon X1950 GTRadeon X1900 CrossFire EditionRadeon X1900 XTXRadeon X1900 XTRadeon X1900 GT Rev2Radeon X1900 GTRadeon X1800 CrossFire EditionRadeon X1800 XT PE 512MBRadeon X1800 XTRadeon X1800 XLRadeon X1800 GTORadeon X1650 XTRadeon X1650 GTRadeon X1650 XL DDR3Radeon X1650 XL DDR2Radeon X1650 PRO on RV530XTRadeon X1650 PRO on RV535XTRadeon X1650Radeon X1600 XTRadeon X1600 PRORadeon X1550 PRORadeon X1550Radeon X1550 LERadeon X1300 XT on RV530ProRadeon X1300 XT on RV535ProRadeon X1300 CERadeon X1300 ProRadeon X1300Radeon X1300 LERadeon X1300 HMRadeon X1050Radeon X850 XT Platinum EditionRadeon X850 XT CrossFire EditionRadeon X850 XT Radeon X850 Pro Radeon X800 XT Platinum EditionRadeon X800 XTRadeon X800 CrossFire EditionRadeon X800 XLRadeon X800 GTO 256MBRadeon X800 GTO 128MBRadeon X800 GTO2 256MBRadeon X800Radeon X800 ProRadeon X800 GT 256MBRadeon X800 GT 128MBRadeon X800 SERadeon X700 XTRadeon X700 ProRadeon X700Radeon X600 XTRadeon X600 ProRadeon X550 XTRadeon X550Radeon X300 SE 128MB HM-256MBR adeon X300 SE 32MB HM-128MBRadeon X300Radeon X300 SERadeon 9800 XTRadeon 9800 PRO /DDR IIRadeon 9800 PRO /DDRRadeon 9800Radeon 9800 SE-256 bitRadeon 9800 SE-128 bitRadeon 9700 PRORadeon 9700Radeon 9600 XTRadeon 9600 PRORadeon 9600Radeon 9600 SERadeon 9600 TXRadeon 9550 XTRadeon 9550Radeon 9550 SERadeon 9500 PRORadeon 9500 /128 MBRadeon 9500 /64 MBRadeon 9250Radeon 9200 PRORadeon 9200Radeon 9200 SERadeon 9000 PRORadeon 9000Radeon 9000 XTRadeon 8500 LE / 9100Radeon 8500Radeon 7500Radeon 7200 Radeon LE Radeon DDR OEM Radeon DDR Radeon SDR Radeon VE / 7000Rage 128 GL Rage 128 VR Rage 128 PRO AFRRage 128 PRORage 1283D Rage ProNVIDIAGeForce RTX 3090 TiGeForce RTX 3090GeForce RTX 3080 TiGeForce RTX 3080 12GBGeForce RTX 3080GeForce RTX 3070 TiGeForce RTX 3070GeForce RTX 3060 TiGeForce RTX 3060 rev. 2GeForce RTX 3060GeForce RTX 3050GeForce RTX 2080 TiGeForce RTX 2080 SuperGeForce RTX 2080GeForce RTX 2070 SuperGeForce RTX 2070GeForce RTX 2060 SuperGeForce RTX 2060GeForce GTX 1660 TiGeForce GTX 1660 SuperGeForce GTX 1660GeForce GTX 1650 SuperGeForce GTX 1650 GDDR6GeForce GTX 1650 rev.3GeForce GTX 1650 rev.2GeForce GTX 1650GeForce GTX 1630GeForce GTX 1080 TiGeForce GTX 1080GeForce GTX 1070 TiGeForce GTX 1070GeForce GTX 1060GeForce GTX 1060 3GBGeForce GTX 1050 TiGeForce GTX 1050 3GBGeForce GTX 1050GeForce GT 1030GeForce GTX Titan XGeForce GTX 980 TiGeForce GTX 980GeForce GTX 970GeForce GTX 960GeForce GTX 950GeForce GTX TitanGeForce GTX 780 TiGeForce GTX 780GeForce GTX 770GeForce GTX 760GeForce GTX 750 TiGeForce GTX 750GeForce GT 740GeForce GT 730GeForce GTX 690GeForce GTX 680GeForce GTX 670GeForce GTX 660 TiGeForce GTX 660GeForce GTX 650 Ti BoostGeForce GTX 650 TiGeForce GTX 650GeForce GT 640 rev.2GeForce GT 640GeForce GT 630 rev.2GeForce GT 630GeForce GTX 590GeForce GTX 580GeForce GTX 570GeForce GTX 560 TiGeForce GTX 560GeForce GTX 550 TiGeForce GT 520GeForce GTX 480GeForce GTX 470GeForce GTX 465GeForce GTX 460 SEGeForce GTX 460 1024MBGeForce GTX 460 768MBGeForce GTS 450GeForce GT 440 GDDR5GeForce GT 440 GDDR3GeForce GT 430GeForce GT 420GeForce GTX 295GeForce GTX 285GeForce GTX 280GeForce GTX 275GeForce GTX 260 rev. 2GeForce GTX 260GeForce GTS 250GeForce GTS 240GeForce GT 240GeForce GT 230GeForce GT 220GeForce 210Geforce 205GeForce GTS 150GeForce GT 130GeForce GT 120GeForce G100GeForce 9800 GTX+GeForce 9800 GTXGeForce 9800 GTSGeForce 9800 GTGeForce 9800 GX2GeForce 9600 GTGeForce 9600 GSO (G94)GeForce 9600 GSOGeForce 9500 GTGeForce 9500 GSGeForce 9400 GTGeForce 9400GeForce 9300GeForce 8800 ULTRAGeForce 8800 GTXGeForce 8800 GTS Rev2GeForce 8800 GTSGeForce 8800 GTGeForce 8800 GS 768MBGeForce 8800 GS 384MBGeForce 8600 GTSGeForce 8600 GTGeForce 8600 GSGeForce 8500 GT DDR3GeForce 8500 GT DDR2GeForce 8400 GSGeForce 8300GeForce 8200GeForce 8100GeForce 7950 GX2GeForce 7950 GTGeForce 7900 GTXGeForce 7900 GTOGeForce 7900 GTGeForce 7900 GSGeForce 7800 GTX 512MBGeForce 7800 GTXGeForce 7800 GTGeForce 7800 GS AGPGeForce 7800 GSGeForce 7600 GT Rev.2GeForce 7600 GTGeForce 7600 GS 256MBGeForce 7600 GS 512MBGeForce 7300 GT Ver2GeForce 7300 GTGeForce 7300 GSGeForce 7300 LEGeForce 7300 SEGeForce 7200 GSGeForce 7100 GS TC 128 (512)GeForce 6800 Ultra 512MBGeForce 6800 UltraGeForce 6800 GT 256MBGeForce 6800 GT 128MBGeForce 6800 GTOGeForce 6800 256MB PCI-EGeForce 6800 128MB PCI-EGeForce 6800 LE PCI-EGeForce 6800 256MB AGPGeForce 6800 128MB AGPGeForce 6800 LE AGPGeForce 6800 GS AGPGeForce 6800 GS PCI-EGeForce 6800 XTGeForce 6600 GT PCI-EGeForce 6600 GT AGPGeForce 6600 DDR2GeForce 6600 PCI-EGeForce 6600 AGPGeForce 6600 LEGeForce 6200 NV43VGeForce 6200GeForce 6200 NV43AGeForce 6500GeForce 6200 TC 64(256)GeForce 6200 TC 32(128)GeForce 6200 TC 16(128)GeForce PCX5950GeForce PCX 5900GeForce PCX 5750GeForce PCX 5550GeForce PCX 5300GeForce PCX 4300GeForce FX 5950 UltraGeForce FX 5900 UltraGeForce FX 5900GeForce FX 5900 ZTGeForce FX 5900 XTGeForce FX 5800 UltraGeForce FX 5800GeForce FX 5700 Ultra /DDR-3GeForce FX 5700 Ultra /DDR-2GeForce FX 5700GeForce FX 5700 LEGeForce FX 5600 Ultra (rev. 2)GeForce FX 5600 Ultra (rev.1)GeForce FX 5600 XTGeForce FX 5600GeForce FX 5500GeForce FX 5200 UltraGeForce FX 5200GeForce FX 5200 SEGeForce 4 Ti 4800GeForce 4 Ti 4800-SEGeForce 4 Ti 4200-8xGeForce 4 Ti 4600GeForce 4 Ti 4400GeForce 4 Ti 4200GeForce 4 MX 4000GeForce 4 MX 440-8x / 480GeForce 4 MX 460GeForce 4 MX 440GeForce 4 MX 440-SEGeForce 4 MX 420GeForce 3 Ti500GeForce 3 Ti200GeForce 3GeForce 2 Ti VXGeForce 2 TitaniumGeForce 2 UltraGeForce 2 PROGeForce 2 GTSGeForce 2 MX 400GeForce 2 MX 200GeForce 2 MXGeForce 256 DDRGeForce 256Riva TNT 2 UltraRiva TNT 2 PRORiva TNT 2Riva TNT 2 M64Riva TNT 2 Vanta LTRiva TNT 2 VantaRiva TNTRiva 128 ZXRiva 128 9Fury XRadeon R9 FuryRadeon R9 NanoRadeon R9 390XRadeon R9 390Radeon R9 380XRadeon R9 380Radeon R7 370Radeon R7 360Radeon R9 295X2Radeon R9 290XRadeon R9 290Radeon R9 280XRadeon R9 285Radeon R9 280Radeon R9 270XRadeon R9 270Radeon R7 265Radeon R7 260XRadeon R7 260Radeon R7 250Radeon R7 240Radeon HD 7970Radeon HD 7950Radeon HD 7870 XTRadeon HD 7870Radeon HD 7850Radeon HD 7790Radeon HD 7770Radeon HD 7750Radeon HD 6990Radeon HD 6970Radeon HD 6950Radeon HD 6930Radeon HD 6870Radeon HD 6850Radeon HD 6790Radeon HD 6770Radeon HD 6750Radeon HD 6670 GDDR5Radeon HD 6670 GDDR3Radeon HD 6570 GDDR5Radeon HD 6570 GDDR3Radeon HD 6450 GDDR5Radeon HD 6450 GDDR3Radeon HD 5570 GDDR5Radeon HD 3750Radeon HD 3730Radeon HD 5970Radeon HD 5870Radeon HD 5850Radeon HD 5830Radeon HD 5770Radeon HD 5750Radeon HD 5670Radeon HD 5570Radeon HD 5550Radeon HD 5450Radeon HD 4890Radeon HD 4870 X2Radeon HD 4870Radeon HD 4860Radeon HD 4850 X2Radeon HD 4850Radeon HD 4830Radeon HD 4790Radeon HD 4770Radeon HD 4730Radeon HD 4670Radeon HD 4650Radeon HD 4550Radeon HD 4350Radeon HD 4350Radeon HD 43500 (IGP 890GX) Radeon HD 4200 (IGP)Radeon HD 3870 X2Radeon HD 3870Radeon HD 3850Radeon HD 3690Radeon HD 3650Radeon HD 3470Radeon HD 3450Radeon HD 3300 (IGP)Radeon HD 3200 (IGP)Radeon HD 3100 (IGP)Radeon HD 2900 XT 1Gb GDDR4Radeon HD 2900 XTRadeon HD 2900 PRORadeon HD 2900 GTRadeon HD 2600 XT DUALRadeon HD 2600 XT GDDR4Radeon HD 2600 XTRadeon HD 2600 PRORadeon HD 2400 XTRadeon HD 2400 PRORadeon HD 2350Radeon X1950 CrossFire EditionRadeon X1950 XTXRadeon X1950 XTRadeon X1950 PRO DUALRadeon X1950 PRORadeon X1950 GTRadeon X1900 CrossFire EditionRadeon X1900 XTXRadeon X1900 XTRadeon X1900 GT Rev2Radeon X1900 GTRadeon X1800 CrossFire EditionRadeon X1800 XT PE 512MBRadeon X1800 XTRadeon X1800 XLRadeon X1800 GTORadeon X1650 XTRadeon X1650 GTRadeon X1650 XL DDR3Radeon X1650 XL DDR2Radeon X1650 PRO on RV530XTRadeon X1650 PRO on RV535XTRadeon X1650Radeon X1600 XTRadeon X1600 PRORadeon X1550 PRORadeon X1550Radeon X1550 LERadeon X1300 XT on RV530ProRadeon X1300 XT on RV535ProRadeon X1300 CERadeon X1300 ProRadeon X1300Radeon X1300 LERadeon X1300 HMRadeon X1050Radeon X850 XT Platinum EditionRadeon X850 XT CrossFire EditionRadeon X850 XT Radeon X850 Pro Radeon X800 XT Platinum EditionRadeon X800 XTRadeon X800 CrossFire EditionRadeon X800 XLRadeon X800 GTO 256MBRadeon X800 GTO 128MBRadeon X800 GTO2 256MBRadeon X800Radeon X800 ProRadeon X800 GT 256MBRadeon X800 GT 128MBRadeon X800 SERadeon X700 XTRadeon X700 ProRadeon X700Radeon X600 XTRadeon X600 ProRadeon X550 XTRadeon X550Radeon X300 SE 128MB HM-256MBR adeon X300 SE 32MB HM-128MBRadeon X300Radeon X300 SERadeon 9800 XTRadeon 9800 PRO /DDR IIRadeon 9800 PRO /DDRRadeon 9800Radeon 9800 SE-256 bitRadeon 9800 SE-128 bitRadeon 9700 PRORadeon 9700Radeon 9600 XTRadeon 9600 PRORadeon 9600Radeon 9600 SERadeon 9600 TXRadeon 9550 XTRadeon 9550Radeon 9550 SERadeon 9500 PRORadeon 9500 /128 MBRadeon 9500 /64 MBRadeon 9250Radeon 9200 PRORadeon 9200Radeon 9200 SERadeon 9000 PRORadeon 9000Radeon 9000 XTRadeon 8500 LE / 9100Radeon 8500Radeon 7500Radeon 7200 Radeon LE Radeon DDR OEM Radeon DDR Radeon SDR Radeon VE / 7000Rage 128 GL Rage 128 VR Rage 128 PRO AFRRage 128 PRORage 1283D Rage ProNVIDIAGeForce RTX 3090 TiGeForce RTX 3090GeForce RTX 3080 TiGeForce RTX 3080 12GBGeForce RTX 3080GeForce RTX 3070 TiGeForce RTX 3070GeForce RTX 3060 TiGeForce RTX 3060 rev. 2GeForce RTX 3060GeForce RTX 3050GeForce RTX 2080 TiGeForce RTX 2080 SuperGeForce RTX 2080GeForce RTX 2070 SuperGeForce RTX 2070GeForce RTX 2060 SuperGeForce RTX 2060GeForce GTX 1660 TiGeForce GTX 1660 SuperGeForce GTX 1660GeForce GTX 1650 SuperGeForce GTX 1650 GDDR6GeForce GTX 1650 rev.3GeForce GTX 1650 rev.2GeForce GTX 1650GeForce GTX 1630GeForce GTX 1080 TiGeForce GTX 1080GeForce GTX 1070 TiGeForce GTX 1070GeForce GTX 1060GeForce GTX 1060 3GBGeForce GTX 1050 TiGeForce GTX 1050 3GBGeForce GTX 1050GeForce GT 1030GeForce GTX Titan XGeForce GTX 980 TiGeForce GTX 980GeForce GTX 970GeForce GTX 960GeForce GTX 950GeForce GTX TitanGeForce GTX 780 TiGeForce GTX 780GeForce GTX 770GeForce GTX 760GeForce GTX 750 TiGeForce GTX 750GeForce GT 740GeForce GT 730GeForce GTX 690GeForce GTX 680GeForce GTX 670GeForce GTX 660 TiGeForce GTX 660GeForce GTX 650 Ti BoostGeForce GTX 650 TiGeForce GTX 650GeForce GT 640 rev.2GeForce GT 640GeForce GT 630 rev.2GeForce GT 630GeForce GTX 590GeForce GTX 580GeForce GTX 570GeForce GTX 560 TiGeForce GTX 560GeForce GTX 550 TiGeForce GT 520GeForce GTX 480GeForce GTX 470GeForce GTX 465GeForce GTX 460 SEGeForce GTX 460 1024MBGeForce GTX 460 768MBGeForce GTS 450GeForce GT 440 GDDR5GeForce GT 440 GDDR3GeForce GT 430GeForce GT 420GeForce GTX 295GeForce GTX 285GeForce GTX 280GeForce GTX 275GeForce GTX 260 rev. 2GeForce GTX 260GeForce GTS 250GeForce GTS 240GeForce GT 240GeForce GT 230GeForce GT 220GeForce 210Geforce 205GeForce GTS 150GeForce GT 130GeForce GT 120GeForce G100GeForce 9800 GTX+GeForce 9800 GTXGeForce 9800 GTSGeForce 9800 GTGeForce 9800 GX2GeForce 9600 GTGeForce 9600 GSO (G94)GeForce 9600 GSOGeForce 9500 GTGeForce 9500 GSGeForce 9400 GTGeForce 9400GeForce 9300GeForce 8800 ULTRAGeForce 8800 GTXGeForce 8800 GTS Rev2GeForce 8800 GTSGeForce 8800 GTGeForce 8800 GS 768MBGeForce 8800 GS 384MBGeForce 8600 GTSGeForce 8600 GTGeForce 8600 GSGeForce 8500 GT DDR3GeForce 8500 GT DDR2GeForce 8400 GSGeForce 8300GeForce 8200GeForce 8100GeForce 7950 GX2GeForce 7950 GTGeForce 7900 GTXGeForce 7900 GTOGeForce 7900 GTGeForce 7900 GSGeForce 7800 GTX 512MBGeForce 7800 GTXGeForce 7800 GTGeForce 7800 GS AGPGeForce 7800 GSGeForce 7600 GT Rev.2GeForce 7600 GTGeForce 7600 GS 256MBGeForce 7600 GS 512MBGeForce 7300 GT Ver2GeForce 7300 GTGeForce 7300 GSGeForce 7300 LEGeForce 7300 SEGeForce 7200 GSGeForce 7100 GS TC 128 (512)GeForce 6800 Ultra 512MBGeForce 6800 UltraGeForce 6800 GT 256MBGeForce 6800 GT 128MBGeForce 6800 GTOGeForce 6800 256MB PCI-EGeForce 6800 128MB PCI-EGeForce 6800 LE PCI-EGeForce 6800 256MB AGPGeForce 6800 128MB AGPGeForce 6800 LE AGPGeForce 6800 GS AGPGeForce 6800 GS PCI-EGeForce 6800 XTGeForce 6600 GT PCI-EGeForce 6600 GT AGPGeForce 6600 DDR2GeForce 6600 PCI-EGeForce 6600 AGPGeForce 6600 LEGeForce 6200 NV43VGeForce 6200GeForce 6200 NV43AGeForce 6500GeForce 6200 TC 64(256)GeForce 6200 TC 32(128)GeForce 6200 TC 16(128)GeForce PCX5950GeForce PCX 5900GeForce PCX 5750GeForce PCX 5550GeForce PCX 5300GeForce PCX 4300GeForce FX 5950 UltraGeForce FX 5900 UltraGeForce FX 5900GeForce FX 5900 ZTGeForce FX 5900 XTGeForce FX 5800 UltraGeForce FX 5800GeForce FX 5700 Ultra /DDR-3GeForce FX 5700 Ultra /DDR-2GeForce FX 5700GeForce FX 5700 LEGeForce FX 5600 Ultra (rev. 2)GeForce FX 5600 Ultra (rev.1)GeForce FX 5600 XTGeForce FX 5600GeForce FX 5500GeForce FX 5200 UltraGeForce FX 5200GeForce FX 5200 SEGeForce 4 Ti 4800GeForce 4 Ti 4800-SEGeForce 4 Ti 4200-8xGeForce 4 Ti 4600GeForce 4 Ti 4400GeForce 4 Ti 4200GeForce 4 MX 4000GeForce 4 MX 440-8x / 480GeForce 4 MX 460GeForce 4 MX 440GeForce 4 MX 440-SEGeForce 4 MX 420GeForce 3 Ti500GeForce 3 Ti200GeForce 3GeForce 2 Ti VXGeForce 2 TitaniumGeForce 2 UltraGeForce 2 PROGeForce 2 GTSGeForce 2 MX 400GeForce 2 MX 200GeForce 2 MXGeForce 256 DDRGeForce 256Riva TNT 2 UltraRiva TNT 2 PRORiva TNT 2Riva TNT 2 M64Riva TNT 2 Vanta LTRiva TNT 2 VantaRiva TNTRiva 128 ZXRiva 128
You can simultaneously select
up to 10 video cards by holding Ctrl
- U.A. | EN
Video card GeForce 6200 AGP — Technical City
NVIDIA
GeForce 6200 AGP
- AGP interface 8x
- Core frequency 230 MHz
- Video memory size 128 MB
- Memory type DDR
- Memory frequency 132 MHz
- Maximum resolution
Description
NVIDIA started GeForce 6200 AGP sales 14 December 2003. This is Celsius architecture desktop card based on 150 nm manufacturing process and primarily aimed at gamers. It has 128 MB of DDR memory at 0.13 GHz, and coupled with a 64-bit interface, this creates a throughput of 1.056 Gb / s.
In terms of compatibility, this is a single-slot card connected via the AGP 8x interface. The length of the reference version is 168 mm.
We don’t have test results for the GeForce 6200 AGP.
General information
Information about the type (desktop or laptop) and architecture of GeForce 6200 AGP, as well as when sales started and cost at that time.
place in the performance rating | A | ||||||||||||||||||||||||||||||||||||||||||||||||||||
Architecture | Celsius (1999–2005) |
Frequency of the nucleus | 230 MHz | of 2610 (Radeon RX 6500 XT) |
The number of transistors | 29 29 290 ti) | |
memory capacity | 1.056 GB/s | of 14400 (Radeon R7 M260) |
Video
Types and number of video connectors present on GeForce 6200 AGP. As a rule, this section is relevant only for desktop reference video cards, since for laptop ones the availability of certain video outputs depends on the laptop model.
Video connectors | ||
Vulkan | N/A |
Benchmark tests
These are the results of GeForce 6200 AGP rendering performance tests in non-gaming benchmarks. The overall score is set from 0 to 100, where 100 corresponds to the fastest video card at the moment.
Other video cards
Here we recommend several video cards that are more or less close in performance to the reviewed one.
Recommended processors
According to our statistics, these processors are most often used with the GeForce 6200 AGP.
Pentium 4
P4 3.0
13%
Core 2
Duo E4500
8.7%
Athlon XP
3200+
4.3%
Pentium 4
630
4.3%
Athlon XP
1700+
4.3%
Pentium III
1133
4.3%
Pentium
E5400
4. 3%
Core 2
Duo E7300
4.3%
Athlon II
X2 250
4.3%
Pentium D
945
4.3%
User rating
Here you can see the rating of the video card by users, as well as put your own rating.
Tips and comments
Here you can ask a question about the GeForce 6200 AGP, agree or disagree with our judgements, or report an error or mismatch.
Please enable JavaScript to view the comments powered by Disqus.
Video card Nvidia GeForce 6200 PCI
Home / Video cards / Nvidia GeForce 6200 PCI
- Issue date: January, 2008;
- Video card memory size: 256 MB;
- Video memory type: DDR2;
- GPU frequency: 280 MHz.
Specifications Nvidia GeForce 6200 PCI
GPU
GPU manufacturer | Nvidia |
---|---|
GPU name | NV43 |
Platform | Desktop |
Clock frequency | 280 MHz |
Two | No |
Reference card | No |
Performance
TMU | 4 |
---|---|
Number of ROPs | 2 |
Pixel Fill Rate | 0. 56GPixel/s |
Texture Fill Rate | 1.12 GTexel/s |
Memory
Memory clock speed | 200MHz |
---|---|
Effective memory frequency | 400 MHz |
Memory bus width | 64bit |
Video memory size | 256MB |
Memory type | DDR2 |
Memory bandwidth | 3.2 GB/s |
Comparison of video card GeForce 6200 PCI with analogues
effective Amount of video memory Clock frequency
Frequency at which memory can be read from and written to
GeForce 6200 PCI | 400 MHz |
---|---|
GeForce 210 | 800 MHz |
Radeon HD 5450 PCI | 666MHz |
Memory bandwidth
Speed at which data can be read from or stored in the onboard memory
GeForce 6200 PCI | 3.
|
---|