GeForce FX 5200 [in 1 benchmark]
NVIDIA
GeForce FX 5200
Buy
- Interface AGP 8x
- Core clock speed 250 MHz
- Max video memory 128 MB
- Memory type DDR
- Memory clock speed 400 MHz
- Maximum resolution
Summary
NVIDIA started GeForce FX 5200 sales 6 March 2003 at a recommended price of $69.99. This is Celsius architecture desktop card based on 150 nm manufacturing process and primarily aimed at gamers. 128 MB of DDR memory clocked at 0.4 GHz are supplied, and together with 128 Bit memory interface this creates a bandwidth of 6.4 GB/s.
Compatibility-wise, this is single-slot card attached via AGP 8x interface.
We have no data on GeForce FX 5200 benchmark results.
General info
Some basic facts about GeForce FX 5200: architecture, market segment, release date etc.
Place in performance rating | not rated | |
Architecture | Celsius (1999−2005) | |
GPU code name | NV18 C1 | |
Market segment | Desktop | |
Release date | 6 March 2003 (19 years ago) | |
Launch price (MSRP) | $69.99 | |
Current price | $95 (1.4x MSRP) | of 49999 (A100 SXM4) |
Technical specs
GeForce FX 5200’s general performance parameters such as number of shaders, GPU base clock, manufacturing process, texturing and calculation speed. These parameters indirectly speak of GeForce FX 5200’s performance, but for precise assessment you have to consider its benchmark and gaming test results.
Core clock speed | 250 MHz | of 2610 (Radeon RX 6500 XT) |
Number of transistors | 29 million | of 14400 (GeForce GTX 1080 SLI Mobile) |
Manufacturing process technology | 150 nm | of 4 (GeForce RTX 4080 Ti) |
Texture fill rate | 1. 000 | of 959.6 (Radeon RX 7900 XTX) |
Compatibility, dimensions and requirements
Information on GeForce FX 5200’s compatibility with other computer components. Useful when choosing a future computer configuration or upgrading an existing one. For desktop graphics cards it’s interface and bus (motherboard compatibility), additional power connectors (power supply compatibility).
Interface | AGP 8x | |
Width | 1-slot | |
Supplementary power connectors | None |
Memory
Parameters of memory installed on GeForce FX 5200: its type, size, bus, clock and resulting bandwidth. Note that GPUs integrated into processors have no dedicated memory and use a shared part of system RAM instead.
Memory type | DDR | |
Maximum RAM amount | 128 MB | of 128 (Radeon Instinct MI250X) |
Memory bus width | 128 Bit | of 8192 (Radeon Instinct MI250X) |
Memory clock speed | 400 MHz | of 22400 (GeForce RTX 4080) |
Memory bandwidth | 6. 4 GB/s | of 14400 (Radeon R7 M260) |
Video outputs and ports
Types and number of video connectors present on GeForce FX 5200. As a rule, this section is relevant only for desktop reference graphics cards, since for notebook ones the availability of certain video outputs depends on the laptop model, while non-reference desktop models can (though not necessarily will) bear a different set of video ports.
Display Connectors | 1x DVI, 1x VGA, 1x S-Video |
API support
APIs supported by GeForce FX 5200, sometimes including their particular versions.
DirectX | 8.0 | |
OpenGL | 1.3 | of 4.6 (GeForce GTX 1080 Mobile) |
OpenCL | N/A | |
Vulkan | N/A |
Benchmark performance
Non-gaming benchmark performance of GeForce FX 5200. Note that overall benchmark performance is measured in points in 0-100 range.
- Passmark
Passmark
This is probably the most ubiquitous benchmark, part of Passmark PerformanceTest suite. It gives the graphics card a thorough evaluation under various load, providing four separate benchmarks for Direct3D versions 9, 10, 11 and 12 (the last being done in 4K resolution if possible), and few more tests engaging DirectCompute capabilities.
Benchmark coverage: 26%
FX 5200
7
Similar GPUs
Here is our recommendation of several graphics cards that are more or less close in performance to the one reviewed.
Recommended processors
These processors are most commonly used with GeForce FX 5200 according to our statistics.
Pentium
N4200
4. 1%
Pentium
4415U
3.1%
Pentium 4
2.4 GHz
2.1%
Celeron
N4020
2.1%
Pentium Silver
N5030
2.1%
Pentium D
945
1.6%
Pentium 4
HT 631
1.6%
Pentium 4
P4 3.0
1.6%
Core i3
3220
1. 6%
Pentium Dual
Core E2140
1.6%
User rating
Here you can see the user rating of the graphics card, as well as rate it yourself.
Questions and comments
Here you can ask a question about GeForce FX 5200, agree or disagree with our judgements, or report an error or mismatch.
Please enable JavaScript to view the comments powered by Disqus.
NVIDIA GeForce FX 5200 — review. GPU Benchmark & Specs
NVIDIA GeForce FX 5200 graphics card (also called GPU) comes in in the performance rating. It is a good result. The graphics card NVIDIA GeForce FX 5200 runs with the minimal clock speed 250 MHz. It is featured by the acceleration option and able to run up to . The manufacturer has equipped NVIDIA with GB of 128 MB memory, clock speed 400 MHz and bandwidth 6.4 GB/s.
The power consumption of the graphics card is , and the fabrication process is only 150 nm. Below you will find the main data on the compatibility, sizes, technologies and gaming performance test results. Also you can read and leave the comments.
Let’s take a closer look at the most important specifications of the graphics card. To have a good idea what a graphics card is the best, we recommend to use comparison service.
3.8
Out of 20
Hitesti score
Popular graphics cards
Most viewed
AMD Radeon RX Vega 7
Intel UHD Graphics 630
Intel UHD Graphics 600
AMD Radeon RX Vega 10
NVIDIA Quadro T1000
Intel HD Graphics 530
Intel UHD Graphics 620
NVIDIA GeForce MX330
Intel HD Graphics 4600
Intel HD Graphics 520
Buy here:
AliExpress
General info
The basic set of information will help you find out the graphics card NVIDIA GeForce FX 5200 release date and its purpose (laptops or PCs), as well as the price at the time of the release and the average current price. This data also includes the architecture employed by the producer, and the chip’s codename.
Place in performance rating: | not rated | |||
Value for money (0-100): | 0.11 | |||
Architecture: | Celsius | |||
Code name: | NV18 C1 | |||
Type: | Desktop | |||
Release date: | 6 March 2003 (18 years ago) | |||
Launch price (MSRP): | $69.99 | |||
Price now: | $107 (1. 5x MSRP) | |||
GPU code name: | NV18 C1 | |||
Market segment: | Desktop |
Technical specs
This is the important information that defines the graphics card’s capacity. The simpler the device production process, the better. The core’s power frequency is responsible for its speed (direct correlation) while the elaboration of signals is performed by the transistors (the more transistors, the faster the computations are carried out).
Core clock speed: | 250 MHz | |||
Transistor count: | 29 million | |||
Manufacturing process technology: | 150 nm | |||
Texture fill rate: | 1. 000 | |||
Number of transistors: | 29 million |
Compatibility, dimensions and requirements
Today there are numerous form factors for PC cases, so it is extremely important to know the length of the graphics card and the types of its connection. This will help facilitate the upgrade process.
Interface: | AGP 8x | |||
Supplementary power connectors: | None |
Memory
The internal main memory is used for storing data while conducting computations. Contemporary games and professional graphic apps have high requirements for the memory’s volume and capacity. The higher this parameter, the more powerful and fast the graphics card is. Type of memory, the capacity and bandwidth for NVIDIA GeForce FX 5200.
Memory type: | DDR | |||
Maximum RAM amount: | 128 MB | |||
Memory bus width: | 128 Bit | |||
Memory clock speed: | 400 MHz | |||
Memory bandwidth: | 6.4 GB/s |
Video outputs and ports
As a rule, all contemporary graphics cards feature several connection types and additional ports. Knowing these peculiarities is crucial for avoiding problems with connecting the graphics card to the monitor or other peripheral devices.
Display Connectors: | 1x DVI, 1x VGA, 1x S-Video |
API support
All API-supported NVIDIA GeForce FX 5200 are listed below.
DirectX: | 8.0 | |||
OpenGL: | 1.3 |
Overall gaming performance
All tests have been based on FPS counter. Let’s have a look on what place NVIDIA GeForce FX 5200 has been taken in the gaming performance test (calculation has been made in accordance with the game developer recommendations about system requirements; it can differ from the real world situations).
Select games to view
Horizon Zero DawnDeath StrandingF1 2020Gears TacticsDoom EternalHunt ShowdownEscape from TarkovHearthstoneRed Dead Redemption 2Star Wars Jedi Fallen OrderNeed for Speed HeatCall of Duty Modern Warfare 2019GRID 2019Ghost Recon BreakpointFIFA 20Borderlands 3ControlF1 2019League of LegendsTotal War: Three KingdomsRage 2Anno 1800The Division 2Dirt Rally 2.0AnthemMetro ExodusFar Cry New DawnApex LegendsJust Cause 4Darksiders IIIFarming Simulator 19Battlefield VFallout 76Hitman 2Call of Duty Black Ops 4Assassin´s Creed OdysseyForza Horizon 4FIFA 19Shadow of the Tomb RaiderStrange BrigadeF1 2018Monster Hunter WorldThe Crew 2Far Cry 5World of Tanks enCoreX-Plane 11. 11Kingdom Come: DeliveranceFinal Fantasy XV BenchmarkFortniteStar Wars Battlefront 2Need for Speed PaybackCall of Duty WWIIAssassin´s Creed OriginsWolfenstein II: The New ColossusDestiny 2ELEXThe Evil Within 2Middle-earth: Shadow of WarFIFA 18Ark Survival EvolvedF1 2017Playerunknown’s Battlegrounds (2017)Team Fortress 2Dirt 4Rocket LeaguePreyMass Effect AndromedaGhost Recon WildlandsFor HonorResident Evil 7Dishonored 2Call of Duty Infinite WarfareTitanfall 2Farming Simulator 17Civilization VIBattlefield 1Mafia 3Deus Ex Mankind DividedMirror’s Edge CatalystOverwatchDoomAshes of the SingularityHitman 2016The DivisionFar Cry PrimalXCOM 2Rise of the Tomb RaiderRainbow Six SiegeAssassin’s Creed SyndicateStar Wars BattlefrontFallout 4Call of Duty: Black Ops 3Anno 2205World of WarshipsDota 2 RebornThe Witcher 3Dirt RallyGTA VDragon Age: InquisitionFar Cry 4Assassin’s Creed UnityCall of Duty: Advanced WarfareAlien: IsolationMiddle-earth: Shadow of MordorSims 4Wolfenstein: The New OrderThe Elder Scrolls OnlineThiefX-Plane 10. 25Battlefield 4Total War: Rome IICompany of Heroes 2Metro: Last LightBioShock InfiniteStarCraft II: Heart of the SwarmSimCityTomb RaiderCrysis 3Hitman: AbsolutionCall of Duty: Black Ops 2World of Tanks v8Borderlands 2Counter-Strike: GODirt ShowdownDiablo IIIMass Effect 3The Elder Scrolls V: SkyrimBattlefield 3Deus Ex Human RevolutionStarCraft 2Metro 2033Stalker: Call of PripyatGTA IV — Grand Theft AutoLeft 4 DeadTrackmania Nations ForeverCall of Duty 4 — Modern WarfareSupreme Commander — FA BenchCrysis — GPU BenchmarkWorld in Conflict — BenchmarkHalf Life 2 — Lost Coast BenchmarkWorld of WarcraftDoom 3Quake 3 Arena — TimedemoHalo InfiniteFarming Simulator 22Battlefield 2042Forza Horizon 5Riders RepublicGuardians of the GalaxyBack 4 BloodDeathloopF1 2021Days GoneResident Evil VillageHitman 3Cyberpunk 2077Assassin´s Creed ValhallaDirt 5Watch Dogs LegionMafia Definitive EditionCyberpunk 2077 1.5GRID LegendsDying Light 2Rainbow Six ExtractionGod of War
low
1280×720
med.
1920×1080
high
1920×1080
ultra
1920×1080
QHD
2560×1440
4K
3840×2160
Horizon Zero Dawn (2020)
low
1280×720
med.
1920×1080
high
1920×1080
ultra
1920×1080
QHD
2560×1440
4K
3840×2160
Death Stranding (2020)
low
1280×720
med.
1920×1080
high
1920×1080
ultra
1920×1080
QHD
2560×1440
4K
3840×2160
F1 2020 (2020)
low
1280×720
med.
1920×1080
high
1920×1080
ultra
1920×1080
QHD
2560×1440
4K
3840×2160
Gears Tactics (2020)
low
1280×720
med.
1920×1080
high
1920×1080
ultra
1920×1080
QHD
2560×1440
4K
3840×2160
Doom Eternal (2020)
low
1280×720
med.
1920×1080
high
1920×1080
ultra
1920×1080
QHD
2560×1440
4K
3840×2160
Legend | |
5 | Stutter – The performance of this graphics cards with this game is not well explored yet. According to interpolated information obtained from graphics cards of similar efficiency levels, the game is likely to stutter and show low frame rates. |
May Stutter – The performance of this graphics cards with this game is not well explored yet. According to interpolated information obtained from graphics cards of similar efficiency levels, the game is likely to stutter and show low frame rates. | |
30 | Fluent – According to all known benchmarks with the specified graphical settings, this game is expected to run at 25fps or more |
40 | Fluent – According to all known benchmarks with the specified graphical settings, this game is expected to run at 35fps or more |
60 | Fluent – According to all known benchmarks with the specified graphical settings, this game is expected to run at 58fps or more |
May Run Fluently – The performance of this graphics cards with this game is not well explored yet. According to interpolated information obtained from graphics cards of similar efficiency levels, the game is likely to show fluent frame rates. | |
? | Uncertain – The testing of this graphics cards on this game showed unexpected results. A slower card might be able to produce higher and more consistent frame rates when running the same benchmark scene. |
Uncertain – The performance of this graphics cards with this game is not well explored yet. No reliable data interpolation can be made based on the performance of similar cards of the same category. | |
The value in the fields reflects the average frame rate across the entire database. To obtain individual results, move your cursor over the value. |
AMD equivalent
AMD Radeon R9 390 X2
Compare
Benchmark
Benchmarks help determine the performance in standard tests for NVIDIA GeForce FX 5200. We have listed the world’s most famous benchmarks so that you could obtain accurate results in each (see the description). Graphics card preliminary testing is especially important in the presence of high loads so that the user could see to what extent the graphic processing unit copes with computations and data elaboration.
Overall benchmark performance
Benchmark Passmark: Graphic cards performance test result. Check Passmark test results of GPUs on hitesti.com
NVIDIA Quadro NVS 280 PCI
NVIDIA GeForce FX 5500
NVIDIA GeForce FX 5200
3.8
Out of 20
Hitesti score
Share on social network:
In order to leave a review you need to log in
Reviews of NVIDIA GeForce FX 5200
Compare NVIDIA GeForce FX 5200
VS
AMD Radeon R9 390 X2
AMD Radeon 530
AMD FirePro W5100
NVIDIA GF117
NVIDIA GRID K2
NVIDIA NVS 5400M
NVIDIA GeForce RTX 2070 Mobile
NVIDIA GeForce RTX 2080 Ti
AMD Radeon Pro WX 4100
AMD Radeon 520
Characteristics of NVIDIA GeForce FX 5200 / Overclockers.
ua
- News
- Specifications
- Reviews
- Processors
- Motherboards
- Memory
- Video cards
- Cooling systems
- Enclosures
- Power supplies
- Accumulators
- Peripherals
- Systems
- U.A. | EN
- 5800 (Ultra) — NV30. High-end replacement for Ti4800
- 5600 (Ultra) — NV31. Middle class, replacement for Ti4200
- 5200 (Ultra) — NV34. Low-end replacement for MX440
- 0.13 Micron Process Technology — allows you to place more semiconductor elements on a chip and increase the frequency of a 256-bit core. The FX5200 series has a 0.15 micron process.
- Intellisample Technology is a new anti-aliasing technology that eliminates jaggies, ladders and combs 50% better than before. It also allows you to adjust the color gamut to take into account the difference in the perception of light and color directly by the eye and how it is reproduced on the monitor. In addition, this technology uses new and improved anisotropic filtering, which reduces texture distortion by making dynamic adjustments to its image. The FX 5200 doesn’t have the Z-compression and ironclad color support of this technology. Yes, and it cannot have — the power of the chip is simply not enough to implement such technologies.
- 8 Pixel Pipelines — output up to 8 pixels per clock. In our (5200) case — only 4.
- 400 MHz RAMDAC — for the 5200 series video memory digital-to-analog converter operates at a frequency of 350 megahertz.
- DDR II memory — instead of progressive DDR II memory, the FX 5200 uses conventional DDR.
You can simultaneously select
up to 10 video cards by holding Ctrl
nVidia GeForce FX5200 (NV34) / Graphics cards
Author: Artyom Semenkov
NV30 has been announced for a long time. But video cards based on nVidia GeForce FX 5800 (NV 30) have a high price and many simply cannot afford them, or simply the user does not want to overpay for speed that he does not need. Usually in such cases, as a rule, a little later after the release of the flagship, its stripped-down versions are released, which makes it possible to eventually cover all sectors of the market.
As of May 2003, nVidia has three chips based on the FX architecture in production, to be as accurate as possible, the FX 5600 video cards will go on mass sale only next month — right now its production stocks are being formed. And to be absolutely principled, then, as we predicted, the NV30 will never enter the mass market and is reoriented exclusively to the professional Quadro line, where the price does not matter as much as in the consumer market. The top NV30 chip in all roadmaps of video card manufacturers has already been replaced by NV35. Nevertheless, in our comparison table, we put down exactly NV30… While NV35 goes on sale, a lot of water will flow away.
FX family comparison table:
Level | High End | Middle | Low end | |||
Chip | NV30 | NV30 Ultra | NV31 | NV31 Ultra | NV34 | NV34 Ultra |
Description | FX 5800 | FX 5600 | FX 5200 | |||
Technology micron | 0. 13 | 0.13 | 0.15 | |||
Transistors, million | 125 | 75 | 45 | |||
Pixel pipelines | 4 | 2 | 2 | |||
Texture blocks | 8 | 4 | 4 | |||
Core frequency, Mhz | 400 | 500 | 300 | 350 | 250 | 325 |
Memory bus frequency, Mhz | |
|
|
|
|
|
Memory bus bit | 128 (DDR II) | 128 (DDR) | 128 (DDR) | |||
RAMDAC, Mhz | 2×400 | 2×400 | 2×400 | |||
Power supply | Mandatory | Desirable | Optional |
Note: The recommended core and memory frequencies for NV31 and NV34 chips have changed several times.
Today we will present benchmarks for the nVidia GeForce FX 5200 (NV34), but this will be fixed soon, and an article about the mid-range GeForce FX 5600 (NV31) will also be presented.
NV34 Chip Differences
Currently, nVidia is releasing a line of three FX family GPUs for three market sectors, and each chip comes in two versions, regular and ultra:
As you can see, nVidia stopped using the MX label to designate their low-end products. You can forget about MX…
Reducing the cost of the chip and card requires sacrifice. Here is a list of the main differences between the older NV30 and NV34. So, what the NV30 does not have:
nVidia GeForce FX 5200 chip
The chip has 45 million transistors and is manufactured using a 0.15 micron manufacturing process.
The GeForce FX 5200 has 2 pixel pipelines and 4 texture units. However, all this is conditional, today it is difficult to judge a video card only by the number of certain pipelines. This is due to the fact that each time the driver separately configures the operation of the chip for each individual scene of a particular computer game.
In general, we can say that the 3D capabilities of the NV34 do not differ from those of the NV30/31. The NV34 supports the DirectX 9 API and hence Shaders 2.0 and 2.0+. However, there is a difference: the GeForce FX 5200 chip does not have IntelliSample optimization.
The memory interface of the GeForce FX 5200 also differs from that of its older brother, the NV30. The chip uses a standard DDR memory controller, which theoretically will lead to a significant drop in performance, especially when anisotropic filtering and full-screen anti-aliasing are enabled.
The chips of the GeForce FX 5200 family do not support HDTV, but they can boast of having a built-in TV codec, a TMDS transmitter, and two built-in 350 MHz RAMDACs. Although today this is no longer a surprise to anyone.
The GeForce FX 5200 family has managed to change the «recommended» clock values of the core and memory ten times over the long time of its development and short time of existence. The problem is that the first tests of working samples showed such amazingly low performance that it was not only unreasonable, but also deadly to release cards in such a state on the market. We simply hesitate to present those very first results — it’s just not objective in relation to nVidia, but believe me, they were significantly lower than the MX440. The reason, first of all, is in the untuned Detonator drivers for a completely new chip architecture and really low clock frequencies of new chips. Gradually the situation leveled off — the standard recommended frequencies had to be raised, each new version of the driver equalized the performance of the cards, and the output of suitable chips gradually came to reasonable technological standards. In our comparison table at the beginning of the article, we gave the «original» core and memory clock speeds, the reality that you can buy in your nearest store may be completely different. Video card manufacturers are not shy about varying these values over a wide range….
Direct competitors of the new chips are Radeon 9000 and 9000 Pro, and their soon-to-be-replaced Radeon 9200 and 9200 PRO.
Video card Daytona GEF FX5200
Video cards from this manufacturer have always been characterized by low prices and mediocre workmanship.
The video card has an AGP x2/x4/x8 interface. The layout is non-standard, which is not at all strange, boards from Daytona usually have their own layout and design. cooling is standard, passive, and is a medium-sized needle radiator.
The effectiveness of such cooling on this chip is very debatable, since the video card still got quite hot, and during overclocking it is strongly recommended to replace it with a normal fan. The memory chips are not covered by radiators.
The video card has 128 megabytes of memory with a 6ns access time. Memory manufactured by PMI is marked as HP58C2128164SAT-6. Here is the explanation for the lack of cooling on the memory — the slow 6 ns memory does not heat up.
Daytona GEF FX5200 core and memory frequencies are 250 MHz and 150 (300DDR) MHz respectively. Attention! The memory frequency of this board is lower than it should be according to the latest nVidia recommendations (recommended frequency is 200 (400DDR) MHz).
The card has a standard set of outputs: analog, digital (DVI) and TV-OUT. TV-OUT is implemented using the GeForce FX 5200 chip itself, as it already has integrated tools for implementing TV-OUT.
Sometimes it comes in a box, but its equipment is the same as in the OEM version, namely the Daytona GEF FX5200 video card and a driver disk.
Testing
Test stand: | |
Motherboard: | FIC AD11 (AMD 761+ VIA 686B) |
Processor: | AMD Athlon XP 2000+ (Tbred) |
Memory: | 512MB PC2100 NCP DDR SDRAM CL 2. 0; |
Hard disk: | Maxtor Fireball 3 30 GB; |
Video cards: | Daytona GEF FX5200 (nVidia GeForce FX 5200) Chaintech (nVidia GeForce 4 MX480) Sapphire Radeon 9000 (ATI Radeon 9000) |
Operating system, test programs and drivers | |
OS | Microsoft Windows XP SP1 |
Driver for ATI graphics cards: | Catalyst 3.1 |
Driver for nVidia video cards: | Detonator 41.09 |
Test programs: | — MadOnion 3DMark2001 SE; — MadOnion 3DMark2003; — Codecreatures; — Unreal Tournament 2003; — Serious Sam The Second Encounter (OpenGL). |
Test results
Unreal Tournament 2003
FX 5200 significantly loses both to its competitor — ATI Radeon 9000 and to its predecessor — GeForce 4 MX480. ATI Radeon 9000 and nVidia GeForce 4 MX480 show approximately the same results.
In the botmatch test, the same positions remained, but the gap between nVidia GeForce FX and ATI Radeon 9000/ nVidia GeForce 4 MX480 has narrowed, but still not small: 4-8 fps on average.
Serious Sam The Second Encounter
nVidia GeForce FX lags behind ATI Radeon 9000, and quite significantly loses to nVidia GeForce 4 MX480. In this test, nVidia GeForce 4 MX480 wins: it is ahead of its competitor — ATI Radeon 9000 by a fairly large gap.
3D Mark 2001 SE
nVidia GeForce FX lags behind ATI Radeon 9000 and loses to nVidia GeForce 4 MX480.
3D Mark 2001 detailed results
Game Test 4 — Nature
Test results for nVidia GeForce FX are almost identical to those of ATI Radeon 9000. The nVidia GeForce 4 MX480 fails this test due to lack of pixelware support.
Fill rate
With fill rate tests clearly outperformed by the nVidia GeForce FX 5200, we began to have doubts that the FX 5200 based video cards are based on the design of 2×2 pixel pipelines.
High poly count — 8 lights
The results are clearly not in favor of the nVidia GeForce FX 5200.
Vertex program speed
Here the places are the same 1st for nVidia GeForce 4 MX480. 2nd for ATI Radeon 9000 and 3rd for nVidia GeForce FX 5200.
Pixel program speed
The nVidia GeForce FX 5200 does much better with pixel programs.
Speed of advanced pixel programs
There is no confusion with the results here: nVidia GeForce FX 5200 is very far behind its competitor ATI Radeon 9000. The fact is that video cards optimized for DirectX 9 support pixel programs version 1. 4, if the card does not support version 1.4, then this test uses pixel programs version 1.1. And they require more passes. It is not yet clear what is the reason for this result, either the driver, or the 3Dmark test package itself.
Codecreatures
This BenchMark uses DirectX 8.1 generation pixel programs. And the work with the latter is better with nVidia GeForce FX than with ATI Radeon 9000. The gap is not significant, but it is there. The nVidia GeForce 4 MX480 doesn’t pass this test at all, as it doesn’t support pixel programs.
Codecreatures — average number of polygons
These test results allow you to present and compare the average polygons per second.
Image quality
The nVidia GeForce FX 5200 is slightly behind the nVidia GeForce 4 MX480, things looked much better if the nVidia GeForce FX 5200 had not been stripped of the IntelliSample engine. ATI Radeon 9000 test results could not be captured due to ATI driver issues with Unreal Tournament 2003.
Conclusion
There are two impressions about the GeForce FX 5200. On the one hand, the performance is no different from the GeForce4 MX, and sometimes much lower than that of competitors and its predecessor.
On the other hand, a low price of ~$90 (although the price should be lowered, since cards based on this graphics chip can really compete only with the GeForce4 MX and RADEON 9000), good DirectX 9 optimizations and excellent work with pixel programs, although There were also some incidents: the test results in the Advanced Pixelshaders scene, although most likely these are driver problems, and this will be fixed soon.
In general, the FX 5200 is the previous MX with more functionality in the form of DX9 support. But who is going to buy a weak video card for demanding games that do not yet exist using DX9, in order to end up watching a slideshow on it? Players don’t need it, they need more serious solutions, and everyone else doesn’t really need DX9 support.