NVIDIA GeForce FX 5200 upgrade
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
1 — 20 of 25 Posts
vagito0802
·
Registered
Klocek001
·
Banned
vagito0802
·
Registered
Klocek001
·
Banned
om3nz
·
Registered
Klocek001
·
Banned
vagito0802
·
Registered
Klocek001
·
Banned
om3nz
·
Registered
vagito0802
·
Registered
Klocek001
·
Banned
om3nz
·
Registered
Klocek001
·
Banned
vagito0802
·
Registered
vagito0802
·
Registered
Klocek001
·
Banned
vagito0802
·
Registered
DzillaXx
·
Premium Member
vagito0802
·
Registered
ep45-ds3l
·
Registered
1 — 20 of 25 Posts
- This is an older thread, you may not receive a response, and could be reviving an old thread.
Please consider creating a new thread.
Top
How to Overclock GeForce Fx 5200
flamer2204
Posts: 28 +0
-
-
#1
i was just wondering how i could overclock my geforce fx 5200.
thanks.
AMD2800+
Posts: 50 +0
-
-
#2
Ditto that. What program should you use and how far can you safely go over stock settings?
dopefisher
Posts: 436 +0
-
-
#3
go to guru3d.com and download one of the geforce tweak utilities (geforce tweak, riva tuner, etc) i have overclocked mine to 301/422 safely. it simply will not go any higher. and it doesnt help at all, in fact it probably hurts it. there is an option to keep your fan on at all times , however.
vnf4ultra
Posts: 1,360 +2
-
-
#4
Search «coolbits» on google and find some info on that. It’s a registry change that lets you change your core and memory speeds in the graphics properties box. You can usually oc until you get artifacts, (odd colored jaggies in games), but make sure your card doesn’t overheat or it will die.
It might be easier to just dl rivatuner.
AtK SpAdE
Posts: 1,343 +6
-
-
#5
horray for coolbits
My brother has a 5200 and i oced for him. I used coolbits. You can find out about coolbits here
Sean
flamer2204
Posts: 28 +0
-
-
#6
how do i prevent my card from overheating?
tbrunt3
Posts: 308 +0
-
-
#7
Well the key is to use Coolbits do a test of the settings . Dont overclock the card much . Go to Clock freq.settings you will see this on the left once cool bits is installed move slider to 3D then hit detect optimal settings teh card will detect the setting it can run at .. YOu need to have proper cooling I suggest getting a pci fan and place it on your PCI slot .DO NOT go crazy overclocking this card it can burn up…
flamer2204
Posts: 28 +0
-
-
#8
is there software that can show what the cards temperature is? what should it be normally?
AMD2800+
Posts: 50 +0
-
-
#9
Efftastic, people, thank you very much. Cooling will be an issue for me i think because the CPU (see name) already hits 65degC under load, and the box itself hits 40degC+, better look at that.
AMD2800+
Posts: 50 +0
-
-
#10
Actually i’ll go for RivaTuner because i really don’t want to touch the registry.
tbrunt3
Posts: 308 +0
-
-
#11
AMD2800+ said:
Actually i’ll go for RivaTuner because i really don’t want to touch the registry.
![]()
Click to expand…
You dont all you do is install it it is that simple ..Than you can unlock the clocks it is a really good program…
rskjackson
Posts: 23 +0
-
-
#12
Excuse my lack of knowledge, but what does overclocking your video card do and why would you want to do that? Just curious…
vnf4ultra
Posts: 1,360 +2
-
-
#13
Well overclocking makes your graphics card run faster. It’s like getting a little graphics upgrade for free, but it can fry the card if it gets too hot and it also voids the warranty.
rskjackson
Posts: 23 +0
-
-
#14
I have downloaded the rivatuner utility as mentioned above, will that help me as described to not overheat and with the proper settings? Thanks…
rliu
Posts: 55 +0
-
-
#15
With the CoolBit thing am I right to presume that the card won’t over heat if you select detect optimum settings? It’s an option available in the more recent nVidia drivers.
dopefisher
Posts: 436 +0
-
-
#16
ok i have oc’ed my card to 306/424 with coolbits 2.0 from guru3d.com…… like i said i don’t think it’ll help and it seems to slow it down….but with all this talk i’ll try it out again
vnf4ultra
Posts: 1,360 +2
-
-
#17
Run a benchmark before and after you oc. Aquamark 3 is what I use.
dopefisher
Posts: 436 +0
-
-
#18
vnf4ultra said:
Run a benchmark before and after you oc. Aquamark 3 is what I use.
Click to expand…
here’s my 2 cents about aquamark: first option in aquamark: 11,200 (approx) with oc, 10,987 w/o. is it worth oc’ing?
dopefisher
Posts: 436 +0
-
-
#19
ok sorry about the # of posts, but im so excited lol got my fx 5200 up to 324/424!!!!
vnf4ultra
Posts: 1,360 +2
-
-
#20
Yeah, that isn’t much improvement, less than 1fps. But getting over 10000 for a 5200 is good. I think many are below 10000. On my 6600gt, ocing from 500, 1000 to 564, 1160 only increased my fps by 3fps. So I went back to stock, it’s fast enough stock anyway.
tbrunt3
Posts: 308 +0
-
-
#21
Aquamark is a good program that is what I got 39574 with and I only tweak a little bit thats a great score or a sockt «A»..To answer your question overclocking is to increase frame rate sometimes it helps but if you dont get over 10% it is not worth it..
Gary King
-
-
#22
What are your 3DMark2001 scores though?
Isn’t Aquamark for more high-end systems? That’s the good thing about 3DMark, because it has 3 different versions.
dopefisher
Posts: 436 +0
-
-
#23
Gary King said:
What are your 3DMark2001 scores though?
Isn’t Aquamark for more high-end systems? That’s the good thing about 3DMark, because it has 3 different versions.
Click to expand…
if, by versions, you mean 3 tests, aquamark has 3 different types of tests….. i was pleased with the program. i didnt really like 3dmark03 never tried 01
tbrunt3
Posts: 308 +0
-
-
#24
Gary King said:
What are your 3DMark2001 scores though?
Isn’t Aquamark for more high-end systems? That’s the good thing about 3DMark, because it has 3 different versions.
![]()
Click to expand…
both are good programs my 03 score is on my sig it a good score for a socket «A» for a 333fsb CPU oc a little I jsut have a very good video card a fx 5800 ultra they dont make any more ..Have not ran 01 in a long time
bushwhacker
Posts: 766 +1
-
-
#25
I had mine VC of FORSA GeForce FX 5200 and it is capable clock in 310/475 but i decrease it to 305/455 due to heat generation from VC stock fan. I prefer to keep it in stock of 250/400. I used TaiwanTech PowerStrip v3.14… a very nice third-party utility!
Lucan
P.S. 3DMARK2001 SE Professional Score = 12,503 :ekkk:
Extreme overclocking GeForce FX 5200 (WinFast A340 TDH) from Leadtek.
This time for extreme overclocking, I chose
GeForce FX 5200 (WinFast A340 TDH) from Leadtek. I liked this video card because it has memory from Samsung with an access time of 4 ns. The nominal core/memory frequencies are 250/400 MHz, the company itself provides an opportunity to increase the memory frequency by 100 MHz, and it would be foolish not to use it. I was also interested in the fact that there is a temperature sensor between the processor and the cooler, and three wires go from the cooler to the board, which means it has a tachometer. The CD supplied with the video card contains the WinFox II utility — Leadtek’s own control panel. A description of all the functions of the control panel would take up a lot of space, but among others there is dynamic monitoring of temperature and voltages on the processor and memory, fan speed, and overclocking functions. Otherwise, the card is not much different from its similar ones: support for AGP 8x and DirectX 9, connectors for connecting external DVI and VGA monitors and TV-out (S-video).
In a stylish box with a corporate design, along with the board, I found: description, 3 CDs with toys and drivers, S-video cable, RCA cable, S-video — RCA adapter, DVI — D-Sub adapter.
First I tested our GeForce FX5200 at nominal core/memory frequencies of 250/400 MHz. Then, using the usual method, using Riva Tuner v2.0RC14, we managed to get new results: 300 MHz for the core and 520 MHz for memory. As tests, I ran 3DMark 2003 v.3.3.0 and 3DMark 2001 SE with default quality settings, Quake3 Arena (demo FOUR) and UT 2003 (flyby inferno) three times. All tests were carried out with a resolution of 1024×768 and 1600×1200. The test stand was assembled from the following components:
- Motherboard — Abit NF7, BIOS version — 18.
- Processor — AMD Athlon XP 1700+, operating at 2241 MHz (166×13.
5).
- Memory — 2 x 512 MB DDR 333 modules from Samsung.
- Hard — IBM DTLA 305020.
- Cooler — Thermaltake Volcano 7.
- Thermal grease — KPT-8.
- OS — Microsoft Windows XP (Detonator 45.23).
After all the usual means had been used, I decided to increase the voltage on the processor and memory. The main thing: find a voltage stabilizer. On this board, it is assembled on the APW 7060 chip. It is dual, designed specifically for video cards and regulates 1.5 V on the processor and 2.5 V on the memory chips.
By changing the parameters of the feedback resistor divider, you can change the supply voltage to the processor and memory. You can control these voltages with a tester on the legs of the corresponding electrolytic capacitors.
Now let’s figure out who is responsible for what: the 6th leg regulates the voltage on the memory, the 9th on the processor. We solder two variable resistors (I have 22 kOhm each) between these legs and the nearest grounded points on the board. Before soldering, we set the maximum resistance value on the resistors. Well, then everything is very simple, set the voltage above the nominal value by 10% and run the tests.
recommendations
With the new voltages (1.7 V on the processor and 2.9 V on the memory chips), we managed to overclock the processor to 375 MHz and the memory to 580 MHz. But the very first test 3DMark 2003 v.3.3.0 (1024×768) showed a lower result than with normal overclocking, despite the fact that the frequencies increased significantly. Something was slowing down the card from the inside. It seems that the increase in voltage caused increased heating of the memory chips and the processor.
I had to change the overclocking method. Having chosen Quake3 Arena (demo FOUR) as the shortest of the tests, I first increased the frequency on the processor, as soon as the test started to fail, I gradually increased the voltage. The same operations had to be carried out with memory. Each time, increasing the frequency by 10 MHz, he ran the test and, getting a new result, compared it with the previous one. And so, when once again, the resulting value turned out to be less than the previous one, I stopped. The new overclocking results for the GeForce FX5200 turned out to be more modest — core frequency 340 MHz, Ugpu = 1.57 V., memory 580 MHz, U mem = 2.75 V., but the performance of the video card has noticeably increased.
Not so long ago I tested the GeForce FX 5200 and GeForce 4 MX440-8X from ASUS, it was planned that these products would be reworked and tested, but, as usual, they were not at hand at the right time. Why two identical video cards, let’s consider only the GeForce FX 5200 for now, showed such different results, I still find it difficult to answer. The platform on which testing was carried out remained the same. The new version of Detonator 45.23 may have affected the performance of Leadtek’s product, but not to the same extent. I had to change the BIOS version of the motherboard, since the FSB frequency was floating with the previous version, which Doors4ever wrote about in detail recently. In fact, there were more questions than answers.
Subscribe to our channel in Yandex.Zen or telegram channel @overclockers_news — these are convenient ways to follow new materials on the site. With pictures, extended descriptions and no ads.
Extreme overclocking GeForce FX 5600 from SUMA.
Already on our website
article about extreme overclocking RadeOn 9600 Pro. Now it’s the turn of the GeForce FX 5600. My choice fell on the product from SUMA due to the presence of two coolers, one of which is installed on the front side of the board, and the second — on the back side. The cooler installed on the reverse side does not do much good for cooling the core, but it blows air over the adjacent heatsinks on the memory chips. In order to provide airflow from the back of the GPU, I would drill more holes in the aluminum base of the heatsink.
Another reason why I chose this particular card is that it is equipped with not the worst video memory from Samsung with an access time of 3.3 ns.
And briefly about the characteristics of the video card: nominal core/memory frequencies 325/600 MHz, 128 MB memory, 128-bit memory bus, DirectX 9.0 and AGP 8x support, combined TV input/output (S-video), DVI and VGA connectors. The package for the video card includes: DVI — VGA adapter, RCA cable, combo connector adapter — RCA/S-video input + RCA/S-video output, CD with drivers, description.
The test bench consisted of the following components:
- Motherboard — Abit NF7, BIOS version — 17.
- Processor — AMD Athlon XP 1700+, operating at 2241 MHz (166×13.5).
- Memory — 2 x 512 MB DDR 333 modules from Samsung.
- Hard — IBM DTLA 305020.
- Cooler — Thermaltake Volcano 7.
- Thermal grease — KPT-8.
- OS — Microsoft Windows XP (Detonator 44.
90).
Standard tests: 3Dmark2003 v3.3.0, 3Dmark 2001 SE, Quake3 Arena (demo Four) and UT2003 (flyby-inferno) — default quality settings. The tests were carried out in two resolutions, 1024×768 and 1600×1200, and with Antialiasing 8X and Anisotropic Filtering 8X enabled at 1024×768.
First GeForce FX 5600 was tested at nominal frequencies: core/memory — 325/600 MHz. Then, using the usual, «non-extreme» overclocking, we managed to get the frequencies of 380 MHz for the core and 680 MHz for the memory. After the test results were received on a video card overclocked in the usual way, I proceeded to extreme overclocking. The essence of this process is to increase the voltage on the GPU and memory by about 10%, while it becomes possible to further increase the core / memory frequencies.
There are two voltage regulators on the board: ICL 6529 supplies the video processor with a voltage of 1.3 V, the voltage for the memory is 2.5 V supplied from HIP 6012. From the description on ICL 6529 we find the contact we need (FB 9th leg of the microcircuit), to which the resistive divider is connected . Between this contact and any nearby contact soldered to ground (for example, the 3rd leg of this microcircuit), we solder a variable resistor with a resistance of 22 kOhm. We carry out the same operation with HIP 6012 (FB 5th leg, GND 7th leg).
recommendations
The resulting voltage can be monitored with a tester on the respective chokes or on the terminals of electrolytic capacitors.
Before applying voltage to the test bench, make sure that the resistance of the variable resistors is maximum. To adjust the resistance, it is recommended to use a dielectric tool. When using metal objects, the stable operation of power devices can be disturbed, which will cause distortion of the image on the monitor screen.
After trying different options, I got the following end result: GPU voltage — 1.6 V, memory voltage 2.8 V, maximum core/memory frequency — 435/690 MHz.
An interesting fact: even without our intervention in the scheme of the GeForce FX 5600 board from SUMA, when you start any 3D application, the voltage on the GPU immediately rises by about 10%. When the supply voltage is increased using a variable resistor, the voltage in 3D mode also increases, but the maximum value of this voltage, as it turned out, is 1.7 V. That is, if you set the voltage to 1.7 V on the video processor after loading Windows, in 2D mode, the value will not change when the 3D application is started.
Test results are shown below:
I recently talked about testing the GeForce FX 5600 from CHAINTECH. I tried to keep the stand configuration the same, and now you can compare the results of various tests at a resolution of 1024×768. From the extreme overclocking of our GeForce FX 5600, I expected to get a more significant performance boost, although I was pleased with what happened.
Subscribe to our channel in Yandex.