Frys gpu: Reddit — Dive into anything

Why do I not fry a GPU, when it’s PCIe power is connected to a second PSU?


Asked


Modified
5 years, 7 months ago


Viewed
776 times

I see lot’s of Bitcoin/Altcoin PC mining rigs having powered their graphics cards with a second power supply. They simply connect the second PSU to the card’s PCIe power connectors, while they do not ground both PSU’s. Is it really the correct way in doing this? I have a little bit background in electrical engineering and I was wondering how both GND’s are separated in a graphics card’s design. Can someone describe me this please?

  • power-supply
  • gpu






3

It depends on voltage differences of the 5V power combined into the card and ESR of the entire path. 2/W=R= 100 mΩ is the load and thus the source is 1% of this. Your wire might be 2 to 5% of this 100 mΩ’s. So doubling up cables improves from the same supply.

BUT if you had 2 different PSU’s with ΔV=1% and connected them in parallel with a total loop cable resistance of say 50 mΩ then 1% * 5V/50mΩ = 1A of circulation current not going to the load but going between supplies. It may get worse as the slope is not linear with more current and a bigger difference in voltage or lower resistance cable at 10 mΩ then 500% more current.






4









Sign up or log in

Sign up using Google

Sign up using Facebook

Sign up using Email and Password

Post as a guest

Email

Required, but never shown

Post as a guest


Email

Required, but never shown




By clicking “Post Your Answer”, you agree to our terms of service, privacy policy and cookie policy


graphics card — Did I fry my motherboard or my power supply?


Asked


Modified
9 years, 8 months ago


Viewed
12k times

I just installed a second video card (GeForce 7900 series) alongside my older GeForce 6700 Series. After the computer was powered on for about a day, I came back to find the unit dead, with the power button not responding at all. After changing the plug from a power strip to a different wall socket, it seemed to partially power on, with these results:

  • Two of the three case fans were operational
  • The motherboard’s green LED was on
  • The GeForce 6700’s fan was operational
  • The DVD drive made standard bootup sounds

However, the CPU fan was not on, one case fan was not on, and the GeForce 7900’s fans were not operating. Additionally, after pressing the power button to turn the computer on, it could not be turned off by anything except flicking the PSU’s power switch.

Here are the machine specs:

  • Rosewill Capstone-650 (650 watt power supply)
  • Asus P8B75-V motherboard
  • Two graphics cards (as described)

I removed both video cards, but nothing changed. I have a strong feeling I fried some combination of the motherboard, the PSU, and possibly the graphics cards — possibly because my PSU wasn’t powerful enough to handle the load. Can anyone shed any insight on this?

  • graphics-card
  • motherboard
  • power-supply
  • fan






1

Most likely it’s the power supply.

The video cards can draw a lot of current from the 12V rails. Some of the cheaper power supplies with lower wattage ratings can’t do it, especially on a 2 card setup like you have. So I’d start by looking there. First remove all the cards and see if the motherboard powers on (that’ll verify a card hasn’t come loose and is shorting the pci-x bus).

If not, have a look at one of the many howto’s on the web for testing your power supply.
e.g. http://pcsupport.about.com/od/toolsofthetrade/f/powersupplytest.htm If you don’t have a tester, then open it up and check fuses. See below:

Personally, I think 650W is too low for a 2 card setup (particularly those cards. They are toward the high end of current draw). You should have at least 800W or above. For a 3 card setup, 1200W+ is recommended.

I should add that most power supplies have overload or short circuit protection.
See: http://en.wikipedia.org/wiki/Power_supply

Overload protection

Power supplies often have protection from short circuit or overload
that could damage the supply or cause a fire. Fuses and circuit
breakers are two commonly used mechanisms for overload protection.[6]

A fuse contains a short piece of wire which melts if too much current
flows. This effectively disconnects the power supply from its load,
and the equipment stops working until the problem that caused the
overload is identified and the fuse is replaced. Some power supplies
use a very thin wire link soldered in place as a fuse. Fuses in power
supply units may be replaceable by the end user, but fuses in consumer
equipment may require tools to access and change.

A circuit breaker contains an element that heats, bends and triggers a
spring which shuts the circuit down. Once the element cools, and the
problem is identified the breaker can be reset and the power restored.

Some PSUs use a thermal cutout buried in the transformer rather than a
fuse. The advantage is it allows greater current to be drawn for
limited time than the unit can supply continuously. Some such cutouts
are self resetting, some are single use only.

If your PSU appears to be dead. Carefully remove the screws (CAUTION! High Voltage — be very careful what you touch). You may see some fuses inside. Are they still intact?
If they are not then it’s what’s likely happened is that too much current was drawn and the fuses popped. Then you can replace the fuse and use the PSU in another machine. Now go out and buy a higher wattage PSU!






5

What brand is the power supply? As with most equipment, budget brands tend to over-rate capacity by measuring only perfect conditions. If something in it was blown, you may be able to detect some smokey smells from it faintly (worth checking). It may also be worth using a digital multimeter to confirm your voltages are in spec, usually +/- 5% of rated voltage. Connect the black probe of your meter to a black wire on a molex connector and the red probe to the red wire, and then the yellow wire. You should see 4.9-5.1 volts, and 11.8 to 12.2 volts (roughly). If voltage is far off from there, it’s probably a bad supply or another component causing excessive draw.






1









Sign up or log in

Sign up using Google

Sign up using Facebook

Sign up using Email and Password

Post as a guest

Email

Required, but never shown

Post as a guest


Email

Required, but never shown




By clicking “Post Your Answer”, you agree to our terms of service, privacy policy and cookie policy


AMD FreeSync technology can be used with NVIDIA GeForce

graphics cards

And then everything is relatively simple. You must connect the monitor to a Radeon graphics card or motherboard if using an APU, and in the BIOS, in the graphics settings, select the AMD graphics processor as the main GPU. Next, you need to launch FreeSync through Radeon Software Adrenalin Edition, and then go to the NVIDIA control panel and “force” the game or application to use the GeForce graphics card by default.

And the next time you start the game, it should use the NVIDIA graphics card, but the FreeSync technology should ensure that the picture output is smooth. In essence, this approach assumes that the NVIDIA graphics card is responsible for processing all the graphics, and the AMD graphics processor is responsible for the FreeSync function. nine0007

Our western colleagues have already checked this trick, and note that despite the use of the GeForce video card «through» the Radeon video card for FreeSync operation, this does not significantly affect the performance in games. Although there are occasional compatibility issues, such as Unigine’s Valley and Deus Ex: Mankind Divided, FreeSync «works great» in «a number of other games.»

This feature will be of interest not only to owners of AMD hybrid processors using NVIDIA discrete graphics, but also to many other users. After all, NVIDIA itself positions its G-Sync technology as a premium feature, and monitors with its support are by no means the most affordable. In turn, FreeSync is a derivative of the VESA Adaptive-Sync standard, which is part of the specifications for video transmission connectors, such as DisplayPort 1.2 and others. So AMD FreeSync support is found even in very budget monitor models. nine0007

Therefore, this method, which allows you to use FreeSync on NVIDIA video cards, opens up good prospects for mainstream NVIDIA video card users who cannot afford a monitor with G-Sync. True, there is a possibility that NVIDIA will try to close this possibility somehow with the help of drivers.

Sources:


If you notice an error, select it with the mouse and press CTRL+ENTER.

Related materials

Permanent URL: https://3dnews.ru/974617

Headings:
News Hardware, video cards,

Tags:
amd, nvidia, graphics card, freesync, pascal


past
To the future →

Using FreeSync with NVIDIA graphics cards — i2HARD

Articles

Evgeny Serov

January 20, 2019

To the surprise of many, Nvidia has stepped back from its G-Sync stronghold and allowed Nvidia GPU owners to use adaptive sync with a wide range of FreeSync-enabled monitors. This feature was announced during CEO Jensen Huang’s keynote at CES 2019 and was included in the latest GeForce drivers this week.

You will probably object: “wide range is too strong a word, because Nvidia only announced support for 12 monitors!”. But actually it is not. The words of Nvidia’s statement about this feature are a little misleading, so in this article we will try to clarify this point.
nine0007

Nvidia’s driver support for adaptive-sync monitors has four levels. Yes, four…

First up is G-Sync Ultimate, the new name for G-Sync HDR.

G-Sync Ultimate certified monitors have built-in Nvidia G-Sync HDR and support the full range of HDR features. G-Sync Ultimate monitors include the Acer Predator X27, Asus ROG Swift PG27UQ, and the new HP Omen X Emperium 65.


Photo: Tim Schiesser, techspot.com

Then — the usual G-Sync.

These are monitors that have served us for many years and that include a G-Sync module, but do not support G-Sync HDR. These are adaptive-sync monitors that work with and only Nvidia graphics cards, and are more expensive than their FreeSync counterparts.


Photo: Tim Schiesser, techspot.com

Next, we move on to «G-Sync compatible» monitors.

These are the FreeSync monitors that meet all of Nvidia’s stringent requirements. They don’t have a G-Sync module, but they do support the VESA Adaptive Sync standard, so they also work with AMD GPUs. The latest Nvidia drivers allow these certified monitors to work with adaptive sync on Nvidia GPUs by default.
nine0007

So far, Nvidia has announced 12 monitors that are fully G-Sync compatible, you can see a list of them in the picture above. If you own any of these monitors and you have installed the latest Nvidia driver, adaptive sync will be enabled automatically and you can use it just like you would with any G-Sync monitor.

According to Nvidia, G-Sync compatibility is still inferior to regular G-Sync. The table below shows that G-Sync monitors are certified with a wide range of image quality tests, full range variable refresh rate (VRR), Variable Overdrive, and factory color calibration. However, nothing prevents any G-Sync compatible monitor from being factory calibrated or having a full VRR (variable refresh rate) range as well. It’s just that Nvidia does not require these conditions from the manufacturer in order to receive the «G-Sync compatible» sticker, while G-Sync monitors must have these features.
nine0007


Photo: Tim Schiesser, techspot.com

And finally, the fourth level, which Nvidia mentions only in passing, is the ability to use any FreeSync or VESA Adaptive Sync monitor with an Nvidia graphics card using a switch in the Nvidia control panel.

According to Nvidia, this applies to those “VRR monitors that have not yet been tested for G-Sync compatibility,” and when the feature is enabled, “it may work, or it may work partially or not at all.” Of course, only certified monitors are guaranteed to work, and based on Nvidia’s results — 12 supported monitors out of 400 tested — your chances may look bleak, however the drivers don’t limit you and in fact all monitors with adaptive sync are supported. All you have to do is enable the feature in the settings.
nine0007

During its keynote at the show, Nvidia tried to convince people that their G-Sync compatible monitor certification program is necessary, as uncertified monitors are allegedly rife with problems. They showed flickering and dimming monitors, trying to tarnish the entire FreeSync ecosystem with these examples. Nvidia states that G-Sync compatible monitors they have certified do not have these issues, while any non-certified monitors may.


Photo: Tim Schiesser, techspot.com

As soon as we saw this, we immediately considered this statement to be nonsense. Simply because the problems demonstrated have nothing to do with the FreeSync or VESA Adaptive Sync standards; they are not problems inherent in the technology. These are just problems related to the quality of monitors. It’s no secret that some FreeSync monitors, especially early models, aren’t great and do have flaws like flickering even on AMD GPUs.
nine0007

But these monitors are just garbage. It seems to us that if you buy a flickering or fading monitor, then such a defective product is simply eligible for return. Of course, there is a chance that adaptive-sync monitors that work great on AMD GPUs will have some issues on Nvidia GPUs. And then one could reproach Nvidia for the inappropriate implementation of adaptive sync support, but, as with all implementations, some errors and problems are always possible.
nine0007

Enable G-Sync on a FreeSync monitor


Photo: Tim Schiesser, techspot.com

First, let’s look at exactly how you enable adaptive sync support for non-certified monitors. Open the Nvidia control panel, go to the «Set up G-Sync» section and select your FreeSync monitor. Then make sure both boxes are checked: Enable G-Sync, G-Sync compatible and Enable settings for the selected display model. The second checkbox will not appear if your monitor is a G-Sync certified monitor. Then click «Apply» and your monitor will turn on with adaptive sync.
nine0007

In some cases, you may need to go to the global 3D settings (Manage 3D settings) and select «G-Sync Compatible» from the «Monitor technology» drop-down list, but this was not required on all the monitors we tested. It’s also important to note that FreeSync must be enabled on the monitor itself (usually via the OSD). Some monitors have a switch that allows you to enable/disable FreeSync or adaptive sync, you need to set it to «on».
nine0007

And the last remark. Unlike G-Sync monitors that work with Nvidia GPUs up to the GeForce 600 series, G-Sync compatible and FreeSync monitors only work with Nvidia GTX 10 or later. We have tested many monitors with the GeForce RTX 2080 Ti, but all other Pascal cards should work as well. We believe this is because Pascal is the first GPU architecture to support both adaptive sync and G-Sync, while older architectures only supported G-Sync.
nine0007

Testing FreeSync on GeForce

So far, we’ve tested seven FreeSync monitors with Nvidia GPUs. All of these monitors have been tested by us before and work flawlessly with AMD GPUs. So no flickering, blanking or other problems. They work fine. We’d love to test more monitors with over 500, but for now we only have those seven available. But still, that’s a pretty good sample size today.
nine0007

The goal of testing was to see if there are any differences between enabled and disabled adaptive sync with an Nvidia GPU, and if there are any differences when compared to FreeSync on a monitor connected to an AMD GPU. This included testing the monitor across the frame rate range to see how it performs within and outside the refresh rate range.


Photo: Tim Schiesser, techspot.com

The first monitor we tested was the Acer KG251QF, a budget 24-inch 1080p monitor with a refresh rate of 30 to 144Hz. It’s a great monitor for the price, and it’s safe to say we didn’t find any issues with it with adaptive sync enabled on the Nvidia GPU. No flickering, no blanking, nothing. It worked exactly the same as when connected to an AMD GPU. So, account.
nine0007

The second monitor was a BenQ EL2870U, a 60Hz 4K panel with a narrow refresh rate range of 40 to 60Hz. This monitor also worked great, although the refresh range is too narrow to support low frame rate compensation (LFC). So when the frame rate dropped below 40, adaptive sync no longer functioned and there were tearing or lag depending on whether Vsync was on or off. This behavior was expected and occurs with AMD GPUs as well. Therefore, we again evaluate the result as successful.
nine0007

The next monitor was the Viotek GN24C, another 24-inch 1080p monitor, but this time with a VA panel and a refresh rate range of 48 to 144 Hz. Again, this monitor worked great, and thanks to its wide refresh rate range, it also supported LFC.

Low Framerate Compensation (LFC) is a feature we were especially curious to see if it worked properly on GeForce GPUs. If Nvidia were lazy, it would simply disable adaptive sync when the frame rate drops below 48Hz screen refresh. However, this is not the case, Nvidia properly supports LFC with monitors that support this feature, so when the frame rate drops below 48 FPS, the monitor runs at multiples of the frame rate, duplicating them.
nine0007

For example, if the game’s frame rate is 37 FPS, the monitor will refresh at 74 Hz, showing each frame twice. So it’s good to see that one of the key adaptive sync features that Nvidia has successfully implemented for G-Sync monitors also works here without any restrictions.


Photo: Tim Schiesser, techspot.com

Two other monitors performed similarly: the AOC C27G1, a 27-inch 1080p monitor with a refresh rate of 48 to 144Hz, and the Viotek GN32LD, a 32-inch 1440p monitor, also with a refresh rate of 48 to 144Hz. Both monitors worked perfectly, including in LFC mode.
nine0007

Then came the Philips Momentum 43, a 43-inch 4K monitor with a 48 to 60 Hz refresh rate and HDR support. It doesn’t have LFC due to the screen’s narrow refresh rate range, but otherwise this panel worked as expected with no issues.

We are pleased to note that adaptive sync support works even with HDR enabled; choosing one or the other doesn’t matter, HDR doesn’t affect adaptive sync capabilities, which is good news for those who are interested in an HDR monitor but don’t want to buy a G-Sync Ultimate display.
nine0007

One of the FreeSync monitors we tested refused to work with adaptive sync on an Nvidia GPU, but that’s not a big surprise. The fact is that the Viotek NB24C only supports adaptive sync over HDMI, while Nvidia GPUs only support adaptive sync over DisplayPort. AMD GPUs can perform adaptive sync over both HDMI and DisplayPort, so this monitor has adaptive sync with AMD GPUs, but not with Nvidia GPUs.
nine0007

The lack of adaptive sync over HDMI will also disappoint those looking to pair an Nvidia GPU with the range of FreeSync-enabled TVs that have hit the market in recent years. Most FreeSync TVs only have HDMI ports, so once again Nvidia GPU owners are left out.


Photo: Tim Schiesser, techspot.com

So, out of the seven monitors we tested, six performed flawlessly, and the seventh was not going to show them, because it needs FreeSync over HDMI, which Nvidia does not support. Also, it wouldn’t hurt to check how successfully low frame rate compensation (LFC) and HDR work with adaptive sync on Nvidia GPUs, as they work on AMD GPUs.
nine0007

We believe our test results will hold true for the vast majority of FreeSync monitors. If the monitor is known to work fine with AMD GPUs over DisplayPort — and has no flickering issues — it should also work fine with Nvidia GPUs with the option enabled. The reverse is also true — if a monitor has problems with the Nvidia GPU, it probably has problems with the AMD GPU, and such a monitor should be returned to the store.
nine0007

Do monitors that successfully pass our testing need to be certified as G-Sync compatible? It’s hard to tell without knowing Nvidia’s strict testing guidelines. However, any monitor without LFC support automatically loses — there are many such FreeSync monitors without LFC, including the two we tested today. But I think it’s important to just emphasize that you don’t need to purchase a «G-Sync compatible» monitor to have adaptive sync with your Nvidia graphics card. Purchasing a «G-Sync compatible» monitor will ensure proper operation and support for features such as LFC, but normal FreeSync monitors will also work just fine.
nine0007

As for the weak points… we don’t have any of those twelve «G-Sync compatible» monitors to test and compare, but having tested many G-Sync monitors before, we got the same great results with non-FreeSync certified monitors . The «G-Sync compatible» badge simply provides a certain level of quality that you won’t get with, say, the cheapest FreeSync models.

If you are interested in the issue of input lag, then we did not find a noticeable difference in its value between enabled and disabled adaptive sync on Nvidia GPUs. Enabling adaptive sync does not appear to increase processing time on the part of the GPU, be it Nvidia or AMD.
nine0007

You’re not limited to just one display that supports adaptive sync — you can have multiple displays connected to the same Nvidia GPU, but adaptive sync will only work on one at a time. This is unlikely to be a problem for most people, but let’s say you have two games running on two different adaptive-sync displays, in which case only one of those monitors will receive an adaptive-sync signal from the corresponding game.
nine0007

We haven’t tested any FreeSync 2 monitors with Nvidia GPUs, as FreeSync 2 is just an AMD-exclusive HDR pipeline that allows the game to communicate directly with the monitor for the lowest latency HDR processing, and thus is unlikely to be a feature. work with Nvidia GPUs. However, as we mentioned, this won’t stop regular HDR from working in conjunction with adaptive sync on Nvidia GPUs. So those who have bought or are about to buy a FreeSync 2 monitor will get HDR functionality, not HDR FreeSync 2, which only a few games support.
nine0007

In summary, Nvidia’s decision to support FreeSync is a good thing for consumers and the industry as a whole. We hope that this will give owners of the most popular GPUs on the market a significant increase in the choice of available monitors with adaptive sync. Why pay extra for a G-Sync module when there is an equivalent FreeSync, but before worrying about VRR technology, first make sure that the gaming monitor you buy really meets a high quality standard.