Temperature, power consumption & overclocking : AMD Radeon RX 580 vs. NVIDIA GeForce GTX 1060 OC: Mid-range GPU shootout
Trending
Pricing and promos for Samsung’s new Galaxy Z Flip5 and Z Fold5 in Singapore
Best Amazon Singapore deals from the mid-year clearance sale
Samsung Wearables Livestream Special is happening on 28th July, 5:30pm SGT. Watch it here.
Shootouts
-
Page 1 of 5 — IntroductionPage 2 of 5 — MSI Radeon RX 580 Gaming X+ 8G vs. GeForce GTX 1060 Gaming X+ 6GPage 3 of 5 — Test setup & performancePage 4 of 5 — Temperature, power consumption & overclockingPage 5 of 5 — Conclusion
Page 4 of 5 — Temperature, power consumption & overclocking
- Next >
Temperature and power consumption
Since both the cards we’re reviewing today use the same MSI cooler, their temperature figures should be quite comparable. The Radeon RX 580 ran 5°C hotter than the GeForce GTX 1060, which is probably unsurprising given its much higher power draw and TDP (the NVIDIA card has just a 120 watt TDP compared to the Radeon RX 580’s 155 watts).
It also posted a total system power consumption figure of 300 watts, higher than the Radeon RX 480 as well.
As it turns out, these results are actually testament to how efficient NVIDIA’s Pascal architecture is as it managed to run cooler and consume less power while offering similar performance.
Overclocking
We used MSI’s Afterburner utility to overclock the cards, and here’s a table summarizing the clock speeds we managed to achieve.
Base clock | Boost clock | Memory clock | |
---|---|---|---|
MSI Radeon RX 580 Gaming X+ 8G | 1,500MHz | — | 8,880MHz |
MSI GeForce GTX 1060 Gaming X+ 6G | 1,669MHz | 1,895MHz | 9,448MHz |
Aorus Radeon RX 570 4G | 1,440MHz | — | 7,600MHz |
PowerColor Red Devil Radeon RX 480 | 1,330MHz | — | 8,100MHz |
The Radeon RX 580 overclocked surprisingly well, and this was a nice change considering the lackluster results we had with the Radeon RX 480. What’s more, the Radeon RX 570 took quite well to overclocking as well, so AMD clearly has made some much-appreciated tweaks in its second-generation Polaris architecture.
Ultimately, we achieved between 5 and 6 per cent increase in Fire Strike for the GeForce GTX 1060 and Radeon RX 580 respectively. The proportionate increase was similar in Fire Strike Extreme as well, so there’s still a tiny bit of performance to squeeze out of these cards with overclocking.
Having said that, the Radeon RX 570 deserves special mention, as we managed to get a 12 per cent performance boost in Fire Strike after overclocking, which is quite impressive for a card of its class.
-
Page 1 of 5 — IntroductionPage 2 of 5 — MSI Radeon RX 580 Gaming X+ 8G vs. GeForce GTX 1060 Gaming X+ 6GPage 3 of 5 — Test setup & performancePage 4 of 5 — Temperature, power consumption & overclockingPage 5 of 5 — Conclusion
Page 4 of 5 — Temperature, power consumption & overclocking
- Next >
Join HWZ’s Telegram channel here and catch all the latest tech news!
Our articles may contain affiliate links. If you buy through these links, we may earn a small commission.
Sponsored Links
HDX-card, Radeon RX 580 8GB and power consumption [Archive]
Avid Pro Audio Community > Pro Tools Hardware > Pro Tools HDX & HD Native Systems (Mac) > HDX-card, Radeon RX 580 8GB and power consumption
PDA
View Full Version : HDX-card, Radeon RX 580 8GB and power consumption
stargazer
06-17-2020, 02:48 AM
I have an HDX card in my MacPro5,1.
Looking at a Radeon RX 580 8GB Graphics Upgrade for Mac Pro 2010-2012.
https://macpatric.com/products/amd-radeon-rx-580-8gb-graphics-upgrade-for-mac-pro-2010-2012?_pos=76&_sid=762c29d8b&_ss=r
The graphics card needs both of the two connectors on the motherboard. (PCIE AUX A / PCIE AUX B), but the HDX card uses the first one.
Anybody with a similar setup and a solution?
AE
06-17-2020, 11:52 AM
in my (admittedly hazy) recollection i believe i sold the graphics card that was supplied with my MP and replaced it with a card that only required a single power connection. i don’t recall that there was an option that supported the upgrade you’re looking at.
stargazer
06-17-2020, 01:20 PM
Searched this forum, but got more confused…
Some possible solutions I can think of:
1. Compatible graphics card that supports 3840×2160 that only needs one power cable.
2. Expansion chassis.
3. External power for the graphics card.
4. Share one of the connectors (PCIE AUX A or PCIE AUX B) if there is enough power for both cards.
5. eGPU (But I guess there is no way to install that on a MacPro5,1)
AE
06-17-2020, 01:52 PM
i ended up with the Radeon HD 5770; works fine driving two monitors but only at 1920 x 1080.
stargazer
06-17-2020, 02:13 PM
Is the Radeon HD 5770 Mojave compatible (Metal)?
Whats the solution if you run only one display?
JFreak
06-17-2020, 02:27 PM
Is the Radeon HD 5770 Mojave compatible (Metal)?
Whats the solution if you run only one display?
No (https://www. pro-tools-expert.com/production-expert-1/2018/6/12/how-do-you-find-mojave-compatible-graphics-cards-for-cheese-grater-apple-mac-pro-51-computers).
stargazer
06-17-2020, 02:51 PM
Ok, thanks!
DetroitT
06-20-2020, 06:41 AM
Hi. I use the 580 mac video card.com flashed
The two mini 6pin pcie power to 8 pin video.
Power split / spliced to avid 4 pin hdx
2-evo ssd’s, 2-10tb spinners
Hdx1
I’ve been using this split 8 yr or so since the
Original 5870 card, 7950 now the 580
Mac video cards also offers a single power cable solution (6 pin to 8 pin)
stargazer
06-20-2020, 08:44 AM
Thanks DetroitT!
Im gonna contact Mac Video Cards.
stargazer
06-24-2020, 02:40 AM
Are there any Mojave compatible graphics card that only needs one power cable available?
DetroitT
06-24-2020, 06:22 AM
Are there any Mojave compatible graphics card that only needs one power cable available?
Apple has a list.
Some are not flashed like the AMD Radeon Pro WX7100
1-6pin power
Padje
07-08-2020, 09:07 AM
MSI Radeon 560 Aero ITX 4G
No startup screen. Not flashable as I know, but besides that, works like a charm on a Mac Pro 5.1 with HDX and UAD, all drive bays occupied, 4k screen.
epm
07-22-2020, 07:11 AM
Are there any Mojave compatible graphics card that only needs one power cable available?
As Padje mentioned the MSI Radeon 560 Aero ITX 4G is (imo) the best solution.
4k Support, fully PCI powered.
No startup screen, but Recovery mode works so I don’t see any need for flashing. I’d keep your 5770 or whatever mac card you have just in case, but the 560 is a perfect fit for these machines, and can be found for about $170ish.
RossH
09-27-2020, 02:03 PM
Hi. I use the 580 mac video card.com flashed
The two mini 6pin pcie power to 8 pin video.
Power split / spliced to avid 4 pin hdx
2-evo ssds, 2-10tb spinners
Hdx1
Ive been using this split 8 yr or so since the
Original 5870 card, 7950 now the 580
Mac video cards also offers a single power cable solution (6 pin to 8 pin)
Hi DetroitT, I’ve seen you mention your «power split» solution in a couple threads on here. Can you provide more detail on how you did that?
I have a 2012 Mac Pro that I want to upgrade the 5770 to an RX580, but am struggling to figure out the power requirements as I also have a single HDX card. And I don’t want to add an external power supply, I’m just trying to get another year out of my system.
Thanks for any info.
Hugh-H
09-30-2020, 02:25 PM
Hello RossH,
For our MP5,1 + RX580 + HDX3 installations initially I wired the harnesses, one MP 6-pin to the RX580, the other MP 6-pin to both the RX580 and the HDX3 harness. The thought in my head was that the RX580 requires high current and the reason most power harnesses are dual-6pin is to allow for the high currents, less voltage drop.
However I did some voltage and current testing and perhaps the dual 6-pin source is advised for high end graphics software but for audio software and video rendering, it’s not necessary. So to make it easier for our other users we purchased single 6-pin to 8-pin harnesses for the RX580 installations, and the other 6-pin on the MP is for the HDX3. Works fine.
Here is the link we purchased that works —
https://www.amazon.com/gp/product/B00OSLGQGE/ref=ppx_yo_dt_b_asin_title_o09_s00?ie=UTF8&psc=1
Good luck.
Hugh
RossH
10-02-2020, 08:39 AM
Hi Hugh, thanks for the response.
Which RX580 are you using? The ones I’ve seen only have one 8 pin power port. I’m planning to get a Sapphire RX580, I guess mostly because it’s on Apple’s (now archived) list of supported cards.
I do use Adobe Creative Cloud apps on this Mac, so I think I will need both mini 6 pin ports for the single 8 pin connector. I also want to drive at least 3 monitors (can only use two on my 5770) if that matters.
I’ll plan to power the HDX card off a SATA port or two. I’ve contacted Avid support to try and track down the HDX power cable for PC with the Molex connector on it, I don’t seem to have the original box anymore. But I haven’t heard back just yet. There is a US-based site that sells them online, and there’s one from the UK on eBay. Can’t find any in Canada.
clubber
03-31-2021, 07:58 PM
Folks, I am about the pull the trigger for an HDX card and I have an RX580 4 GB in a 5,1. I am not very familiar with power cables and stuff that needs to be connected to the mainboard and the cards. Please help me out.
I remember having to buy this power cable in order to connect the RX580: https://www.amazon.com/gp/product/B07V4GGS43/ref=ppx_yo_dt_b_asin_title_o06_s00?ie=UTF8&psc=1
clubber
04-01-2021, 06:23 AM
Hi. I use the 580 mac video card.com flashed
The two mini 6pin pcie power to 8 pin video.
Power split / spliced to avid 4 pin hdx
2-evo ssds, 2-10tb spinners
Hdx1
Ive been using this split 8 yr or so since the
Original 5870 card, 7950 now the 580
Mac video cards also offers a single power cable solution (6 pin to 8 pin)
Hi, can you please elaborate on this ‘power split solution»? How do you do that?
Thank you.
hugoleitao
04-05-2021, 06:42 PM
Hi
Just got HDX card but my nVidia GTX 680 2GB Mac EFI is using both PSU ports.
I’m on a 2009 mac pro 3.46.
Can anyone tell I to get power working for both please?
innerbooty
04-21-2021, 09:03 AM
Hi,
Chiming in here also looking for I believe the splitter solution. I am new to HDX and have been slowly configuring a 2009 4,1 Mac Pro for HDX. I flashed to 5,1. Got High Sierra installed. HDX installed. Pro Tools 2021. All working great. I bought an AMD Radeon HD 7970 3GB GPU pre-flashed for Mac that I thought I would install today so I could get a 2nd drive running Mojave. I quickly hit the roadblock of realizing this GPU card requires both PCIe power slots, and one is taken by up the HDX card. Is there a way to work around this? Specific details would be super appreciated! Mike Thornton on Pro Tools Expert describes buy a 6 to 8 PIN adapter, but he glosses over the installation of that part and I am not sure which adapter to get, or where it is supposed to get plugged in. Presumably the port that the current stock graphics card is using? The new GPU required one 8 pin and one 6 pin connection.
Thanks in advance for any advice!
drums77
04-21-2021, 11:21 AM
MSI Radeon 560 Aero ITX 4G
No startup screen. Not flashable as I know, but besides that, works like a charm on a Mac Pro 5.1 with HDX and UAD, all drive bays occupied, 4k screen.
Same here.
innerbooty
04-21-2021, 11:35 AM
So, do your GPUs only need one of the mini 6 pin PCie power ports? My Radeon needs two, and Im trying to figure out how to make that work with the HDX taking one. Thanks!!
drums77
04-21-2021, 01:12 PM
So, do your GPUs only need one of the mini 6 pin PCie power ports? My Radeon needs two, and I’m trying to figure out how to make that work with the HDX taking one. Thanks!!
My Radeon RX 560(MSI) don’t requires any power. Only the PCI Express connection.
LDS
04-21-2021, 03:08 PM
So, do your GPUs only need one of the mini 6 pin PCie power ports? My Radeon needs two, and Im trying to figure out how to make that work with the HDX taking one. Thanks!!
Its really just a case of looking at the power consumption specs of your specific GPU, and calculating how much power you really need to power the stuff. The wattage provided by 6-pin and 8-pin pcie aux power is standardised (cant remember the exact spec, but its easily googleable). An HDX card only needs something like 40watts.
Powering options are 1×6 pin to 2x 6-pin splitters, 4 pin molex to 6-pin to use the power in the 5.25 driver bay, or even Sata to 6-pin to use the power from a spare hard drive slot. The cable you need really depends on how much power your GPU plus HDX card draws. Pay close attention to the male/female orientation of cables. Its a bit of a minefield.
vBulletin® v3.8.7, Copyright ©2000-2023, vBulletin Solutions, Inc.
RX 500 — Power consumption RX 580 8gb Micron
makduk
Own person
-
-
#1
Good afternoon,
Recently, after updating ethos to version 1. 3.1, amd cards began to show their consumption. I have x7 cards rx 580 8gb red dragon v2 on micron memory. So here is my question. If you look through ethos, you can see that my cards consume from 120 to 140, depending on the card.
If we do the math we get about 910 watts + ~70 watts for the system and we end up with about 1000 watts. And if you look at the consumption sockets, then you can see a number of 1240 watts there.
And then the question arose where the remaining 240 watts go. Maybe someone has suggestions, be sure to listen to them and check.
Best regards, Dmitry.
Last edit:
mechislav
-
-
#2
HWiNFO64 set what it will show in card consumption
bahek332
Forum friend
-
-
#3
And what kind of cards are these 6gb
makduk
Own person
-
-
#4
bahek332 said:
And what kind of cards are these 6gb
Click to expand. ..
Mixed up a bit, 8 GB
vdebrist
Own person
-
-
#5
Downvolt may be lying, on Windows I did it with a watttool, since AB with such cards does not downvolt from the word at all, and prescribing in a claymore is a slightly worse result than with a watttool. 8×474 eat from a 1080 watt outlet along with the system.
areht
Lawyer
-
-
#6
Efficiency BP
makduk
Own person
-
-
#7
mechislav said:
HWiNFO64 set what it will show in card consumption
Click to expand. ..
Is it possible to install this program on ethos? As far as I know, it is under Windows current. And so if you look, then other cards consume 80 watts each, and from the outlet it shows the same amount.
makduk
Own person
-
-
#8
areht said:
PSU efficiency
Click to expand…
2 power supplies cost 750 watts each, the cards are scattered according to the principle of 3 cards + system and the rest of the power supply, the remaining 4 cards
makduk
Own person
-
-
#9
vdebrist said:
Downvolt may be lying, on Windows I did it with a watttool, since AB with such cards does not downvolt from the word at all, and prescribing in a claymore is a slightly worse result than with a watttool. 8×474 eat from a 1080 watt outlet along with the system.
Click to expand…
When I tried it on Windows, I didn’t cut consumption in Ab at all, I stupidly didn’t take power settings. And through the claymore — immediately the system flew into a freeze. I tried another program, it turned out to be reduced to 120, but after 10 minutes they started eating back at the maximum card.
areht
Lawyer
-
-
#10
makduk said:
2 power supplies cost 750 watts each, cards are scattered according to the principle of 3 cards + system and the rest of the power supply, the remaining 4 cards
Click to expand. ..
why do you need these numbers??? you only need to watch from the outlet, you pay for it.
If it eats 1200 watts from the outlet, then 15% goes to heat, so it turns out your 1000 watts plus or minus
MinerPro
Forum friend
-
-
#eleven
makduk said:
Good afternoon,
Recently, after updating ethos to version 1.3.1, amd cards began to show their consumption. I have x7 cards rx 580 8gb red dragon v2 on micron memory. So here is my question. If you look through ethos, you can see that my cards consume from 120 to 140, depending on the card.
View attachment 67664If we do the math, it will come out to about 910 watts + ~70 watts to the system and end up with about 1000 watts. And if you look at the consumption sockets, then you can see a number of 1240 watts there.
See attachment 67663And then the question arose where the remaining 240 watts go. Maybe someone has suggestions, be sure to listen to them and check.
Best regards, Dmitry.
Click to expand…
Full configuration in the studio, and test on a 100 watt lamp wattmeter
Nicolaym
Own person
-
-
#12
mechislav said:
HWiNFO64 set what it will show in card consumption
Click to expand. ..
hwinfo shows GPU consumption, not the entire card, or am I wrong?
makduk
Own person
-
-
#13
MinerPro said:
Full configuration in the studio, and test on a 100 watt lamp wattmeter
Click to expand…
Complete system configuration? I can’t check the bulb, but when I bought it I checked a 10 watt bulb, it showed 11 — 12
Zeet
Own person
-
-
#14
makduk said:
2 power supplies cost 750 watts each, the cards are scattered according to the principle of 3 cards + system and the rest of the power supply remaining 4 cards
Click to expand. ..
answered his own question .. Efficiency of three blocks.
Vinc37
Experienced
-
-
#15
makduk said:
I can’t check the light bulb, but when I bought it I checked a 10 watt bulb, it showed 11 — 12
Click to expand…
I tested the wattmeter on different light bulbs, from 40 to 100. The result was sad.
The more watts, the more pi … the miracle of Chinese engineers, in my case, an understatement.
This was my first and last wattmeter, went to Avito.
makduk
Own person
-
-
#16
ZeeT said:
answered his own question . . The efficiency of three blocks.
Click to expand…
Are you saying that 2 power supplies that give a total of 1500 watts are not enough for this installation?
Aresgard
Local resident
-
-
#17
makduk said:
Looking through ethos you can see that my cards consume between 120 and 140
Click to expand…
This is most likely just the core consumption. Although it is possible that a static value is added to the consumption of the core, so that the final figure would be similar to the real consumption.
Unit efficiency of 10% must be taken into account. This is at least 1100 watts. Well, wattmeters are very fib very much. For them, an error of 10-15% is considered the norm. So it turns out about 1200 watts.
The surest way is to connect the RMi block or measure the current strength at the input to the PSU with a chain / tongs.
pulemyot
Forum friend
-
-
#18
makduk said:
Are you saying that 2 power supplies that give a total of 1500 watts are not enough for this installation?
Click to expand. ..
You write the model of power supplies. The fact that they are 1500 watt does not mean anything at all.
makduk
Own person
-
-
#19
pulemyot said:
You write the model of power supplies. The fact that they are 1500 watt does not mean anything at all.
Click to expand…
evga supernova 750 g2
makduk
Own person
-
-
#20
Aresgard said:
This is most likely only the kernel consumption. Although it is possible that a static value is added to the consumption of the core, so that the final figure would be similar to the real consumption.
Unit efficiency of 10% must be taken into account. This is at least 1100 watts. Well, wattmeters are very fib very much. For them, an error of 10-15% is considered the norm. So it turns out about 1200 watts.
The surest way is to connect the RMi block or measure the current strength at the input to the PSU with a chain / tongs.Click to expand…
But if you look at other cards that eat 80 watts, then 630 watts come out of the system and from the outlet it shows me 640 — 650 watts
Power color RX 570 8 gb power down
VadimV
Own person
-
-
#1
Good afternoon
On HIVE OS, with cards of the 578 series, on YouTube I often see that the consumption in hive is written between 64-78 watts.
I don’t get less than 80 watts, and then only on one card. almost all 578 cards write 83-85 watts, which is quite a lot.
BIOS sewn, I’ll put it just in case.
I have a wattmeter, judging by it, changes in VDD from 750 to 950 do not give any results. But if you put the original BIOS, then the consumption grows by an average of 22 watts from the card.
Also, if you check the box for aggressive power reduction in the hive and apply it to the flashed card, then judging by the watt meter, the consumption again grows by 20 watts from the card, why is this checkbox there at all? Does anyone know how it works?
Core — 1160 ; VDD — 760; DPM — 1, MEM — 2100
Can anyone tell me if I can still reduce the consumption on video cards? Maybe I don’t know what setting.
All Black
Local resident
-
-
#2
Humble yourself.
Skif1
Local resident
-
-
#3
Fuck it there your HiveOsa knows, you write yourself: set it and forget it. Well, forget it.
MiningFamily(ECPiCo)
Cadenza
Own person
-
-
#4
I have voltage 9 on my rx 57000mv in the hive, and the consumption shows 118w, it seems that the settings are simply not applied. Rx 470 in the same rig 79w. Launched a rave, at the same voltage, consumption by 570 dropped to 80W
Mashkovtsev2006
Forum friend
-
-
#5
VadimV said:
Can anyone tell me if I can still reduce the consumption on video cards? Maybe I don’t know what setting.
Click to expand…
The day before yesterday I started a similar topic
Read first, maybe you’ll find something useful.
Bogdanoff
Own person
-
-
#6
VadimV said:
Good afternoon
On HIVE OS, with cards of the 578 series, on YouTube I often see that the consumption in hive is written in the interval between 64-78 watts.
I don’t get less than 80 watts, and then only on one card. almost all 578 cards write 83-85 watts, which is quite a lot.
BIOS sewn, I’ll put it just in case.I have a wattmeter, judging by it, changes in VDD from 750 to 950 do not give any results. But if you put the original BIOS, then the consumption grows by an average of 22 watts from the card.
Also, if you check the box for aggressive power reduction in the hive and apply it to the flashed card, then judging by the watt meter, the consumption again grows by 20 watts from the card, why is this checkbox there at all? Does anyone know how it works?Core — 1160 ; VDD — 760; DPM — 1, MEM — 2100
Can anyone tell me if I can still reduce the consumption on video cards? Maybe I don’t know what setting.
Click to expand…
Friendly. First, remove all the dunvolts on the maps in the hive, do not put any numbers, just check the box for aggressive downvolt, reboot, and only after that you select millivolts and not vice versa. Read carefully what I wrote and do it. And remove DPM. There is nothing to write.
Last edit:
sXe_92
Own person
-
-
#7
VadimV said:
Good afternoon
On HIVE OS, with cards of the 578 series, on YouTube I often see that the consumption in hive is written in the interval between 64-78 watts.
I don’t get less than 80 watts, and then only on one card. almost all 578 cards write 83-85 watts, which is quite a lot.
BIOS sewn, I’ll put it just in case.I have a wattmeter, judging by it, changes in VDD from 750 to 950 do not give any results. But if you put the original BIOS, then the consumption grows by an average of 22 watts from the card.
Also, if you check the box for aggressive power reduction in the hive and apply it to the flashed card, then judging by the watt meter, the consumption again grows by 20 watts from the card, why is this checkbox there at all? Does anyone know how it works?Core — 1160 ; VDD — 760; DPM — 1, MEM — 2100
Can anyone tell me if I can still reduce the consumption on video cards? Maybe I don’t know what setting.
Click to expand…
Kalinka008
Dancing with a tambourine
-
-
#8
Personally I had around 76-78, but when I changed the miner it went up to 82-84, so try changing the miner. ..
Who does not risk does not risk)
ross52
Experienced
-
-
#9
Really, who the hell knows why the same cards have different consumption with the same settings.
24 cards of 578 armor are in operation and consumption is from 57 to 87 watts. The settings are the same.
Disa put only mutually.
Zembat
Dancing with a tambourine
-
-
#10
I have 588 cards, they eat 60-66, but 578 also eats about 80, now 77.
dmitriev11
Forum friend
-
-
#eleven
ross52 said:
There are 24 cards of 578 armor in operation and consumption from 57 to 87 watts. The settings are the same.
Click to expand…
Cadenza said:
Rx 470 in the same rig 79w. Launched a rave, at the same voltage, consumption at 570 dropped to 80w
Click to expand…
Kapets, these people have been in mining since 2017 and still have not learned how to use a wattmeter?
How is this even possible??
mr.
ashka
Experienced
-
-
#12
GODS! lifted up your flashbacks from 2017! For reds, the consumption indicated in the software is a spherical horse in a vacuum! Stop discussing the gypsies taken from the ceiling! Only a wattmeter will allow you to see the actual values, and this is approximately 110-120w per card with a voltage around 850mV +/-
ross52
Experienced
-
-
#13
dmitriev11 and mr. ashka, do you really think what is meant by the consumption of cards ??????
My mother is a woman, this is just the consumption of the video card core. Naturally, in order to find out the consumption of a video card, it is better to measure it with a wattmeter. But we are talking about something else, and many people understand this except you.
If consumption of card cores is not important for you, then it is too early for you to engage in mining.
You don’t have to be a mathematician to calculate the difference between card consumption and core consumption.
Disa put only mutually.
mr.ashka
Experienced
-
-
#14
ross52 said:
dmitriev11 and mr. ashka, do you really think what is meant by the consumption of cards ??????
My mother is a woman, this is just the consumption of the video card core. Naturally, in order to find out the consumption of a video card, it is better to measure it with a wattmeter. But we are talking about something else, and many people understand this except you.
If consumption of card cores is not important for you, then it is still too early for you to engage in mining.
You don’t have to be a mathematician to calculate the difference between card consumption and core consumption.Click to expand…
you’ve found a clever one) it’s too early to study … I’m not arguing, but stating the fact that these numbers can be put a bolt
for me, the core consumption is not important, because there are such cases (GPU5-GPU8)
cards on the same bios and settings show different consumption for the core, this is 99% of golden hands — picking the bios.
TS, don’t do bullshit, fill in the stock rom only with timings, for hive with an aggressive downvolt DPM tick, you can not put down, in MDPM you can score the voltage of the memory controller, the decrease in zhora is slightly higher than the error level, but there is, of course, it’s not visible in the core .
all this was chewed up back in 2017!!!! math tag fucking
Last edit:
ross52
Experienced
-
-
#15
mr.ashka you are talking about Thomas and you are talking about yerema.
There is no question about your cards and no one argues that they are matched with a difference of 13W. For others, the difference can be serious by 30-40 watts or even more.
Of course, you can adjust the consumption, but then there will be a difference in the hashes.
Disa put only mutually.
mr.ashka
Experienced
-
-
#16
ross52 said:
mr.ashka you are talking about Thomas and you are talking about yerema.
There is no question about your cards and no one argues that they are selected with a difference of 13W. For others, the difference can be serious by 30-40 watts or even more.
Of course, you can adjust the consumption, but then there will be a difference in the hashes. In addition to everything, many have hodgepodges.Click to expand…
what difference does it make, who to drive where, what difference in hashes, what are you talking about? in Windows, power colors on microns show 105w, msi on samsung 86w, in hive power colors on samsung generally 71-75w, all +/- 31mh, you can’t take these numbers as a starting point, even within the same vendor different cards show consumption differently !!!
you normally tell him that the matter is in his hands, and he, damn it, the main thing is the tail . ..
Last edit:
ross52
Experienced
-
-
#17
mr.ashka don’t act smarter than a locomotive. And then there are fools and you are the only smart one.
Everyone is trying to find a middle ground so that they eat little and give out more.
Disa put only mutually.
mr.ashka
Experienced
-
-
#18
ross52 said:
mr. ashka don’t be smarter than a locomotive. And then there are fools and you are the only smart one.
Everyone is trying to find a middle ground so that they eat little and give out more.Click to expand…
you are really oak, already chewed and put
— there is no need to look at the consumption of reds in the software
— there is no need to pick the voltage in the bios, everything is regulated by the software
ross52
Experienced
-
-
#19
mr.ashka read carefully the first message in the topic.
Disa put only mutually.
mr.ashka
Experienced
-
-
#20
ross52 said:
mr.