The Best Graphics Cards for Compact PCs in 2022
Editor’s Note: Before you dive into this guide, as you’ll see from some of the prices above, the availability and pricing situation for GPUs is anything but «normal» right now, and has been skewed since the start of the pandemic. If you plan to buy a card soon, also see this buying-strategies guide for advice on finding cards at a fair price. If you want to wait it out a bit longer, check out this how-to tutorial on getting the most performance from the GPU you already own.
The Best Graphics Card Deals This Week*
*Deals are selected by our commerce team
-
MSI Geforce RTX 3090 24GB Graphics Card
(Opens in a new window)
— $979.99
(List Price $2,009.99)
-
MSI Geforce RTX 3090 Ti 24GB Graphics Card
(Opens in a new window)
— $1,269.60
(List Price $1,400)
-
EVGA GeForce RTX 3090 FTW3 Ultra Graphics Card
(Opens in a new window)
— $1,126. 99
(List Price $1,919.99)
-
Zotac Gaming RTX 3060 Twin Edge OC 12GB Graphics Card
(Opens in a new window)
— $420.57
(List Price $549.99)
-
Zotac Gaming RTX 3080 Ti Trinity OC Graphic Card
(Opens in a new window)
— $869.99
(List Price $1,149.99)
More About Our Picks
Zotac GeForce GTX 1650 Super Twin Fan
4.0 Excellent
Best Compact Graphics Card for Budget 1080p Play
Bottom Line:
Zotac’s punchy GeForce GTX 1650 Super Twin Fan is markedly better than the non-«Super» GTX 1650 and a solid, small version of this mainstream GPU for budget-focused 1080p gaming.
Pros
- Much faster than original non-Super GeForce GTX 1650 in 1080p and 1440p gaming.
- Runs quiet.
- Priced competitively.
- Impressively small in our Zotac test sample.
Cons
- Underperforms on some games.
- Runs hotter than the non-Super GTX 1650.
Read Our Zotac GeForce GTX 1650 Super Twin Fan Review
Zotac GeForce GTX 1660 Super Twin Fan
4.0 Excellent
Best Compact Graphics Card for Mainstream 1080p Play
Bottom Line:
If you’ve been holding off on a mainstream video card for 1080p gaming, the GeForce GTX 1660 Super could be your trigger to buy: It’s a solid-playing, popularly priced waypoint between the GTX 1660 and GTX 1660 Ti.
Pros
- Solid price-to-performance ratio for 1080p gaming.
- Surprisingly good overclocking ceiling.
Cons
- Some driver wrinkles in a few test games show scant improvement over GTX 1660.
Read Our Zotac GeForce GTX 1660 Super Twin Fan Review
EVGA GeForce RTX 3050 XC Black Gaming 8G
4.0 Excellent
Best Compact Graphics Card for High-Refresh 1080p Play (Plus Ray-Tracing)
Bottom Line:
The GeForce RTX 3050 is a strong junior entry into Nvidia’s peerless lineup of «Ampere»-powered RTX 30 Series GPUs, and this EVGA XC Black card is a corker for 1080p play at a near-budget price.
Pros
- Compact, twin-fan design
- Full array of video ports in our test sample
- Good price-to-performance ratio for its segment
- Strong results in ray-tracing benchmarks
- High overclock ceiling
Cons
- Not as far ahead of AMD’s Radeon RX 6500 XT in some tests as we would have hoped
- Relatively high power consumption for its class
Read Our EVGA GeForce RTX 3050 XC Black Gaming 8G Review
XFX Speedster SWFT 210 AMD Radeon RX 6600
3.5 Good
Best Compact Graphics Card for Budget 1440p Play
Bottom Line:
The XFX Speedster SWFT 210, based on AMD’s midrange Radeon RX 6600 GPU, is an able-enough short card for lower-end 1440p gaming with newer AAA games.
Pros
- Competitive with GeForce RTX 3060 in frame rates and list price
- Lower power requirements
Cons
- Better with newer games than old
- No significant overclocking headroom
- Ran hot during our stress testing
Read Our XFX Speedster SWFT 210 AMD Radeon RX 6600 Review
EVGA GeForce RTX 3060 XC Black Gaming 12G
3. 5 Good
Best Compact Graphics Card for Mainstream 1440p Play
Bottom Line:
Suited to 1080p and 1440p gaming, EVGA’s XC Black Gaming 12G version of the GeForce RTX 3060 is an able performer, though Nvidia’s own RTX 3060 Ti outshines it on value.
Pros
- Fast frame rates in multiplayer titles
- Compact card design
- Reasonable overclocking performance
Cons
- Nontrivial performance drop versus the GeForce RTX 3060 Ti
- Minor gains over previous (RTX 2060) generation
- Pricey for its performance class
Read Our EVGA GeForce RTX 3060 XC Black Gaming 12G Review
Zotac GeForce RTX 2060 Amp
4.0 Excellent
A Solid Previous-Gen Alternative to the GeForce RTX 3060
Bottom Line:
A fine value in the previous-generation GeForce RTX 20 lineup, the RTX 2060 is a worthy-enough 1440p pick if you can’t land an RTX 3060.
Pros
- Faster than the GeForce GTX 1070 for less money.
- Compact two-slot design.
- Quiet fans.
- Headroom for mild overclocking.
Cons
- Unlike Nvidia RTX 2060 Founders Edition, lacks a VirtualLink USB Type-C port.
- Priced higher than the GeForce GTX 1060 it replaces.
- 8GB of video memory would have been ideal.
Read Our Zotac GeForce RTX 2060 Amp Review
Nvidia GeForce RTX 3060 Ti Founders Edition
4.5 Outstanding
Best Semi-Compact Graphics Card for High-Refresh 1440p Play
Bottom Line:
If you want the best marriage of price, performance, and features for 4K and 1440p gaming, Nvidia’s GeForce RTX 3060 Ti Founders Edition is matched only by its own step-up RTX 3070 sibling.
Pros
- Beats the RTX 2080 Super in most benchmarks
- Great price-to-performance ratio
- Stable launch drivers
- Runs cool
- Short PCB, redesigned cooling system make for a compact card
Cons
- RTX 3070 gives an extra margin for 4K today and tomorrow
Read Our Nvidia GeForce RTX 3060 Ti Founders Edition Review
Nvidia GeForce RTX 3070 Founders Edition
5. 0 Exemplary
Best Semi-Compact Graphics Card for 4K Gaming
Bottom Line:
A fierce follow-up to its killer GeForce RTX 3080, Nvidia’s beautifully engineered RTX 3070 Founders Edition is a practically perfect graphics card. Gamers aiming for high-resolution, high-refresh 1440p or 4K play will find today’s best price-for-performance engine right here.
Pros
- Incredibly fast for the price
- Beautiful design
- Great 4K gaming results
- Innovative cooling system
- Doesn’t run too hot
Cons
- Expect a run on this card to rival the one on the RTX 3080
Read Our Nvidia GeForce RTX 3070 Founders Edition Review
AMD Radeon RX 6700 XT
3.5 Good
Best Graphics Card for 4K Power in Recent AAA Games (If It Fits!)
Bottom Line:
AMD’s Radeon RX 6700 XT is a solid competitor to Nvidia’s lineup of midrange GPUs—if you stick to recent, optimized titles.
Pros
- Competitive frame rates, in games that are optimized
- SAM testing sees frame-rate gains in select titles
- Debut MSRP is $20 below that of Nvidia’s GeForce RTX 3070
Cons
- DirectX 11-based games may run significantly slower than on competing Nvidia cards
- Higher temperatures under stress than competing Nvidia cards
- Radeon reference design is underwhelming compared with Nvidia’s RTX 30 Series Founders Editions
- Coil whine under load in our sample card
Read Our AMD Radeon RX 6700 XT Review
A hulking, full-tower PC is always your best option if you want room for multiple graphics cards, general upgrades, or all the terabytes of storage you could ever reasonably afford. But these days, you don’t necessarily need one if you want a powerful PC that can handle editing high-def or 4K video, playing new AAA games at 4K, and powering virtual-reality (VR) headsets.
As more and more PC enthusiasts and boutique-PC builders have shown interest in compact performance systems, many case makers have offered up comparatively compact chassis that have room for midsize or even full-size graphics cards. For instance, the shoebox-shaped SilverStone Sugo 14 has room enough for a graphics card over a foot long. That means the largest and most powerful cards available, such as the Nvidia GeForce GTX 3080, should have just enough clearance in that case.
A handful of similarly compact PC cases have an even smaller footprint, yet enough room for beastly graphics cards. (You’ll want to check the specs.) Just note that specialized compact cases like these usually require a MicroATX or Mini-ITX motherboard, rather than a more standard (and often less expensive) full-size ATX motherboard. Their unusual proportions mean that they may be able to take big video cards, but not big mainboards.
It’s possible to build a gaming-ready system even smaller, using a truly trim case such as the Cooler Master Elite 110 (which is just 11 by 10 by 11 inches). It can include an upper-end processor and a graphics card like the Zotac GeForce GTX 1650 Super Twin Fan or one based on the GeForce GTX 1660 or GTX 1660 Super. Those cards aren’t extreme performers, but they do support VR gaming and can handle today’s hardiest non-VR titles on high settings at 1080p (1,920-by-1,080-pixel) resolution. (Interested in which cards are best for your VR headset? Check out our roundup of those GPUs(Opens in a new window) for the best picks.)
Measure Twice: How to Tell Which Graphics Cards Will Fit Your Case
For graphics-card considerations specifically (which is likely why you landed here), the most important thing is how much card clearance the PC chassis that you own (or are considering buying) has available. This is often listed on the chassis’ product page under «specifications,» or in an online manual. But in a pinch, you can pop off the side panel and do a rough estimate yourself, with a tape measure or ruler.
Inside the case, find the area where the PCI Express card expansion brackets are, usually at the back. This is the spot where the video-output ports on a graphics card will show through the back of the chassis. Measure from there, parallel to the PCI Express slot into which you’ll install your card, to the first obstruction you run across. That’s your maximum card length, assuming your card does not have power connectors sticking out of its trailing end. If it does, you’ll need to compensate for that. (Most of the time, these days, the connectors are on the top edge of the card.)
(Photo: Zlata Ivleva)
If you want to fit a high-end gaming card, also be sure there’s room for mounting the card across two expansion slots, because just about all such cards occupy at least two bracket positions across. Some very compact PC cases may not have very much space between the PCI Express slot you’ll use for a card and the nearby case wall, which might prevent you from installing a card at all. If you’re upgrading an existing system, at least an eyeball check is much advised.
The distance between the bracket area and the nearest object on the other side of the case (often a hard drive mount or bay, or the wall of the case itself) will determine how long a graphics card your system can handle. If the case has 10.5 inches of clearance or more, you’ll be able to fit at least some high-end cards, including a subset of ultra-high-end cards based on, say, Nvidia’s GeForce RTX 3070 or GeForce RTX 3080. This also includes AMD’s high-end options like the Radeon RX 6800 and RX 6800 XT, which both measure in at exactly 10.5 inches each in reference versions. But some third-party versions of these cards are longer (much longer), so you need to check the details. Example: We’ve seen a few GeForce RTX 3070 cards under 10 inches, and some RTX 3060 Ti cards above 12 inches. Check those specs.
(Photo: Zlata Ivleva)
For smaller PC cases that still have room for a full-height graphics card, though, you’ll need an air-cooled card that may be 8 inches long or less. These are often Mini-ITX-mainboard cases that are very tight on space. That 8 inches may sound generously big enough, but as gaming-class graphics cards go, that’s not much wiggle room. But even here these days, you have a few options on that front.
Can Your Power Supply Handle a New Video Card?
Another factor to consider when building a compact PC is power supply (PSU) cable routing. While many cards on the market will keep their power-pin connection ports on top of the card (and equally as many compact or Mini-ITX cases are made to support this design), a few may opt to put the pin connector on the rear-facing edge of the card instead.
If you’re already working in a constrained space and every inch of the total card measurement matters, you’ll want to be sure that wherever your card plugs into the PSU (if it does at all, more on that below) is accommodating to all the other parts and pieces you’re trying to squeeze into a confined area.
That said, many compact cards won’t even ask you for a dedicated PSU power connection in the first place. Several options we have tested, including the 75-watt Zotac GeForce GTX 1650 OC, don’t need an external connection because they already pull all the watts they need from the PCI Express slot they’re plugged into.
Keep in mind that these cards have limitations on the amount of graphical horsepower and overclocking capacity they can support without an external power source. But if those aren’t major concerns of yours, then you might want to go this route to avoid any cabling issues in the first place.
Low-Profile Graphics Cards: A Low-Clearance Alternative
You’ll note that we’ve talked about card length, but not much about card height. A few thin PC cases (usually flat, broad ones meant for home-theater-PC use) accept only what are called «low-profile» expansion cards, among them low-profile PCI Express graphics cards. These cards are much shorter in the vertical dimension than an ordinary video card, and they can be outfitted with what’s known as a low-profile or «half-height» bracket. (You twist a screw or two to install the shorter bracket in place of the ordinary one.) This enables the card to mount on the vertically smaller PCI-slot frame.
There’s usually room for just two ports on a half-height bracket, or in a few cards we’ve seen, the half-height bracket is two slots wide, allowing for a third port. So know that multi-display connectivity is often a bit compromised on these cards, especially with the low-profile bracket attached.
Because low-profile boards are much smaller in surface area (and thus the room for a graphics chip, power circuitry, and cooling apparatus is reduced), they are budget-minded, basic cards, meant as a step up from CPU-integrated graphics or to add support for multiple displays.
So, Which Compact Graphics Card Should I Buy?
As always, size and features for video cards based on a given graphics chip can vary significantly, depending on the model and the card maker. Nvidia and AMD make «reference cards» based on their graphics processors. Third-party partners—MSI, Sapphire, EVGA, Asus, and many others—then make and sell their own branded cards, some of which adhere to the design of these reference boards. They also offer «custom» versions with slight differences in shape and size, the configuration of the ports, the amount and speed of onboard memory, and the cooling fans or heat sinks installed.
Compact video cards fall into that custom class. Because of that, if you’re shopping for a card for a compact system, or you have a particular case in mind, be sure the size, power, and cooling demands of the card you’re buying match up with the chassis you’re planning on putting it in. Few things in the gaming world are more frustrating than getting a promising new graphics card in the mail, or carting it home from the local superstore, only to find out it doesn’t fit in your PC, or your power supply doesn’t have the juice (or requisite connectors) to get it going.
If you have a bit more room to play with in your PC case, check out our roundup of the best graphics cards for 4K gaming, which will be bigger cards. (Also check out our master guide to the best graphics cards overall, heedless of size.) And complete your custom build with one of the top M.2 solid-state drives we’ve tested. These tiny SSDs are a perfect match if you’re space-strapped.
Cost Per Frame: Best Value Graphics Cards Right Now
For those who missed it, we recently published our GPU pricing update for April 2022, which saw prices for graphics cards continue to trend in the right direction… downward. We’ve been monitoring GPU prices for well over a year now and it’s great to see prices getting closer and closer to the MSRP.
But as we mentioned in that feature, it’s hard to say just how exciting that is for prospective buyers considering most of these products are now 18 months old and will likely be superseded by the end of this year by much more powerful next-gen hardware. Then again, we do believe that if gamers had the opportunity of buying an RTX 3070 for $730 or an RX 6700 XT for $570 earlier this year, they would have jumped at it. So while pricing still sucks compared to 2019, it could be worse, a lot worse. If you need evidence of that, just go back and check our GPU pricing update for May 2021 when the 3070 was closer to $1,600(!).
Anyway, for those of you who are now ready to buy, and aren’t willing to hold out for next gen products to arrive — because let’s be honest, pricing will almost certainly be inflated and availability will be poor around launch day — what should you buy now? For that, we’ve come up with some fresh data and created a cost per frame analysis.
The goal today is to test all current generation AMD and Nvidia GPUs to establish FPS performance and using that data, create some cost per frame comparisons. In total, there’s 17 current generation GPUs (or up to 18 if you include the RX 6400, but we don’t have one of those yet) and we don’t think we’re missing out on anything of true value there.
Because that’s a ton of GPUs to cover, we’ve only tested them in 6 games, but we’ve chosen the titles carefully based on recent 50 game benchmarks. The titles include Red Dead Redemption 2, Rainbow Six Siege, Far Cry 6, Hitman 3, Dying Light 2 and Shadow of the Tomb Raider.
Instead of going through the individual game data, we’ve calculated the geomean for the six games and will be using that to report cost per frame. The reason for using medium quality settings in almost all the games was to allow the entry-level models to achieve reasonable levels of performance, and then at the high-end we can look at the 4K data.
We tested at 1080p 1440p and 4K resolutions using the Ryzen 7 5800X3D with DDR4-3600 CL16 memory and resizable BAR enabled. We used pricing information gathered on April 20, which means that some price movement is to be expected, though the bulk of the data should be good.
With that said, the most valuable information here is the frame rate data. Simply select a performance tier you’re interested in, take the current pricing you’re looking at and divide that by the frame rate to get your cost per frame. Even if you are not in the United States, as we usually make recommendations based on pricing from that market, you can check pricing of the relevant products in your region and using the formula above you can easily come up with the option that makes the most sense for you.
Best Value at 1080p
The best value 1080p GPU right now is the Radeon RX 6600, coming in at a cost of $2.98 per frame in our testing. That makes it 16% cheaper than the 6500 XT per frame. What’s shocking about this data is that the 6500 XT would need to be priced at ~$170 just match the cost per frame of the RX 6600, and it would be doing so with half the VRAM, half the PCIe bandwidth, no hardware encoding, and complete absence of AV1 decoding.
In other words, even as the Radeon 6500 XT approaches its MSRP, it continues to be a horrible product that should have never sold for a dollar over $150 — in fact, $100 or less is far more fitting for this class of product. It’s crazy to think this GPU was selling for $270+ just months ago and some reviewers were recommending it simply because it was the cheapest new GPU you could buy.
The Radeon RX 6600 also makes a mockery of the RTX 3050 as the GeForce GPU costs 26% more per frame. It might be $10 cheaper, according to our pricing data, but because it’s 20% slower, it’s not a great deal.
The GeForce RTX 3060 Ti stacks up better despite costing more per frame. It’s $10 more than the 6700 XT and is only ~5% slower, so you could argue features such as DLSS help to offset that margin. For comparing more higher-end graphics cards let’s move to 1440p.
Best Value at 1440p
The margin between the Radeon 6700 XT and GeForce RTX 3060 Ti remains about the same at 1440p with the GeForce GPU costing roughly 7% more per frame. As we move beyond the mid-range though towards the higher-end models, AMD stacks up well.
The Radeon RX 6800 XT is offering 5% more performance than the 12GB RTX 3080, suggesting that the 6 game sample is slightly more favorable towards AMD when compared to our 50 game testing, but we’re only talking about a 5% discrepancy, though do keep that in mind.
In terms of overall FPS performance, the 6800 XT, RTX 3080, 3080 12GB, 3080 Ti and even the 3090 are all pretty similar. The 3080 Ti and 6800 XT are a good match up here as they both averaged 157 fps in our testing, but the Radeon GPU is currently 24% cheaper in the US, making it more appealing in terms of cost per frame.
Best Value at 4K
As we’ve come to expect, Nvidia RTX Ampere GPUs stack up better at the higher 4K resolution and now the 6800 XT is on par with the 12GB 3080. With a wider range of games the RTX 3080 would pull ahead by around a 5-7% margin, and we know that because we recently tested them, but for the purpose of this feature it was not feasible to test 17 graphics cards across 50 games.
The data for the mid-range to lower end GPU models was spot on with what we saw across our 50 game testing, so we suspect the lower quality settings is what’s favoring AMD’s high-end models a little more in this scenario.
Here we’re seeing half a dozen AMD and Nvidia GPUs in that 90-105 FPS range, which includes the Radeon RX 6800 XT, 6900 XT, GeForce RTX 3080, 3080 12GB, 3080 Ti and 3090. The most affordable GeForce option is the original RTX 3080 for $1,000, whereas the 6800 XT is down at $920. That’s not a huge saving, so as usual it will come down to the importance you put on extra features, namely ray tracing performance and DLSS.
Where the GeForce range becomes ridiculous is with the RTX 3090 series. The 3090 at $1,700 is dumb and the 3090 Ti at $2,000 is equally silly. The high-end battle is fought out between the 6900 XT and RTX 3080 Ti, both offer a similar level of performance while the GeForce GPU costs 21% more.
Back in February we compared the 6900 XT and 3080 12GB head to head in 50 games and at the time the GeForce GPU was priced between $1600 — $1800, while the 6900 XT was closer to $1500 — $1600. The 3080 12GB was between $100 to $200 more expensive, which is still the case today. We think ray tracing and DLSS allow the GeForce GPU to command a 10% premium, but 20% is getting a bit too steep for us.
Best Value at 1440p (Australia)
Although a big portion of our audience is US based, we thought it’d be interesting to check out pricing trends in a few other regions, so let’s start with Australia. The best value option here is the 6600 XT, narrowly beating out the RX 6600, basically both Radeon 6600 series GPUs represent a similar level of value. Then we have the 6700 XT which just beat the 6500 XT in terms of value, but you know we take the 6500 XT’s cost per frame with a grain of salt given all the issues with that product. We’re also testing it with PCIe 4.0, meaning it would stack up far had use used PCIe 3.0.
The best value GeForce GPU in Australia is the RTX 3060 Ti or 3060. The 3060 Ti costs just 2% more than the 6700 XT per frame, so depending on the features you’re interested in the GeForce GPU could be a better value choice.
For those after a high-end GPU in Australia, the Radeon 6800 XT appears to be the way to go at $1350 AUD as it’s 23% cheaper than the RTX 3080 Ti for the same level of performance. Even if you go off our 50 game data where the 3080 12GB and 6800 XT delivered the same level of performance, the 6800 XT would amount to $9 per frame, making it 16% cheaper per frame than the 3080 12GB. That’s a large premium for superior ray tracing performance and DLSS support, but of course, it’s up to you to decide if those features are worth it.
Best Value at 1440p (Europe)
We also have some Euro pricing data, and here we see some significant changes in pricing trends. Using prices from Mindfactory, we see that the 6500 XT offers the lowest cost per frame at €210, but of course, the RX 6600 is significantly better value despite costing 86% more.
Interestingly, the RTX 3060 Ti ranks very well here and is the best value mid tier product, offering 6700 XT-like performance at a 10% discount. The RTX 3060 is also competitive with the 6600 XT.
As we get to the high-end parts, AMD does stack up well. The 6800 XT can be had for €950, while the original RTX 3080 costs €1140, making it a whopping 36% more expensive. Then when compared to the RTX 3080 12GB, we see that the GeForce GPU is fetching almost 50% more per frame and if we go off the 50 game data and say that these two GPUs are a match in terms of performance, the 12GB 3080 still comes out costing 41% more per frame.
Based on Mindfactory’s pricing, without question you would purchase the 6800 XT over any of the 3080 or 3090 series GPUs from Nvidia.
Closing Notes
That’s where things stand on the GPU front as of April 2022. Of course, pricing is highly volatile at the moment and may have changed for some models by the time you read this. Our advice is to work out which performance tier you’re after and then compare current pricing for those products.
The numbers here serve as a rough guide, but if you want to get highly accurate data for certain match ups then make sure to check out 50 game benchmarks. We’ve updated the data for most GPUs at this point. The only graphics card we strongly recommend you to avoid is the Radeon RX 6500 XT — and possibly the new 6400 XT which we might look at soon.
The Radeon RX 6600 is among the best value you are going to get right now for a budget graphics card, we wouldn’t go with anything below it. On the other side of the spectrum, the RTX 3090 and 3090 Ti should also be avoided, but not because they’re bad products, but because their price is still inflated and do not offer great value. We hope this guide is helpful to those of you buying a new graphics card right now.
Shopping Shortcuts:
- AMD Radeon RX 6700 XT on Amazon
- AMD Radeon RX 6800 XT on Amazon
- AMD Radeon RX 6600 on Amazon
- Nvidia GeForce RTX 3060 on Amazon
- Nvidia GeForce RTX 3070 on Amazon
- Nvidia GeForce RTX 3080 on Amazon
- Nvidia GeForce RTX 3090 on Amazon
GPUCards | Scientific Volume Imaging
This page gives an overview of NVIDIA cards that can be used in combination with the Small, Medium, Large or Extreme GPU option in the latest version of Huygens for GPU acceleration. Over time this page will be refreshed, for a category overview of cards working with older versions of Huygens, please contact us.
For cards that are not included in this page: the option that is required in your license will depend on the amount of CUDA cores and amount of RAM that the card has according to the following specifications:
Small GPU option: for cards that have up to 1024 CUDA cores and up to 6 GB of video RAM (included with every Huygens license Free of Charge)
Medium GPU option: for cards that have up to 3072 CUDA cores and up to 8 GB of video RAM.
Large GPU option: for cards that have up to 8192 CUDA cores and up to 24 GB of video RAM.
Extreme GPU option: for cards that have up to 24576 CUDA cores and up to 64 GB of video RAM.
For the ideal combination with your CPU power check the various Performance Options in which we bundle your GPU and CPU power to let Huygens run extremely fast through your biggest data.
Huygens GPU acceleration is supported for Nvidia cards. These are supported by Windows and Linux, but not by Mac OS. If you have a valid GPU card but it’s not recognized by Huygens check out GPU checklist.
No Tabs
- Small
- Medium
- Large
- Extreme
Small GPU cards
The Small GPU option is required in your Huygens license in order to use small GPU cards. A small GPU option is included in every Huygens base license from 15.10 and later. Download the latest Huygens now.
GPU card | CUDA cores | VRAM |
---|---|---|
GeForce GTX 1660 Ti | 1536 | 6GB |
GeForce GTX 1660 Super | 1408 | 6GB |
GeForce GTX 1660 | 1408 | 6GB |
GeForce GTX 1650 Super | 1408 | 4GB |
GeForce GTX 1650 | 1024 | 4GB |
GeForce GTX 1650 | 896 | 4GB |
GeForce GTX 1060 3GB | 1280 | 3GB |
GeForce GTX 1060 6GB | 1280 | 6GB |
GeForce GTX 1050 Ti | 768 | 4GB |
GeForce GTX 1050 (3GB) | 768 | 3GB |
GeForce GTX 1050 (2GB) | 640 | 2GB |
GeForce GTX 960 | 1024 | 2GB |
GeForce GTX 950 | 786 | 2GB |
GeForce GTX 780 Ti | 2880 | 3GB |
GeForce GTX 780 | 2304 | 3GB |
GeForce GTX 750 Ti | 640 | 2 GB |
GeForce GTX 750 | 512 | 1GB or 2 GB |
Quadro P2000 | 1024 | 5GB |
Quadro P1000 | 640 | 4GB |
Quadro M2000 | 768 | 4GB |
Quadro K2200 | 640 | 4GB |
Quadro T2000 | 1024 | 4 GB |
Quadro T1000 | 768 | 4GB |
Medium GPU cards
The Medium GPU option is required in your Huygens license in order to use medium GPU cards. Request Quote
GPU card | CUDA cores | VRAM |
---|---|---|
GeForce RTX 3070 | 5888 | 8GB |
GeForce RTX 2080 SUPER | 3072 | 8GB |
GeForce RTX 2080 | 2944 | 8GB |
GeForce RTX 2070 SUPER | 2560 | 8GB |
GeForce RTX 2070 | 2304 | 8GB |
GeForce RTX 2060 SUPER | 2176 | 8GB |
GeForce RTX 2060 | 1920 | 6GB |
GeForce GTX 1080 | 2560 | 8GB |
GeForce GTX 1070 Ti | 2432 | 8GB |
GeForce GTX 1070 | 1920 | 8GB |
GeForce GTX 980 Ti | 2816 | 6GB |
GeForce GTX 980 | 2048 | 4GB |
GeForce GTX 970 | 1664 | 4GB |
Quadro RTX 4000 | 2304 | 8GB |
Quadro P4000 | 1792 | 8GB |
Quadro M5000 | 2048 | 8GB |
Quadro M4000 | 1664 | 8GB |
Quadro P2200 | 1280 | 5GB |
Tesla K20 | 2496 | 5GB |
Large GPU cards
Large GPU option is required in your license in order to use Large GPU cards. Request Quote
GPU card | CUDA cores | VRAM |
---|---|---|
Titan RTX | 4608 | 24GB |
Titan V | 5120 | 12GB |
Titan-Xp (Pascal Generation) | 3584 | 12GB |
GeForce GTX Titan-X | 3072 | 12GB |
GeForce RTX 3090 | 10496 | 24GB |
GeForce RTX 3080 | 8704 | 10GB |
GeForce RTX 3080 12GB | 8960 | 12GB |
GeForce RTX 3080 Ti | 10240 | 12GB |
GeForce RTX 2080 Ti | 4352 | 11GB |
GeForce GTX 1080 Ti | 3584 | 11GB |
Tesla V100 (16GB version) | 5120 | 16GB |
Tesla P100 | 3584 | 16GB |
Tesla P100 NVLINK | 3584 | 16GB |
Tesla P40 | 3840 | 24GB |
Tesla M60 | 4096 | 16 GB |
Tesla M40 | 3072 | 12GB or 24 GB |
Tesla K80 | 4992 | 2 x 12GB |
Tesla K40 | 2880 | 12GB |
Quadro GP100 | 3584 | 16GB |
Quadro RTX A5000 | 8192 | 24GB |
Quadro RTX A4000 | 6144 | 16GB |
Quadro RTX 6000 | 4608 | 24GB |
Quadro RTX 5000 | 3072 | 16GB |
Quadro P6000 | 3840 | 24GB |
Quadro P5000 | 2560 | 16GB |
Quadro M6000 24GB | 3072 | 24GB |
Quadro M6000 | 3072 | 12GB |
Quadro K6000 | 2880 | 12GB |
Extreme GPU cards
Extreme GPU option is required in your license in order to use the GPU cards below. Request Quote
GPU card | CUDA cores | VRAM |
---|---|---|
Tesla V100 (32GB version) | 5120 | 32GB |
Quadro RTX 8000 | 4608 | 48GB |
Quadro GV100 | 5120 | 32GB |
Titan V CEO Edition | 5120 | 32GB |
RTX A6000 | 10752 | 48 GB |
Huygens 20.04
Huygens versions up to and including 20.04 support NVidia graphics cards with a Compute Capability of 3.0 or higher and a Cuda Toolkit version of 7.0 or higher.
Huygens 20.10
Compute Capability lower than 3.5 and Cuda Toolkit versions older than 8.0 are now deprecated. Huygens 20.10 does no longer support these for GPU acceleration. CPU computation and display on a monitor via these cards will continue to be supported.
Affected cards that are still supported in Huygens 20.04 but not in Huygens 20.10:
GeForce GTX 770, GeForce GTX 760, GeForce GT 740, GeForce GTX 690, GeForce GTX 680, GeForce GTX 670, GeForce GTX 660 Ti, GeForce GTX 660, GeForce GTX 650 Ti BOOST, GeForce GTX 650 Ti, GeForce GTX 650, GeForce GTX 880M, GeForce GTX 780M, GeForce GTX 770M, GeForce GTX 765M, GeForce GTX 760M, GeForce GTX 680MX, GeForce GTX 680M, GeForce GTX 675MX, GeForce GTX 670MX, GeForce GTX 660M, GeForce GT 750M, GeForce GT 650M, GeForce GT 745M, GeForce GT 645M, GeForce GT 740M, GeForce GT 730M, GeForce GT 640M, GeForce GT 640M LE, GeForce GT 735M, GeForce GT 730M;
Quadro K5000, Quadro K4200, Quadro K4000, Quadro K2000, Quadro K2000D, Quadro K600, Quadro K420, Quadro K500M, Quadro K510M, Quadro K610M, Quadro K1000M, Quadro K2000M, Quadro K1100M, Quadro K2100M, Quadro K3000M, Quadro K3100M, Quadro K4000M, Quadro K5000M, Quadro K4100M, Quadro K5100M, NVS 510, Quadro 410;
Tesla K10, GRID K340, GRID K520.
See also the wikipedia page on CUDA.
Besides the GPU options for Huygens, SVI also offers Performance options. For the use of multi-GPU acceleration the Performance Plus, Mega or Extreme option is needed. With Huygens version 16.10.0p8, multi-GPU support has been introduced in the Batch Processor and Huygens Core to perform deconvolution of a queue of images on multiple GPU devices simultaneously. With Huygens 17.10 (Linux) and Huygens 18.04 (Windows), support is added for running a single image deconvolution on multiple GPUs in Huygens Professional. We offer the following Performance Packages to get the most out of your workstation:
Performance Option: Standard included in Huygens for using up to 16 CPU cores (32 logical when hyper-threaded) and 1 Small GPU card.
Performance Plus: Allows you to use up to 32 CPU cores (64 logical cores when hyper-threaded) and 2 Large GPU cards.
Performance Mega: Allows you to use up to 64 CPU cores (128 logical cores when hyper-threaded) and 4 Large GPU cards.
Performance Extreme: Allows you to use up to 128 CPU cores (256 logical cores when hyper-threaded) and 8 Large GPU cards.
Below is a table that shows which options and commands in Huygens can use single GPU acceleration or multi-GPU acceleration in batch mode.
Accelerated with GPU (CUDA) |
|
---|---|
Deconvolution algorithms | |
CMLE deconvolution | + Multi GPU support: Operation Window, Batch Processor & Scripting |
QMLE deconvolution | + Multi GPU support: Operation Window, Batch Processor & Scripting |
GMLE deconvolution | + Multi GPU support: Operation Window, Batch Processor & Scripting |
Tikhonov-Miller deconvolution | |
Workflow Processor | + Multi GPU support |
Deconvolution Wizard | |
Deconvolution Express | |
Batch Feeder | |
Localization algorithms for Huygens Localizer | |
MLE particle fitting | |
LSQ particle fitting | |
Center of mass particle fitting | |
Registration & Deconvolution | |
Object Stabilizer | + Multi GPU support: Batch Processor |
Drift correction | + Multi GPU support: Batch Processor |
Stitching & Decon Wizard | + Multi GPU support: HuCore/HRM |
Light-Sheet Decon & Fusion Wizard | |
Image processing commands | |
Image processing fitlers (min, max, variance ppu, avg, gauss) | |
Image properites functions (hist, stat, range) | |
Image processing commands 1 (estbg, phaseCorr) | |
Image processing commands 2 (getpix, setpix, cp, slice, multiMip, miniMip, sum) | |
Image processing commands 3 (shift, resample, mirror, hist2ch, label, perc, optrep) | |
Visualization & Analysis | |
2D slice renderers | |
3D SFP renderer | Since v. 19.04 |
3D MIP renderer | Since v.19.04 |
3D Surface renderer | Since v.19.04 |
Colocalization Analysis | + Multi GPU support: HuCore/HRM |
Object Analyzer | GPU support for labeling and rendering |
GeForce Graphics Cards Now 14% Over MSRP, Radeon at Just 6% Over MSRP
The latest NVIDIA GeForce and AMD Radeon GPU pricing update have been published by 3DCenter, & once again shows that we are a step close to MSRP prices for gaming graphics cards.
NVIDIA GeForce & AMD Radeon GPU Prices Now Just 10% Over MSRP: Graphics Cards In Stock Everywhere!
In the latest report by 3DCenter, we can see that the GPU prices for both NVIDIA GeForce and AMD Radeon graphics cards continue to fall which shouldn’t be a surprise as that’s the trend we have witnessed since the end of 2021. The NVIDIA GeForce RTX 30 series prices now average at around 14% over MSRP while AMD’s Radeon RX 6000 series averages with a selling price of 6% over MSRP.
AMD Radeon & NVIDIA GeForce Graphics cards are back to normal MSRP prices & GPU availability is better than ever. (Image Credits: 3DCenter)
In addition to that, GPU supply is abundant and currently, there’s no retail outlet in the world that doesn’t have graphics cards on their store shelves (which wasn’t the case a few quarters back). The red and green teams have also announced various promos such as the ‘Restocked and Reloaded’ by NVIDIA and AMD also announcing how its Radeon RX 6000 series cards are available at MSRP-level prices.
AMD Radeon & NVIDIA GeForce Graphics Card Price Trend (Image Credits: 3DCenter):
Dec 12 | Jan 2 | Jan 23 | 13 Feb | 6 Mar | 27 Mar | Apr 17 | 8th of May | |
---|---|---|---|---|---|---|---|---|
AMD Radeon RX 6000 | +83% | +78% -5PP |
+63% -15PP |
+45% -18PP |
+35% -10PP |
+25% -10PP |
+12% -13PP |
+6% -6PP |
nVidia GeForce RTX 30 | +87% | +85% -2PP |
+77% -8PP |
+57% -20PP |
+41% -16PP |
+25% -16PP |
+19% -6PP |
+14% -5PP |
only Radeon RX 6700 XT, 6800 & 6800 XT and GeForce RTX 3060, 3060 Ti, 3070 & 3080-10GB | +105% | +101% -4PP |
+88% -13PP |
+68% -20PP |
+55% -13PP |
+38% -17PP |
+26% -12PP |
+22% -4PP |
Talking about AMD prices first, once again, almost all graphics cards within the Radeon RX 6000 series are now available between +1 to +10% of MSRP. It is only the Radeon RX 6800 series cards that are still selling for an average of 30-35% over MSRP. This has been the case with these cards since their launch.
AMD Radeon RX 6000 Series Graphics Card Prices (RDNA 2 GPUs) via 3DCenter:
6400 | 6500XT | 6600 | 6600XT | 6700XT | 6800 | 6800XT | 6900XT | |
---|---|---|---|---|---|---|---|---|
miser | 182-230€ | 199-320€ | 359-480€ | 425-570€ | 599-1199€ | 889-1209€ | 929-1470€ | 1114-1949€ |
alternates | N/A | 220-299€ | 374-439€ | 479-579€ | €629-799 | 925€ | 1019-1099€ | 1129-1399€ |
Caseking | 188-211€ | 228-305€ | 399-438€ | 498-575€ | 640-957€ | 907-1145€ | 989-1207€ | 1199-1679€ |
computer universe | 186-211€ | 215-260€ | 375-446€ | 479-660€ | 630-752€ | 937-1080€ | 966-1335€ | 1349-1767€ |
Hardwarecamp24 | N/A | 224€ | 389-449€ | 458-494€ | 769€ | €959 | 1129-1149€ | 1249-1259€ |
media market | 190€ | 205-310€ | 420-435€ | 450-560€ | 649-900€ | 1039€ | 1132-1229€ | 1206-1349€ |
Mindfactory | 182-195€ | 199-260€ | 359-412€ | 425-556€ | 599-690€ | €889-949 | 929-1031€ | 1114-1339€ |
notebooks cheaper | 199€ | 199-269€ | 369-509€ | 449-529€ | 629-819€ | €889-999 | 999-1180€ | 1160-1949€ |
Pro Shop | 193-230€ | €222-309 | 399-428€ | 514-575€ | 683-840€ | 950-1127€ | 1016-1349€ | 1300-1820€ |
List Price | $179 | $199 | $329 | $379 | $479 | $579 | $649 | $999 |
Surcharge | from –10% | from –11% | from –3% | from -1% | from +11% | from +36% | from +27% | from -1% |
Change as of 17th April | — | –6PP | +2PP | –7PP | –3PP | –5PP | –4PP | -1PP |
Availability | ★★★☆☆ | ★★★★★ | ★★★★★ | ★★★★★ | ★★★★★ | ★★★★☆ | ★★★★☆ | ★★★★★ |
The NVIDIA lineup averages around +14% prices over MSRP but it is just three cards at the moment, the RTX 3050, RTX 3060 Ti, and RTX 3070 which are priced over +20% of MSRP. The enthusiast RTX 3080 Ti can actually be found below the MSRP which is impressive given that it offers performance similar to the RTX 3090 Ti which costs several hundred dollars more.
NVIDIA GeForce RTX 30 Series Graphics Card Prices (Ampere GPUs) via 3DCenter:
3050 | 3060 | 3060Ti | 3070 | 3070Ti | 3080 -10GB | 3080Ti | 3090 | |
---|---|---|---|---|---|---|---|---|
miser | 348-500€ | 420-790€ | €620-917 | 679-1079€ | 749-1491€ | 899-1491€ | 1299-1976€ | 1839-3074€ |
alternates | 369-419€ | 449-499€ | €619 | 789-799€ | 799-949€ | 999-1079€ | 1299-1699€ | 1999-2499€ |
Caseking | 358-419€ | 486-651€ | 623-733€ | 796-898€ | 829-928€ | 965-1294€ | 1379-1850€ | 1999-2168€ |
computer universe | 378-429€ | 469-790€ | 590-749€ | 799-961€ | 806-1491€ | 941-1140€ | 1416-1903€ | 1889-3164€ |
Hardwarecamp24 | 418€ | N/A | 649€ | €779-899 | 788-869€ | 999-1279€ | 1399-1529€ | 1879-1994€ |
media market | 359-421€ | 450-620€ | 811€ | 700€ | 799-1039€ | 960-1249€ | 1300-1869€ | 1800-2699€ |
Mindfactory | 372-398€ | 449-479€ | N/A | 745-769€ | 789-899€ | 969-1064€ | 1298-1498€ | 1839-1990€ |
notebooks cheaper | 358-500€ | 449-729€ | 630-710€ | €679-899 | 749-1049€ | €899-1199 | 1329-1532€ | 1879-2099€ |
Pro Shop | 369-497€ | 470-624€ | €675-701 | 770-983€ | 825-975€ | 1099-1200€ | 1399-1949€ | 2000-2249€ |
List Price | $249 | $329 | $399 | $499 | $599 | $699 | $1199 | $1499 |
Surcharge | from +24% | from +13% | from +31% | from +21% | from +11% | from +14% | from -4% | from +6% |
Change as of 17th April | +7p | –5PP | -1PP | –6PP | –7PP | –7PP | –5PP | –8PP |
Availability | ★★★★★ | ★★★★★ | ★★★★★ | ★★★★★ | ★★★★★ | ★★★★★ | ★★★★★ | ★★★★★ |
We recently saw the AMD Radeon RX 6900 XT being sold at $100 US below its MSRP over at Newegg US. There are just a few cards that are currently in demand and would take a few more months to hit their MSRP figures but the vast majority of the NVIDIA GeForce RTX 30 and AMD Radeon RX 6000 lineup is currently being sold at normal prices. This along with prices now coming back at or below the MSRP means that the GPU market is out of the worse and prices/availability can now go back to normal.
Products mentioned in this post
Best GeForce RTX 3080 Ti Graphics Cards Available — Which One To Get?
Q3 is almost finished and we are slowly but surely moving to Q4. Still, graphics card prices continue to fall. We have seen RTX 3090 Ti selling for MSRP, but for the high-end graphics card, I think the GeForce RTX 3080 Ti is still the best. Although, some of the best RTX 3080 (non-Ti) are not bad either. The GeForce RTX 3080 Ti is a cheaper version of the RTX 3090. It is also said to be a better buy than the RTX 3090, especially if you don’t need 24GB of VRAM. So, if you are in the market wondering which RTX 3080 Ti you should get, we listed here some of the best RTX 3080 Ti graphics cards in the market.
UPDATE: Note that as of September 17, 2022, EVGA has announced that they will no longer work with NVIDIA, ending the partnership immediately. They will continue to support and sell the RTX 30 series cards until all their stocks are sold. EVGA will also keep parts on hand for replacement and warranty purposes. Once all are consumed, they are finished with the graphics card business.
Best RTX 3080 Ti Graphics Cards Today
All GeForce RTX 3080 Ti graphics cards in the market are powered with the same 8nm GA102 GPU based on the Ampere architecture. It has 10240 CUDA cores, 320 TMUs, 112 ROPs, and 12GB of GDDR6X memory running on a 384-bit memory interface. These are all common to all RTX 3080 Ti graphics cards.
The main difference lies in the cooler design and PCB design. Some AIB advertises or highlights the boost clock speed of their graphics card. But at the end of the day, all of them will boost higher thanks to NVIDIA Boost technology. The graphics card only needs sufficient power and enough cooling to keep the GPU’s temperature at bay. With the conditions met, the GPU will eventually boost higher.
Below is a summary of all the graphics cards mentioned in this list. It is in alphabetical order. Later in the article, I’ll be categorizing them according to their (unique) features. I’ll also show you some performance numbers and answer some frequently asked questions regarding the RTX 3080 Ti GPU.
Model | Boost Clock | Cooler Design | PCI Slots | Power Connectors | Availability |
---|---|---|---|---|---|
ASUS ROG Strix GeForce RTX 3080 Ti OC Edition | 1815 MHz | Tri-fan air cooling | 2.9 | 3x 8-pin | Check here |
ASUS ROG Strix LC GeForce RTX 3080 Ti OC Gaming | 1860 MHz | Hybrid (Liquid + Air) | 2. 6 | 3x 8-pin | Check here |
ASUS TUF Gaming GeForce RTX 3080 Ti OC Edition | 1785 MHz | Tri-fan air cooling | 2.7 | 2x 8-pin | Check here |
EVGA GeForce RTX 3080 Ti FTW3 Ultra Gaming | 1800 MHz | Tri-fan air cooling | 2.75 | 3x 8-pin | Check here |
EVGA GeForce RTX 3080 Ti XC3 Ultra Gaming | 1725 MHz | Tri-fan air cooling | 2.2 | 2x 8-pin | Check here |
EVGA GeForce RTX 3080 Ti XC3 Ultra Hybrid Gaming | 1725 MHz | Hybrid (Liquid + Air) | 2 | 2x 8-pin | Check here |
Gigabyte Aorus GeForce RTX 3080 Ti Master 12G | 1770 MHz | Tri-fan air cooling | 3.5 | 3x 8-pin | Check here |
Gigabyte GeForce RTX 3080 Ti Gaming OC 12G | 1710 MHz | Tri-fan air cooling | 2. 7 | 2x 8-pin | Check here |
Gigabyte GeForce RTX 3080 Ti Vision OC 12G | 1710 MHz | Tri-fan air cooling | 2.7 | 2x 8-pin | Check here |
MSI GeForce RTX 3080 Ti Gaming X Trio 12G | 1770 MHz | Tri-fan air cooling | 2.7 | 3x 8-pin | Check here |
MSI GeForce RTX 3080 Ti Ventus 3X 12G OC | 1695 MHz | Tri-fan air cooling | 2.7 | 2x 8-pin | Check here |
MSI GeForce RTX 3080TI SUPRIM X 12G | 1830 MHz | Tri-fan air cooling | 2.75 | 3x 8-pin | Check here |
NVIDIA GeForce RTX 3080 Ti Founders Edition | 1670 MHz | Dual Fan Flow-Through | 2 | 2x 8-pin | Check here |
GeForce RTX 3080 Ti Performance
To give you an idea of how fast the RTX 3080 Ti is, here are some performance benchmarks at 4K resolution with maximum or ultra graphics preset. Note that some RTX 3080 Ti will perform slightly faster than others, and some will slightly be slower. GPU cooling, power, and silicon lottery are also at play. But at the end of the day, all of the RTX 3080 Ti cards will perform within the same performance range. Most of the time, the difference is minimal and negligible. NVIDIA controls how these GPUs perform.
RTX 3080 Ti FAQs
What RTX 3080 Ti has the best cooler and great cooling performance?
From the list, the Asus ROG Strix LC and EVGA XC3 Ultra Hybrid, definitely, have the better cooling solution. The GPU is liquid-cooled, while the other components are air-cooled. Unfortunately, they are costly compared to air-cooled RTX 3080 Ti. The good news is, most RTX 3080 Ti with a tri-fan cooling system will offer anywhere from sufficient to excellent cooling performance.
What RTX 3080 Ti is the fastest out of the box?
Out of the box, the Asus ROG Strix LC RTX 3080 Ti, EVGA RTX 3080 Ti FTW3 and MSI RTX 3080 Ti SUPRIM X have the higher or faster boost clock speed. They come with a factory overclock settings. But like I mentioned earlier if the graphics card has enough power and the GPU is sufficiently cooled, it will eventually boost higher. My MSI Suprim X can achieve around ~2,000MHz(+/-) without additional tweaking or manual overclocking.
Is it worth it to buy an expensive RTX 3080 Ti graphics card?
Yes and no. It depends on how you define “worth”. Some people are okay to spend more on the design, looks, and aesthetics. But even the liquid-cooled RTX 3080 Ti will not perform significantly better compared to the air-cooled RTX 3080 Ti. For sure it will have an advantage since it has a better cooling solution. But is it worth it for you?
Most of the time, even the simplest RTX 3080 Ti, like NVIDIA RTX 3080 Ti Founders Edition is enough already.
What’s the best CPU for the RTX 3080 Ti?
Generally speaking, you can pair the RTX 3080 Ti with any CPU. But if you are using a slower or entry-level CPU, its performance may be poor. So, since you are willing to spend thousands of dollars for an RTX 3080 Ti, it would only be wise to not skimp on the CPU.
I would recommend AMD’s Ryzen 9 5950X, 5900X, or Ryzen 7 5800X. For Intel, the latest 12th Gen Intel Core CPUs, like the Core i9-12900(K/F) and Core i7-12700(K/F) are currently some of the best in the market, especially in gaming. The older 10th gen and 11th gen CPUs are not bad either. But if you’re building a new system, I recommend that you jump to the latest 12th Gen Intel CPU since the performance difference from its predecessor is substantial.
Should I get an RTX 3080 Ti if I have a 1080p monitor?
No, definitely no! Just like what we have experienced on our RTX 3080 review, the GPU is being bottlenecked by the 1080p resolution. There’s some performance left on the table and it is simply not designed for lower resolutions.
What’s the best monitor for the RTX 3080 Ti?
4K (2160p) and 2K (1440p) would be the ideal resolution for the RTX 3080 Ti. You can check out our article on the best 4K gaming monitors here.
So, which RTX 3080 Ti should I get?
The RTX 3080 Ti graphics cards listed here are all good. It’s really up to you which one would you prefer and based on your budget.
Best GeForce RTX 3080 Ti Graphics Cards in General
EVGA GeForce RTX 3080 Ti FTW3 Ultra Gaming
EVGA’s RTX 3080 Ti FTW3 Ultra Gaming is another beast of a graphics card. It offers a higher factory overclock setting and a thick and substantial aluminum heatsink to cool the GPU and VRAM. The heatsink is cooled by three fans and there is RGB lighting on the front-side portion.
EVGA RTX 3080 Ti FTW3 Ultra Gaming available on Amazon.com here
For UK, check at Amazon UK here
Asus ROG Strix GeForce RTX 3080 Ti OC Edition
Aside from the EVGA FTW3 Ultra graphics cards, Asus’ ROG Strix series is also a popular graphics card. And the Asus ROG Strix GeForce RTX 3080 Ti OC Edition is no slouch in terms of being one of the best 3080 Ti card in the market. It has a 2.9-slot design, with three Axial-tech fan design, nice-looking RGB lighting, and its aesthetics is simple a headturner. It also feels premium on hand and its cooling solution is quite effective.
Asus ROG Strix RTX 3080 Ti OC Edition check on Amazon.com here.
MSI GeForce RTX 3080TI SUPRIM X 12G
Just like the EVGA FTW3, the MSI RTX 3080 Ti SUPRIM X is also a beast of a graphics card. It also requires three 8-pin power connectors and features a 20-phase power design. There’s a switch near the power connector that toggles between “silent” and “gaming” mode. Finally, it has some RGB lighting and this card feels heavy and premium on hand as well.
MSI GeForce RTX 3080 Ti SUPRIM X available on Amazon.com here.
For UK, check at Amazon UK here.
NVIDIA GeForce RTX 3080 Ti Founders Edition
The NVIDIA GeForce RTX 3080 Ti Founders Edition is one of the best RTX 3080 Ti in the market. Not only that it comes with the base price, but it’s also professional-looking, sleek, and clean. The flow-through design is excellent at cooling the GPU and other components. As expected from a Founders Edition, it does feel premium on hand.
NVIDIA RTX 3080 Ti Founders Edition available on Amazon.com here
For UK, check at Amazon UK here
Gigabyte Aorus GeForce RTX 3080 Ti Master 12G
This is definitely the “thiccest” RTX 3080 Ti on this list. The Aorus RTX 3080 Ti Master has three fans, with the middle one spinning in the opposite direction. It also has RGB lighting, but the unique feature of the RTX 3080 Ti Master are its dual BIOS, six output ports, and the (mini) LCD display on the rear-end front-side portion. Aside from displaying the GPU temperature, users can upload GIF and customize what will be displayed on the screen.
Aorus GeForce RTX 3080 Ti Master 12G available on Amazon.com here
For UK, check on Amazon UK here.
Also Good RTX 3080 Ti Graphics Card Alternatives
Below are more RTX 3080 Ti graphics cards that are not as premium as the cards mentioned above, but are generally cheaper and good options as well. These three features tri-fan cooling solution and they do perform as expected. The Gigabyte Vision is the only different card here since it has an all-white color scheme. Just choose which one do you prefer more when it comes to aesthetics.
- MSI GeForce RTX 3080 Ti Gaming X Trio 12G available on Amazon.com here or Amazon UK here
- Gigabyte GeForce RTX 3080 Ti Gaming OC 12G available on Amazon.com here
- Gigabyte GeForce RTX 3080 Ti Vision OC 12G available on Amazon.com here
Best RTX 3080 Ti with No (RGB) Lighting
Understandably, not all of us are a fan of RGB lighting. Some users don’t like RGB lighting or any kind of lighting at all. While you can disable the RGB lighting on the graphics card mentioned above, here are three RTX 3080 Ti that doesn’t have RGB lighting. The only exception is with the Asus TUF RTX 3080 Ti since the TUF logo on the rear end has a small RBG lighting. Nevertheless, all of these cards feature an all-black color scheme.
- ASUS TUF Gaming GeForce RTX 3080 Ti OC Edition available on Amazon.com here or Amazon UK here
- EVGA GeForce RTX 3080 Ti XC3 Ultra Gaming available on Amazon.com here
- MSI GeForce RTX 3080 Ti Ventus 3X 12G OC available on Amazon.com here
Best Water-cooled RTX 3080 Ti Graphics Cards
For those who don’t mind spending more, money no object, and simply want the coolest and best performing RTX 3080 Ti, these two liquid-cooled RTX 3080 Ti would be your option.
ASUS ROG Strix LC GeForce RTX 3080 Ti OC Edition Gaming
The Asus ROG Strix LC RTX 3080 Ti OC features a hybrid cooling solution. Its GPU is cooled by a 240mm radiator and the tube connecting the radiator and GPU copper plate is 560mm long. It also has RGB lighting on the fans and on the graphics card itself. In addition to the liquid cooling, there’s a blower-type fan that cools the aluminum heatsink that is in contact with the VRM and other components.
ASUS ROG STRIX LC RTX 3080 Ti OC available on Amazon.com here.
For UK, check on Amazon UK here.
EVGA GeForce RTX 3080 Ti XC3 Ultra Hybrid Gaming
EVGA’s RTX 3080 Ti XC3 Ultra Hybrid is somewhat similar to Asus’ ROG Strix LC. But the XC3 Ultra Hybrid doesn’t have a head-turner RGB lighting. Only the logo on its side and at the back have RGB lighting. It generally has an all-black aesthetic, including the two 120mm fans cooling the radiator.
And unlike Asus’ liquid-cooled RTX 3080 Ti, the EVGA RTX 3080 Ti XC3 Ultra Hybrid occupies only two PCI slots and requires only two 8-pin PCIe power connectors.
EVGA RTX 3080 Ti XC3 Ultra Hybrid check on Amazon.com here
For UK, check on Amazon UK here.
There you have it, which do you think is the best RTX 3080 Ti graphics card for your use case? If you think the RTX 3080 Ti is too expensive and out of your budget, perhaps you may be interested in checking out some of the best RTX 3080 graphics cards instead. On the other hand, AMD’s flagship Radeon RX 6900 XT is also a 4K-capable graphics card, not to mention it is cheaper than the RTX 3080 Ti. Check out some of the best RX 6900 XT cards here.
NVIDIA GeForce RTX 3080 Ti |
Ravencoin NVIDIA GeForce RTX 3080 Ti Payback 74mo. Hashrate 58.9 Mh/s Mining Profit 24h 0.54 $ 15.21 RVN |
1 200 $ | 0.54 $15.21 RVN | 74mo. | KAWPOW | 58.9 Mh/s | RavencoinPools listed: 3 |
NVIDIA GeForce RTX 3090 |
Ravencoin NVIDIA GeForce RTX 3090 Payback 105mo. Hashrate 52 Mh/s Mining Profit 24h 0.48 $ 13.43 RVN |
1 500 $ | 0.48 $13.43 RVN | 105mo. | KAWPOW | 52 Mh/s | RavencoinPools listed: 3 |
NVIDIA GeForce RTX 3080 |
Ravencoin NVIDIA GeForce RTX 3080 Payback 66mo. Hashrate 47 Mh/s Mining Profit 24h 0.43 $ 12.14 RVN |
860 $ | 0.43 $12.14 RVN | 66mo. | KAWPOW | 47 Mh/s | RavencoinPools listed: 3 |
AMD Radeon VII |
Ravencoin AMD Radeon VII Payback 87mo. Hashrate 42 Mh/s Mining Profit 24h 0.39 $ 10.85 RVN |
1 000 $Used | 0.39 $10.85 RVN | 87mo. | KAWPOW | 42 Mh/s | RavencoinPools listed: 3 |
NVIDIA GeForce RTX 3070 Ti |
Ravencoin NVIDIA GeForce RTX 3070 Ti Payback 71mo. Hashrate 40 Mh/s Mining Profit 24h 0.37 $ 10.33 RVN |
780 $ | 0.37 $10.33 RVN | 71mo. | KAWPOW | 40 Mh/s | RavencoinPools listed: 3 |
NVIDIA GeForce RTX 2080 Ti |
Ravencoin NVIDIA GeForce RTX 2080 Ti Payback 71mo. Hashrate 36 Mh/s Mining Profit 24h 0.33 $ 9.299 RVN |
700 $ | 0.33 $9.299 RVN | 71mo. | KAWPOW | 36 Mh/s | RavencoinPools listed: 3 |
AMD Radeon RX 6800 |
Ravencoin AMD Radeon RX 6800 Payback 88mo. Hashrate 32.7 Mh/s Mining Profit 24h 0.3 $ 8.447 RVN |
790 $ | 0.3 $8.447 RVN | 88mo. | KAWPOW | 32.7 Mh/s | RavencoinPools listed: 3 |
AMD Radeon RX 6800XT |
Ravencoin AMD Radeon RX 6800XT Payback 89mo. Hashrate 32.7 Mh/s Mining Profit 24h 0.3 $ 8.447 RVN |
800 $ | 0.3 $8.447 RVN | 89mo. | KAWPOW | 32.7 Mh/s | RavencoinPools listed: 3 |
AMD Radeon RX 6900XT |
Ravencoin AMD Radeon RX 6900XT Payback 100mo. Hashrate 32.7 Mh/s Mining Profit 24h 0.3 $ 8.447 RVN |
900 $ | 0.3 $8.447 RVN | 100mo. | KAWPOW | 32.7 Mh/s | RavencoinPools listed: 3 |
NVIDIA GeForce RTX 3060 Ti |
Ravencoin NVIDIA GeForce RTX 3060 Ti Payback 63mo. Hashrate 30.5 Mh/s Mining Profit 24h 0.28 $ 7.879 RVN |
530 $ | 0.28 $7.879 RVN | 63mo. | KAWPOW | 30.5 Mh/s | RavencoinPools listed: 3 |
NVIDIA GeForce RTX 3070 |
Ravencoin NVIDIA GeForce RTX 3070 Payback 79mo. Hashrate 30.3 Mh/s Mining Profit 24h 0.28 $ 7.827 RVN |
660 $ | 0.28 $7.827 RVN | 79mo. | KAWPOW | 30.3 Mh/s | RavencoinPools listed: 3 |
NVIDIA GeForce RTX 2080 |
Ravencoin NVIDIA GeForce RTX 2080 Payback 73mo. Hashrate 30 Mh/s Mining Profit 24h 0.28 $ 7.749 RVN |
600 $ | 0.28 $7.749 RVN | 73mo. | KAWPOW | 30 Mh/s | RavencoinPools listed: 3 |
NVIDIA GeForce GTX 1080 Ti |
Ravencoin NVIDIA GeForce GTX 1080 Ti Payback 36mo. Hashrate 25 Mh/s Mining Profit 24h 0.23 $ 6.458 RVN |
250 $Used | 0.23 $6.458 RVN | 36mo. | KAWPOW | 25 Mh/s | RavencoinPools listed: 3 |
AMD Radeon Vega 64 |
Ravencoin AMD Radeon Vega 64 Payback 65mo. Hashrate 25 Mh/s Mining Profit 24h 0.23 $ 6.458 RVN |
450 $ | 0.23 $6.458 RVN | 65mo. | KAWPOW | 25 Mh/s | RavencoinPools listed: 3 |
AMD Radeon Vega 56 |
Ravencoin AMD Radeon Vega 56 Payback 51mo. Hashrate 25 Mh/s Mining Profit 24h 0.23 $ 6.458 RVN |
350 $ | 0.23 $6.458 RVN | 51mo. | KAWPOW | 25 Mh/s | RavencoinPools listed: 3 |
AMD Radeon RX 5700 |
Ravencoin AMD Radeon RX 5700 Payback 57mo. Hashrate 24.8 Mh/s Mining Profit 24h 0.23 $ 6.406 RVN |
390 $ | 0.23 $6.406 RVN | 57mo. | KAWPOW | 24.8 Mh/s | RavencoinPools listed: 3 |
NVIDIA GeForce RTX 2070 Super |
Aeternity NVIDIA GeForce RTX 2070 Super Payback 78mo. Hashrate 8.22 Gps Mining Profit 24h 0.22 $ 3.010 AE |
520 $ | 0.22 $3.010 AE | 78mo. | CuckooCycle | 8.22 Gps | AeternityPools listed: 2 |
NVIDIA GeForce RTX 3060 |
Ravencoin NVIDIA GeForce RTX 3060 Payback 70mo. Hashrate 22.7 Mh/s Mining Profit 24h 0.21 $ 5.864 RVN |
440 $ | 0.21 $5.864 RVN | 70mo. | KAWPOW | 22.7 Mh/s | RavencoinPools listed: 3 |
AMD Radeon RX 6700XT |
Ravencoin AMD Radeon RX 6700XT Payback 78mo. Hashrate 22.7 Mh/s Mining Profit 24h 0.21 $ 5.864 RVN |
490 $ | 0.21 $5.864 RVN | 78mo. | KAWPOW | 22.7 Mh/s | RavencoinPools listed: 3 |
NVIDIA GeForce RTX 2070 |
Aeternity NVIDIA GeForce RTX 2070 Payback 82mo. Hashrate 7.5 Gps Mining Profit 24h 0.2 $ 2.747 AE |
500 $ | 0.2 $2.747 AE | 82mo. | CuckooCycle | 7.5 Gps | AeternityPools listed: 2 |
NVIDIA GeForce RTX 2060 Super |
Ravencoin NVIDIA GeForce RTX 2060 Super Payback 46mo. Hashrate 22.2 Mh/s Mining Profit 24h 0.2 $ 5.735 RVN |
280 $ | 0.2 $5.735 RVN | 46mo. | KAWPOW | 22.2 Mh/s | RavencoinPools listed: 3 |
NVIDIA GeForce GTX 1070 Ti |
Aeternity NVIDIA GeForce GTX 1070 Ti Payback 36mo. Hashrate 6.2 Gps Mining Profit 24h 0.17 $ 2.271 AE |
180 $Used | 0.17 $2.271 AE | 36mo. | CuckooCycle | 6.2 Gps | AeternityPools listed: 2 |
NVIDIA GeForce GTX 1080 |
Aeternity NVIDIA GeForce GTX 1080 Payback 41mo. Hashrate 6 Gps Mining Profit 24h 0.16 $ 2.197 AE |
200 $Used | 0.16 $2.197 AE | 41mo. | CuckooCycle | 6 Gps | AeternityPools listed: 2 |
NVIDIA P104-100 |
Aeternity NVIDIA P104-100 Payback 35mo. Hashrate 5.9 Gps Mining Profit 24h 0.16 $ 2.161 AE |
170 $Used | 0.16 $2.161 AE | 35mo. | CuckooCycle | 5.9 Gps | AeternityPools listed: 2 |
NVIDIA GeForce RTX 2060 |
Aeternity NVIDIA GeForce RTX 2060 Payback 54mo. Hashrate 5.9 Gps Mining Profit 24h 0.16 $ 2.161 AE |
260 $ | 0.16 $2.161 AE | 54mo. | CuckooCycle | 5.9 Gps | AeternityPools listed: 2 |
NVIDIA GeForce GTX 1070 |
Bitcoin Gold NVIDIA GeForce GTX 1070 Payback 38mo. Hashrate 55.5 Sol/s Mining Profit 24h 0.14 $ 0.007 BTG |
160 $Used | 0.14 $0.007 BTG | 38mo. | Equihash 144_5 | 55.5 Sol/s | Bitcoin GoldPools listed: 7 |
AMD Radeon RX 5600XT |
Ravencoin AMD Radeon RX 5600XT Payback 68mo. Hashrate 15.06 Mh/s Mining Profit 24h 0.14 $ 3.890 RVN |
280 $ | 0.14 $3.890 RVN | 68mo. | KAWPOW | 15.06 Mh/s | RavencoinPools listed: 3 |
NVIDIA GeForce GTX 1660 Ti |
Ravencoin NVIDIA GeForce GTX 1660 Ti Payback 65mo. Hashrate 15 Mh/s Mining Profit 24h 0.14 $ 3.875 RVN |
270 $ | 0.14 $3.875 RVN | 65mo. | KAWPOW | 15 Mh/s | RavencoinPools listed: 3 |
NVIDIA GeForce GTX 1660 Super |
Ravencoin NVIDIA GeForce GTX 1660 Super Payback 61mo. Hashrate 14.2 Mh/s Mining Profit 24h 0.13 $ 3.668 RVN |
240 $ | 0.13 $3.668 RVN | 61mo. | KAWPOW | 14.2 Mh/s | RavencoinPools listed: 3 |
AMD Radeon RX 6600XT |
Ravencoin AMD Radeon RX 6600XT Payback 84mo. Hashrate 13 Mh/s Mining Profit 24h 0.12 $ 3.358 RVN |
300 $ | 0.12 $3.358 RVN | 84mo. | KAWPOW | 13 Mh/s | RavencoinPools listed: 3 |
AMD Radeon RX 590 |
Ravencoin AMD Radeon RX 590 Payback 54mo. Hashrate 12 Mh/s Mining Profit 24h 0.11 $ 3.100 RVN |
180 $Used | 0.11 $3.100 RVN | 54mo. | KAWPOW | 12 Mh/s | RavencoinPools listed: 3 |
AMD Radeon RX 580 |
Ravencoin AMD Radeon RX 580 Payback 53mo. Hashrate 11 Mh/s Mining Profit 24h 0.1 $ 2.841 RVN |
160 $Used | 0.1 $2.841 RVN | 53mo. | KAWPOW | 11 Mh/s | RavencoinPools listed: 3 |
AMD Radeon RX 480 |
Ravencoin AMD Radeon RX 480 Payback 36mo. Hashrate 11 Mh/s Mining Profit 24h 0.1 $ 2.841 RVN |
110 $Used | 0.1 $2.841 RVN | 36mo. | KAWPOW | 11 Mh/s | RavencoinPools listed: 3 |
NVIDIA GeForce GTX 1660 |
Aeternity NVIDIA GeForce GTX 1660 Payback 73mo. Hashrate 3.7 Gps Mining Profit 24h 0.1 $ 1.355 AE |
220 $ | 0.1 $1.355 AE | 73mo. | CuckooCycle | 3.7 Gps | AeternityPools listed: 2 |
AMD Radeon RX 570 |
Ravencoin AMD Radeon RX 570 Payback 54mo. Hashrate 10 Mh/s Mining Profit 24h 0.09 $ 2.583 RVN |
150 $Used | 0.09 $2.583 RVN | 54mo. | KAWPOW | 10 Mh/s | RavencoinPools listed: 3 |
AMD Radeon RX 470 |
Ravencoin AMD Radeon RX 470 Payback 36mo. Hashrate 10 Mh/s Mining Profit 24h 0.09 $ 2.583 RVN |
100 $Used | 0.09 $2.583 RVN | 36mo. | KAWPOW | 10 Mh/s | RavencoinPools listed: 3 |
NVIDIA GeForce GTX 1060 |
Bitcoin Gold NVIDIA GeForce GTX 1060 Payback 46mo. Hashrate 35 Sol/s Mining Profit 24h 0.09 $ 0.004 BTG |
120 $Used | 0.09 $0.004 BTG | 46mo. | Equihash 144_5 | 35 Sol/s | Bitcoin GoldPools listed: 7 |
NVIDIA GeForce GTX 1650 |
Ravencoin NVIDIA GeForce GTX 1650 Payback 94mo. Hashrate 8.1 Mh/s Mining Profit 24h 0.07 $ 2.092 RVN |
210 $ | 0.07 $2.092 RVN | 94mo. | KAWPOW | 8.1 Mh/s | RavencoinPools listed: 3 |
NVIDIA GeForce GTX 1050 Ti |
Ravencoin NVIDIA GeForce GTX 1050 Ti Payback 52mo. Hashrate 7 Mh/s Mining Profit 24h 0.06 $ 1.808 RVN |
100 $Used | 0.06 $1.808 RVN | 52mo. | KAWPOW | 7 Mh/s | RavencoinPools listed: 3 |
List-table of Nvidia GeForce video cards | AMD news
Skip to content
Category: Video cards
Tags: #list_of_graphic_cards_GeForce, #list_of_video_cards_Nvidia, #nvidia_geforce
Description
Here is a detailed list-table of Nvidia GeForce video cards. The newest models are listed at the top of the list, the oldest ones at the bottom.
List table of Nvidia GeForce(GF) RTX4000 Series video cards
Nvidia GeForce RTX4000 video cards are based on 5nm Ada Lovelace architecture.
GeForce | GPU Name | Speed (Turbo) | Memory | PCIe | Bits | CUDA Cores | FP32 | TWD |
RTX4090 | AD102-300 | 2235Mhz 2250Mhz | 24Gb GDDR6X | 4.0 | 384 | 16384 | 82.6 TFLOPs | 450 |
RTX4080 | AD103-300 | 2205Mhz 2508Mhz | 16Gb GDDR6X | 4.0 | 256 | 9728 | 48.8 TFLOPs | 320 |
RTX4080 12Gb | AD104-400 | 2310Mhz 2685Mhz | 12Gb GDDR6X | 4. 0 | 192 | 9728 | 48.2 TFLOPs | 285 |
List table of Nvidia GeForce(GF) RTX3000 Series video cards
Nvidia GeForce RTX3000 video cards are based on the 8nm Ampere architecture.
GeForce | GPU Name | Speed (Turbo) | Memory | PCIe | Bits | CUDA Cores | FP32 | TWD |
RTX3090Ti | GA102-350 | 1560Mhz 1860Mhz | 24Gb GDDR6X | 4.0 | 384 | 10752 | 40 TFLOPs | 450 |
RTX3090 | GA102-300 | 1395Mhz 1695Mhz | 24Gb GDDR6X | 4.0 | 384 | 10496 | 35.6 TFLOPs | 350 |
RTX3080Ti | GA102-225 | 1365Mhz 1665Mhz | 12Gb GDDR6X | 4. 0 | 384 | 10240 | 34.1 TFLOPs | 350 |
RTX3080 | GA102-200 | 1440Mhz 1710Mhz | 10Gb GDDR6X | 4.0 | 320 | 8704 | 29.8 TFLOPs | 320 |
RTX3070Ti | GA104-400 | 1575Mhz 1770Mhz | 8Gb GDDR6X | 4.0 | 256 | 6144 | 21.75 TFLOPs | 290 |
RTX3070 | GA104-300 | 1500Mhz 1725Mhz | 8Gb GDDR6 | 4.0 | 256 | 5888 | 20.3 TFLOPs | 220 |
RTX3060Ti | GA104-200 | 1410Mhz 1665Mhz | 8Gb GDDR6 | 4.0 | 256 | 4864 | 16.2 TFLOPs | 200 |
RTX3060 | GA106-300 | 1320Mhz 1777Mhz | 8Gb GDDR6 | 4.0 | 192 | 3584 | 12. 7 TFLOPs | 170 |
RTX3050 | GA106-150 | 1552Mhz 1777Mhz | 8Gb GDDR6 | 4.0 | 128 | 2560 | 9.1 TFLOPs | 130 |
List table of Nvidia GeForce(GF) RTX2000 Series video cards
Nvidia GeForce RTX2000 video cards are based on 12nm Turing architecture.
GeForce | GPU Name | Speed (Turbo) | Memory | PCIe | Bits | CUDA Cores | FP32 | TWD |
RTX2080Ti | TU102-300 | 1350Mhz 1545Mhz | 11Gb GDDR6 | 3.0 | 352 | 4352 | 13.4 TFLOPs | 250 |
RTX2080 Super | TU104-450 | 1350Mhz 1545Mhz | 8Gb GDDR6 | 3.0 | 256 | 3072 | 11. 1 TFLOPs | 250 |
RTX2080 | TU104-400 | 1515Mhz 1710Mhz | 8Gb GDDR6 | 3.0 | 256 | 2944 | 10.1 TFLOPs | 215 |
RTX2070 Super | TU104-410 | 1605Mhz 1770Mhz | 8Gb GDDR6 | 3.0 | 256 | 2560 | 9.1 TFLOPs | 215 |
RTX2070 | TU106-400 | 1410Mhz 1620Mhz | 8Gb GDDR6 | 3.0 | 256 | 2304 | 7.5 TFLOPs | 175 |
RTX2060 Super | TU106-410 | 1410Mhz 1620Mhz | 8Gb GDDR6 | 3.0 | 256 | 2176 | 7.2 TFLOPs | 175 |
RTX2060 12Gb | TU106-300 | 1470Mhz 1650Mhz | 12Gb GDDR6 | 3.0 | 192 | 2176 | 7.2 TFLOPs | 185 |
RTX2060 | TU106-300 | 1365Mhz 1680Mhz | 6Gb GDDR6 | 3. 0 | 192 | 1920 | 6.5 TFLOPs | 160 |
List table of Nvidia GeForce(GF) GTX1600 Series video cards
Nvidia GeForce GTX1600 video cards are based on 12nm Turing architecture.
GeForce | GPU Name | Speed (Turbo) | Memory | PCIe | Bits | CUDA Cores | FP32 | TWD |
GTX1660Ti | TU116-400 | 1500Mhz 1770Mhz | 6Gb GDDR6 | 3.0 | 192 | 1536 | 5.4 TFLOPs | 120 |
GTX1660 Super | TU116-300 | 1530Mhz 1785Mhz | 6Gb GDDR6 | 3.0 | 192 | 1408 | 5.1 TFLOPs | 125 |
GTX1660 | TU116-300 | 1530Mhz 1785Mhz | 6Gb GDDR6 | 3. 0 | 192 | 1408 | 5.0 TFLOPs | 120 |
GTX1650 Super | TU116-250 | 1530Mhz 1725Mhz | 4Gb GDDR6 | 3.0 | 128 | 1280 | 4.4 TFLOPs | 100 |
GTX1650 | TU117-300 | 1485Mhz 1665Mhz | 4Gb GDDR6 | 3.0 | 128 | 896 | 3.0 TFLOPs | 75 |
GTX1630 | TU117-150 | 1740Mhz 1785Mhz | 4Gb GDDR6 | 3.0 | 64 | 512 | 1.83 TFLOPs | 75 |
List table of Nvidia GeForce(GF) GTX1000 Series video cards
Nvidia GeForce GTX1000 video cards are based on 16nm Pascal architecture.
GeForce | GPU Name | Speed (Turbo) | Memory | PCIe | Bits | CUDA Cores | FP32 | TWD |
GTX1080Ti | GP102-350 | 1480Mhz 1582Mhz | 11Gb GDDR5X | 3. 0 | 352 | 3584 | 11.3 TFLOPs | 250 |
GTX1080 | GP104-400 | 1607Mhz 1733Mhz | 8Gb GDDR5X | 3.0 | 256 | 2560 | 8.9 TFLOPs | 180 |
GTX1070Ti | GP104-300 | 1607Mhz 1683Mhz | 8Gb GDDR5 | 3.0 | 256 | 2432 | 8.2 TFLOPs | 180 |
GTX1070 | GP104-200 | 1506Mhz 1683Mhz | 8Gb GDDR5 | 3.0 | 256 | 1920 | 6.5 TFLOPs | 150 |
GTX1060 6Gb_Video | GP106-400 | 1506Mhz 1708Mhz | 6Gb GDDR5 | 3.0 | 192 | 1280 | 4.4 TFLOPs | 120 |
GTX1060 5Gb_Video | PG410 | 1506Mhz 1708Mhz | 5Gb GDDR5 | 3.0 | 160 | 1280 | 4. 3 TFLOPs | 120 |
GTX1060 3Gb_Video | GP106-300 | 1506Mhz 1708Mhz | 3Gb GDDR5 | 3.0 | 192 | 1152 | 3.9 TFLOPs | 120 |
GTX1050Ti | GP107-400 | 1291Mhz 1392Mhz | 4Gb GDDR5 | 3.0 | 128 | 768 | 2.1 TFLOPs | 75 |
GTX1050 3Gb_Video | PG210 | 1392Mhz 1518Mhz | 3Gb GDDR5 | 3.0 | 128 | 768 | 2.3 TFLOPs | 75 |
GTX1050 | GP107-300 | 1354Mhz 1455Mhz | 2Gb GDDR5 | 3.0 | 128 | 640 | 1.9 TFLOPs | 75 |
GTX1030 | GP108-300 | 1227Mhz 1468Mhz | 2Gb GDDR5 | 3.0 | 64 | 384 | 1.1 TFLOPs | 30 |
List table of Nvidia GeForce(GF) GTX900 Series video cards
Nvidia GeForce GTX900 video cards are based on 28nm Maxwell architecture.
GeForce | GPU Name | Speed (Turbo) | Memory | PCIe | Bits | CUDA Cores | FP32 | TWD |
GTX980Ti | GM200-310 | 1000Mhz 1076Mhz | 6Gb GDDR5 | 3.0 | 384 | 2816 | 6.1 TFLOPs | 250 |
GTX980 | GM204-400 | 1126Mhz 1216Mhz | 4Gb GDDR5 | 3.0 | 256 | 2048 | 5.0 TFLOPs | 165 |
GTX970 | GM204-200 | 1051Mhz 1178Mhz | 3.5Gb GDDR5 | 3.0 | 224 | 1664 | 3.9 TFLOPs | 145 |
GTX960 | GM206-300 | 1127Mhz 1178Mhz | 2Gb GDDR5 | 3.0 | 128 | 1024 | 2. 4 TFLOPs | 120 |
GTX950 | GM206-250 | 1024Mhz 1188Mhz | 2Gb GDDR5 | 3.0 | 128 | 768 | 1.8 TFLOPs | 90 |
List table of Nvidia GeForce(GF) GTX700 Series video cards
Nvidia GeForce GTX700 video cards are based on 28nm Kepler architecture.
GeForce | GPU Name | Speed (Turbo) | Memory | PCIe | Bits | CUDA Cores | FP32 | TWD |
GTX780Ti | GK110-425 | 875Mhz 928Mhz | 3Gb GDDR5 | 3.0 | 384 | 2880 | 5.4 TFLOPs | 250 |
GTX780 | GK110-300 | 863Mhz 900Mhz | 3Gb GDDR5 | 3.0 | 384 | 2304 | 4. 2 TFLOPs | 250 |
GTX770 | GK104-425 | 1046Mhz 1084Mhz | 2Gb GDDR5 | 3.0 | 256 | 1536 | 3.3 TFLOPs | 230 |
GTX760Ti | GK104-325 | 915Mhz 980Mhz | 2Gb GDDR5 | 3.0 | 256 | 1344 | 2.6 TFLOPs | 170 |
GTX760 | GK104-225 | 980Mhz 1033Mhz | 2Gb GDDR5 | 3.0 | 256 | 1152 | 2.4 TFLOPs | 170 |
GTX750Ti | GM107-400 | 1020Mhz 1085Mhz | 2Gb GDDR5 | 3.0 | 128 | 640 | 1.4 TFLOPs | 60 |
GTX750 | GM107-300 | 1020Mhz 1085Mhz | 1Gb GDDR5 | 3.0 | 128 | 512 | 1.1 TFLOPs | 55 |
GT740 | GK107-425 | 993Mhz | 1Gb GDDR5 | 3. 0 | 128 | 384 | 0.8 TFLOPs | 65 |
GT730 | GF108-400 | 700Mhz | 1Gb GDDR3 | 2.0 | 128 | 96:16:4 | 0.1 TFLOPs | 50 |
GT720 | GK208 | 797Mhz | 1Gb GDDR3 | 2.0 | 64 | 192 | 0.3 TFLOPs | 20 |
List table of Nvidia GeForce(GF) GTX600 Series video cards
Nvidia GeForce GTX600 video cards are based on 28nm Kepler architecture.
GeForce | GPU Name | Speed (Turbo) | Memory | PCIe | Bits | CUDA Cores | FP32 | TWD |
GTX690 | 2*GK104-400 | 915Mhz 1019Mhz | 4Gb GDDR5 | 3. 0 | 512 | 3072 | 6.3 TFLOPs | 300 |
GTX680 | GK104-400 | 1006Mhz 1058Mhz | 2Gb GDDR5 | 3.0 | 256 | 1536 | 3.3 TFLOPs | 195 |
GTX670 | GK104-325 | 915Mhz 980Mhz | 2Gb GDDR5 | 3.0 | 256 | 1344 | 2.7 TFLOPs | 170 |
GTX660Ti | GK104-300 | 915Mhz 980Mhz | 2Gb GDDR5 | 3.0 | 192 | 1344 | 2.6 TFLOPs | 150 |
GTX660 | GK106-400 | 980Mhz 1033Mhz | 2Gb GDDR5 | 3.0 | 192 | 960 | 2.0 TFLOPs | 140 |
GTX650Ti | GK106-220 | 928Mhz | 1Gb GDDR5 | 3.0 | 128 | 768 | 1.4 TFLOPs | 110 |
GTX650 | GK107-450 | 1058Mhz | 1Gb GDDR5 | 3. 0 | 128 | 384 | 0.8 TFLOPs | 64 |
GT640 | GK107-300 | 900Mhz | 2Gb GDDR3 | 3.0 | 128 | 384 | 0.7 TFLOPs | 65 |
GT630 | GF108-400 | 810Mhz | 1Gb GDDR3 | 2.0 | 128 | 96 | 0.2 TFLOPs | 65 |
GT620 | GF108-100 | 700Mhz | 1Gb GDDR3 | 2.0 | 64 | 96 | 0.13 TFLOPs | 49 |
GT610 | GF119-300 | 810Mhz | 1Gb GDDR3 | 2.0 | 64 | 48 | 0.08 TFLOPs | 29 |
List table of Nvidia GeForce(GF) GTX500 Series video cards
Nvidia GeForce GTX500 video cards are based on 40nm Fermi 2.0 architecture.
GeForce | GPU Name | Speed (Turbo) | Memory | PCIe | Bits | CUDA Cores | FP32 | TWD |
GTX590 | 2*GF110 | 612Mhz | 3Gb GDDR5 | 2. 0 | 768 | 1024 | 2.5 TFLOPs | 365 |
GTX580 | GF110 | 782Mhz | 1.5/3Gb GDDR5 | 2.0 | 384 | 512 | 1.6 TFLOPs | 244 |
GTX570 | GF110 | 742Mhz | 1/2.5Gb GDDR5 | 2.0 | 320 | 480 | 1.4 TFLOPs | 219 |
GTX560Ti | GF114 | 900Mhz | 1/2Gb GDDR5 | 2.0 | 256 | 384 | 1.3 TFLOPs | 170 |
GTX560 | GF114 | 810Mhz | 1/2Gb GDDR5 | 2.0 | 256 | 336 | 1.2 TFLOPs | 150 |
GTX550Ti | GF116 | 910Mhz | 1Gb GDDR5 | 2.0 | 192 | 192 | 0.7 TFLOPs | 116 |
GT530 | GF119 | 700Mhz | 1/2Gb GDDR3 | 2. 0 | 128 | 96 | 0.3 TFLOPs | 50 |
GT520 | GF119 | 810Mhz | 1/2Gb GDDR3 | 2.0 | 64 | 48 | 0.2 TFLOPs | 29 |
GT510 | GF119 | 523Mhz | 1/2Gb GDDR3 | 2.0 | 64 | 48 | 0.1 TFLOPs | 25 |
List of Nvidia GeForce(GF) GTX400 Series Video Cards Table
Nvidia GeForce GTX400 video cards are based on the 40nm Fermi architecture.
GeForce | GPU Name | Speed (Turbo) | Memory | PCIe | Bits | CUDA Cores | FP32 | TWD |
GTX480 | GF100 | 700Mhz | 1.5Gb GDDR5 | 2.0 | 384 | 480 | 1. 3 TFLOPs | 250 |
GTX470 | GF100 | 608Mhz | 1.2Gb GDDR5 | 2.0 | 320 | 448 | 1.1 TFLOPs | 215 |
GTX465 | GF100 | 608Mhz | 1.2Gb GDDR5 | 2.0 | 256 | 352 | 0.9 TFLOPs | 200 |
GTX460 256bit | GF104 | 675Mhz | 1/2Gb GDDR5 | 2.0 | 256 | 336 | 0.9 TFLOPs | 160 |
GTX460 192bit | GF104 | 675Mhz | 0.7/1.5Gb GDDR5 | 2.0 | 192 | 336 | 0.9 TFLOPs | 150 |
GTS450 GDDR5 | GF106 | 783Mhz | 1/2Gb GDDR5 | 2.0 | 128 | 192 | 0.6 TFLOPs | 106 |
GTS450 GDDR3 | GF106 | 783Mhz | 1/2Gb GDDR3 | 2. 0 | 128 | 192 | 0.6 TFLOPs | 106 |
GT440 GDDR5 | GF108 | 810Mhz | 0.5/1Gb GDDR5 | 2.0 | 128 | 96 | 0.35 TFLOPs | 65 |
GT440 GDDR3 | GF108 | 810Mhz | 1/2Gb GDDR3 | 2.0 | 128 | 96 | 0.35 TFLOPs | 65 |
GT430 | GF108 | 700Mhz | 1/2Gb GDDR3 | 2.0 | 128 | 96 | 0.27 TFLOPs | 49 |
GT420 | GF108 | 700Mhz | 1/2Gb GDDR3 | 2.0 | 128 | 48 | 0.12 TFLOPs | 50 |
GT405 | GT218 | 589Mhz | 0.5/1Gb GDDR3 | 2.0 | 64 | 16 | 0.07 TFLOPs | 25 |
Nvidia GeForce(GF) GTX300 Series Graphics Card Table List
GeForce | GPU Name | Speed (Turbo) | Memory | PCIe | Bits | CUDA Cores | FP32 | TWD |
GT340 | GT215 | 550Mhz | 0. 5/1Gb GDDR5 | 2.0 | 128 | 96 | 0.4 TFLOPs | 69 |
GT330 256Bit | G92b | 550Mhz | 2Gb GDDR3 | 2.0 | 256 | 112 | 0.45 TFLOPs | 75 |
GT330 128Bit | G92b | 500Mhz | 0.25-1Gb GDDR2 | 2.0 | 128 | 96 | 0.36 TFLOPs | 75 |
GT320 | GT215 | 540Mhz | 1Gb GDDR3 | 2.0 | 128 | 72 | 0.3 TFLOPs | 43 |
GT315 | GT216 | 475Mhz | 0.5/1Gb GDDR3 | 2.0 | 64 | 48 | 0.16 TFLOPs | 33 |
GT310 | GT218 | 589Mhz | 0.5/1Gb GDDR2 | 2.0 | 64 | 16 | 0.07 TFLOPs | 31 |
Nvidia GeForce(GF) GTX200 Series
9 graphics card list0017
Nvidia GeForce(GF) GT100 Series Video Card Table List
GeForce | GPU Name | Speed (Turbo) | Memory | PCIe | Bits | CUDA Cores | FP32 | TWD |
GTS150 | G92b | 738Mhz | 1Gb GDDR3 | 2.0 | 256 | 128 | 0.7 TFLOPs | 141 |
GT140 | G94b | 650Mhz | 0. 5/1Gb GDDR3 | 2.0 | 256 | 64 | 0.3 TFLOPs | 96 |
GT130 | G94b | 500Mhz | 0.7/1.5Gb DDR2 | 2.0 | 192 | 48 | 0.18 TFLOPs | 75 |
GT120 | G96b | 500Mhz | 0.5/1Gb DDR2 | 2.0 | 128 | 32 | 0.13 TFLOPs | 50 |
G100 | G98 | 567Mhz | 0.2/0.5/1Gb DDR2 | 2.0 | 64 | 8 | 0.03 TFLOPs | 35 |
Comparative table of video cards. Compare video cards, characteristics. YourSputnik.Ru
The speed of modern computer games is highly dependent on the performance of the video adapter. To evaluate it will help test the video card in a special program. Such diagnostics will not hurt when buying a second-hand device to assess the condition of the video memory and graphics processor. Consider the best programs for checking a video card — from simple utilities to serious tools.
DxDiag
The fastest way to test a video card for performance is to use the built-in DxDiag utility in Windows. It is installed along with the DirectX tool that controls graphics in applications. Available in Windows 7, 8 and 10, to call, hold down Win + R and enter the dxdiag command.
On the Display tab, you will see the exact parameters of the video adapter. Check the «Notes» field, it will list the faults or indicate their absence. In this case, DxDiag will show only obvious failures that are detected in normal operation.
Nvidia Inspector
Nvidia video cards can be conveniently tested with the Nvidia Inspector utility. It shows the detailed characteristics of the device, allows you to monitor the frequency of the GPU and the operation of the cooling system, change the voltage and rotation of the fan, overclock the equipment. Works on Windows 7, XP and above, no installation required.
Checking a video card in Nvidia Inspector does not include testing, the utility provides statistical information.
Intel Iris Plus Graphics G7
Intel Iris Plus Graphics G7 is a video processor embedded in Intel’s 10th generation mobile processors based on the Ice Lake architecture. This integrated graphics processor contains 64 execution units operating at frequencies up to 1.1 GHz. As a result, it provides a peak computing performance of 1.12 teraflops for FP32 operations and twice as much (2.24 teraflops) with reduced accuracy — FP16. The frequency of the G7 depends on the CPU model, and graphics performance is greatly affected by the TDP settings set by the manufacturer, as well as the quality of the power supply and cooling of the laptop.
In a computer game
You can check the video card performance in a computer game with modern graphics. It will load the device as much as possible, after which you can evaluate its performance. Before you check the performance of your video card, install the GPU-Z program. This application controls an important parameter — the temperature under load.
Download and run GPU-Z, in the lower left corner set the video adapter model. The program will show detailed information about video memory, DirectX version, shaders, driver. Go to the «Sensors» tab and look at the «GPU Temperature» field — this is the current temperature of the graphics adapter. Double-click on this value to show the maximum — Max will appear in the corner.
Now launch your favorite game without closing GPU-Z. Play for 30-40 minutes and check the temperature indicator — it should not exceed 90-95 degrees. If during testing the value rose above 100, the device is not in the best condition.
Early 2021 graphics card hierarchy
Tom’s Hardware has tested hundreds of Nvidia and AMD graphics cards from various manufacturers. This experience allowed them to create a rating based on the performance of different card models.
This rating includes cards of the current and previous generations, including the most productive ones. Whether you’re gaming or working on performance-heavy tasks like 4K video editing, a serious graphics card is essential. Processors in such cases play only a secondary role.
The table in this article is based on scores from graphics card analysis benchmarks. The best graphics cards are described in other articles based on a variety of factors, including price, power consumption, and efficiency. This ranking includes the recently introduced Radeon RX 6800 XT and RX 6800 graphics cards.0003
To help you decide which video card you need, use the table with dozens of test results. Maps are ranked from fastest to slowest. Based on test results from nine games at medium and ultra graphics settings at 1080p, 1440p, 4K. For comparison, the fastest card has a score of 100% and the rest are considered relative to it.
FurMark
The real stress test of the video card is carried out by the FurMark program. This free utility overclocks the adapter as much as possible by evaluating its stability and performance. How to check the video card for performance in FurMark:
- Close other applications, including downloading files from the Internet.
- Run the program, it will determine the device model, available resolution and current temperature.
- Set the standard resolution in the «Resolution» field and open the settings by clicking the «Settings» button.
- In the test options, check the boxes for «Dynamic background» and «Burn-in». Do not use the «Xtreme burn-in» mode if you do not understand the intricacies of computer equipment — it can burn video memory.
- Return to the main screen, click «Burn -in test».
- Confirm that you agree with the possible risks — rebooting the PC due to excessive load intensity.
- Video card performance test starts.
- You will see the temperature graph — in the first minutes it will start to rise sharply, then it will stabilize.
In a working device, the temperature will not exceed 100 degrees. At higher values, color artifacts will appear on the screen, the computer will restart, and the video driver will turn off.
Upon successful completion of the video card error test, you will see a window with the main indicators — laptop or PC parameters, maximum temperature, FPS. To compare the performance with the characteristics of users of similar devices, click on «Submit». A browser will open with a table in which you will evaluate your score.
If FurMark’s video card diagnostic fails, try troubleshooting. Inspect the device, if dust has stuck to the radiator — remove it with cotton swabs or blow it off with compressed air. Update the thermal paste on the chip yourself or contact the service center.
Comparative table of video cards. Compare video cards, characteristics.
Updated video card comparison table, new players and a new desire to compare video cards before buying. Or a comparative characteristic of video cards to the constant question — did I buy a good video card or not. Hello. So I gathered my strength and decided to update the comparative characteristics of video cards.
As you already know, new video cards Radeon and GeForce appeared on the market, the unhealthy «miracles» with prices subsided, and reasonable buyers began to take an interest in new video cards. So, it’s time to compare video cards in a slightly new way and with new participants. I am sure that the comparative table of Radeon and GeForce video cards, as before, will help you understand the importance of certain characteristics of video cards, tell you which “G” the seller is trying to promote, and save you from a Loch purchase. So, the table of video cards and what it all means …
Why do we need a comparison table of video cards. Many of you, thinking about which video card to buy, ask — why do you offer to buy a GeForce GTX 560 Ti video card in the computer assembly, and the store seller advised me another — GeForce GTX 560 SE, says — everything will fly. To which one often has to answer — a computer is an exact science, and numbers rule the ball here.
Forget about the existence of temptations and super discounts when assembling a computer, choosing PC components, and in particular, studying the comparative characteristics of video cards, ignore Loch’s turns of speech — super, cool, everything will fly, etc. They will not help you compare video cards, consider the characteristics that affect the performance of a video card, but they will very well help the seller to breed you.
General video card comparison table + video card performance table — this is the main food for your brain, if you want a good video card, turn off your ears, turn on your brain, compare video cards before buying. This will save you the question — did I buy a good video card or not.
Regrettably, but the lion’s share of users first buys a video card, and only then they find out that there is a comparative table of video cards, that video cards need to be compared before, and not after, that the comparative characteristics of video cards are not just numbers, but an accurate description of the capabilities and needs of your future video card.
When assembling a computer, I used to bet on three years, and not on someone else — cool, with a stupid quarterly replacement of «unsuccessful» components, which is what I wish you. And don’t ask if a good graphics card is Radeon HD 5550 / HD 6450 / Radeon R7 240 or GeForce GT 520 / GT 610 / GeForce GT 720, etc. — a video card performance table + a comparison table of video cards, taking into account the price, to help you.
For beginners — for those who have loaded the table of video cards with their computing units, frequencies. Under the table you will find explanations for the columns — what the comparative characteristics of video cards affect, what they are responsible for.
Comparative table of video cards Radeon and GeForce.
-Card Models- | GPU | Frequency GPU | memory | Shina | TDP | BP | ||||||||||||||||||||||||||||||||||
— = GeForce = — | Crystal | /memory MHZ | Standard | Standard | ||||||||||||||||||||||||||||||||||||
— | — | — | — | — | — | — | — | |||||||||||||||||||||||||||||||||
TU102 | 4352/272/88 | 1350/14000 | /14000 | /14000 | 1350/14000 | 1350/14000 | 352 bit | 250w | 650W | |||||||||||||||||||||||||||||||
RTX 2080 8gb | TU104 | 2944/184/64 | 1515/1710/14000 | GDDR6 | 256 bit | 215w | 650W | |||||||||||||||||||||||||||||||||
RTX 2070 8gb | TU106 | 2304/144/64 | 1410/1620/14000 | GDDR6 | 256 bit | 175w | 550W | |||||||||||||||||||||||||||||||||
GTX 1660 Ti 6gb | Tu1116 | 1536/96/48 | 1500/1770/12000 | GDDR6 | 192 Bit | 120W | 450W | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | -0023 | — | — | |||||
Titan Xp 12gb | GP102 | 3840/224/96 | 1405/1582/11000 | GDDR5X | 384 bit | 250w | 650W | |||||||||||||||||||||||||||||||||
Titan X 12GB | GP102 | 3584/224/96 | 1417/1530/10000 | GDDR5x | 384 Bit | 250W | 650W | GTX 110020 GP102 | 3584/224/88 | 1481/1582/11000 | GDDR5X | 352 bit | 250w | 650W | ||||||||||||||||||||||||||
GTX 1080 8gb | GP104 | 2560/160/64 | 1607/1733/10000 | GDDR5X | 256 bit | 180w | 550W | |||||||||||||||||||||||||||||||||
GTX 1070Ti 8gb | GP104 | 2432/152/64 | 1607/1683/8000 | GDDR5 | 256 bit | 180w | 550W | |||||||||||||||||||||||||||||||||
GTX 1070 8gb | GP104 | 1920/120/64 | 1506/1683/8000 | GDDR5 | 256 bit | 150w | 500W | |||||||||||||||||||||||||||||||||
GTX 1060 6GB | GP106 | 1280/80/48 | 1506/1709/8000 | GDDR5 | BIT | 120W | 023 | GP106 | 1152/72/48 | 1506/1709/8000 | GDDR5 | 192 bit | 120w | 450W | ||||||||||||||||||||||||||
GTX 1050Ti 4gb | GP107 | 768/48/32 | 1290/1392/7008 | GDDR5 | 128 bit | 75w | 350W | |||||||||||||||||||||||||||||||||
GTX 1050 3gb | GP107 | 768/48/24 | 1392/1518/7008 | GDDR5 | 96 bit | 75w | 300W | |||||||||||||||||||||||||||||||||
GTX 1050 2gb | GP107 | 640/40/32 | 1354/1455/7008 | GDDR5 | 128 bit | 75w | 300W | |||||||||||||||||||||||||||||||||
GT 1030 2GB | GP108 | 384/24/16 | 1227/1468/6008 | GDDR5 | 64 Bit | 30W 9002 — | 020 — | — | — | — | — | |||||||||||||||||||||||||||||
Titan X 12GB | GM200 | 3072/96 | /1075/7000 | GDDR5 388 | GDDR 3 | 600W | ||||||||||||||||||||||||||||||||||
GTX 980 Ti 4gb | GM200 | 2816/176/96 | 1000/1075/7000 | GDDR5 | 384 bit | 250w | 600W | |||||||||||||||||||||||||||||||||
GTX 980 4 gb | GM204 | 2048/104/64 | 1126/1216/7000 | GDDR5 | 256 bit | 165w | 500W | |||||||||||||||||||||||||||||||||
GTX 970 4 gb | GM204 | 1664/104/56 | 1050/1178/78/7000 | GDDR5 | 256 BIT | 145W | 50023 | |||||||||||||||||||||||||||||||||
GTX | GM206 | GDDR5 | 128 bit | 120w | 400W | |||||||||||||||||||||||||||||||||||
GTX 950 2 gb | GM206 | 768/48/32 | 1024/1188/6600 | GDDR5 | 128 bit | 90w | 350W | |||||||||||||||||||||||||||||||||
— | — | — | — | — | — | — | — | — GEFORCE = — | GPU | . | Frequency GPU | Memory | Shina | TDP | BP | |||||||||||||||||||||||||
Crystal | /memory MHZ | |||||||||||||||||||||||||||||||||||||||
— | — | — | — | — | — | — | — | Titan Z 12gb 9002 | 2xgk110 | 5760/448/96 | 705/876/7000 | GDDR5 | 2×384 | 375w | 700W | |||||||||||||||||||||||||
GTX 780 Ti 3gb | GK110 | 2880/240/48 | 875 /928/7000 | GDDR5 | 384 bit | 250w | 600W | |||||||||||||||||||||||||||||||||
Titan Black 6gb | GK110 | 2880/240/48 | 889/980/7000 | GDDR5 | 384 bit | 250w | 600W | |||||||||||||||||||||||||||||||||
GTX Titan 6 gb | GK110 | 2688/224/48 | 836/876/6008 | GDDR5 | 384 bit | 250w | 600W | |||||||||||||||||||||||||||||||||
GTX 780 3 gb | GK110 | 2304/192/48 | 863/900/7010 | GDDR5 | 384 bit | 250w | 600W | |||||||||||||||||||||||||||||||||
GTX 770 2 gb | GK104 | 1538/ 128/32 | 1046/1085/7010 | GDDR5 | 256 bit | 230w | 600W | |||||||||||||||||||||||||||||||||
GTX 760 2 gb | GK104 | 1152/96/32 | 980/1033/6008 | GDDR5 | 256 bit | 170w | 500W | |||||||||||||||||||||||||||||||||
GTX 750 Ti 2gb | GM107 | 640/40/16 | 1020/1085/5400 | GDDR5 | 128 bit | 60w | 300W | |||||||||||||||||||||||||||||||||
GTX 750 1 gb | GM107 | 512/32/16 | 1020/1085/5000 | GDDR5 | 128 bit | 55w | 300W | |||||||||||||||||||||||||||||||||
GT 740 1 gb | GK107 | 384/32/16 | 993/5000 | GDDR5 | 128 Bit | 64W | 350W | GT 730 902/5000 | GDDR5 | 64 bit | 25w | 300W | ||||||||||||||||||||||||||||
GT 730 1 gb | GF108 | 96/16/4 | 700/1800 | GDDR3 | 128 bit | 49w | 300W | |||||||||||||||||||||||||||||||||
020 — | — | — | — | — | — | |||||||||||||||||||||||||||||||||||
GTX 690 4 GB | GK104 | 3072/256/64 915/1019/1019/1019/1019/AC ×256 | 300w | 650W | ||||||||||||||||||||||||||||||||||||
GTX 680 2 gb | GK104 | 1538/128/32 | 1006/1058/6008 | GDDR5 | 256 bit | 195w | 550W | |||||||||||||||||||||||||||||||||
GTX 670 2 gb | GK104 | 1344/112/32 | 915/980/6008 | GDDR5 | 256 bit | 170w | 500W | |||||||||||||||||||||||||||||||||
GTX 660 Ti 2gb | GK104 | 1344/112/24 | 915/980/6008 | GDDR5 | 192 Bit | 150W | ||||||||||||||||||||||||||||||||||
GDDR5 | 192 bit | 140w | 450W | |||||||||||||||||||||||||||||||||||||
650Ti Boost 2gb | GK106 | 768/64/24 | 980/1033/6008 | GDDR5 | 192 bit | 134w | 450W | |||||||||||||||||||||||||||||||||
GTX 650 TI 1GB | GK106 | 768/64/16 | 925/925/5400 | GDDR5 | 128 BIT 1 | GK107 | 384/32/16 | 1058/1058/5000 | GDDR5 | 128 bit | 64w | 400W | ||||||||||||||||||||||||||||
GT 640 2 gb | GK107 | 384/32/16 | 900/900/900/1784 | GDDR3 | 128 Bit | 65W | 350W | |||||||||||||||||||||||||||||||||
GT 630 1 GB | GF108 | GDDR5 | 128 BIT | 65W | 3002 | |||||||||||||||||||||||||||||||||||
GT 620 1 GB | GF108 | 96/16/4 | 700/1400/1800 | GDDR3 | ||||||||||||||||||||||||||||||||||||
GT 610 1 GB | GF1119 | 48/4 | 810/1620/1800 | GDDR3 | BIT | 29W | 300W | |||||||||||||||||||||||||||||||||
— | — | — | — | — | ||||||||||||||||||||||||||||||||||||
GTX 590 3 GB | GF110 | 1028/96 | 607/1215/3415/3415/3415/3415/3415/3415/3415/3415/3415/3415/3415/3415/3415/3415/3415IE | 365w | 700W | |||||||||||||||||||||||||||||||||||
GTX 580 1. 5 gb | GF110 | 512/64/48 | 772/1544/4008 | GDDR5 | 384 bit | 244w | 600W | |||||||||||||||||||||||||||||||||
GTX 570 1.2 gb | GF110 | 480/60/40 | 732/1464/3800 | GDDR5 | 320 bit | 219w | 550W | |||||||||||||||||||||||||||||||||
GTX 480 1.5 gb | GF100 | 480/60/48 | 701/1402/3700 | GDDR5 | 384 BIT | 250W | 60023 | |||||||||||||||||||||||||||||||||
9002 GB | gf100 9002/58/58/58/58/58/58/58/58/58/58/58/58/58/58/58/58/ 1215/3348 | GDDR5 | 320 bit | 215w | 550W | |||||||||||||||||||||||||||||||||||
560 Ti 448C | GF110 | 448/56/40 | 732/1464/3600 | GDDR5 | 320 bit | 210w | 550W | |||||||||||||||||||||||||||||||||
GTX 560 Ti 1gb | GF114 | 384/64/32 | 822/1644/4008 | GDDR5 | 256 bit | 170w | 500W | |||||||||||||||||||||||||||||||||
GTX 560 1 gb | GF114 | 336/56/32 | 675/1620/4004 | GDDR5 | 256 bit | 160w | 450W | |||||||||||||||||||||||||||||||||
GTX 560 SE 1 | GF114 | 288/56/ 32 | 736/1472/3828 | GDDR5 | 192 bit | 150w | 450W | |||||||||||||||||||||||||||||||||
GTX 465 1 gb | GF100 | 352/44/32 | 607/1215/3208 | GDDR5 | 256 bit | 200w | 550W | |||||||||||||||||||||||||||||||||
GTX 460 v2 1gb | GF114 | 336/56/32 | 778/1556/4004 | GDDR5 | 256 bit | 160w | 450W | |||||||||||||||||||||||||||||||||
GTX 460 1 gb | GF104 | 336/56/32 | 675/1350/3600 | GDDR5 | 256 bit | 160w | 450W | |||||||||||||||||||||||||||||||||
GTX 460 SE 1 | GF104 | 288/56/32 | 650/1300/ 1350/3600 | GDDR5 | BIT | 150W | 450W | |||||||||||||||||||||||||||||||||
GTX 550 TI 1GB | GF116 | 192/32/24 | 900/1800/1800/1800/1800/1800/1800/AM | 116w | 400W | |||||||||||||||||||||||||||||||||||
GTS 450 1 gb | GF106 | 192/32/16 | 783/1566/3600 | GDDR5 | 128 bit | 106w | 400W | |||||||||||||||||||||||||||||||||
GT 440 1 GB | GF108 | 96/16/4 | 810/1620/3200 | GDDR5 | 128 Bit | 65W | 300W | |||||||||||||||||||||||||||||||||
9002 GB 9. GB 9002 GB 9002 Gb 9002 GB 900 1 GB 900 1 GB 900 1 GB 16/4 | 700/1400/ | 64 BIT | 29W | 300W | ||||||||||||||||||||||||||||||||||||
— | — | — | — | — | — | 480/160/56 | 576/1242/2000 | GDDR3 | 2 × 448 | 289W | 650W | 9002 GTX 285 1 GB | / 648/1476/2485 | GDDR3 | 512 bit | 204w | 550W | |||||||||||||||||||||||
GTX 280 1 gb | GT200 | 240/80/32 | 602/1296/2215 | GDDR3 | 512 bit | 236w | 550W | |||||||||||||||||||||||||||||||||
GTX 275 896mb | GT200b | 240/80/28 | 633/1404/2270 | GDDR3 | 448 bit | 219w | 550W | |||||||||||||||||||||||||||||||||
GTX 260 896mb | GT200 | 216/72/28 | 576/1242/242/2000/2000 | GDDR3 | 448 BIT | 50023 | ||||||||||||||||||||||||||||||||||
9002 GTS 250 1 GB 9 16 | 738/1836/2200 | GDDR3 | 256 BIT | 150W | 450W | GT 240 1 GB | 128 bit | 69w | 300W | |||||||||||||||||||||||||||||||
GT 220 1 gb | GT216 | 48/16/8 | 625/1360/1580 | GDDR3 | 128 bit | 58w | 300W | |||||||||||||||||||||||||||||||||
GT 210 1 GB | gt218 | 16/8/4 | 589/1402/1000 | DDR2 | 64 Bit | 303 | 9800 GTX+ 1GB | 128/64/16 | 738/1836/2200 | GDDR3 | 256 BIT | 150W | 450W | |||||||||||||||||||||||||||
GB | G2/11 /1500/1800 | GDDR3 | 256 bit | 105w | 400W | |||||||||||||||||||||||||||||||||||
9600 GT 1 gb | G94 | 64/32/16 | 650/1625/1800 | GDDR3 | 128 bit | 96w | 400W | |||||||||||||||||||||||||||||||||
9500 GT 1 GB | G96 | 32/16/8 | 550/1400/1600 | GDDR3 | 9003 | 3003 9003 900WARE 900WARE 900WARY 900WARE 900WARE 900WARE 900WAY 900WAY 900WAY 900WAY 900WAY 900WAY 900WAY 900WAY 900WAY 900WAY 900WAY 900WAY 900WAY 900WAY 900WAY 900WAY 900WAY 900WAY 900WAY 900WAY 900W mb | G86 | 16 \ 8 \ 4 | 450/900/800 | DDR2 | 64 Bit | 40W | 300W | |||||||||||||||||||||||||||
— | — 9002 — | — | — | |||||||||||||||||||||||||||||||||||||
— Card Models — | GPU | GPU Clock | Memory | Bus | TDP | PSU | ||||||||||||||||||||||||||||||||||
-= Radeon =- | Crystal | Blocks | /memory MHZ | Standard | memory | W | W | |||||||||||||||||||||||||||||||||
— | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | ||||||||||||||||
Radeon VII 16GB | VEGA 20 | 3840/240/64 | 1400/1750/2000 | HBM2 | 4096BIT | 300W | 700W 900WEW 900W | RX Vega 64 8gb | Vega 10 | 4096/256/64 | 1274/1546/1890 | HBM2 | 2048bit | 295w | 700W | |||||||||||||||||||||||||
RX Vega 56 8gb | Vega 10 | 3584/224/64 | 1156/1471/1600 | HBM2 | 2048bit | 210w | 650W | |||||||||||||||||||||||||||||||||
RX 590 8gb | Polaris | 2304/144/32 | 1469/1545/8000 | GDDR5 | 256 bit | 185w | 600W | |||||||||||||||||||||||||||||||||
RX 580 8gb | Polaris | 2304/144/32 | 1257/1340/8000 | GDDR5 | 256 bit | 185W | 600w | |||||||||||||||||||||||||||||||||
RX 570 4-8GB | Polaris | 2048/128/32 | 1168/1244/8000 | GDDR5 | 256 256 BIT | BIT | BIT | BIT | BIT | BIT | BIT | BIT | BIT0056 | |||||||||||||||||||||||||||
RX 560 2-4GB | Polaris | 1024/64/16 | /1275/7000 | GDDR5 | BIT | 80W RA | Polaris | 512/32/32/16 | 1100/1183/7000 | GDDR5 | 128 BIT | 50W | 300W | |||||||||||||||||||||||||||
— | — | — | — | — | — | — | — | — | — | — 9002 — | — | — | ||||||||||||||||||||||||||||
RX 480 8gb | Polaris | 2304/144/32 | 1120/1266/8000 | GDDR5 | 256 bit | 150w | 500W | |||||||||||||||||||||||||||||||||
RX 470 4-8GB | Polaris | 2048/128/32 | 926/1206/6600 | GDDR5 | BIT | 120W | PARAR | 896/48/16 | 1090/1200/7000 | GDDR5 | 128 BIT | 75W | 30023 | |||||||||||||||||||||||||||
— | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | |||||||||||||||||
R9 Fury X 4gb | Fiji | 4096/256/64 | 1050/1000 | HBM | 4096 | 275w | 650W | |||||||||||||||||||||||||||||||||
R9 Nano 4gb | Fiji | 4096/256/64 | 1000/1000 | HBM | 4096 | 175w | 650W | |||||||||||||||||||||||||||||||||
R9 Fury 4gb | Fiji | 3584/224/64 | 1000/ 1000 | HBM | 4096 | 275w | 650W | |||||||||||||||||||||||||||||||||
R9 390X 8 gb | Hawaii | 2816/176/64 | 1050/6000 | GDDR5 | 512 bit | 275w | 650W | |||||||||||||||||||||||||||||||||
R9 390 8 gb | Hawaii | 2560/160/64 | 1000/6000 | GDDR5 | 512 bit | 275w | 650W | |||||||||||||||||||||||||||||||||
R9 380 4 gb | Tonga | 1792/112/32 | 970/5700 | GDDR5 | 256 bit | 190w | 550W | |||||||||||||||||||||||||||||||||
R7 370 2gb | Pitcairn | 1024/64/32 | 975/5600 | GDDR5 | 256 bit | 110w | 400W | |||||||||||||||||||||||||||||||||
R7 360 2gb | Bonaire | 768/48/16 | 1050/6500 | GDDR5 | 128 bit | 100W | 40023 | |||||||||||||||||||||||||||||||||
R7 350x 2GB | C-VERDE | 640/40/40/16 | 1000/4500 | GDDR5 | 128 BIT | R7 350 2GB | C-RERDE | 512/32/32/16 | 925/4500 | GDDR5 | 128 Bit | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | -9002 — | — | — | — | — |
R9 295X2 8 gb | 2xHawaii | 5632/352/128 | 1018/5000 | GDDR5 | 2×512 | 500w | 800W | |||||||||||||||||||||||||||||||||
R9 290X 4 gb | Hawaii | 2816/176/64 | 1000/5000 | GDDR5 | 512 bit | 280w | 650W | |||||||||||||||||||||||||||||||||
R9 290 4 gb | Hawaii | 2560/160/64 | 947/5000 | GDDR5 | 512 Bit | 25023 | 600W | R9 0023 | 1000/6000 | GDDR5 | 384 bit | 250w | 600W | |||||||||||||||||||||||||||
R9 285 2 gb | Tonga | 1792/112/32 | 918/5500 | GDDR5 | 256 bit | 190W | 600W | |||||||||||||||||||||||||||||||||
R9 280 3 GB | TAHITI | 1792/112/32 | 933/5000 | 384 BIT BIT BIT | 019 | R9 270X 2 gb | Pitcairn | 1280/80/32 | 1050/5600 | GDDR5 | 256 bit | 180w | 500W | |||||||||||||||||||||||||||
R9 270 2 gb | Pitcairn | 1280/80/32 | 925/5600 | GDDR5 | 256 BIT | 150W | 450W | |||||||||||||||||||||||||||||||||
R7 265 2 GB | PitcaIRN | 1023 | /322 | GDDR5 | 256 bit | 140w | 500W | |||||||||||||||||||||||||||||||||
R7 260X 2 gb | Bonaire | 896/56/16 | 1100/6500 | GDDR5 | 128 bit | 115w | 400W | |||||||||||||||||||||||||||||||||
022 | C-Verde | 640/40/16 | 1000/4500 | GDDR5 | 128 Bit | 80W | 400W | |||||||||||||||||||||||||||||||||
R7 250 1 GB . 2.23.23.23.23.23.23.2ALA 8 | 1050/4600 | GDDR5 | 128 BIT | 59W | 350W | |||||||||||||||||||||||||||||||||||
240 1 GB | Oland | 320/20/8 | 780/180/180/180/180/180/ALS bit | 30W | 300W | |||||||||||||||||||||||||||||||||||
— | — | — | — | — | — | — | hd 9002 64 | 900/1000/6000 | GDDR5 | 2 × 384 | 40023 | 750W | HD 7970 3 GB | 2048/128/128/128/128/128/128/128/128/128/128/AP | 384 bit | 250w | 550W | |||||||||||||||||||||||
HD 7950 3 gb | Tahiti | 1792/112/32 | 800/5000 | GDDR5 | 384 bit | 200w | 520W | |||||||||||||||||||||||||||||||||
HD 7870 2 GB | Pitcairn | 1280/80/32 | 1000/4800 | GDDR5 | 256 Bit | 175W | ||||||||||||||||||||||||||||||||||
Pitcairn | 1024/64/32 | 860/4800 | GDDR5 | 256 bit | 130w | 450W | ||||||||||||||||||||||||||||||||||
HD 7790 1 gb | Bonaire | 896/56/16 | 1000/6000 | GDDR5 | 128 BIT | 85W | 40023 | |||||||||||||||||||||||||||||||||
HD 7770 1 GB | C-REDE | /40/16 | 1000/400/400/400/400/400/400/400/400/400/400/ALS | 80w | 400W | |||||||||||||||||||||||||||||||||||
HD 7750 1 gb | C-Verde | 512/32/16 | 800/4500 | GDDR5 | 128 bit | 55w | 350W | |||||||||||||||||||||||||||||||||
— | — | — | — | — | — | — | — | |||||||||||||||||||||||||||||||||
CAYMAN | 3072/192/64 | 2×256 | 375w | 750W | ||||||||||||||||||||||||||||||||||||
HD 6970 2 gb | Cayman | 1536/96/32 | 880/5500 | GDDR5 | 256 bit | 250w | 600W | |||||||||||||||||||||||||||||||||
HD 6950 2 GB | Cayman | 1408/88/32 | 800/5000 | GDDR5 | 256 Bit | 5503 | Cayman | 1280/80/32 | 750/4800 | GDDR5 | 256 bit | 180w | 500W | |||||||||||||||||||||||||||
HD 6870 X2 2gb | Barts | 2240/112/64 | 900/4200 | GDDR5 | 2 × 256 | 310W | 70023 | |||||||||||||||||||||||||||||||||
HD 6870 1 GB | BARTS | /56/32 | 900/4 900/4 900/4 900/4 900/4 900/4 900/4 900/4 900/4 900 900/4200 900/4200 900/4 | 151w | 500W | |||||||||||||||||||||||||||||||||||
HD 6850 1 gb | Barts | 960/48/32 | 775/4000 | GDDR5 | 256 bit | 127w | 450W | |||||||||||||||||||||||||||||||||
HD 6790 1 GB | Barts | 800/40/16 | 840/4200 | GDDR5 | 256 Bit | 150W | 50023 | Jun 16 | 850/4800 | GDDR5 | 128 bit | 108w | 400W | |||||||||||||||||||||||||||
HD 6750 1 gb | Juniper | 720/36/16 | 700/4600 | GDDR5 | 128 bit | 86W | 400W | |||||||||||||||||||||||||||||||||
HD 6670 1 GB | Turks | 480/24/8 | 800/4000 | GDDR5 | 128 BIT 900W 900W 900W 900W 900W BITS | HD 6570 1 gb | Turks | 480/24/8 | 650/4000 | GDDR5 | 128 bit | 60w | 350W | |||||||||||||||||||||||||||
HD 6450 1 gb | Caisoc | 160/8/4 | 625/3600 | GDDR5 | 64 BIT | 27W | 300W | HD 6350 1 GB | 80/8/4 | 80/4/4/4/4 GDDR3 | 64 bit | 19w | 300W | |||||||||||||||||||||||||||
— | — | — | — | — | — | — | — | |||||||||||||||||||||||||||||||||
HD 5970 2 gb | Cypress | 3200/160/64 | 725/4000 | GDDR5 | 2 × 256 | 294W | 650W | |||||||||||||||||||||||||||||||||
HD 5870 1 GB | CyPress | 1600/32 900/3 | GDDR5 | 256 bit | 188w | 500W | ||||||||||||||||||||||||||||||||||
HD 5850 1 gb | Cypress | 1440/72/32 | 725/4000 | GDDR5 | 256 bit | 170w | 500W | |||||||||||||||||||||||||||||||||
HD 5830 1 GB | CyPress | 1120/56/16 | 800/4000 | GDDR5 | 256 BIT | 175W | Juniper | 800/40/16 | 850/4800 | GDDR5 | 128 bit | 108w | 400W | |||||||||||||||||||||||||||
HD 5750 1 gb | Juniper | 720/36/16 | 700/4600 | GDDR5 | 128 BIT | 86W | 40023 | |||||||||||||||||||||||||||||||||
HD 5670 1 GB | Redwood | 400/20 | 775/475/475/475/475/475/475/475/475/475/475/475 775/4EN0023 | 61w | 300W | |||||||||||||||||||||||||||||||||||
HD 5570 1 gb | Redwood | 400/20/8 | 650/1800 | GDDR3 | 128 bit | 43w | 300W | |||||||||||||||||||||||||||||||||
HD 5550 1 GB | Redwood | 320/16/8 | 550/1800 | GDDR3 | 128 BIT | 39W | 300W | |||||||||||||||||||||||||||||||||
0. 5 GB | CD0020 80/8/4 | 600/1600 | GDDR3 | 64 BIT | 19W | 300W | ||||||||||||||||||||||||||||||||||
— | — | — | — | — | — | — 9002 | ||||||||||||||||||||||||||||||||||
HD 4890 1 gb | RV790 | 800/40/16 | 850/3900 | GDDR5 | 256 bit | 190w | 500W | |||||||||||||||||||||||||||||||||
HD 4870 1 gb | RV770 | 800/40/16 | 750/ | GDDR3 | 256 bit | 110w | 400W | |||||||||||||||||||||||||||||||||
HD 4830 1 gb | RV770 | 640/32/16 | 575/1800 | GDDR3 | 256 bit | 100w | 400W | |||||||||||||||||||||||||||||||||
HD 4770 1 gb | RV740 | 640/32/16 | 750/3200 | GDDR5 | 128 bit | 75w | 350W | |||||||||||||||||||||||||||||||||
HD 4670 1 gb | RV730 | 320/32/8 | 750/2000 | GDDR3 | 128 BIT | 59W | 300W | |||||||||||||||||||||||||||||||||
— | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | — | -0023 | — | ||||||||||||||||
-= Radeon =- | GPU | Comp. | Frequency GPU | memory | Shina | TDP | BP | |||||||||||||||||||||||||||||||||
Crystal | /memory MHZ | WITH |
Comparative table of video cards — what the columns of the table indicate and what is the value of the comparative characteristics of video cards. Or we compare video cards with an understanding of the matter.
Card Models — the first column indicates the developer — Nvidia GeForce or AMD Radeon and is responsible for the hierarchy of models by power within the same line of video cards (first digit).
GPU is the name of the GPU die model. Sometimes the same stone is sold to us 2 times. Sometimes it doesn’t hurt to compare video cards anyway.
Compute units — in this case, the comparison table of video cards indicates the power of the graphics processor. Shader processors / texture units / ROP — rasterization units / their number is responsible for the performance of the video card. In simple words, evaluating only these comparative characteristics of video cards, you can understand how many «brains» and what potential video cards GeForce GT 720 — GeForce GTX Titan Z have, compare video cards in the AMD Radeon R7 240 — Radeon R9 line290X would also be helpful.
Important! The number of computing units in Radeon and GeForce video adapters is not directly compared, they are not identical.
GPU frequency – nominal (standard) frequency of the video card. For GeForce video cards up to the sixth series, the frequency of the GPU core / shader unit / memory in MHz. Since the 6 series, the comparative characteristics of video cards have changed a little — GPU frequency / auto-overclocking / memory frequency. For Radeon video cards, the core and blocks operate at the same frequency, hence the two values \u200b\u200b- the frequency of the GPU / memory.
Memory is the specified memory standard from the developer, the higher the number, the faster the memory. In cheaper models, manufacturers can reduce the amount of memory, they can, on the contrary, double it, but at the same time use the memory of the previous generation, and sometimes use the ancient DDR2. Which does not affect the performance of the video card in the best way.
Memory bus — a channel for data exchange between the graphics processor and the memory of the video card. In cheaper graphics cards (under $100), manufacturers can cut bandwidth. Important! never choose a video card by bus (sometimes it comes to insanity) — just specify the standard characteristics of the model.
TDP W — peak, maximum power consumption of the video card. The comparison table of video cards in this case indicates what the power consumption will be at maximum load on the video card, and in no case says that such consumption will be constant. All modern video cards have power-saving modes — the lower the load, the lower the power consumption of the video card.
PSU W — recommendation for choosing a power supply. The total power of the PSU is taken into account. It is planned to overclock the system — do not forget to throw in 50-100W, providing a margin of safety for the power supply.
That’s actually all the main comparative characteristics of video cards, which must be monitored. I hope that the comparison table of video cards will help you not to step on the rake of miserable buyers of the GeForce GTX 460 + SE, and now the GeForce GTX 560 — the name is the same, but the efficiency is different.
I’m not talking about cheaper models, where the characteristics of video cards are changed by the manufacturer as you like and often in the negative, but are presented to you by traders as an unprecedented achievement of science and technology.
Try to compare video cards from your store not only by price, but also by taking into account the standard characteristics of video cards from the developer AMD Radeon or Nvidia GeForce. And of course, do not forget to link the comparison of video card characteristics with the video card performance table (you will find the link just below).
Tests, performance table for video cards . Which video card to choose for a computer. Find out what video card is installed. Choosing a video card for PC. How to buy a PC for home. Download DirectX 9-eleven. Buy memory.
When copying material, a link to the site is required!
Best regards $ Denker.
Computer accessories.
Aida 64
The Aida 64 program evaluates the performance of the computer as a whole — the parameters of the video adapter, processor, hard drive. Stress tests are available to diagnose component stability. How to check the video card for serviceability in Aida 64: in the application, go to the «Tools» menu, from there run the «System stability test». In the settings, check the box next to «Stress GPU» to test the video card.
0019
OCCT
System stability can be evaluated by the OCCT program. It is used to overclock video cards and assess the possible harm from overloading the device. OCCT displays frequency, voltage, temperature, builds and saves graphs of parameter changes. Works with DirectX 9 versionsand 11.
How to check the performance of a video card in OCCT: go to the GPU tab and set the settings:
- check duration — 10-15 minutes is enough;
- approval;
- shader complexity;
- check the checkbox for errors.
During testing, you will see the current FPS and temperature of the video adapter. At the end, a summary table of the total values of the parameters is displayed. If 0 is specified in the Errors field, then no errors occurred during the verification process.
Intel Iris Xe Graphics G7
Intel Iris Xe Graphics G7 is a video processor built into some of the Intel Tiger Lake processors. This GPU has 96 or 80 Unified Shader Clusters and uses the 12th generation Intel graphics architecture. The operating frequency of the integrated graphics card varies from Intel Xe Graphics G7 from 400 MHz to 1350 MHz. The slowest graphics option is installed in the Intel Core i5-1130G7 processor, in which the maximum frequency is 1100 MHz.
List-table of video cards nvidia geforce
Testing a new video card operation mode.
Next, you need to make sure that the video card is working correctly and stable. To test the video image, we will use the free FurMark utility. This program loads the video accelerator with work and displays a test video clip on the screen, by which you can determine the smoothness of rendering small details, the frequency of changing the picture, the temperature of the processor of the graphics adapter, its frequency and voltage.
Download, install and run FurMark. Next, press the button «BURN-IN test» and watch the video — a rotating object. If within one or several minutes there are no jumps, distortions, jamming and other similar phenomena on the images, then the video accelerator performance test at a new, increased frequency has been successfully passed.
Now you can proceed to the next stage of overclocking the video card — increasing the frequency of the device’s memory.
Practical work on replacing memory chips on a video card
High-quality work on replacing memory chips on a video card requires the use of a full-fledged soldering station with lower and upper board heating, the availability of stencils for microcircuits, as well as other small things necessary for soldering (flux, desoldering tools, solder balls of the required diameter for reballing, microscope, etc.) and diagnostics (post-card, multimeter), etc. After installing the chips and starting the video card, it is advisable to check the performance of the video memory subsystem using the MATS/MODS program (for Nvidia video cards).
Reballing donor GDDR5 memory chips (0.45 mm balls) on a special tool:
There are articles/videos on the Internet describing the successful practical implementation of work to increase the amount of video memory, for example, installing Samsung K4G80325FB-HC25 chips with 8 Gbps instead of K4G41325FE-HC25 4Gb on MSI GTX1060 AERO ITX 3G OC video card (article “Upgrading GTX1060 3Gb to 6Gb“). In this case, the memory chips were replaced, as well as the rearrangement of three control resistors of 100 kΩ each: R91->R84, R94->R86, R93->R85:
MSI GTX1060 AERO ITX 3G OC: | Location of resistors on a stock and modified video card MSI GTX1060 AERO ITX 6G: |
The same article shows a photo of the resistor configuration for running 6 GB of video memory on Palit 1060 Dual / StromX video cards:
1 GB to 2 GB. In this case, 8 Elpida memory chips (128 MB each) were replaced with Hynix MFR chips (256 MB each), another BIOS was flashed, and the card worked perfectly.
Soldering memory chips:
The presence of a number of articles and videos on the Internet indicates that it is quite possible in practice to upgrade video cards by installing memory chips with increased memory capacity on them.
Practice shows that the most successful memory upgrade of AMD video cards is possible when using BIOS from older models with the same video chip. For Nvidia video cards, the ability to upgrade depends on the performance of the configuration obtained by rearranging the resistors that control the operation of the memory.
How to overclock the Nvidia GeForce 9800 GT
By overclocking the Nvidia GeForce 9800 GT, you can increase its frequency. The result is an increase in card performance and fps during gameplay. Although it will still not work to run games whose minimum requirements it does not meet.
Special utilities such as MSI Afterburner or Nvidia Inspector will help you overclock the Nvidia GeForce 9800 GT video card.
After overclocking, mining by 9800 GT is possible but not recommended. Even with the advent of the new cryptocurrency Bitcoin Gold, which can be mined using GPUs, the performance will still be too low even to pay for electricity, especially at such a high TDP.
Benefits
Reasons to choose NVIDIA GeForce 9600 GSO
- Newer graphics card, release date difference 1 year(s) 0 month(s)
- Core frequency 2. 5 times higher: 1375 MHz vs 540 MHz
- 3.1 times faster texturing speed: 26.4 billion / sec vs 8.6 billion / sec
- Number of shader processors 3 times more: 96 vs 32
- 3.5 times better floating point performance: 264 gflops vs 76.16 gflops
- A newer manufacturing process for the video card allows it to be more powerful, but with lower power consumption: 65 nm vs 80 nm
- Memory clock 14% faster: 800 MHz vs 700 MHz
- PassMark — G3D Mark 3x better performance: 353 vs 117
- Performance in PassMark — G2D Mark about 74% better: 134 vs 77
- Performance in GFXBench 4.0 benchmark — T-Rex (Frames) about 55% better: 2160 vs 1397
- Performance in GFXBench 4.0 benchmark — T-Rex (Fps) about 55% more: 2160 vs 1397
Features | |
Issue date | 28 April 2008 vs 17 April 2007 |
Core frequency | 1375 MHz vs 540 MHz |
Texturing speed | 26. 4 billion/sec vs 8.6 billion/sec |
Number of shaders | 96 vs 32 |
Floating point performance | 264 gflops vs 76.16 gflops |
Process | 65 nm vs 80 nm |
Memory frequency | 800 MHz vs 700 MHz |
Benchmarks | |
PassMark — G3D Mark | 353 vs 117 |
PassMark — G2D Mark | 134 vs 77 |
GFXBench 4.0 — T-Rex (Frames) | 2160 vs 1397 |
GFXBench 4.0 — T-Rex (Fps) | 2160 vs 1397 |
Reasons to choose NVIDIA GeForce 8600 GT
- 2.2 times less power consumption: 47 Watt vs 105 Watt
- The maximum memory size is about 33% larger: 512 MB vs 384 MB
Power consumption (TDP) | 47 Watt vs 105 Watt |
Maximum memory size | 512 MB vs 384 MB |
Feature comparison
NVIDIA GeForce 210 | NVIDIA GeForce 9600 GSO | |
---|---|---|
Architecture | Tesla 2. 0 | Tesla |
Codename | GT218 | G92 |
Production date | October 12, 2009 | April 28, 2008 |
Price at first issue date | $29.49 | $49.99 |
Place in the ranking | 1505 | 1195 |
Price now | $32.99 | $49.99 |
Type | Desktop | Desktop |
Price/performance ratio (0-100) | 6.81 | 13.43 |
Core frequency | 1402MHz | 1375 MHz |
Number of CUDA conveyors | 16 | 96 |
Floating point performance | 39.36 gflops | 264 gflops |
Process | 40nm | 65nm |
Maximum temperature | 105 °C | 105 °C |
Number of shaders | 16 | 96 |
Texturing speed | 4. 16 GTexel/s | 26.4 billion/sec |
Power consumption (TDP) | 30.5 Watt | 105 Watt |
Number of transistors | 260 million | 754 million |
Audio input for HDMI | Internal | S/PDIF |
Video connectors | 1x DVI, 1x DisplayPort, 1x VGA, DVIVGADisplayPort | 2x DVI, 1x S-Video, Dual Link DVIHDTV |
HDMI | ||
Maximum resolution VGA | 2048×1536 | 2048×1536 |
Multi-monitor support | ||
Tire | PCI-E 2.0 | PCI-E 2.0 |
Height | 2.731″ (6.9 cm) | 4.376″ (11.1 cm) |
Interface | PCIe 2.0 x16 | PCIe 2.0 x16 |
Length | 6.60″ (16.8cm) | 9″ (22. 9 cm) |
Additional power connectors | None | 6-pin |
SLI support | 2-way | |
DirectX | 10.1 | 10.0 |
OpenGL | 3.1 | 2.1 |
Maximum memory size | 512MB | 384MB |
Memory bandwidth | 8.0 GB/s | 38.4 GB/s |
Memory bus width | 64 Bit | 192 Bit |
Memory frequency | 500MHz | 800MHz |
Memory type | GDDR2 | GDDR3 |
CUDA |
How to reinstall the video driver for the GeForce 9800 GT
The video adapter requires a properly working driver software to work properly. There are three ways to download and install new drivers on a computer with a 9 card800 GT:
- Download from the manufacturer’s official resource. The only option that guarantees the correct operation and security of the PC.
- Loading from third-party resources. A method in which you can not only download the driver for the Nvidia GeForce 9800 GT, but also infect your computer with a virus.
- Using special utilities such as DriverPack Solution, DriverHub or Driver Booster Free. In this case, the driver may be out of date.
On the official website of Nvidia, you can find new versions of control programs for the video card, designed for various operating systems. The list of platforms supported by the card includes Windows 7 32 and 64 bit, Windows 10 and Linux. On other resources, you can find drivers for such rare operating systems as Solaris.
For Windows 10
File size: ~290.00 MB Driver version: 342.01 (WHQL) dated 12/14/2016; Language: Russian Operating system: Windows 10 32/64-bit, Windows 7 32/64-bit, Windows 8.1 32/64 -bit, Windows 8 32/64-bit, Windows Vista
Some background information
Memory chips with a density of 4 Gb (512 megabytes per chip) are installed in video cards with 3 gigabytes of video memory (6 pieces), and with a density of 8 Gb — in 6 gigabytes.
Specifications table for selected Hynix GDDR5 memory chips
Memory marking |
Density, Gbit/Mb per chip | Operating frequencies, GHz | Voltage VDD (device operation) = VDDQ (I/O interface), V | |
H5GQ4h34AJR | 4/512 | 7-8 | 1.55/1.35 | |
4/512 | 7-8 | 1.35/1.5 (1.55) | ||
4/512 | 6-7 | 1.35/1.5 | ||
4/512 | 4 | 1.5/1.5 | ||
4/512 | 5 | 1.35/1.35 | ||
H5GC8h34MJR | 8/1024 | 6-7 | 1.35/1.5 | |
8/1024 | 5-6 | 1.35/1.55 | ||
H5GQ4h34MFR-R2C |
4/512 | 7-8 |
1.35/1.55 |
|
H5GC4h34MFR-T2C |
4/512 | 5-6 |
1. 35/1.55 |
|
H5GQ2h34BFR-R2C (T2C) |
2/256 | 7 (5-6) |
1.55/1.55 |
|
H5GQ2h34AFR-R2C | 2/256 | 7 |
1.6/1.6 |
The frequency of operation of Hynix memory chips is coded by the following designations, Gbps:
- R4C — 8;
- R2C — 7;
- R0C — 7;
- T2C — 5-6.
Marking of Samsung memory chips:
The fourth letter on the Samsung memory chip marking indicates their type:
- N — GDDR2 SDRAM;
- W — GDDR3 SDRAM;
- J — GDDR3 SGRAM;
- G — GDDR3 SGRAM;
Samsung memory density is determined by the fifth and sixth characters:
- 51 — 512 Mb, 8K/64ms;
- 52 — 512 Mb, 8K/32ms;
- 10 — 1G, 8K/32ms;
- 1G — 1G, 8K/64ms;
- 2G — 2G, 8K/64ms;
- 4G — 4G, 8K/64ms;
- 41 — 4G, 16K/32ms;
Samsung GDDR5 (170 BGA) Memory Specification Chart
Memory marking |
Density, Gbit/Mb per chip | Operating frequencies, GHz |
Voltage, V |
K4G80325FB-HC25 |
8/1024 | 8 |
1. 5 |
K4G20325FC |
2/256 | 8 |
1.5 |
K4G20325FD-FC03 (FC03/04) |
2/256 | 8 |
1.5 |
K4G41325FE-HC25 |
4/512 | 8 |
1.5 |
Micron GDDR5 memory specification table:
Memory marking |
Revision | Density, Gbit/Mb per chip | Operating frequencies, GHz |
Voltage, V |
D9TCB(MT51J256M32HFB) |
8/1024 | 6, 7, 8 |
1.5/1.35 |
Table with information about installed GDDR5 video memory in some Nvidia video cards:
Video card name |
Video memory size, GB |
Note |
|
Asus GeForce DUAL-GTX1060-3G |
3 |
SK Hynix H5GQ4h34AJR |
|
Palit GeForce GTX 1060 Super |
6 |
Samsung K4G80325FB-HC25 |
|
Palit Dual GTX1060 |
6 |
Micron |
|
Palit GeForce GTX 1060 JetStream |
3 |
SK Hynix H5GQ4h34AJR |
|
Inno3D GeForce GTX1060 Compact |
3 |
Samsung K4G41325FE-HC25 |
|
MSI GTX 1060 OCV1v328 ver6. 2 and 7.0 |
3 | Resistors defining memory straps (100 kΩ): R84 R85 R86 R87 R88 R8 | 00 — Samsung011100 — Hynix |
6 |
R84 — R8
00 — Samsung |
||
MSI GTX 1070 Armorv330 ver6.0 |
8 |
R831 R833 R837 R840 R843 R845000100 — Samsung010100 — Micron |
|
MSI GTX1060 AERO ITX 3G OC |
3 |
R91, R94, R93 Samsung |
|
MSI GTX1060 AERO ITX OC | 6 |
R84, R86, R85 Samsung |
Picture of control resistors on Palit Dual GTX1060:
Currently, the following types of GDDR5 memory with 1 GB modules are most available:
- Samsung — K4G80325FB — HC25 and HC28;
- Hynix — H5GQ8h34MJR with different suffixes;
- Micron — D9TCB, D9VVR, D9XSD series with different markings;
- Elpida — does not produce high-density modules.
Video card temperature control.
FurMark also has the ability to control the heating temperature of the video card. The application interface shows a graph of the current processor temperature. Its temperature should not exceed 90 degrees C. If it exceeds this limit, then actions must be taken to increase the cooling of all components of the system unit and the video card as well. This may require cleaning the case and cooling fans from dust. Or installing an additional or more powerful cooler on the computer case, main processor or graphics accelerator.
Or if there is a problem, take a step back. End the test by pressing the Escape button. Return to the nVidia Inspektor program. Lower the Shader Clock and repeat the FurMark test.
What games can Nvidia GeForce 9800 GT
The test in the GeForce 9800 GT games carried out at the time showed the possibility of using a video card for budget gaming PCs. The minimum requirement for a computer that matches the card is a motherboard with PCI-Express 16x slots, 512-1024 MB of RAM, and a 500W power supply. It is also recommended to install the DX10 package.
The test results are as follows:
- In the game Crysys (2009) when the resolution is set to 1280×1024 pixels. The 512 MB model delivers 22 to 30 fps, roughly on par with the HD 4770.
- When you start the Stalker game (resolution 1680×1050 pixels), the picture change rate reaches 13-25 frames per second if you use an adapter with 512 MB GDDR5, and up to 30 if you install a gigabyte version on your computer.
- Skyrim game with 512 MB card will not start at all, and gigabyte modification at minimum settings will show up to 65 fps.
Games released after 2011-2012 are not recommended to run with GeForce 9800 GT 512 MB. Most of them will show no more than 20 fps, the rest will not work. A version with 1 GB of memory is suitable, but it is also unlikely to provide an acceptable quality of the gameplay.
Instructions for overclocking the video card GeForce
Before starting, you need to determine at what frequencies the graphics adapter is currently operating. This can be done using the convenient and free GPU-Z program.
Download and install the program. After launching, the GPU-Z utility will show the operation data of your video accelerator.
We are interested in information about the frequency of the graphics core (GPU Clock), the frequency of shader units (Shader), the frequency of the video card’s memory (Memory).
Screenshot of the GPU-Z program.
Performance options are highlighted in green in the picture. In red, the frequency of the device that needs to be increased. It is necessary to increase the frequency by 10-15% no more. Otherwise, the risk of overheating of the video card and its failure will be high.
This program does not require installation. Download the program archive, unpack it and run the nvidiaInspector.exe file.
After starting in a new window, press the button «Show Overclocking» , then press «Yes».
You should see a window with sliders and buttons for overclocking your nVidia video card.
Screenshot of the nVidia Inspektor program.
In the right part of the program, first we will raise the value of the core clock (GPU Clock) and the frequency of the shader units (Shader Clock) will automatically rise, because these parameters are interconnected.
It is necessary to increase the Shader Clock frequency by no more than 15% of the initial operating frequency of the processor; to do this, move the slider to the right.
For the changes to take effect, they must be confirmed by pressing the button «Apply Clock&Voltage».
Benefits
Reasons to choose NVIDIA GeForce 210
- Newer graphics card, release date difference 1 year(s) 5 month(s)
- Core clock about 2% faster: 1402 MHz vs 1375 MHz
- A newer manufacturing process for the video card allows it to be more powerful, but with lower power consumption: 40 nm vs 65 nm
- 3.4 times less power consumption: 30. 5 Watt vs 105 Watt
- The maximum memory size is about 33% larger: 512 MB vs 384 MB
Production date | 12 October 2009 vs 28 April 2008 |
Core frequency | 1402 MHz vs 1375 MHz |
Process | 40 nm vs 65 nm |
Power consumption (TDP) | 30.5 Watt vs 105 Watt |
Maximum memory size | 512 MB vs 384 MB |
Reasons to choose NVIDIA GeForce 9600 GSO
- 6.3x faster texturing speed: 26.4 billion / sec vs 4.16 GTexel / s
- Number of shader processors 6 times more: 96 vs 16
- 6.7 times better floating point performance: 264 gflops vs 39.36 gflops
- Memory clock 60% faster: 800 MHz vs 500 MHz
- 3.6 times better performance in PassMark — G3D Mark benchmark: 353 vs 97
- 2.8x better performance in PassMark — G2D Mark benchmark: 134 vs 48
- 3. 1 times better performance in GFXBench 4.0 — T-Rex (Frames): 2160 vs 688
- 3.1x better performance in GFXBench 4.0 — T-Rex (Fps): 2160 vs 688
Features | |
Texturing speed | 26.4 billion / sec vs 4.16 GTexel / s |
Number of shaders | 96 vs 16 |
Floating point performance | 264 gflops vs 39.36 gflops |
Memory frequency | 800 MHz vs 500 MHz |
Benchmarks | |
PassMark — G3D Mark | 353 vs 97 |
PassMark — G2D Mark | 134 vs 48 |
GFXBench 4.0 — T-Rex (Frames) | 2160 vs 688 |
GFXBench 4.0 — T-Rex (Fps) | 2160 vs 688 |
Characteristics of GeForce 9800 GT video card
Graphics adapter parameters fully meet the requirements of gaming applications in 2008-2009. The main characteristics of the Nvidia GeForce 9800 GT are as follows:
- G92-270 GPU;
- GPU frequency — 550 to 600 MHz;
- Memory frequency — 1400-1800 MHz;
- Bit depth — 256 bits;
- Maximum data transfer rate — 57.6 GB / s;
- Supported image resolution is up to 2560×1600.
The video card supports Nv technology >Overview GeForce 9800 GT
The power consumption of the 9800 GT is quite high — at 105 W, so it requires a powerful power supply for its operation. The manufacturer recommends using at least 450 watts. To run modern games, you should choose a more productive PSU — 500 or 600 watts.
In order to maintain the normal temperature of the GeForce 9800 GT video card, all modifications are equipped with active cooling systems — usually with one cooler.
The video adapter has the following connectors for connecting peripheral devices:
- 2 DVI, to which you can also connect ordinary VGA and HDMI cables through adapters;
- TV-Out for analog signal output;
- MIO that can combine two cards.
The maximum value of the effective frequency of the video card is 2000 MHz, which allows you to overclock it by 11-30%, depending on the model. It is worth considering that a noticeable increase in performance can lead to severe overheating.
Temperature control.
An overclocked device raises the temperature of the entire computer system unit. Therefore, at first it is necessary to control data from temperature sensors not only from the video card, but also from the motherboard and the main processor. The most convenient way to do this is with a free program SpeedFan.
This program not only displays the temperatures of all sensors of the system unit: processor, memory, graphics accelerator, hard drive, motherboard chipset, but also allows you to configure the mode of operation of the cooling system.
If the temperature of any sensor is high, then care must be taken to improve air cooling in the computer case. To do this, clean the cracks and air ducts from dust. If necessary, install or change cooling fans — coolers for intake and exhaust.
Power consumption table for NVIDIA GeForce GTX and RTX
11/17/2021
Computer
78 Views
Power consumption is one of the main characteristics of any video card. The fact is that video cards, like central processing units, are one of the main consumers of energy in a computer. Current mid-range models typically require 150 to 250 watts of power, while flagship models can require up to 300 watts.
Naturally, these requests must be satisfied by the power supply. Otherwise, the computer may not work stably, blue screens, sudden reboots, and other problems are possible. Therefore, when assembling a computer, it is extremely important to take into account power consumption and choose a power supply that can cope with such a load.
In this article, we provide a table of power consumption for NVIDIA GeForce GTX and RTX graphics cards, starting from the GTX 400 Series, which were released in 2010, and ending with the RTX 2080 Ti and TITAN RTX, which appeared in 2019-m.
Table of Contents
- Power Consumption Chart
- Power consumption of NVIDIA TITAN
- Power consumption for NVIDIA Quadro
- Power consumption for RTX 3090, 3080, 3070 graphics cards
- RTX 2060, 2070, 2080 (Super) graphics card power consumption
- Power consumption for GTX 1650, 1660 (Super, Ti)
- Power consumption for GTX 1050, 1060, 1070, 1080 (Ti)
- Power consumption for GTX 9 graphics cards50, 960, 970, 980 (Ti)
- Power consumption for GTX 750, 760, 770, 780 (Ti)
- Power consumption for GTX 650, 660, 670, 680, 690 (Ti, Ti Boost)
- Power consumption for GTX 550, 560, 570, 580, 590 (Ti)
- Power consumption for GTX 450, 460, 465, 470, 480 graphics cards
graphics cards
graphics cards
Table of power consumption of video cards
The table shows two values, «Power consumption» and «Recommended PSU power».
- Power consumption is the power consumption of the video card that will be required from the power supply, this parameter can be used to estimate the total power consumption of the entire computer.
- The recommended PSU power is the power of the power supply for a computer with this video card, as recommended by NVIDIA. The recommended power supply capacity is calculated based on typical components and in most cases is sufficient for stable operation of the computer.
For most work and gaming computers with mid-range processors and without overclocking, a power supply with the power recommended by NVIDIA will be sufficient. But, if you are using a flagship processor, plan to overclock components or upgrade the system, then you need to choose a more powerful PSU. Usually in such cases, 100-150 watts above the recommended value will be more than enough.
Power consumption of NVIDIA TITAN graphics cards
Model name | Power consumption (W) | Recommended PSU power (W) |
NVIDIA TITAN RTX | 280 | 650 |
NVIDIA TITAN V | 250 | 600 |
NVIDIA TITAN Xp | 250 | 600 |
NVIDIA TITAN X (Pascal) | 250 | 600 |
NVIDIA GeForce GTX TITAN X | 250 | 600 |
NVIDIA GeForce GTX TITAN Black | 250 | 600 |
NVIDIA GeForce GTX TITAN | 250 | 600 |
Power consumption of NVIDIA Quadro graphics cards
Model name | Power consumption (W) | Recommended PSU power (W) |
NVIDIA Quadro RTX 8000 | 295 W | — |
NVIDIA Quadro RTX 6000 | 295 W | — |
NVIDIA Quadro RTX 5000 | 265 W | — |
NVIDIA Quadro RTX 4000 | 160 W | — |
RTX 3090, 3080, 3070 Power Consumption
Model Name | Power consumption (W) | Recommended PSU power (W) |
NVIDIA GeForce RTX 3090 | 350 | 750 |
NVIDIA GeForce RTX 3080 | 320 | 750 |
NVIDIA GeForce RTX 3070 | 220 | 650 |
Power consumption for RTX 2060, 2070, 2080 (Super)
Model name | Power consumption (W) | Recommended PSU power (W) |
NVIDIA GeForce RTX 2080 Ti | 260 | 650 |
NVIDIA GeForce RTX 2080 Super | 250 | 650 |
NVIDIA GeForce RTX 2080 | 225 | 650 |
NVIDIA GeForce RTX 2070 Super | 215 | 650 |
NVIDIA GeForce RTX 2070 | 175 | 550 |
NVIDIA GeForce RTX 2060 Super | 175 | 550 |
NVIDIA GeForce RTX 2060 | 160 | 500 |
Power consumption for GTX 1650, 1660 (Super, Ti)
Model name | Power consumption (W) | Recommended PSU power (W) |
NVIDIA GeForce GTX 1660 Ti | 120 | 450 |
NVIDIA GeForce GTX 1660 Super | 125 | 450 |
NVIDIA GeForce GTX 1660 | 120 | 450 |
NVIDIA GeForce GTX 1650 Super | 100 | 350 |
NVIDIA GeForce GTX 1650 | 75 | 300 |
Power consumption for GTX 1050, 1060, 1070, 1080 (Ti)
Model name | Power consumption (W) | Recommended PSU power (W) |
NVIDIA GeForce GTX 1080 Ti | 250 | 600 |
NVIDIA GeForce GTX 1080 | 180 | 500 |
NVIDIA GeForce GTX 1070 Ti | 180 | 500 |
NVIDIA GeForce GTX 1070 | 150 | 500 |
NVIDIA GeForce GTX 1060 | 120 | 400 |
NVIDIA GeForce GTX 1050 Ti | 75 | 300 |
NVIDIA GeForce GTX 1050 | 75 | 300 |
NVIDIA GeForce GT 1030 | 30 | 300 |
Power consumption for GTX 950, 960, 970, 980 (Ti)
Model name | Power consumption (W) | Recommended PSU power (W) |
NVIDIA GeForce GTX 980 Ti | 250 | 600 |
NVIDIA GeForce GTX 980 | 165 | 500 |
NVIDIA GeForce GTX 970 | 145 | 500 |
NVIDIA GeForce GTX 960 | 120 | 400 |
NVIDIA GeForce GTX 950 | 90 | 350 |
Power consumption for GTX 750, 760, 770, 780 (Ti)
Model name | Power consumption (W) | Recommended PSU power (W) |
NVIDIA GeForce GTX 780 Ti | 250 | 600 |
NVIDIA GeForce GTX 780 | 250 | 600 |
NVIDIA GeForce GTX 770 | 230 | 600 |
NVIDIA GeForce GTX 760 | 170 | 500 |
NVIDIA GeForce GTX 750 Ti | 60 | 300 |
NVIDIA GeForce GTX 750 | 55 | 300 |
Power consumption for GTX 650, 660, 670, 680, 690 (Ti, Ti Boost)
Model name | Power consumption (W) | Recommended PSU power (W) |
NVIDIA GeForce GTX 690 | 300 | 650 |
NVIDIA GeForce GTX 680 | 195 | 550 |
NVIDIA GeForce GTX 670 | 170 | 500 |
NVIDIA GeForce GTX 660 Ti | 150 | 450 |
NVIDIA GeForce GTX 660 | 140 | 450 |
NVIDIA GeForce GTX 650 Ti Boost | 134 | 450 |
NVIDIA GeForce GTX 650 Ti | 110 | 400 |
NVIDIA GeForce GTX 650 | 64 | 400 |
NVIDIA GeForce GTX 645 | 130 | 450 |
NVIDIA GeForce GT 640 (GDDR5) | 49 | 300 |
NVIDIA GeForce GT 640 (DDR3) | 65 | 350 |
NVIDIA GeForce GT 630 | 65 | 300 |
NVIDIA GeForce GT 620 | 49 | 300 |
Power consumption for GTX 550, 560, 570, 580, 590 (Ti)
Model name | Power consumption (W) | Recommended PSU power (W) |
NVIDIA GeForce GTX 590 | 365 | 700 |
NVIDIA GeForce GTX 580 | 244 | 600 |
NVIDIA GeForce GTX 570 | 219 | 550 |
NVIDIA GeForce GTX 560 Ti | 170 | 500 |
NVIDIA GeForce GTX 560 | 150 | 450 |
NVIDIA GeForce GTX 550 Ti | 116 | 400 |
NVIDIA GeForce GT 520 | 29 | 300 |
Power consumption for GTX 450, 460, 465, 470, 480
Model name | Power consumption (W) | Recommended PSU power (W) |
NVIDIA GeForce GTX 480 | 250 | 600 |
NVIDIA GeForce GTX 470 | 220 | 550 |
NVIDIA GeForce GTX 465 | 200 | 550 |
NVIDIA GeForce GTX 460 | 160 | 450 |
NVIDIA GeForce GTS 450 | 106 | 400 |
What do the numbers and letters in the names of Nvidia graphics cards mean?