Nvidia g70: NVIDIA G70 GPU Specs | TechPowerUp GPU Database

Page not found — Technical City

Page not found — Technical City









We couldn’t find such page: /en/video/g70-vs-geforce-gtx-1650-ti%23general-info

Popular graphics cards comparisons


GeForce RTX
3060 Ti

vs


GeForce RTX
3060



GeForce RTX
3060 Ti

vs


GeForce RTX
3070



GeForce GTX
1050 Ti

vs


GeForce GTX
1650



GeForce GTX
1660 Super

vs


GeForce RTX
3050 8 GB



GeForce RTX
2060

vs


GeForce RTX
3050 8 GB



GeForce GTX
1660 Ti

vs


GeForce GTX
1660 Super

Popular graphics cards


GeForce GTX
1050 Ti



GeForce RTX
4090



Radeon RX
Vega 7



GeForce RTX
3060



Radeon RX
580



GeForce GTX
1650

Popular CPU comparisons


Ryzen 5
5600X

vs


Core i5
12400F



Ryzen 5
3600

vs


Core i5
10400F



Core i5
1135G7

vs


Ryzen 5
5500U



Ryzen 5
5600X

vs


Ryzen 5
5600G



Ryzen 5
3600

vs


Ryzen 5
5600X



Core i5
10400F

vs


Core i3
12100F

Popular CPUs


Ryzen 5
5500U



EPYC
7h22



Core i3
1115G4



Core i5
1135G7



Ryzen 5
3500U



Ryzen 3
5300U










Page not found — Technical City

Page not found — Technical City









We couldn’t find such page: /en/video/g70-vs-geforce-gtx-1650-ti%23memory-specs

Popular graphics cards comparisons


GeForce RTX
3060 Ti

vs


GeForce RTX
3060



GeForce RTX
3060 Ti

vs


GeForce RTX
3070



GeForce GTX
1050 Ti

vs


GeForce GTX
1650



GeForce GTX
1660 Super

vs


GeForce RTX
3050 8 GB



GeForce RTX
2060

vs


GeForce RTX
3050 8 GB



GeForce GTX
1660 Ti

vs


GeForce GTX
1660 Super

Popular graphics cards


GeForce GTX
1050 Ti



GeForce RTX
4090



Radeon RX
Vega 7



GeForce RTX
3060



Radeon RX
580



GeForce GTX
1650

Popular CPU comparisons


Ryzen 5
5600X

vs


Core i5
12400F



Ryzen 5
3600

vs


Core i5
10400F



Core i5
1135G7

vs


Ryzen 5
5500U



Ryzen 5
5600X

vs


Ryzen 5
5600G



Ryzen 5
3600

vs


Ryzen 5
5600X



Core i5
10400F

vs


Core i3
12100F

Popular CPUs


Ryzen 5
5500U



EPYC
7h22



Core i3
1115G4



Core i5
1135G7



Ryzen 5
3500U



Ryzen 3
5300U










NVIDIA GeForce 7800GTX (G70) / Video cards

Authors: Musyaka
, Kuzin Andrey

Introduction

The announcement of the G70 arrived when it was not expected: in the middle of summer. It’s not the season at all, but for nVidia it’s a matter of principle — «always be the first». Too early leaks of preliminary information on the R520 and G70 caused a backlash — the people were waiting and indignant. The first fragmentary information occurred even before the March CeBit. After the exhibition, the reviewers went home already with a more or less adequate perception of future products. The graphics industry has taken a clear course to increase the number of pipelines and before the release of DX10, which is expected only in Longhorn, no revolutionary breakthroughs in the implementation of graphics are expected.

There is a ready, debugged and working conveyor. The fact that he was licked to the last comma by both companies both at the topography level and at the driver level — no one doubted. This is evidenced by the fact that in synthetics, competing chips have long been approximately the same, and marketing games like HL2 and DOOM3 are being used. That’s life and that’s okay. When there is a «running into the technological ceiling» — marketers enter the battle with their heavy artillery.

Initially, it was known that the G70 was designed for the 0.11 manufacturing process, which means that more than 24 pipelines cannot fit in a given die area. With ATI, the intrigue was more interesting — the R520 was created as a chip designed for the 0.09 process technology! That is, an elementary mathematical calculation said that all 32 pipelines could fit, which was immediately refuted by ATI — «not the time». The new generation of graphics chips from both of our leaders has the same formula 24pp/8vp . Again… with the release of DX10, all these conventions and the division of pipelines into pixel and vertex ones will die out by itself — pipelines will become unified and really completely programmable.

As it became known back in early May, ATI’s announcement was postponed until the middle of summer. Familiar, on the example of nVidia, a rake with the development of an ultrafine process, plus debugging and launching its analogue of SLI — a technology called CrossFire.

Before moving on to real tests of the new product from nVidia, an important note must be made. The eight-year race of manufacturers to increase the capabilities and performance of graphics accelerators frankly ran into insufficient performance of central processors. Intel and AMD have been banging their heads against the wall for a year already… 64-bit support, multi-core, security, steppe are being introduced… but it doesn’t change the essence — the real computing power of the systems is not growing. More precisely, it is growing, but it is absolutely incommensurable with the money and effort spent. Moreover, under the pressure of the prevailing circumstances, both companies abandoned the traditional methods of marking their processors.

The creators of graphics chips in this situation have driven themselves into a corner. On the one hand, with the transition and mastering of new technical standards, they can quite calmly and organically build up pipelines . .. And CPU manufacturers face much more serious problems — primarily with the imposition of «multi-threaded thinking» among programmers. Generation 16pp/6vp cards (GeForce 6800Ultra and X850 XT) are already too cool for most gaming applications at any graphics quality settings, even the most aggressive ones. The industry is driven forward (and saved) by just a few companies that can be counted on the fingers of one hand: ID Software, Valve, Crytek … the rest smoke bamboo and wait for money from the sky for any development of features, at least DX9.0c.

Buying today’s novelty, every extreme gamer should be absolutely aware of two things:

— only the FASTEST CPU can load a 24pp/8vp video card.

— the advantage of the current hi-end is the ability to play at maximum resolutions with maximum quality settings.

These are the pies. You are now ready to inspect and test the NVIDIA GeForce 7800GTX card.

Key generation differences:

Card Core Memory Pipes Process Number of transistors Retail Price
X850 XT Platinum Edition 540Mhz 1. 18GHz
(37.8 GB/s)
16/6 0.13 160 million $500
GeForce 6800Ultra 400Mhz 1.1GHz
(35.2 Gb/sec)
16/6 0.13 222 million $550
GeForce 7800GTX 430Mhz 1.2GHz
(38.4 Gb/sec)
24/8 0.11 302 million $599

First of all, I would like to note the new video card cooling system. For a more objective assessment of the advantages of the 7800GTX cooling system, we should recall NVIDIA’s not very successful experiment with the NV30, which was notable for childish heat dissipation, which led to the use of a very bulky cooling system, which, among other things, is also noisy. Then NVIDIA worked on the bugs, and the result was the release of NV40 (we do not consider an intermediate stage in the form of NV35), but on reference design boards (not only samples from NVIDIA,
but also the vast majority of serial cards from partners) the two-slot design of the cooling system was again used. This was justified in terms of achieving higher efficiency and allowed to reduce the noise level, but nevertheless, this was one of the facts that ATI used when pointing out the advantages of the R420 over the competing NV40 — after all, the RX800XT had a single-slot cooling system.

NVIDIA GeForce 7800GTX (G70)

All these factors seem to have been taken into account when developing NVIDIA’s current flagship 7800GTX, and the reference card sample is equipped with a single-slot cooling system,

which is less noisy than that set on 6800Ultra (of course, any assessment of such a parameter as noise is a priori subjective). As for the temperature parameters, they are quite acceptable — 53 degrees under load at standard frequencies, but you should also take into account the fact that the G70 is made according to the 0.11 process technology, in contrast to 0.13 on the NV40/45. Of course, later, when serial boards based on the G70 from the company’s partners go on sale, we will probably see other options for the cooling system.

As for the design of the cooling system itself, it is an aluminum heatsink with soldered fins that covers the GPU and memory chips located on the front side of the board. At the top
The radiator is equipped with a heat pipe for more efficient cooling of video memory chips.

A cover with a fan and an NVIDIA logo carved in a very original way is placed on top of the structure,

and the stabilizers are cooled by a separate aluminum radiator.

Power Stabilizers

On the reverse side of the board there is an aluminum plate used to fix the cooling system. A separate plate covers four video memory chips located on the reverse side of the board and empty «seats» for four more chips that will be installed on the 512Mb modification of the board.

The board has 256Mb of GDDR3 video memory with a 256Bit data transfer bus in the form of eight Samsung memory chips with a 1.6 ns access time, which corresponds to an operating frequency of 625MHz (1250 MHz DDR). Initially, the memory on the GeForce 7800GTX runs at 1200MHz DDR. The memory bandwidth is even more impressive than the NV40 (35.2 Gb/sec) at 38.4 Gb/sec.

It is possible that later some «extreme» modifications of the 7800GTX will be released, equipped with a faster memory with an access time of 1.4 ns (nominal — 1430MHz DDR), and having higher nominal operating frequencies (most likely, one of the first to decide on such changes, the company will become Gainward with its next Goes Like Hell solution). GeForce 7800GTX cards equipped with 512Mb of video memory will soon see the light of day, which is best evidenced by the presence of empty spaces for eight memory chips on the PCB.

Let’s look at the G70 GPU. According to the manufacturer, the number of GPU transistors reaches 302 million (remember, the number of transistors of the NV40 GPU was 222 million). The nominal operating frequency is 430MHz. Twenty-four pixel pipelines, eight vertex units.

Today, the 7800GTX is the absolute champion in terms of overall dimensions of the board. The PCB of 7800GTX is longer than that of 6800Ultra,

and outperforms even the RX800XL with AGP interface, which has proudly held the palm in this parameter until today.

As for power consumption, GeForce 7800GTX consumes about 110W, and NVIDIA’s recommended power supply capacity is 350W (minimum!) for one 7800GTX card and 500W for SLI configuration. These requirements are much more modest than those of the previous flagship — GeForce 6800Ultra, which required a power supply unit with a power of about 480W. But the latter was, among other things, due to the fact that the GeForce 6800Ultra AGP had two connectors for additional power supply. Therefore, a PSU was required with the appropriate number of free slots (each Molex
required the connection of a separate power cable), and only high-power power supplies could boast of this.

GeForce 7800GTX is also equipped with two digital outputs and TV-OUT

And a connector that allows you to combine two 7800GTX cards in SLI.

ForceWare 77.xx drivers

A new version of ForceWare drivers 77.6x has been released to work with 7800 series cards (we used 77.62 in this case).

The following FSAA modes are available in standard image quality settings: Application Controlled — application control (default) / off. / 2x / 2x Quincunx / 4x / 8xS

GeForce 7800GTX supports anisotropic filtering up to x16 inclusive. The following levels are available:

Application Controlled — application control (default) / off / 2x / 4x / 8x / 16x

In addition, in the advanced settings submenu, you can select the adaptive AA mode (off, multisampling, supersampling)

You can read about the influence of a particular mode on image quality a little lower, in the corresponding section of the material.

Also among the driver options is a menu for selecting the quality of texture filtering. The choice is made among the medium modes: High Performance (high performance) / Performance (performance) / Quality (quality, by default) / High Quality (high quality).

In addition, thermal monitoring and frequency control tabs are available (the latter is enabled via Coolbits).

Luna and Mad Mod Mike Demo

As usual, when new graphics solutions are announced, a new demo program is also presented, designed to fully demonstrate all the qualities and technological advantages of the new product. The character of the demo, as in the case of the GeForce 6800Series, is a charming girl — this time a siren with a pronounced oriental appearance.

*7680×4320; 1.8Mb

Here’s how the developers describe the plot of this demo program:

« Gatekeepers stand at the gates on the border between the real and the non-material world. This trio of disembodied beings floats against the background of a huge eyeball, moving in a spiral that unwinds into a luminous void. They direct the gaze of this unblinking eyeball with their giant hands, and now we are peering into the black hole, revealing to us an unknown world, more mysterious than the one that surrounds us in our prosaic reality.

These three beings are servants of the High Priestess Luna, who commands this magical eye. From the cold white light appears Luna — a charming mythical siren, calling us to her gloomy unreal world. Shimmering with warm green tones that illuminate the path, it flies through cave passages to unknown images of the subconscious.

Forget your familiar world, follow Luna and visit the vast game world of chaos. You will see something that you have never seen before, believe in this fabulous world, seeing everything with your own eyes

*6144×4608; 1,6Mb

Well, what can I say, impressive? It remains only to add that the demo actively uses pixel shaders 3. 0 version, as well as effects:

Translucence — the ability to display not only the pale red highlights emitted from under the skin of Luna, but also the display of bright tones emitted from the bodies of the gatekeepers. The intensity of the emitted light is calculated taking into account the depth of the pylorus bodies, their bones and arteries.

Displacement Mapping — A new displacement mapping method implemented in the NVIDIA Gems 2 graphics core: the use of volumetric textures for calculating surfaces, allowing blending of convex objects.

Real-Time Hair — The ability to simulate the finest hair on Luna’s head in real time, modeling their shadows, taking into account the amount of hair between each pixel and the light source.

The second character is a young man named Mad Mod Mike, the owner of a very impressive and serious look.

*1800×2875; 710kb

» Mad Mod Mike is a hero who is always ready to help the player. When night falls, he gets into your exhausted computers and turns them into real gaming monsters. And with which graphics card? what NVIDIA offers! «- this is the broadcast description of this most curious character. Just like Luna, this demo uses PS 3.0 features and Radiant Lighting and Depth of Field effects.

Radiant Lighting — more realistic shadows and lights from Mad Mod Mike rocket launchers, natural lighting of the environment and light reflections from objects. Note that the reflection on Mad Mod Mike’s helmet and rocket launcher is a separately rendered scene. When approaching objects, subtle reflections are visible on his face.

Depth of Field — Mad Mod Mike demonstrates the ability of the camera or the human eye to focus in depth, and also shows the smooth defocusing of objects above this point.

Testing

Test stand:

CPU AMD Athlon64 4000+
MB ASUS A8N-SLI Deluxe
Memory Kingston HyperX PC3200 2×512 Mb
OS WinXP + SP2 + DirectX 9. 0c
PSU Hiper 525W

ATI cards were tested using the latest WHQL drivers — Catalyst 5.6, for NVIDIA cards — the latest ForceWare 77.62 beta drivers

So, the most important graph from the point of view of all marketing departments of video card companies 😉 This is 3DMark 2005Pro:

nVidia regains the palm. And who would doubt? On 24 conveyors, it would be possible to have a sharper trim. But what is, is. It says «initial driver version».

We are looking at the performance drop when the quality settings are enabled:

Practically, the card didn’t even notice the maximum 8xS AA/16xAF settings. It’s really amazing.

And now let’s add data for Radeon 850XT Platinum and GeForce 6800Ultra:

Old synthetic test — 3DMark 2003Pro:

And again, we check the failure on high-quality settings. This time the fall is quite noticeable:

Overclocking

Considering that there were rumors on the Internet about overclocking G70-based boards up to 508/1370MHz (using additional cooling), we already knew what to expect. However, in order to obtain the most adequate results, it was decided not to use additional airflow and other modifications of the cooling system, showing the overclocking of the board as it is.

As a result, we managed to reach 500/1350MHz , which is a very good indicator. Of course, overclockers and PC enthusiasts, with the right cooling gear, can certainly raise the bar even higher.

In our case, overclocking the GPU by about 16% and video memory by 12.5% ​​looks pretty good for a top solution. Recall, for example, the flagship of the previous generation — GeForce 6800Ultra, where 450MHz (+12.5%) was considered a very good GPU overclocking on standard cooling, and the memory was rarely overclocked above 1200MHz DDR (+9%).

Image quality

We evaluated the image quality in two currently relevant games:

  • Far Cry (1.3 Patch)
  • Half-Life2 (d1_trainstation).

All screenshots were taken at 1024×768 resolution.

FARCRY Screens:

no AA/1xAF

8xS AA/16xAF

8xS AA/16xAF Multisampling

8xS AA/16xAF Supersampling

16 x AF/ Supersampling

16xAF/Multisampling

HDR 11Level /16 x AF

HL2-Screens:

no AA/1xAF

4xAA/8xAF

8xS AA/16xAF

8xS AA/16xAF Multisampling

8xS AA/16xAF Supersampling

16 x AF/ Supersampling

16x AF/Multisampling

Performance analysis and conclusions

Traditionally, NVIDIA has been a pioneer and discoverer of a new generation of video accelerators, presenting a solution based on the 24pp/8vp formula (the previous generation was 16pp/6vp). Even though the difference in performance between the boards of the previous and current generation is not as significant as it was a year ago with the release of GeForce 6800Ultra (NV40), nevertheless, the new product has shown a clear and quite convincing step forward.

With the release of GeForce 7800GTX, NVIDIA intends to correct one of the most important mistakes of the previous generation, committed by both NVIDIA and ATI — G70-based boards will go on sale immediately after the end of the NDA, that is, in fact, already today. Of course, you should not exaggerate and take such a gratifying efficiency literally, because the official launch does not mean that boards will magically appear immediately at a certain hour on the shelves of all stores in the world — first they must be shipped and brought corny. However, by all indications of such an unprecedented shortage of new products, which we observed on the example of high-end cards of the past generation, this round is not expected, which cannot but rejoice.

Since we are talking about buying, it would not be superfluous to mention the price. The recommended price for the 256Mb modification of the 7800GTX is $599. Based on this, we can assume that retail prices for the first time after the announcement will be about $650 and above. As for the 512Mb version, the price will most likely start at $700 or even higher. However, exact data on this issue has not yet been presented.

There is a question about the expediency of releasing a new generation now, when modern top-end cards already provide more than an acceptable level of performance in all games of the latest generation. But remember that these boards are a kind of reserve for the future, in particular for games on Unreal Engine 3 and other important innovations in the gaming industry. Of course, being the owner of a GeForce 6800Ultra or RX850XT PE now, it makes no sense for you to worry and run after new products, and here is another advantage of the product announced today — that with the release of the G70, the previous generation boards did not automatically turn into rubbish, which happened last year with 9800XT and 5950Ultra (those who bought them 2-3 months before the April announcement were especially happy — that’s about 450+ bucks).

Only one thing can be said for sure — the sooner new cards hit the shelves, the faster the astronomical prices for the previous Hi-End from nVidia and ATI will start to collapse.

Video card G70 — Technical City

NVIDIA
G70

  • Interface
  • Core frequency
  • Video memory size
  • Memory type
  • Memory frequency
  • Maximum resolution

Description

This is a Curie architecture desktop card based on 110 nm manufacturing process and primarily aimed at gamers.

We don’t have test results for the G70.

General information

Information about the type (desktop or laptop) and architecture of the G70, as well as when sales started and cost at that time. Price now 244 $ of 49999 (A100 SXM4)

characteristics

G70’s general performance parameters such as number of shaders, GPU core clock, manufacturing process, texturing and calculation speed. They indirectly speak about the performance of the G70, but for an accurate assessment, you need to consider the results of benchmarks and gaming tests.

Number of transistors


According to our statistics, these processors are most often used with the G70.


Core 2
Duo E4600

20%


Core i5
760

20%


Pentium
B960

20%


Xeon E5
2620 v2

20%


Atom x5
Z8350

20%

User rating

Here you can see the rating of the video card by users, as well as put your own rating.


Tips and comments

Here you can ask a question about the G70, agree or disagree with our assessments, or report errors and inaccuracies on the site.

2024 © All rights reserved