Archiv verlassen und diese Seite im Standarddesign anzeigen : nVidia NV40, NV41, NV45 & Co. Informationen

2004-01-27, 20:15:55

Halte ich für hochwichtig und sollte allem anderen vorgezogen werden.

Grund: Wenn wir eine englische Version haben, kann die vom englischsprachigen Web verlinkt werden. Haben wir diese nicht, schreiben die ihren eigenen Text und verlinken uns nur am Rande.

2004-01-28, 10:59:09
Hmm... also dann würde ich sagen das der Masken - Artikel erstmal ruht und dieser hier bearbeitet wird... ich mache seite 1 und nach möglichkeit seite 2... korrekturleser sidn aber dann pflicht ;-).

MfG Maik

2004-01-28, 11:34:02
Original geschrieben von Jason15
Hmm... also dann würde ich sagen das der Masken - Artikel erstmal ruht und dieser hier bearbeitet wird...

Ja, bitte. Das Web linkt schon auf die deutsche Version, das ist ungünstig.

2004-01-28, 18:02:48
also mal zu dem Problem das "das Web" auf dei deutsche Version linkt: Könnte man nicht über solche wichtigen Artikel die Anmerkung setzen das in kurzer Zeit auch eine englische Version zur Verfügung steht und das die Webmaster darauf hinweisen sollen? Also die Webmaster sollten das so machen:
"Bei 3DCenter gibt es neue Informationen zum NV40 usw... die den ganzen Artikel gibts es hier:'Link' englische Version:'Link' (oder so ähnlich).

Trotzdem erstmal meine bisherigen "Errungenschaften" (der Artikel ist ja sau groß):


Around the next generation grapchics chip NV40 of nVidia some rumor climbs, about which however to date nothing was confirmed formally. Unfortunately make also speculations round, which were raised into the status of a rumor - although a statement was never haerd in this connection, not even unofficially. Today we want to summarize our knowledge about the nVidia NV40, peppered also with a set of brand new information, which was not to be read so far yet anywhere.

To the classification of the nVidia NV40 project: The zero in the code name point to a completely new architecture concerning the feature pallet, like it was usually the fall by nVidia (NV10 = GeForce1 with DirectX7-feature pallet, NV20 = GeForce3 with DirectX8-feature pallet, NV30 = GeForceFX with DirectX9 Shader 2,0 feature pallet). The NV40-chip will be further DirectX 9,0 (and not DirectX 9,1), but the shader ability raises to stage 3,0.

In accordance with actual nVidia plannings, a whole new architecture regarding the feature palett is to come on the market. As well as all half years an appropriate refresh chip. However nVidia so far actually only with Riva TNT 1/2 and GeForce 1/2 could adhere to this planning, the GeForce3-chip and only right the GeForceFX chip was already late substantially, which had then naturally also appropriate effects on the following projects and has. Because of the delay of the GeForceFX the NV40-chip was planned the NV40-chip was planned for atumn 2003.

Ich arbeite weiter...

MfG Maik

2004-01-28, 20:00:46
Original geschrieben von Jason15
also mal zu dem Problem das "das Web" auf dei deutsche Version linkt: Könnte man nicht über solche wichtigen Artikel die Anmerkung setzen das in kurzer Zeit auch eine englische Version zur Verfügung steht und das die Webmaster darauf hinweisen sollen?

Prinzipiell ja. Bei diesem speziellen Artikel zwecklos.

2004-01-28, 20:37:19
Tja dann heist es wohl: schnell übersetzen... PS.: ich hab hier einen "direktübersetzer" gefunden... damit mal die 2 seiten durchlaufen lassen. also die ergebnisse sind nicht gerade sehr gut aber englisch :D. also man muss einige passagen korrigieren und viele teile umändern (wörtliche übersetzung). trotzdem meine die 2 seiten:


(ist mein eigener server, weil ich nicht wusste wie ich die seiten hier reinstellen sollte) also einfach mal drüberlesen und korrigieren was falsch ist.

MfG Maik

Lost Prophet
2004-01-29, 02:46:50
also die maschinellen uebersetzungen sind total fuer die hinterbacken

ich werd mal sehen was ich machen kann.

@leo bei etwas mit aehnlicher wichtigkeit koenntest du in zukunft einfach 1 - 2 tage das uebersetzungesteam schon lesen lassen (per pm die addy) damit das ganze zeitgleich(er) herauskommt

cya, axel

Lost Prophet
2004-01-29, 09:02:12
schlechte, da hastige uebersetzung

page 1


nVidia NV40, NV41, NV45 & Co Informations

February 27th 2004 / by aths & Leonidas / page 1 of 2

There are lots of rumours around nVidia's next-gen graphics chip NV40, but absolutely nothing has been confirmed officially yet. Unfortunately there are also speculations which gained rumour-status, although nothing has been said concerning that, not even unofficially. Today we want to sum up our knowledge about NV40, along with some brand-new informations, which haven't been released anywhere else yet.

To categorise the NV40-project: The zero in the codename indicates an entirely new architecture concerning the features, as it has always been the case with nVidia (NV10 = GeForce 1 with DirectX7-features, NV20 = GeForce3 with DirectX8-features, NV30 = GeForce FX with DirectX9 Shader 2.0 features). NV40 will still be DirectX9 though, (and not DirectX9.1) [the linked article is currently being translated], but will raise the shader-specs to 3.0-level.

According to original nVidia-plannings, every year a new architecture should be released, and every half year the matching refresh-chip. But nVidia could only stick to this plan with Riva TNT 1 & 2, and GeForce 1 & 2, already GeForce3 and even more so the GeForce FX-chip were delayed heavily, which of course also affected the successing projects. Due to the delay of GeForce FX, the NV40-chip finally was issued for Autumn 2003.

This plan also didn't work out, so nVidia released the NV38 (GeForce FX 5950), a second, initially not planned refresh-chip of the original NV30-chip (GeForce 5800 /Ultra). When the NV38 was launched, it was still unclear when NV40 was due, but now we know that nVidia will present the NV40 either at the CeBIT (18th - 24th of March) or at the Game Developers Conference (22nd - 26th of March), while first purchasable NV40-graphics cards can be expected around the end of April or the beginning of May.

Now about the technological changes of NV40 compared to NV38: It is highly probable that the pixelshader-architecture has been improved heavily, and that the shaders work much more efficiently than with GeForce FX-cards. There are two ways to achieve this: Enlarge the temp-register files (to minimise idle cycles of the pipeline), and splitting the Vector4 caltulation-units into Vector3+scalar units. (Register Combiner and the R300-shaders achieve a big plus in performance by doing that). Of course these alterations cost transistors, but the gained performance would more than justify that. Therefore we're pretty sure, that the NV40 will receive changes along this lines.

It's likely that NV40 has at least double the amount of calculation-units for the pixelshader compared to NV38. Architecture-improvements and -widening together would mean a massive increase in pixelshader-perfomance, at least in versions 2.0 and 2.X. Shader-version 3.0 will be supported, but we expect that the use of jump-commands in 3.0 will slow the process down quite strongly. The reason for that is: Up to now four pixels were rendered together (either simultaneously or in a row), but these dependencies can't be used in pixelshader 3.0 because the commands can be different for each pixel.

The spec-term "pipeline x TMUs" is probably out-dated by NV40, but we think that with multitexturing the results are similar to a 8x2-architecture. Likely is also that that 16 Z/Stencil-tests can be done per cycle, which of course would help in games that use a seperate Z-pass (Doom III). We ask you though, to remain highly cautious about the "8x2 architecture"-part.

The NV40 is assumed to have 175 millions transistors, which seems believable for a new architecture, bearing the current growth-rates in mind. In our opinion this offers enough space for additional pixel-processors and texture-mapping units. But this would pretty much use up all the resources, so that beyond performance-increase and support of pixel- and vertex-shader 3.0, not many new features should be expected. There are also some points which would lead to believe that the NV40 was altered during development (instead of trying to build the original design and testing it). Whether that was enough to beat r420 resp. r423, the next-gen chip of ATi, still has to be proven.

To increase overall-performance it isn't enough just to multiple arithmetic power. Fillrate remains as important as ever, which also is why we speculate about a 8x2-architecture in multitexturing (with up to 16 textures per cycle). But to be able to make use of such enormous fillrate, a gigantic bandwith would be required: We think nVidia shouldn't have problems with 600 Mhz or even faster DDR2-memory. This also would follow the tradition of nVidia, to always use the fastest memory available. It would even be possible that the Ultra-version could have memory with up to 800 Mhz physical clockspeed, but nothing is for sure in this case.

The TMUs are likely to be able to generate a bilinear sample per cycle each, as it has been the case before. Trilinear and anisotropic filtering should also be implemented the way it is on GeForce FX (while S3's DeltaChrome can filter trilinear from one MIP-map, which will be the way to do it long-term wise). If nVidia sticks to 8x AF, or if NV40 has 16x AF or even higher implemented, remains unclear.

We say, the GeForce FX 8x AF in quality-mode is still up to date. Of course a non-"optimised" GeForce4-like 16x AF would be interesting, to have a direct comparison to Radeon. Whether the new chip will permit full trilinear filtering, is also unknown, but we are worried that nVidia continues to force "brilinear" filtering, although the hardware would be capable of full trilinear filtering.

Side note concerning AF: Quality "by the book" is extremely difficult, that's why nVidia (and S3 in their DeltaChrome) accept optimisations at certain angles in 4x or higher. In comparison to the strong angle-dependency in ATi's AF this can nearly be disregarded though; we accept a certain weakness at 45° angles in higher modes. The "optimisations" can most probably be activated in NV40 again, although this appears pointless to us, since more quality is burnt than performance gained. Well, we're used to nVidia's chips developing evolutionary.

Another point, of which there is no information to this point, is antialiasing. Facing the good 6x "sparsed"-mask [the linked article is currently being translated] which ATi's offering since the R300 (Radeon 9500/9700), an improvement in this matter is crucial for the GeForce-series. We know that nVidia is introducing at least one new antialiasing-mode with NV40, but unfortunately we don't know where to put it: Similar quality with better performance, or better edge-smoothing.

It's probable that NV40 is still marketed as GeForce. Though parts of the nVidia-team wanted a new name, the marketing-department objected to that. Hence the GeForce FX-series continue, according to our information still with a four-digit number-appendix beginning with a "6".

Summary of what we know (or think to know) about NV40 at the moment:

nVidia NV40
* 175 millions transistors, manufactured in 130nm
* 8x2-architecture, but 16 Z/Stencil-test per cycle
* DirectX9-architecture, supports shaders 3.0
* Doubled in number compared to NV38 and more efficient pixelshaders
* Supports DDR2, GDDR2, GDDR3
* Internal interface is AGPx8
* Exact clockspeeds: unknown; estimated 500-600 Mhz core and 600-800 Mhz memory
* Improvements for antialiasing: (at least) one new mode, its subpixel-mask is still unknown though
* Improvements for anisotropic filtering: unknown
* Presentation: GDC or CeBIT; end of March
* Market-entry: end of April or beginning of March 2004
* Name: GeForce FX 6XXX

We dare making a prognosis about NV40: at least under the aspect of performance nVidia will excel once more. To know whether it's enough to beat the strong competition we will have to wait until the GDC resp. the CeBIT, because with the R420 and R423 ATi is also expected to present new high end graphics chips.


@ zeckensack: help needed

cya, axel

Lost Prophet
2004-01-29, 09:22:51
hm, mir geht die zeit fuer die 2. seite aus, sorry

ich hab fruehestens in 24 stds wieder zeit.

viel glueck damit
naja, sollts nicht fertig sein wenn ich zurueck komme, werd ich mich dranmachen (sofern ich zeit hab).


cya, axel

2004-01-29, 10:23:03
Ich fürchte, dass der NV41 (im Original-Artikel) falsch eingeordnet ist.

2004-01-29, 12:01:37
Seite 2

The decision to equip the NV40 with internal AGPx8 interface was probably made at an early developement stage - and this cannot be changed in a rush, at least not without a substantial time delay. Furthermore during the planing of the NV40 there couldn't be foreseen, when the age of PCI Express would begin. However we know now, that Intel will start it with the mainboard chipsets Alderwood and Grantsdale on 28. march 2004.

Here nVidia surely doesn't want to stay behind and so they developed a PCI Express Bridge, which is soldered as an extra chip on the graphics board. This converts the AGPx8 signals of the graphics chip in PCI Express x16 signals - with which the graphics board can have a PCI Express x16 interface, even if the actual graphics chip is AGPx8. This chip is ready at nVidia and will be used for the current AGPx8 graphic cards to make them capable of PCI Express.

However the nVidia-graphics chips, which will be equipped with that PCI Express Bridge, get special code names, even though the graphics chips remain the same. The code name of the NV40 with the PCI Express Bridge will be NV41. This chip will be entirely identical to the NV40, with nVidia only adding an extra PCI Express Bridge chip to the graphics board manufacturers' delivery.

All succeeding chips of the NV4X-series will carry only internal PCI Express interfaces, including the NV45, which will be the refresh chip of the NV40/NV41 chips. With the delay of the NV40 the NV45 will come quite fast after it into the market, supposedly in the first half of the year. This will probably result in a similarity of events compared to the original GeForceFX-series, where the NV35-chip ( GeForceFX 5900 /Ultra/XT/SE) came also only a few months after the NV30 ( GeForceFX 5800 /Ultra) into the market ;-)

Larger technical changes at the NV45 are not to be expected though. Technologically nVidia can hardly go beyond the feature-set of DirectX 9.0, since this would make little sense: New features could not be used for effective marketing without DirextX support and additionally they would also hardly be accepted by game developers. And since DirectX10 ist still at the far horizon, nVidia has to live with the features specified in DirectX 9.0 and for which the NV40 will probably already have complete support.

In addition nVidia will introduce new graphics chips from the NV4X- family for the mainstream and lowcost segment in the course of summer. Planed for the mainstream segment is the NV43-chip at the end of the second quarter, which will succeed the NV36-chip (GeForceFX 5700 /Ultra). This chip probably will still be based on the NV40/NV41. This indicates the market entrance of NV43 and NV45 at the same time - cause nVidia having a stripped-down variant at the time of the market entrance of the original chip is rather unlikely. But contrary to NV40/NV41 the NV43 will already be equipped with an internal PCI Express x16 interface, like all further nVidia-graphics chips after NV40/N41 too.

The NV43 will primarily be a stripped-down NV40/NV41-chip. Wether nVidia realizes this by halving or quartering the original design, remains the big question: First would result in a really impressing mainstream chip, which could match the current highend-chips of nVidia with the right clock speeds. The second solution would be not really inspiring of course, because then one could almost only exceed the level of the current mainstream chip NV36 ( GeForceFX 5700 /Ultra) with higher clock speeds.

Furthermore known is the NV42-chip, which is to come into the market in the third quarter and is apparently intended for the lowcost segment. This will probably still be based on the NV40/NV41, but is supposed to have an internal PCI Express interface yet. With the NV42 one can expect even more explicit savings than with the NV43, similar to the NV34-chip (GeForceFX 5200 /Ultra), which was clearly stripped-down compared to the NV31-chip (GeForceFX 5600/Ultra/ SE). The basic shader 3.0 capability however will probably not be omitted by nVidia, because with the NV42 one apparently wants to place the first chip in the lowcost market capable of shader 3.0, similar to the NV34 which is the only shader 2.0 capable chip in the lowcost segment up to now (and thus leads nVidia to a very high share in the total DirectX9-market based on quantities).

But since all of these mainstream- and lowcost-chips of the NV4x-family are not to be expected before midyear and nVidia naturally needs corresponding PCI express graphics cards after the PCI Express launch on 28. march, one will make some of the current graphics chips "PCI Express capable" with the mentioned PCI Express Bridge.

One assigns again new code names, although, like said before, the graphics chips are absolutely not changed by this action: Thus one will sell the NV36- Chip (GeForceFX 5700 /Ultra/SE) bundled with a PCI Express Bridge as NV39 ( this variant was so far known as "NV36X"), while a NV34-Chip (GeForceFX 5200 / Ultra) paired with a PCI Express Bridge results in a NV37 and a NV18-Chip ( GeForce4 MX) paired with a PCI express Bridge in a NV19.

Thus the following supposed nVidia roadmap for the first part of the year 2004 results (the individual chips are arranged after their supposed presentation date and not after their market entrance):

Finally a few words to the NV5X graphics chips series, which will succeed the NV4X-family, based then on the NV50-chip: DirectX-Next ("DirectX 10") should originally come out at the same time as the Windows XP successor "Longhorn". Due to the indicating large delay of Longhorn to possibly the year 2006, DirectX Next could be available earlier. But the specifications of this new DirectX version are at least now not certain.

Thus the big question remains, how long the NV4X-Schiene must continue, before she is replaced from the NV5X-chips, whereby the NV50-chip with large probability already is in development. We consider it practically safe that there will be some "NV48" for by-pass by then, which will be a clock-optimized NV45-variant in the 110nm process. First both large IHVs however will collect 110nm experience with smaller chips. With nVidia this could possibly take place with new mainstream and lowcost graphics chips, based then on the NV45, for the end of the year 2004.

We would like to thank our sources, without naming them. Besides it has to be annotated, that the informations specified in this article are not verified and that some parts result from our own interpretations. The previous lines reflect our present estimate of the situation, however this can change daily of course.

Seite 2 ist fertig übersetzt. Korrekturlesen wäre aber bestimmt sehr nützlich ;)


2004-02-01, 19:17:31
Sorry Aqualon, ich hatte nicht gesehen, daß die Deinen Beitrag aktualisiert hast.

2004-02-11, 15:18:06
Hallo Freunde. Ich bin zwar froh über diese Übersetzungen, aber doch etwas enttäuscht darüber, daß sich keiner die Arbeit des Korrekturlesens gemacht hat und daß die Sache nun schon seit einigen Tagen hier herumgammelt. Ich habe jetzt aus dem nicht korrektur gelesenem den englischen Artikel machen müssen. Irgendwie wollte ich eigentlich diesen Artikel recht schnell nach dem deutschen bringen, daraus ist ja nun aber leider nix geworden.

2004-02-11, 15:28:24
naja beim nächsten mal wie schon erwähnt etwas früher über so etwas wichtiges informieren... ich hätte ja korrektur gelesen, bin aber dafür definitiv nicht geeignet (dann hätte ja noch einer extra drüberlesen müssen).

MfG Maik