PDA

Archiv verlassen und diese Seite im Standarddesign anzeigen : Review des Reviews


aths
2005-08-17, 19:24:01
Bitte durchlesen und schwere sprachliche Fehler korrigieren.

________________________________________________________________

High-End Chip G70: Only flickering AF?


With the launch of the GeForce 7800 GTX, Nvidia showed that the strong performance of its former SM3 Flagship, GeForce 6800 Ultra, may still be topped. Beneath more and improved pipelines, G70 offers higher clock speeds. Usefull new Antialiasing modes were added as well.

In terms of the level of texture quality, however Nvidia seems to evaluate the claims of its high-end customers as fairly low. The new high-end chip G70 seems to produce texture shimmering with activated anisotropic filter (AF). Just as a reminder: the situation with NV40 (GF 6800 series) is the same in standard driver settings, but this can be remedied by activating the "High Quality Mode".

For the G70 chip, the activation of "High Quality", does <I>not</i> bring the desired effect–the card still shows texture shimmering. Did Nvidia already with the NV40-Standard accustomed the user for AF textur flickering–so that their is no longer a real options for the G70-Chip to produce a flickering-free AF?

The anisotropic filter should in fact improve the quality of textures–but on your new card for 500 bucks, the result is texture shimmering. One can hardly consider this a real "qualitiy improvement". Good quality naturally has an impact on the rendering speed. "Performance" meaning "power" is "the amount of work W done per unit of time t", "performance" meaning "act" or "appearance" includes the quality of it. That is why the performance does <I>not</I> increase with such "optimized" AF.

Someone spending some 100 bucks on a new card problebly doesn't want to play without anisotropic filtering. One should also know that the 16x AF mode in the G70 chip, as well as the NV40 chip, renders some areas with 2xAF at max only even thought 4x+ AF would help here to have better textures. 16x AF naturally don't mean to have for the entire image any texture sampled with 16x AF, it depends on the degree of distortion of the resampled texture. But for some angles, even an 16:1-distortion gets only 2x AF resulting in blurry textures there.

That was may be acceptable at times of the R300 (Radeon 9500/9700 series), today this is of course outdated (sadly, the R420-Chip also suffers this restriction. In fact, since NV40 Nvidia mimiced ATIs already poor AF-pattern). For SM3 products like the NV40 or G70 we don't understand such trade-offs regarding the AF quality.

Right at the launch of the GeForce 7800 GTX only one site in the entire net described that problem of flickering textures: The the <a href="http://www.hardware.fr/articles/574-5/nvidia-geforce-7800-gtx.html" target="_blank">article on Hardware.fr</a> by Damien Triolet and Marc Prieur. We from 3DCenter offered for the launch, beside benchmarks in the Quality-mode only (by carelessness, we believed what Nvidia wrote in the Reviewer's Guide) only technical stuff like the improvements in the pixelshader. But as long as this problem with the textures exists on the G70, we don't see a reason to publish more about the shader hardware: First of all, the multitexturing has to be satisfactory, then we could talk about other stuff, too.

In older news we stated the cause for the G70-AF-flickering would be undersampling. We have to owe Demirugs investigations, that we now know the matter is more complicated than initially thought. The necessary texel data is actually read from the cache, <I>all needed texel</I> are read, but they are combined in a wrong way. Nvidias attempt to produce the same quality with less work was a good idea, but unfortunately they failed here. This could be a hardware-bug or a driver error.

After all, it does not matter: The G70, delivering a texelpower of more than 10 Gigatexels uses 2xAF for some angles only–while the user enabled a mode called "16xAF"; plus the tendency to have texture with a tendency to flicker, is an evidence of the G70's incapacity. We also have to criticize Nvidias statement in the Review Guidelines. It reads: (we also quoting the bold typing)

<B>"Quality"</B> mode offers users the highest image quality while still delivering exceptional performance. <B>We recommend all benchmarking be done in this mode.</B>

This setting results in flickering on the NV40 and on G70, but Nvidia called this "the highest image quality". An image which claims to show "quality" (even though without "high") <I>must of course do not tend to texture flickering</I>–but it does on Nvidias SM3 product line. For High Quality, Nvidia tells:

<B>"High Quality"</B> mode is designed to give discriminating users images that do not take advantage of the programmable nature of the texture filtering hardware, and is overkill for everyday gaming. Image quality will be virtually indistinguishable from <B>"Quality"</B> mode, however overall performance will be reduced. Most competitive solutions do not allow this level of control. Quantitative image quality analysis demonstrates that the NVIDIA "Quality" setting produces superior image fidelity to competitive solutions <B>therefore "High Quality" mode is not recommended for benchmarking.</B>

However, we also cannot confirm this: The anisotropic filter of the G70-Chips <I>does</i> flicker also under High quality, the current Radeon cards does <I>not</I> flicker even with "A. I." optimizations enabled. We would be really intereted how Nvidias "quantitative image quality analysis" examines the image quality.




<B>MouseOver 16xAF á la GeForce 6/7 vs. 8xAF á la GeForce 3/4/FX</B>


The image shows a tunnel built of 200 segments (causing the "fan out" effect at the border of the picture). Each color represent a new <a href="http://www.3dcenter.org/artikel/grafikfilter/index3.php" target="_blank">MIP-level</a>. The more near to teh center the new color begin, the more detailed (with higher AF-levels) the texture is being filtered. At several angles like e.g. 22.5°, the MIPs on Geforce 6/7 starting very early (near the border), because only 2xAF is provided for these angles. The Geforce 3/4/FX has an angle-"weakness" at 45° but shows a far more detailed image already at 8xAF - due to the fact that the colored MIPs appear later, more in centre.

Essentially, the GeForce 3/4/FX offers the appointed 8xAF for the most part of the image, while just few parts being treated with 4xAF. In opposite, the GeForce 6/7 offers the appointed 16xAF only for minor parts of the image, only the 90° and 45° angles are really getting 16xAF. Most of the other angles are being filtered far less detailed than the actually enabled 16xAF, though. Large areas of the image are just being filtered with 2x and 4xAF, the overall quality improvement is not that good as 8xAF provides on the GeForce 3/4/FX. The overall quality improvement of NV40's and G70's so-called 16x AF is in fact much littler that 8x on GeForce3-FX. ATI also does not reach the traditional GeForce quality, but at least ATI improved the capabilities of their AF with any real new chip generation (R420 is a beefed-up R300 and no real new generation) while Nvidia lowered the hardware capabilities for their AF implementation comparing to their own previous generation. The Reviewer's Guide is silent about that fact, it only highlights the possibility of 16x since NV40, making the Reviewer think this must be better that 8x provides by traditional GeForce products.

But this angel-depency of the GeForce 6/7' AF adds an extra "sharpness turbulence" to the image: Imagine the case, very good filtered textures (with 16xAF) are close to only weak filtered ones (with 2xAF), which catches even the untrained eye unpleasingly. The geometrical environenment of "World of Warcraft" is a good example to see such artifacts (on both any Radeon and GeForce since NV40.) Shouldn't an SM3-chip be able to do better, at least able to provide the texture quality of 2001's GeForce3?

But this now is very important: This AF-tunnel allows no conclusion concerning underfiltering, it only shows which MIP-level is used when. Normally, you should be able to tell which particular AF-level is used, by comparing the pattern with a LOD-biased pattern. But in the case of undersampling, there are fewer texels sampled than actually needed. For instance, undersampled 8xAF is no true 8xAF. The right MIP-map is chosen indeed, but the wrong amount of texels are used. As already stated, this tunnel only shows which MIP-level is used when, not if the AF is implemented correctly.


==== Seite 2 ====


<B>Videos to demonstrate the effect of undersampling on GeForce 6/7 series graphic cards</B>

Note: If you experience stuttering during video playback, you can lower playback speed by pressing Ctrl+Cursor down in <a href="http://sourceforge.net/project/showfiles.php?group_id=82303&package_id=84358" target="_blank">Media Player Classic</a>. The required codec can be obtained by installing the latest version of <a href="http://www.3dcenter.org/downloads/fraps.php" target="_blank">Fraps</a>. Media Player Classic should be configured to automatically repeat the video. During the first run, the video is supposedly going to stutter, but the next time it should run fluidly.

We also advise to disable the overlay playback and use VMR instead. This way you ensure the best 1:1 rendering of the video stream.

The videos are made by Damien Triolet, we have his permission to publish them here. We'd like to thank him for his efforts. The videos were captured in Unreal Tournament 2003. The first texture layer got full trilinear filtering in HQ mode (or A. I. off) while the other texture layers get reduced trilinear only. Since this video uses a demo with only one texture, it is full trilinear filtered in HQ mode, but real games are using more layers. This looks like a special application "optimization" (meaning quality reduction.) Nvidia is silent about that, since AF test tools show full trilinear on any layer. Remark: Do not trust texture quality test tools since the driver treats games in other ways.


= Neues Mouseover =

Normal: prev_layer1.png, mit MouseOver: prev_layer2.png

[Bildunterschrift] For easier use, these images are scaled down. You can see both picutres in full-res here: [Link, layer1.png und layer2.png sieht] On this images you see the difference of the treatment of texture stage 1 and 2. The primary texture stage still get full trilinear in the HQ mode. This is true for any texture stage if you check it with an texture test tool. In UT however, any non-primary stage got heavily reduced trilinear only.

The videos are not rendered using the standard LOD bias of UT2003, rather using a (correct) LOD bias of 0. This means: If the texture filter works correctly, there shouldn't be any flickering effects. All videos are recorded with 8x AF enabled to render them. They video image size is 1024x768, original speed was 20 fps using a slow motion tool, and the playback speed was set to 30 fps.

We advise you <B>to download just one single video first</B>, to check if your machine can play it appropriately. Both the high video resulution and lossless codec result in a high system load. Therefore we also offer a short description of what can be seen in each video.


"Quality" on a GeForce 6800 results in flickering. Furthermore, one can see the only partially applied trilinear filter: "Flickering bands" are followed by "Blurry bands" (areas where the texture is too blurry). In our opinion, this mode shouldn't be named "Quality.", but Nvidia decided to offer this "quality" as standard and advise do all benchmark with such poorly rendered textures.


"High Quality" on the GeForce 6800 is a borderline case: The textures looks if they are just starting to flicker, while they in fact do not flicker. Like all cards of the NV40 and G70 series, the 6800 also shows angle-dependant differences in sharpness, caused by the inferiour AF pattern compared to the GeForce3-FX series graphic cards.


Nvidia's new card features a by far greater raw texture power than the GeForce 6800, but shows remarkable worse textures as well: The annoying flickering is obvious. According to Nvidia's Reviewer's Guide though, this mode deliver "the highest image quality while still delivering exceptional performance." In our opinon, this quality is too poor to be offered to anyone.


When using the GeForce 7800's "High Quality" mode, flickering is reduced and it now does look better than GeForce 6800's standard mode (which however delivers poor image quality). Yet, the GeForce 6800's just flicker-free HQ mode can not be achieved: The GeForce 7800 can not be configured by the user to use AF without flickering textures.


ATI's Radeon X800, even when using standard settings, seems to be far superior to any GeForce 6800 or 7800 already. There are areas which tend to flicker faintly, but altogether, only the angle-dependant AF reduction in the tunnel is distracting. The GeForce 7800's "High Quality" quality is clearly surpassed.


When turning off A.I. on the X800, no remarkable differences to activated A.I. can be seen.


As reference card, a GeForce FX in "High Quality" mode. This shows us two things: Not all GeForce cards show flickering, see ground an wall textures: They are absolutely "stable." Furthermore, the whole tunnel is textured as sharply as it should be when using 8xAF, because of the superiour AF-implementation.



<B>Conclusion:</B>

ATI's Radeon X800 shows: Even with activated "optimizations" (meaning quality reduction), there are no flickering textures. While there is no full trilinear filtering used, this can not be noticed so quickly. Even though ATI's texture filtering hardware does not compute as exactly as a GeForces', the overall image quality is better, for there are not as many questionable "optimizations." Angle dependency when using AF, however, should not be considered as a feature of modern graphic cards any more, ATI's advertising using "High Definition" gaming can thus be seen as an unfulfilled promise straight from the marketing department.

Nvidia, with its current 7800 series, offers graphic cards that can not be recommended to lovers of texture quality–even though texel performance was increased by a factor of 2.5 compared to the GeForce FX 5800 Ultra! Added to the angle dependency (inspired by ATI's R300), there is now the tending to texture flickering. The GeForce 6800 (or GeForce 6600) has to be configured to use "High Quality" to circumvent texture flickering as much as possible. With the 7800, this seems to be useless, even when using "High Quality", the new chip tends to texture flickering.


The quoted passages from Nvidia's Reviewer's Guide can easily disproved. Nvidia makes claims which are clearly disproved by the upper videos. That means: All benchmarks using standard settings, no matter if GeForce 7800 or 6800, against a Radeon, are wrong: Nvidia offers, at this time, the by far worse AF quality. Radeon standard settings are better (speaking in terms of image quality) than 6800 standard settings, whilst the 7800's standard settings are even worse. Thus, the so called "performance" should not be compared either. One should also not compare 7800 standard vs 6800 standard or 7800 HQ vs 6800 HQ, since the 7800's texture quality is lower. Real "performance" includes on-screen image quality.

What advantage do you have of 16x AF if you get at max 2x AF at certain angles only and if you can have texture flickering while other cards provide textures free of flickering? All benchmarks using the standard setting for NV40 and G70 against the Radeon are <b>invalid</b>, because the Nvidia cards are using general undersampling which can result in texture flickering. The GeForce 7 series cannot be configured to deliver flickering-free AF textures while the GeForce 6 series and the Radeon series can (of course) render flickering-free AF quality.

If there should be any changes with new driver versions, we will try to keep our readers up-to-date.

huha
2005-08-17, 22:47:38
High-End Chip G70: Only flickering AF?


With the launch of the GeForce 7800 GTX, Nvidia showed that the strong performance of its former SM3 flagship, GeForce 6800 Ultra, may still be topped. Beneath more and improved pipelines, the G70 offers higher clock speeds. Useful new anti-aliasing modes were added as well.

In terms of the level of texture quality, however, Nvidia seems to evaluate the claims of its high-end customers as fairly low. The new high-end chip G70 seems to produce texture flickering with activated anisotropic filtering (AF). Just as a reminder: This applies as well to the NV40 (GF 6800 series) in standard driver settings, but can be remedied by activating the "High Quality" mode.

For the G70 chip, the activation of "High Quality" does <I>not</i> bring the desired effect–- the card still shows texture flickering. Was Nvidia already able to accustom the consumer to texture flickering with the NV40 so that their next chip, the G70, does no longer offer options to produce flicker-free AF?

The anisotropic filter should, in fact, improve the quality of textures-– but on your new card for 500 bucks, the result is texture flickering. One can hardly consider this a real "quality improvement." Good quality naturally does have an impact on rendering speed. "Performance" meaning "power" is "the amount of work W done per unit of time t", "performance" meaning "acting" or "appearance" includes the quality of it. That is why the performance does <I>not</I> increase with such "optimized" AF.

Someone spending some 100 bucks on a new card probably doesn't want to play without anisotropic filtering. One should also know that the 16x AF mode in the G70 chip, as well as the NV40 chip, renders some areas using 2xAF at maximum, only even though 4x+ AF would be far more helpful in terms of producing decent textures.

16x AF naturally doesn't mean to have any texture sampled with 16x AF for the entire image, it depends on the degree of distortion of the resampled texture. But for some angles, even an 16:1-distortion gets only 2x AF, resulting in blurry textures there.

That was maybe acceptable at times of the R300 (Radeon 9500/9700 series), today this is of course outdated (sadly, the R420-Chip also suffers this restriction. In fact, since NV40 Nvidia mimicked ATIs already poor AF-pattern). For SM3 products like the NV40 or G70 we don't understand such trade-offs regarding AF quality.

Right at the launch of the GeForce 7800 GTX only one site in the entire net described the problem of flickering textures: The the <a href="http://www.hardware.fr/articles/574-5/nvidia-geforce-7800-gtx.html" target="_blank">article on Hardware.fr</a> by Damien Triolet and Marc Prieur. For the launch, 3DCenter only offered, besides benchmarks in Quality mode (by carelessness, we believed what Nvidia wrote in the Reviewer's Guide), technical aspects like the improvements in the pixel shader. But as long as the texture problem with exists on the G70, we don't see a reason to publish more about the shader hardware: First of all, the multi-texturing has to be satisfactory, then we could write about other stuff, too.

In older news we stated the cause for the G70's AF-flickering would be undersampling. We have to owe Demirug's investigations our current knowledge of the matter being more complicated than initially thought. The necessary texel data is actually read from the cache, <I>all required texels</I> are read, but they are combined in a wrong way. Nvidia's attempt to produce the same quality with less work was a good idea, but unfortunately they failed here. This could be a hardware bug or a driver error.

After all, it does not matter: The G70, delivering a texel power of more than 10 Gigatexels uses 2xAF for some angles only-– while the user enabled a mode called "16xAF"; plus the tendency to have texture with a tendency to flicker, is an evidence of the G70's incapability. We also have to criticize Nvidia's statement in the Review Guidelines, which reads: (we also quote the bold typing)

<B>"Quality"</B> mode offers users the highest image quality while still delivering exceptional performance. <B>We recommend all benchmarking be done in this mode.</B>

This setting results in flickering on the NV40 and on the G70, but Nvidia called this "the highest image quality". An image which claims to show "quality" (even though without "high") <I>must of course not tend to texture flickering</I>-– but it does on Nvidias SM3 product line. For "High Quality", Nvidia tells:

<B>"High Quality"</B> mode is designed to give discriminating users images that do not take advantage of the programmable nature of the texture filtering hardware, and is overkill for everyday gaming. Image quality will be virtually indistinguishable from <B>"Quality"</B> mode, however overall performance will be reduced. Most competitive solutions do not allow this level of control. Quantitative image quality analysis demonstrates that the NVIDIA "Quality" setting produces superior image fidelity to competitive solutions <B>therefore "High Quality" mode is not recommended for benchmarking.</B>

However, we also cannot confirm this: The anisotropic filter of the G70-Chips <I>does</i> also show flickering textures under "High Quality", the current Radeon cards do <I>not</I> flicker even with "A. I." optimizations enabled. We would be really interested how Nvidia's "quantitative image quality analysis" examines the image quality.




MouseOver 16xAF à la GeForce 6/7 vs. 8xAF à la GeForce 3/4/FX


The image shows a tunnel built of 200 segments (causing the "fan out" effect at the border of the picture). Each color represents a new <a href="http://www.3dcenter.org/artikel/grafikfilter/index3.php" target="_blank">MIP level</a>. The nearer to the center the new color begins, the more detailed (with higher AF-levels) the texture is being filtered. At several angles like e.g. 22.5°, the MIPs on Geforce 6/7 start very early (near the border), because only 2xAF is provided for these angles. The Geforce 3/4/FX has an angle-"weakness" at 45° but shows a far more detailed image already at 8xAF-- due to the fact that the colored MIPs appear later, more in center.

Essentially, the GeForce 3/4/FX offers the chosen 8xAF for most of the image, while just few parts are treated with 4xAF. In contrary, the GeForce 6/7 offers (when chosing to use 16xAF) 16xAF only for minor parts of the image, only the 90° and 45° angles are really getting 16xAF. Most of the other angles are being filtered far less detailed than the actually enabled 16xAF, though. Large areas of the image are just filtered with 2x and 4xAF, the overall quality improvement is not that good as 8xAF provides on the GeForce 3/4/FX series. The overal quality improvement of NV40's and G70's so-called 16xAF is, in fact, much less compared to 8xAF on GeForce3-FX. ATI's AF also does not reach the traditional GeForce quality, but at least ATI improved the capabilities of their AF with each real new chip generation (R420 is a beefed-up R300 and no real new generation) while Nvidia lowered the hardware capabilities for their AF implementation comparing to their own previous generation. The Reviewer's Guide is silent about that fact, it only highlights the possibility of 16x since NV40, making the Reviewer think this must be better than 8x provided by older GeForce series.

But this angle-dependency of the GeForce 6/7's AF adds an extra "sharpness turbulence" to the image: Imagine the case of very good filtered textures (with 16xAF) being close to only weak filtered ones (with 2xAF), which even catches the untrained eye unpleasingly. The geometrical environenment of "World of Warcraft" is a good example to see such artifacts (on both any Radeon and GeForce since NV40.) Shouldn't an SM3-chip be able to do better, at least being able to provide the texture quality of 2001's GeForce3?


---

Einige Fehler rausgemacht, Formulierungen überarbeitet. Und mich eine Weile mit aths auseinandergesetzt ;)

-huha

huha
2005-08-17, 23:18:08
<B>Videos to demonstrate the effect of undersampling on GeForce 6/7 series graphic cards</B>

Note: If you experience stuttering during video playback, you can lower playback speed by pressing Ctrl+Cursor down in <a href="http://sourceforge.net/project/showfiles.php?group_id=82303&package_id=84358" target="_blank">Media Player Classic</a>. The required codec can be obtained by installing the latest version of <a href="http://www.3dcenter.org/downloads/fraps.php" target="_blank">Fraps</a>. Media Player Classic should be configured to automatically repeat the video. During the first run, the video is supposedly going to stutter, but the next time it should run fluently.

We also advise to disable the overlay playback and use VMR instead. This ensures (the best) 1:1 rendering of the video stream.

The videos are made by Damien Triolet, we have his permission to publish them here. We'd like to thank him for his efforts. The videos were captured in Unreal Tournament 2003, a game that undergoes a special "optimization" by the ForceWare drivers which can not be turned off. (This is the constraint to reduced trilinear filtering, Nvidia prevents you to play UT2003/2004 with full trilinear filtering. This is also true for the "High Quality" mode and–- as far as we know–- not mentioned in any of Nvidia's Reviewer's Guidelines.)

The videos were not captured using the standard LOD bias of UT2003, but rather using a (correct) LOD bias of 0. This means: If the texture filter works correctly, there shouldn't be any flickering effects. All videos are recorded with 8xAF enabled to render them. The video image size is 1024x768, original speed was 20 fps using a slow motion tool, and the playback speed was set to 30 fps.

We advise you <B>to download just one single video first</B>, to check whether your machine can play it appropriately. The high video resolution and lossless codec result in a high system load. Therefore we also offer a short description of what can be seen in each video.


"Quality" on a GeForce 6800 results in flickering. Furthermore, one can see the only partially applied trilinear filter: "Flickering bands" are followed by "Blurry bands" (areas where the texture is too blurry). In our opinion, this mode shouldn't be named "Quality.", but Nvidia decided to offer this "quality" as standard and advise do all benchmarking with such poorly rendered textures.


"High Quality" on the GeForce 6800 is a borderline case: Textures already tend to flicker, while they actually just do not by a tiny margin. Like all cards of the NV40 and G70 series, the 6800 also shows angle-dependant differences in sharpness, caused by the inferior AF pattern compared to the GeForce3-FX series graphic cards.


Nvidia's new card features a by far greater raw texture power than the GeForce 6800, but shows remarkably worse textures as well: The annoying flickering is obvious. According to Nvidia's Reviewer's Guide, though, this mode delivers "the highest image quality while still delivering exceptional performance." In our opinon, this quality is too poor to be offered to anyone.


When using the GeForce 7800's "High Quality" mode, flickering is reduced and it now does look better than GeForce 6800's standard mode (which, however, delivers poor image quality). Yet, the GeForce 6800's just flicker-free HQ mode can not be achieved: The GeForce 7800 can not be configured by the user to use AF without flickering textures.


ATI's Radeon X800, even when using standard settings, seems to be far superior to any GeForce 6800 or 7800 already. There are areas which tend to flicker faintly, but altogether, only the angle-dependant AF reduction in the tunnel is distracting. The GeForce 7800's "High Quality" quality is clearly surpassed.


When turning off A.I. on the X800, no remarkable differences to activated A.I. can be seen.


As the reference card, a GeForce FX in "High Quality" mode was used. This shows us two things: Not all GeForce cards show flickering, see ground and wall textures: They are absolutely "stable." Furthermore, the whole tunnel is textured as sharply as it should be when using 8xAF, because of the superiour AF implementation.



<B>Conclusion:</B>

ATI's Radeon X800 shows: Even with activated "optimizations" (meaning quality reduction), there are no flickering textures. While there is no full trilinear filtering used, this can not be noticed so quickly. Even though ATI's texture filtering hardware does not compute as exactly as a GeForces', the overall image quality is better, for there are not as many questionable "optimizations." Angle dependency when using AF, however, should not be considered as a feature of modern graphic cards any more, ATI's advertising using "High Definition" gaming can thus be seen as an unfulfilled promise straight from the marketing department.

Nvidia, with its current 7800 series, offers graphic cards that can not be recommended to lovers of texture quality–- even though texel performance was increased by a factor of 2.5 compared to the GeForce FX 5800 Ultra! Added to the angle dependency (inspired by ATI's R300), there is now the tendency to texture flickering. The GeForce 6800 (or GeForce 6600) has to be configured to use "High Quality" to circumvent texture flickering as much as possible. With the 7800, this seems to be useless, even when using "High Quality", the new chip tends to texture flickering.

The quoted passages from Nvidia's Reviewer's Guide can be easily disproved. Nvidia makes claims which are clearly disproved by the upper videos. That means: All benchmarks using standard settings, no matter if GeForce 7800 or 6800, against a Radeon, are wrong: Nvidia offers, at this time, the by far worse AF quality. Radeon standard settings are better (speaking in terms of image quality) than 6800 standard settings, whilst the 7800's standard settings are even worse. Thus, the so called "performance" should not be compared either. One should also not compare 7800 standard vs. 6800 standard or 7800 HQ vs 6800 HQ, since the 7800's texture quality is lower. Real "performance" includes on-screen image quality.

What advantage do you have of 16x AF if you get at max 2x AF at certain angles only and if you can have texture flickering while other cards provide flicker-free textures? All benchmarks using the standard setting for NV40 and G70 against the Radeon are <b>invalid</b>, because the Nvidia cards are using general undersampling which can (and, as we see, does) result in texture flickering. The GeForce 7 series cannot be configured to deliver flicker-free AF textures while the GeForce 6 series and the Radeon series can (of course) render flicker-free AF quality.

If there should be any changes with new driver versions, we will try to keep our readers up-to-date.


---

das war der zweite Teil.

Nicky
2005-08-18, 09:16:39
mal ein paar Vorschlaege
"The anisotropic filter should, in fact, improve the quality of textures-– but on your new card for 500 bucks, the result is texture flickering. This can hardly be considered "quality improvement." Good quality naturally does have an impact on rendering speed"

"16x AF naturally doesn't mean to have "all textures" sampled with 16x AF for the "

"That was maybe acceptable at times " -> That may have been acceptable

"For the launch, 3DCenter only offered (, besides loeschen) benchmarks in Quality mode (by carelessness, we believed what Nvidia wrote in the Reviewer's Guide)(,loeschen) and technical aspects like the improvements in the pixel shader. ("But" loeschen, but nie am Satzanfang gebrauchen) As long as the texture problem (with loeschen)exists on the G70, we don't see a reason to publish more about the shader hardware: Without satisfactory multi texturing we will not write about other stuff ."

"We have to owe Demirug's investigations our current knowledge that the matter is more complicated than initially thought"

"plus the tendency to have texture with a tendency to flicker, is an evidence of the G70's incapability. " incapability to do what? -> incapacity

Rest spaeter wenn Arbeit es wieder zulaesst

moeb1us
2005-08-18, 10:33:27
repost aus dem anderen thread, da hier der review statt finden soll:

ax entspricht der absatzzahl im ersten post des threads

a1 - "end of the flag pole" - ich bezweifle, dass die redewendung im englischen existiert, wäre interessant zu wissen ob es verständlich ist für englisch-sprachige
edit: huhas alternative scheint mir sehr geeignet

a2 - "As a reminder" statt "just to memorize"

a3 - "with the G70-Chip" - kA wie das genau ist, für mich klingt das richtiger (tolle begründung eh ^^)
- "yield" statt "bring" iirc steht das bring eher für physisches bringen, tragen
- "the card is inclined to texture flickering further on."
- "has nvidia accomplished to accustom the user to AF-texture-flickering with the nv40-standard - so that there is no optional flicker-free AF-texturing anymore?"

a4 - "if the textures produced by 500-euro-hardware flicker in the end, you can talk hardly about a "quality improvement"
- "Good quality naturally costs speed."
- "power" statt "achievement" - technische leistung ist hier gemeint
- "The "optimized" AF costs less work - thus, the performed power is not increased." - rise ist auch mehr physisch

a5 - "Anyone willing to spend several 100-euros"
- "probably"
- "It is remarkable that the G70-Chip (as well as the nv40) with activated 16xAF treats some areas as 2xAF at most."
- "today, however, it is no longer up-to-date (the R420-chip suffers under this restriction, too)."
- "Given a SM3 product like the nv40 or G70, such compromises regarding the AF are difficult to understand (if at all).

a6 - "Concerning the G70-flickering, this was solely described by a single site by the time of the 78GTX launch" - zeit am ende iirc
- "in the article by Tridam on hardware.fr"
- "We at 3dc as well offered just descriptions of the improved shader-technology besides benchmarks in quality-mode (we relied on nvidia's reviewer's guide by carelessness)"
- "however, as long as ..."
- "we see no reason to offer full particulars of the shader-technology: as a start, the multitexturing has to function satisfactorily, then you can delve into continuative things."

a7 - "In prior news, we simply mentioned underfiltering (? s.o.) the cause for the G70-AF-flickering."
- "Due to Demirug's investigations we have the insight that the cause may originate elsewhere:"

a8 - "Ultimately, this is irrelevant: given that the G70-Chip, which [...], uses partly only 2xAF dependent of the angle (with appointed 16xAF), and tends towards texture-flickering in addition, is simply an evidence of
incapacity."


und huha, zu deiner version von meinem absatz, ich finde größtenteils nicht das die änderungen gerechtfertigt sind, also border<->margin, nearer<->inlying, start<->commence usw, oder ist das vokabular absichtlich verkleinert worden? "more in centre" ist auch feste wendung statt "more in center", ist halt ein technischer ausdruck :)
weiterhin halte ich appointed für geeigneter als chosen, das hat nich den "auserwählt" beigeschmack

"the overall quality improvement is not that good as 8xAF provides on the GeForce 3/4/FX series" - as good as und provided by

ATI's AF also does not reach the traditional GeForce quality, but at least ATI improved the capabilities of their AF with each real new chip generation (R420 is a beefed-up R300 and no real new generation) while Nvidia lowered the hardware capabilities for their AF implementation comparing to their own previous generation. The Reviewer's Guide is silent about that fact, it only highlights the possibility of 16x since NV40, making the Reviewer think this must be better than 8x provided by older GeForce series.

ehm und wo hast du das gesehen im originaltext? oO

naja will ned rumzicken hrhr aber finde meine version ist hinreichend, hätte lieber von fachleuten begriffe wie "unterfilterung" und "unruhe" usw geklärt ;)

huha
2005-08-18, 14:07:43
moeb1us:

Ich kann nichts dafür, daß aths die Übersetzungen nimmt, Konstruktionen auseinanderreißt und eigenen Inhalt hinzufügt, das danach hier postet und im Laufe der Zeit wieder ändert.
Mein Text entspricht also der verbesserten Version dessen, was hier anfänglich von aths gepostet wurde.

-huha

btw: Da mein Text relativ nutzlos geworden ist, da aths wieder das halbe Teil geändert hat, mache ich mich jetzt daran, aths' aktuellen Text zu bearbeiten.

huha
2005-08-18, 14:56:08
High-End Chip G70: Only flickering AF?


With the launch of the GeForce 7800 GTX, Nvidia showed that the strong performance of its former SM3 flagship, GeForce 6800 Ultra, may still be topped. Beneath more and improved pipelines, the G70 offers higher clock speeds. Useful new anti-aliasing modes were added as well.

In terms of the level of texture quality, however, Nvidia seems to evaluate the claims of its high-end customers as fairly low. The new high-end chip G70 seems to produce texture flickering with activated anisotropic filtering (AF). Just as a reminder: This applies to the NV40 (GF 6800 series) in standard driver settings as well, but this can be remedied by activating the "High Quality" mode.

For the G70 chip, the activation of "High Quality" does <I>not</i> bring the desired effect–- the card still shows texture flickering. Was Nvidia already able to accustom the consumer to texture flickering with the NV40 so that their next chip, the G70, does no longer offer options to produce flicker-free AF?

The anisotropic filter should, in fact, improve the quality of textures–- but on your new card for 500 bucks, the result is texture flickering. One can hardly consider this a real "qualitiy improvement". Good quality naturally does have an impact on the rendering speed, however, "performance", meaning "power", is "the amount of work W done per unit of time t", "performance", meaning "acting" or "appearance", includes the quality of it. That is why the performance does <I>not</I> increase with such "optimized" AF.

Someone spending some 100 bucks on a new card problably doesn't want to play without anisotropic filtering. One should also know that the 16x AF mode in the G70 chip, as well as the NV40 chip, renders some areas with 2x AF at maximmum, only even though consistent 4x+ AF would be far more helpful in terms of producing decent-looking textures. 16x AF naturally doesn't mean to have every texture sampled with 16x AF for the entire image, it depends on the degree of distortion of the resampled texture. But for some angles, even a 16:1-distortion gets only 2x AF, resulting in blurry textures there.

That was probably acceptable at times of the R300 (Radeon 9500/9700 series), today this is of course outdated (sadly, the R420-Chip also suffers this restriction. In fact, since NV40 Nvidia mimicked ATIs already poor AF pattern). For SM3 products like the NV40 or G70, we don't understand such trade-offs regarding AF quality.

Right at the launch of the GeForce 7800 GTX only one site in the entire net described the problem of flickering textures: The <a href="http://www.hardware.fr/articles/574-5/nvidia-geforce-7800-gtx.html" target="_blank">article on Hardware.fr</a> by Damien Triolet and Marc Prieur.
For the launch, 3DCenter only wrote about, besides benchmarks done in "Quality" mode (by carelessness, we believed what Nvidia wrote in their Reviewer's Guide), technical aspects like the improvements in the pixel shader.But as long as the G70's texture problem exists, we do not see any reason to publish more about the shader hardware: First of all, the implementation of multi-texturing has to be satisfactory, then we could write about other stuff, too.

In older news we stated the cause for the G70's AF-flickering would be undersampling. We have to owe Demirug's investigtions our current knowledge of the matter being more complicated than initially thought. The necessary texel data is actually read from the cache, <I>all required texels</I> are read, but they are combined in a wrong way. Nvidia's attempt to produce the same quality with less work was a good idea, but unfortunately they failed here. This could be a hardware bug or a driver error.

After all, it does not matter: The G70, delivering a texel power of more than 10 Gigatexels, uses 2x AF for some angles only–- while the user enabled a mode called "16xAF"; plus the tendency to have textures which tend to flicker, is an evidence of the G70's incapacity. We also have to criticize Nvidias statement in the Review Guidelines, which reads (we also quote the bold typing):

<B>"Quality"</B> mode offers users the highest image quality while still delivering exceptional performance. <B>We recommend all benchmarking be done in this mode.</B>

This setting results in flickering on the NV40 and on G70, but Nvidia called this "the highest image quality". An image which claims to show "quality" (even though without "high") <I>must of course do not tend to texture flickering</I>–- but it does on Nvidia's SM3 product line. For High Quality, Nvidia writes:

<B>"High Quality"</B> mode is designed to give discriminating users images that do not take advantage of the programmable nature of the texture filtering hardware, and is overkill for everyday gaming. Image quality will be virtually indistinguishable from <B>"Quality"</B> mode, however overall performance will be reduced. Most competitive solutions do not allow this level of control. Quantitative image quality analysis demonstrates that the NVIDIA "Quality" setting produces superior image fidelity to competitive solutions <B>therefore "High Quality" mode is not recommended for benchmarking.</B>

However, we also cannot confirm this: The anisotropic filter of the G70 chips <I>does</i> also show flickering textures under "High Quality", the current Radeon cards do <I>not</I> flicker even with "A. I." optimizations enabled. We would be really interested how Nvidias "quantitative image quality analysis" examines the image quality.




<B>MouseOver 16xAF à la GeForce 6/7 vs. 8xAF à la GeForce 3/4/FX</B>


The image shows a tunnel built of 200 segments (causing the "fan out" effect at the border of the picture). Each color represents a new <a href="http://www.3dcenter.org/artikel/grafikfilter/index3.php" target="_blank">MIP level</a>. The nearer to the center the new color begins, the more detailed (i.e. with higher AF levels) the texture is being filtered. At several angles like e.g. 22.5°, the MIPs on Geforce 6/7 start very early (near the border), because only 2x AF is provided for these angles. The Geforce 3/4/FX has an angle-"weakness" at 45° but shows a far more detailed image already at 8x AF-- due to the fact that the colored MIPs appear later, more in center.

Essentially, the GeForce 3/4/FX offers the selected 8x AF for most of the image, while just a few parts are treated with 4x AF. In contrary, the GeForce 6/7 just offers the selected 16x AF for minor parts of the image, only the 90° and 45° angles are really getting 16x AF. Most of the other angles are being filtered far less detailed than the actually enabled 16xAF, though. Large areas of the image are just filtered with 2x and 4x AF, the overall quality improvement is not as good as 8x AF provided on the GeForce 3/4/FX. The overal quality improvement of the NV40's and G70's so-called 16x AF is, in fact, much less compared to 8x AF on the GeForce3-FX series. ATI's AF also does not reach the traditional GeForce quality, but at least ATI improved the capabilities of their AF with each real new chip generation (R420 is a beefed-up R300 and no really new generation), while Nvidia lowered the hardware capabilities for their AF implementation compared to their own previous generation. The Reviewer's Guide is silent about that fact, it only highlights the possibility of 16x since NV40, making the Reviewer think this must be better than 8x provided by older GeForce series.

The angle-dependency of the GeForce 6/7's AF adds an extra "sharpness turbulence" to the image: Imagine the case of very well filtered textures (with 16x AF) being close to only weakly filtered ones (with 2x AF), which even catches the untrained eye unpleasingly. The geometrical environment of "World of Warcraft" is a good example to see such artifacts (on both any Radeon and GeForce since NV40.) Shouldn't a SM3 chip be able to do better, at least be able to provide the texture quality of 2001's GeForce3?

A very imporant note: This AF tunnel does not allow any conclusion regarding underfiltering, it only shows which MIP level is used and when (or where, for that matter) it is used. Normally, you should be able to tell which particular AF level is used by comparing the pattern with a LOD-biased pattern. But in the case of undersampling, there are fewer texels sampled than actually required. For instance, undersampled 8x AF is no "true" 8x AF. The right MIP map is chosen indeed, but the wrong amound of texels are used. As already stated, this tunnel only shows which MIP level is used and when, not whether the AF is implemented correctly.

---

Dieses war der erste Streich (Seite), und der zweite folgt zugleich :uup:

-huha

Nicky
2005-08-18, 15:04:41
But this angel-dependency -> angle

Nicky
2005-08-18, 15:12:02
But this angel-dependency -> angle
Bitte auch das But loeschen, "but" am Anfang des Satzes ist ein Unding im Englischen genauso wie "one can" oder "one should". Im Notfall ist eher ein Passiv (auch nicht so toll) oder eine Form wie "You should" oder ein Imperatif anzuwenden.

huha
2005-08-18, 15:27:06
<B>Videos to demonstrate the effect of undersampling on GeForce 6/7 series graphic cards</B>

Note: If you experience stuttering during video playback, you can lower playback speed by pressing Ctrl+Cursor down in <a href="http://sourceforge.net/project/showfiles.php?group_id=82303&package_id=84358" target="_blank">Media Player Classic</a>. The required codec can be obtained by installing the latest version of <a href="http://www.3dcenter.org/downloads/fraps.php" target="_blank">Fraps</a>. Media Player Classic should be configured to automatically repeat the video. During the first run, the video is supposedly going to stutter, but the next time it should run fluently.

We also advise to disable the overlay playback and use VMR instead. This ensures (the best) 1:1 rendering of the video stream.

The videos are made by Damien Triolet, we have his permission to publish them here. We'd like to thank him for his efforts. The videos were captured in Unreal Tournament 2003. The first texture layer got full trilinear filtering in HQ mode (or A.I. off), while the other texture layers get reduced trilinear filtering only. Since this video uses a demo with only one texture, it is fully trilinear filtered in HQ mode. However, real games use more texture layers. This looks like a special application "optimization" (meaning quality reduction.) Nvidia is silent about that, since AF test tools show full trilinear filtering on every layer. Remark: Do not trust texture quality test tools since the driver treats games differently.


= Neues Mouseover =

Normal: prev_layer1.png, mit MouseOver: prev_layer2.png

[Bildunterschrift]For easier use, these images are scaled down. You can see both picutres in full resolution here: [Link, layer1.png und layer2.png sieht]You can see the difference texture stage 1's and 2's treatment on these images. The primary texture stage still gets full trilinear filtering in "High Quality" mode. This applies for any texture stage if you check it with a texture test tool. In UT, however, any non-primary stage gets heavily reduced trilinear filtering only.

The videos were not rendered using the standard LOD bias of UT2003, but rather using a (correct) LOD bias of 0. This means: If the texture filter works correctly, there shouldn't be any flickering effects. All videos are recorded with 8x AF enabled to render them. They video image size is 1024x768, original speed was 20 fps using a slow motion tool, and the playback speed was set to 30 fps.

We advise you <B>to download just one single video first</B>, to check whether your machine can play it appropriately. The high video resulution and lossless codec result in a high system load. Therefore we also offer a short description of what can be seen in each video.


"Quality" on a GeForce 6800 results in flickering. Furthermore, one can see the only partially applied trilinear filter: "Flickering bands" are followed by "blurry bands" (areas where the texture is too blurry). In our opinion, this mode shouldn't be named "Quality", but Nvidia decided to offer this "quality" as standard and advise to do all benchmarking with such poorly rendered textures.


"High Quality" on the GeForce 6800 is a borderline case: The textures look if they are just starting to flicker, while they in actually just don't by a tiny margin. Like all cards of the NV40 and G70 series, the 6800 also shows angle-dependant differences in sharpness, caused by the inferior AF pattern compared to GeForce3-FX series graphic cards.


Nvidia's new card features a by far greater raw texture power than the GeForce 6800, but shows remarkably worse textures as well: The annoying flickering is obvious. According to Nvidia's Reviewer's Guide, though, this mode delivers "the highest image quality while still delivering exceptional performance." In our opinon, this quality is too poor to be offered to anyone.


When using the GeForce 7800's "High Quality" mode, flickering is reduced and it now does look better than the GeForce 6800's standard mode (which, however, delivers poor image quality). Yet, the GeForce 6800's just flicker-free HQ mode can not be achieved: The GeForce 7800 can not be configured by the user to use AF without flickering textures.


ATI's Radeon X800, even when using standard settings, seems to be far superior to any GeForce 6800 or 7800 already. There are areas which tend to flicker faintly, but altogether, only the angle-dependant AF reduction in the tunnel is distracting. The GeForce 7800's "High Quality" quality is clearly surpassed.


When turning off A.I. on the X800, no remarkable differences to activated A.I. can be seen.


As the reference card, a GeForce FX in "High Quality" mode was used. This shows us two things: Not all GeForce cards show flickering, see ground and wall textures: They are absolutely "stable." Furthermore, the whole tunnel is textured as sharply as it should be when using 8x AF, because of the superior AF implementation.



<B>Conclusion:</B>

ATI's Radeon X800 shows: Even with activated "optimizations" (meaning quality reduction), there are no flickering textures. While there is no full trilinear filtering used, this can not be noticed so quickly. Even though ATI's texture filtering hardware does not compute as exactly as a GeForces', the overall image quality is better, for there are not as many questionable "optimizations." Angle dependency when using AF, however, should not be considered as a feature of modern graphic cards any more, ATI's advertising using "High Definition" gaming can thus be seen as an unfulfilled promise straight from the marketing department.

Nvidia, with its current 7800 series, offers graphic cards that can not be recommended to lovers of texture quality–- even though texel performance was increased by a factor of 2.5 compared to the GeForce FX 5800 Ultra! Added to the angle dependency (inspired by ATI's R300), there is now the tendency to texture flickering. The GeForce 6800 (or GeForce 6600) has to be configured to use "High Quality" to circumvent texture flickering as much as possible. With the 7800, this seems to be useless; even when using "High Quality", the new chip tends to texture flickering.


The quoted passages from Nvidia's Reviewer's Guide can be easily disproved. Nvidia makes claims which are clearly disproved by the upper videos. That means: All benchmarks using standard settings, no matter if GeForce 7800 or 6800, against a Radeon, are wrong: Nvidia offers, at this time, the by far worse AF quality. Radeon standard settings are better (speaking in terms of image quality) than 6800 standard settings, whilst the 7800's standard settings are even worse. Thus, the so called "performance" should not be compared either. One should also not compare 7800 standard vs. 6800 standard or 7800 HQ vs 6800 HQ, since the 7800's texture quality is lower. Real "performance" includes on-screen image quality.

What advantage do you have of 16x AF if you get 2x AF at maximum at certain angles only, are exposed to texture flickering while other cards provide flicker-free textures? All benchmarks using the standard setting for NV40 and G70 against the Radeon are <b>invalid</b>, because the Nvidia cards are using general undersampling which can (and does) result in texture flickering. The GeForce 7 series cannot be configured to deliver flicker-free AF textures while the GeForce 6 series and the Radeon series can (of course) render flicker-free AF quality.

If there should be any changes with new driver versions, we will try to keep our readers up-to-date.


---

So, fertig.
-huha

huha
2005-08-18, 15:29:27
Nicki: Beides im oberen Text korrigiert. Ich halte mich mittlerweile eher an die Vorlage, weil's dann einfach leichter ist, Zeug zu verbessern, das der Autor mittlerweile hinzugefügt hat.

-huha

moeb1us
2005-08-18, 20:54:59
im nachbarthread wurde glaubhaft versichert, statt flickering die übliche formulierung "shimmering" zu verwenden

ot: @huha: seh ja jetz erst bewusst dein avatar hehe mein mitbewohner ist auch ein narutofreak der alle 15x folgen gesehen hat, was mich quasi mitbeeinflusst -g-, er sagte aber was das es schwieriger geworden sei an neue folgen zu kommen, wohl aus lizenztechnischen gründen.

mfg, moe

eXodia
2005-08-18, 21:11:47
the by far worse AF quality - ganz unten in Teil B

Imo müsste das worsest heißen.

GrUß

Nicky
2005-08-18, 23:39:03
Bad - Worse -Worst
ergo worst

huha
2005-08-18, 23:54:47
Bad - Worse -Worst
ergo worst

Nein, in diesem Fall heißt es wirklich worse. Es geht nicht um die allerschlechteste AF-Qualität seit Menschengedenken, sondern nur um eine, die weitaus schlechter ist als die der vorherigen Serie.

-huha

Nicky
2005-08-19, 00:15:12
Nein, in diesem Fall heißt es wirklich worse. Es geht nicht um die allerschlechteste AF-Qualität seit Menschengedenken, sondern nur um eine, die weitaus schlechter ist als die der vorherigen Serie.

-huha
ok got it, frage mich allerdings ob der Satzbau dann nicht Denglisch ist, IMO ist das the dann fehl am Platz, bin mir aber nicht sicher.
Noch was: du gebrauchst in besagtem Abschnitt zweimal hintereinander "disproved", ich wuerde das erste durch "disputed" ersetzen.

Mike
2005-08-19, 13:14:12
Ich würde wohl "the by" weglassen? Nvidia offers far worse AF quality. Wobei es mit by auch gehen müsste.
Oder halt Nvidia offers, at this time, by far the worst AF quality [von den aktuell erhältlichen Grafikchips].

eXodia
2005-08-19, 14:03:04
ok got it, frage mich allerdings ob der Satzbau dann nicht Denglisch ist, IMO ist das the dann fehl am Platz, bin mir aber nicht sicher.
Noch was: du gebrauchst in besagtem Abschnitt zweimal hintereinander "disproved", ich wuerde das erste durch "disputed" ersetzen.

Ich hab mir sagen lassen das es im englisch-sprachigem-raum völlig normal sei wörter doppelt zu benutzen. 2 Unterschiedliche sind natürlich trotzdem schöner ;)

GrUß

Nicky
2005-08-19, 14:59:18
@Gotteshand
dazu ein klares Jein ;) du beziehst dich IMO eher aufs Amerikanische

In diesem speziellen Falle allerdings:
disputed: Anfechtung der Behauptung seitens nV
disproved: die Anfechtung belegt
Gruß