Page 1 of 2 1 2 LastLast
Results 1 to 15 of 23

Thread: NVIDIA Q & A

  1. #1
    Your Friendly Editor
    Join Date
    Oct 2002
    Posts
    82

    NVIDIA Q & A

    SysOpt's Illustrious Editor

  2. #2
    Ultimate Member Rugor's Avatar
    Join Date
    Dec 2001
    Location
    Pacific Northwest, Earth
    Posts
    2,694
    I wish someone had asked him why Ps 1.4 and PS 2.0 weren't important when they first came out but PS 3.0 is.
    "Dude you're getting a Dell." Obscure curse from the early 21st Century, ascribed to a minor demon-spirit known as "Stephen?" [sp].

  3. #3
    Ultimate Member Cyan's Avatar
    Join Date
    Sep 2000
    Location
    Salt Lake City
    Posts
    1,327
    Originally posted by Rugor
    I wish someone had asked him why Ps 1.4 and PS 2.0 weren't important when they first came out but PS 3.0 is.
    Because today, we whine alot about image quality - something we didn't used to do.

  4. #4
    Ultimate Member Rugor's Avatar
    Join Date
    Dec 2001
    Location
    Pacific Northwest, Earth
    Posts
    2,694
    Ok, just checking, I knew it couldn't have been because Nvidia didn't introduce or support those two models very well for a long time after they came out.
    "Dude you're getting a Dell." Obscure curse from the early 21st Century, ascribed to a minor demon-spirit known as "Stephen?" [sp].

  5. #5
    Ultimate Member
    Join Date
    Sep 2001
    Posts
    18,631
    Originally posted by Rugor
    I wish someone had asked him why Ps 1.4 and PS 2.0 weren't important when they first came out but PS 3.0 is.
    We werent allowed to pose questions which could be deemed 'confrontational' sadly

    --Jakk

  6. #6
    Banned
    Join Date
    Nov 2001
    Posts
    29
    Originally posted by Rugor

    I wish someone had asked him why Ps 1.4 and PS 2.0 weren't important when they first came out but PS 3.0 is.
    You don't even need him for that, try something else worth asking.

    PS_1.4 = At that time a GF3 with PS_1.1 could perform pixel shader ops in what available games at best 30FPS, a GF3's killer --> R8500 with PS_1.4 performed at 1/2 to 1/3 that of a GF3. 10FPS is useless for games and only good for cartoon liked demos. GF3's killer --> R8500 is still exactly the same now in comparison to any GF3 for PS_1.4 ops, no driver miracles will fix that..

    PS_2.0 = Broken craps, as nothing, and exactly nothing, not even PS_1.1 but miracles can do anything with the crapped pixels below. My super-duper 9700 pro lacked the very simplest base hardware feature --> PERSPECTIVELY CORRECT TEXTURE MAPPING ANISOTROPIC. But for plain old old Trilinear filtering, even OLD GF2 had the proper perpective filtering.
    .
    .


    Pixel Shaders don't like 2D_side_scroller QUAKITROPIC.... PERIOD. PS_2.0 currently is good for BLURRY Trilinear craps at 24-bit accuracy, for cartoonist arts, and what a benchmark supprise? WATER.

    Nalu is a complete fiction made believable realistic.

    Ruby the human immitation made into a cartoonist dream.
    Last edited by nam_ng; 05-12-2004 at 07:43 PM.

  7. #7
    Ultimate Member Rugor's Avatar
    Join Date
    Dec 2001
    Location
    Pacific Northwest, Earth
    Posts
    2,694
    Let me try and respond:

    PS_1.4 = At that time a GF3 with PS_1.1 could perform pixel shader ops in what available games at best 30FPS, a GF3's killer --> R8500 with PS_1.4 performed at 1/2 to 1/3 that of a GF3. 10FPS is useless for games and only good for cartoon liked demos. GF3's killer --> R8500 is still exactly the same now in comparison to any GF3 for PS_1.4 ops.
    I don't know your source for saying the R8500 had 1/2 to 1/3 the performance of the Gf3. All the references I have seen show it performing roughly on par with a Gf3 on release and the late drivers brought its speed up to near parity with a Gf4 Ti4200. It was a bit slower, but not a lot. Since the Gf3 did not support PS_1.4 it would always run any game with multiple shader paths on the PS_1.1 path. A Gf4Ti would run either the PS_1.1 or PS_1.3 path. Neither had hardware support for PS_1.4 and so would never run a PS_1.4 path. In fact, most benchmarks would compare both cards running a PS_1.1 path which shows nothing of PS_1.4 performance.

    Regardless of the differences, very few games ever supported PS_1.4, and so the 8500's hardware support never came to much. Still, the comparative performance between a Radeon 8500 and a Gf3 when running PS_1.4 shaders hasn't changed. The Radeon can run them, the Gf3 can't.


    PS_2.0 = Broken craps, as nothing, and exactly nothing, not even PS_1.1 but miracles can do anything with the crapped pixels below. My super-duper 9700 pro lacked the very simplest base hardware feature --> PERSPECTIVELY CORRECT TEXTURE MAPPING ANISOTROPIC. But for plain old old Trilinear filtering, even OLD GF2 had the proper perpective filtering.
    Yes the Radeon 9700 Pro uses an adaptive algorithm for anisotropic filtering. This has never been a secret and was a deliberate choice on ATI's part in order to boost performance. It was a compromise between image quality and performance that seems to have worked well enough that Nvidia has gone to an adaptive mode for the NV40. Still, it's well known that this choice has cost ATI sales.

    However, I fail to see the relevance between ATI's imperfect AF implementation and PS_2.0 support, since ATI's AF is going to be exactly the same whether the application uses shaders or not. It's a valid complaint against ATI, but it has nothing to do with the question I asked.

    The fact remains that many more cards have been shipped with PS_1.4 and PS_2.0 support than PS_3.0 support, but PS_3.0 is the one that Nvidia is saying is important to take up immediately.
    "Dude you're getting a Dell." Obscure curse from the early 21st Century, ascribed to a minor demon-spirit known as "Stephen?" [sp].

  8. #8
    Banned
    Join Date
    Nov 2001
    Posts
    29
    Originally posted by Rugor
    [B]Let me try and respond:

    I don't know your source for saying the R8500 had 1/2 to 1/3 the performance of the Gf3. All the references I have seen show it performing roughly on par with a Gf3 on release and the late drivers brought its speed up to near parity with a Gf4 Ti4200. It was a bit slower, but not a lot. Since the Gf3 did not support PS_1.4 it would always run any game with multiple shader paths on the PS_1.1 path. A Gf4Ti would run either the PS_1.1 or PS_1.3 path. Neither had hardware support for PS_1.4 and so would never run a PS_1.4 path. In fact, most benchmarks would compare both cards running a PS_1.1 path which shows nothing of PS_1.4 performance.
    I needed no other sources, I've a brain, had an 8500, and I worked on graphic hardware for a living.

    Regardless of the differences, very few games ever supported PS_1.4, and so the 8500's hardware support never came to much. Still, the comparative performance between a Radeon 8500 and a Gf3 when running PS_1.4 shaders hasn't changed. The Radeon can run them, the Gf3 can't.
    No one bothered because 8500 can't do PS_1.4 worthed a sh*t, and still can't do it worthed a sh*t, except for cartoons. Was it because nVIDIA couldn't do dependant features? Nope, process technology was insufficient for the feature.

    Yes the Radeon 9700 Pro uses an adaptive algorithm for anisotropic filtering. This has never been a secret and was a deliberate choice on ATI's part in order to boost performance. It was a compromise between image quality and performance that seems to have worked well enough that Nvidia has gone to an adaptive mode for the NV40. Still, it's well known that this choice has cost ATI sales.
    All Anisotropic filters are adaptive. Their job is to be adaptive to non-isotropic surfaces. nVIDIA just provided what other dumbsh*ts wanted - the 2D_side_scroller adaptive version besides their own superior version.
    However, I fail to see the relevance between ATI's imperfect AF implementation and PS_2.0 support, since ATI's AF is going to be exactly the same whether the application uses shaders or not. It's a valid complaint against ATI, but it has nothing to do with the question I asked.
    Nah, all PS_2.0 pixel shaded 3D objects had to have Trilinear hacks per object. All of them are BLURRY CRAPS at 24-bit accuracy.

    The fact remains that many more cards have been shipped with PS_1.4 and PS_2.0 support than PS_3.0 support, but PS_3.0 is the one that Nvidia is saying is important to take up immediately.
    The facts remains that all ignorant dumbsh*ts waved pom-pom and not good for much else.
    Last edited by nam_ng; 05-12-2004 at 08:20 PM.

  9. #9
    Banned
    Join Date
    Mar 2004
    Posts
    5
    and when is it comming out???

  10. #10
    Ultimate Member Rugor's Avatar
    Join Date
    Dec 2001
    Location
    Pacific Northwest, Earth
    Posts
    2,694
    A few points:

    PS_2.0 requires a minimum of FP24 for so-called "full precision." ATI runs it right at that minimum, as they figured it was the best compromise of speed, accuracy and transistor count for their cards. Nvidia chose a different compromise: FP32 full precision and FP16 partial precision.

    When the FX5800 Ultra was originally released, before Nvidia started aggressively optimizing their drivers, it was fully capable of running PS_2.0 in 32-bit precision with trilinear anisotropic filtering.

    PS_2.0 is not always blurry, not always FP24, and does not always use hacked trilinear. Even ATI's much maligned implementation is capable of performing trilinear anisotropic filtering on all texture stages on the entire screen. The catch is that different degrees of anisotropy are applied at certain angles where it's harder to notice. It's not a perfect solution, but it works for most people.

    However, once again, neither trilinear nor anisotropic filtering modes are dependent on the Pixe Shader model in use, and any attempt to link ATI's trilinear implementation to PS_2.0 is at best incorrect and at worst misleading.

    Also, if the 24-bit floating point precision of PS_2.0 generates "blurry cr*p" then what about previous pixel shader viersions that ran at 12-bit integer precision. Shouldn't that generate an even blurrier image with worse trilinear?

    Also Nvidia often forced 16-bit precision in PS_2.0 even when 24-bit or higher was called for, but still used full trilinear (before it was disabled through the driver), how was this possible?

    I still want to know why Nvidia believes we need to take up PS_3.0 immediately when not only is PS_2.0 much more widely spread but, thanks to the FX5200, the majority of the PS_2.0 capable install base is Nvidia cards.
    "Dude you're getting a Dell." Obscure curse from the early 21st Century, ascribed to a minor demon-spirit known as "Stephen?" [sp].

  11. #11
    Ultimate Member Someone Stupid's Avatar
    Join Date
    Oct 2002
    Posts
    3,133
    nam, cool off. Make your points, but don't talk to them like the other person is completely ignorant because you disagree. Loosing the aggression would help you get your points across. You could be completely right and you could have made your argument is the post above, but with post like that, nobody will read them eventually. Chill out a bit or you'll get the thread closed.

  12. #12
    Banned
    Join Date
    Nov 2001
    Posts
    29
    Originally posted by Someone Stupid

    nam, cool off. Make your points, but don't talk to them like the other person is completely ignorant because you disagree. Loosing the aggression would help you get your points across. You could be completely right and you could have made your argument is the post above, but with post like that, nobody will read them eventually. Chill out a bit or you'll get the thread closed.
    You're right, I'll stop talking now.

    I supposed the webmasters of this site had invited the nVIDIA's manager here to answer questions with superior English from pom-pom waving ignorant stupid f*ck fanATIcs, and there're people here who actually wanted to read them.
    Last edited by nam_ng; 05-13-2004 at 07:06 AM.

  13. #13
    Ultimate Member MadPistol's Avatar
    Join Date
    Aug 2003
    Location
    In your thoughts, or AL
    Posts
    1,359
    Now you just sound stupid. Cool down a little. The cards aren't even out yet, and you're getting mad over them. You probably need to watch the cursing too. that gets you banned.

  14. #14
    Your Friendly Editor
    Join Date
    Oct 2002
    Posts
    82
    <<You're right, I'll stop talking now. >>

    Please do. While you may know a thing or two about graphics hardware, don't presume to know why I do things on this site outside of the forums. If you've ever worked in print or web publishing you would know that just because you have a question for a company doesn't mean they have to answer to you.
    SysOpt's Illustrious Editor

  15. #15
    Banned
    Join Date
    Nov 2001
    Posts
    29
    Originally posted by MadPistol

    Now you just sound stupid.
    I supposed yours is filled with intelligence.
    Cool down a little. The cards aren't even out yet, and you're getting mad over them. You probably need to watch the cursing too. that gets you banned.
    And what else is new? Yes, it is normal in most places to be pom-pom waving ignorant stupid f*ck fanATIcs.

    However talking to pom-pom waving ignorant stupid f*ck fanATIcs is cursing and a heinous crime.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •