Page 4 of 6 FirstFirst 1 2 3 4 5 6 LastLast
Results 46 to 60 of 79

Thread: ATI Driver Cheating???

  1. #46
    Ultimate Member Strawbs's Avatar
    Join Date
    Sep 2001
    Posts
    4,706
    Whilst I agree the end result seems to be better or equal IQ without any performance hit. The fact that ATi appear to not only have missed a golden opportunity to market this technology as a triumph, and tried to hide it from the reviewers, has me smelling ...something or other. The Q&A's don't give a valid reason for that and this whole episode is very strange.

    I'm now left wondering if they intended to market this later as a "New" feature, or whether this particular method is not yet patented, or whether it may be a method that nVidia\anyothercompetitor could copy without the need of a license, or maybe it's part of ATi's "Secret Recipe" uncovered.
    Last edited by Strawbs; 05-25-2004 at 05:42 PM.

  2. #47
    Ultimate Member morpheus kain's Avatar
    Join Date
    Nov 2001
    Location
    My Own Personal Hell
    Posts
    1,440
    To hell with it ATI all the way. Better IQ AND better framerate!
    -"Don't touch that!!!!!" -ZAPPPPP!- Hehe yet another excuse to upgrade-

  3. #48
    Banned zybch's Avatar
    Join Date
    Jun 2002
    Posts
    2,292
    I'm not sure they purposfully hid it, perhaps they just never thought that people would cry foul about tricksy filters when the quality remained the same but speed increases. Some people are never happy.
    They should have marketed it as a feature though.

    I know that iD first demoed doom3 on a 1.5yr old radeon which should mean that it'l play fine on most cards less than a year old, but I read somewhere that Open GL wasn't as fast on ATI than on nVidia.

  4. #49
    Ultimate Member Rugor's Avatar
    Join Date
    Dec 2001
    Location
    Pacific Northwest, Earth
    Posts
    2,694
    ATI is on record as saying that they are awaiting a patent on the algorithm used, and are doing it in software. They will not release the details of how they are doing it in order to protect their intellectual property.
    "Dude you're getting a Dell." Obscure curse from the early 21st Century, ascribed to a minor demon-spirit known as "Stephen?" [sp].

  5. #50
    Ultimate Member Vampiel's Avatar
    Join Date
    Sep 1999
    Location
    Dark side of the house
    Posts
    2,760
    Originally posted by morpheus kain
    Better IQ AND better framerate!
    Your basing that on one single generation of cards. If you look at the history, NVIDIA has them hands down.

  6. #51
    Ultimate Member morpheus kain's Avatar
    Join Date
    Nov 2001
    Location
    My Own Personal Hell
    Posts
    1,440
    Actually it's two generations of cards (97/98's and new X8's) thank you very much AND I'm not saying that's a permanent decision just something that applies to the current situation. The tech market changes so fast it would be stupid to make a decision about anything in the future based on the past.

    Also ATI is trying to keep this under warps to protect their won intellectual porperty like Rugor said because the technique they use obviously works very well since there is no visual difference. Also don't tell me there IS a difference because if you need to use a frikkin microscope to tell you better be playing games with that same microscope too.
    Last edited by morpheus kain; 05-25-2004 at 07:03 PM.
    -"Don't touch that!!!!!" -ZAPPPPP!- Hehe yet another excuse to upgrade-

  7. #52
    Ultimate Member Vampiel's Avatar
    Join Date
    Sep 1999
    Location
    Dark side of the house
    Posts
    2,760
    It's yet to be seen who wins the new war. Saying the x800 is a premature decision, as to how the final drivers are not even out yet, and the nvidia's true power has yet to be tested.

    My unbiased opinion would be that if you are going to keep the card for a length of time, get the nvidia not the x800 b/c of the sm3.0, and close framerates, theres no clear cut case here, so thats just my opinion, but you cant really say it beats the nvidia in the benchmark test's by any noticable amount, if any.
    Again this is a premature decision.
    Last edited by Vampiel; 05-25-2004 at 07:36 PM.

  8. #53
    Banned zybch's Avatar
    Join Date
    Jun 2002
    Posts
    2,292
    It always ends up the same. No matter which card you buy, it'll be out of date and too slow by the time games come out that can make use of the new features.
    Just look at the GeForce 4Ti cards. They were replaced by the FX range well before pixel shaders became commonplace in most games.

  9. #54
    Ultimate Member Rugor's Avatar
    Join Date
    Dec 2001
    Location
    Pacific Northwest, Earth
    Posts
    2,694
    Here are my opinions-- take what bias you think or don't think is there:

    Best card to buy right now: Radeon 9800Pro 128MB. It's fast enough to be CPU limited on all but the very highest end systems and you can find it for around U$175.

    Best next-gen card to buy right now: It's a no-brainer. The X800 Pro is the only one on store shelves, it's the best purchase because you can't get any of the others.

    Best next-gen card to get overall-- I can't tell you that and won't until they've been out at least a few months and we can see what the games are telling us.

    If history is any guide, the fact Nvidia has SM3.0 support won't mean much if anything. ATI was first with PS1.4 support, only to find Radeon 8500 series thoroughly thrashed by the Gf4Ti which only supported PS1.3. ATI was also first with PS2.0, but with the NV3x's lackluster performance in PS2.0 that has been slow on the uptake as well. SM3.0 is the first shader version since PS1.1 that Nvidia has managed to get out the door before ATI.

    The truth of the matter is that most if not all SM3.0 effects can be handled with SM2.0 and that those few that can't involve shaders that are too long for any current hardware to run at reasonable speeds. So until SM3.0 actually hits the market in both hardware and software I don't think I will worry too much about it.

    Until then I'll worry more about playability and image quality in new games and that will take a few months to settle out.
    "Dude you're getting a Dell." Obscure curse from the early 21st Century, ascribed to a minor demon-spirit known as "Stephen?" [sp].

  10. #55
    Ultimate Member morpheus kain's Avatar
    Join Date
    Nov 2001
    Location
    My Own Personal Hell
    Posts
    1,440
    PS3.0 isn't goin to mean anything because the new Nvidia card isn't goin to have the power to run it.

    All of the rewviews that I've read on the X800 shows it flexing quite a large amount of raw pixel power and a decent bit more of than Nvidia has been able to produce too.
    Last edited by morpheus kain; 05-25-2004 at 09:56 PM.
    -"Don't touch that!!!!!" -ZAPPPPP!- Hehe yet another excuse to upgrade-

  11. #56
    Ultimate Member Vampiel's Avatar
    Join Date
    Sep 1999
    Location
    Dark side of the house
    Posts
    2,760
    Id like to see those reviews. Sure maybe it gets .01 fps more in word, but nothing that makes a big difference.

    Best card to buy right now: Radeon 9800Pro 128MB. It's fast enough to be CPU limited on all but the very highest end systems and you can find it for around U$175.
    link please, from a non-shady dealer

    Best next-gen card to buy right now: It's a no-brainer. The X800 Pro is the only one on store shelves, it's the best purchase because you can't get any of the others.
    bzzz try again, it's here the 27th for the 6800

  12. #57
    Ultimate Member Rugor's Avatar
    Join Date
    Dec 2001
    Location
    Pacific Northwest, Earth
    Posts
    2,694
    Newegg had the 128-bit edition for $175, though the much better 256-bit version is $206. So it is a bit higher right now than I thought. However, since faster cards are largely CPU limited that's still a good deal and the 9800Pro is a better card than the FX59x0 series.

    As to the 6800 series being available on the 27th. Reality check please, it's not the 27th yet, and I specifically said "today" for a reason.
    "Dude you're getting a Dell." Obscure curse from the early 21st Century, ascribed to a minor demon-spirit known as "Stephen?" [sp].

  13. #58
    Banned zybch's Avatar
    Join Date
    Jun 2002
    Posts
    2,292
    It would seem from several sites, that the cards are pretty evenly matched, that is till you turn on AA/AF filtering, then the ATI cards have a small advantage.
    Once you get past a certain frame rate though, its simply not possible to tell which card is faster unless you run something like fraps to actually tell you.
    Anything above 30 is playable, anything above 60-75 isn't going to be noticeably different from one card to the next and you will in fact loose visible frames depending on what refresh rate you have your monitor set to. A Monitor refreshing at 75Hz, simply cannot display more than 75fps unless you don't mind tearing and stuff like that.

  14. #59
    Member jamil5454's Avatar
    Join Date
    Oct 2003
    Location
    houston
    Posts
    279
    The only place SM3.0 might be useful is in cinema studio rendering because using shader lengths of over 100 on current is unplayable. The next-gen ATI cards focus more on increasing the horsepower so that you can have shader lengths of 150+ and still be able to render in real-time. Even half-life2 uses shader lengths no longer than ~50, while the Ruby demo uses somewhere around 100 (I think).

    But there is one thing about SM3.0 that could be useful - displacement mapping. From my understanding, this feature is like bump mapping except is actually raises multiple polygons from the texture instead of just adding shadows to make objects in the texture look raised. This way, you can create multiple polygons from one texture so that when you move close to the texture you can actually see the objects raised up, as opposed to bump mapping where the objects appear flat.

    So basically I think there is still a ton of headroom (longer shader length) for image quality and shader improvements in SM2.0, we just need more horsepower to be able to benefit from this. That's why ATI decided to generally make an overpowered R360 for their R420 series. It's also nice to be able to play at 1600x1200 w/ 4xAA & 8xAF.
    Last edited by jamil5454; 05-26-2004 at 12:51 AM.
    SUPER COMPUTER

    palomino 1800+
    ECS K7VMM+ (VIA KM266)
    Hercules fortissimo 7.1
    Maxtor 60gig 7200 2mb
    bfg asylum g4mx440-SE @ 340, 410
    crappy 17" monitor (1280x1024)

  15. #60
    Ultimate Member Strawbs's Avatar
    Join Date
    Sep 2001
    Posts
    4,706
    Anyone buying the 6800 had better have a very hefty PSU handy, that card wants 2x power leads to run it! ATi, on the other hand, have managed to reduce the amount of power X800 needs, in compararison to 9800XT! 6800 will be of little use to anyone building a small form factor computer.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •