ATI radeon 9800Pro Vs. nVIDIA geforce fx 5900

Discussion in 'General Hardware' started by Unrealer, May 18, 2003.

?

ATi vs nVIDIA once again

Poll closed Jun 17, 2003.
  1. ATI radeon 9800Pro

    22 vote(s)
    50.0%
  2. nVIDIA Geforce FX 5900

    22 vote(s)
    50.0%
  1. Sivo

    Sivo Master Guru

    Messages:
    726
    Likes Received:
    0
    GPU:
    Radeon 9600
    Just to Bump this back up and keep it going:

    So, The ATi Radeon 9800 Pro runs HL 2 Better than the nVidia GeForce FX 5900 Ultra at this current point in time.

    Am I also right that this mean in theroy that all DX9 games using the sme form of Shading as HL2 will all run better on the ATi card?

    I have no technical knowledge when it comes to this so if i've made the wrong asumption then please correct.

    The min jist of this post is, is it possible for nVidia to completly fix the current problem?
     
  2. fallenx

    fallenx Ancient Guru

    Messages:
    1,981
    Likes Received:
    0
    GPU:
    Galaxy 8800GT
    No its not possible. Not without "cheating"

    Notice the qoutation marks, I'm using the term rather losely. Because Nvidia will render the shader effects at 32-bit (rather slowly might I add) Its quality will be higher then ATIs 24-bit precision. What Nvidia could do is use coding in the drivers to "lower" image quality in some areas, but the inherant 32-bit advantage it has covers it up.

    Basically all DX9 games will use the same form of shading as HL2 though, what DX9 does is set a "standard" to use.
     
  3. WeaZel

    WeaZel Guest


    Gabe newell even said that just cause Valve spent so much time optimizing for nv3x doesn't mena other companies are going to. Like smaller game companies. Also thos eoptimizations are hl2 specific so they can't be used by games based on hl2's engine.
     
  4. serAph

    serAph Guest

    you mean Fp32. And actually, it wont be. Thats what 5x.xx is doing: taking FP32 precision and downgrading it to fp16 and 12 on the fly.
     

  5. WeaZel

    WeaZel Guest

    I still don't see that as optimizing.

    Optimizing means making better. the image quality isn't better. The speed is.

    I hope everything turns out alright for the guys that bought nv3x. Valve shouldn't be punished for nVidia's eager beaver-ness (32bit precision instead fo the gaming hardware wide DirectX9 standard).
     
  6. serAph

    serAph Guest

    its optimizing because it does that on every single app it runs. If it only did that for HL2, then it'd be a cheat.

    it IS making it better b/c you get a significant FPS increase from doing it and it'll look virtually no different than FP24.
    Whoops didnt mean to let that cat outta the bag :D


    anyway, its not the 32 bit thing thats the problem. Even @ 16 bit, the nv3x doesnt shade anywhere near as fast as the r3xx series. The 32bit thing is just the nail in the coffin - well, at least for Direct3D apps....
     
    Last edited by a moderator: Sep 23, 2003
  7. WeaZel

    WeaZel Guest

    That's the thing it doesn't look the same.

    They're trading a directx9 code for a directx 8/8.1 one. Refraction of water looks totally different when comparing 8.0 vs 9.0.

    http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/hl2_followup/shotdx.htm

    I knew I'd find it.

    Also look at how crisp the textures are in Directx9 vs the 8.

    And again there's the whole HDR lighting, which you won't get in that mixed FX mode.

    So if me running without v-sync on is that an optimization? Texture tearing doesn't add to the game...well it adds an annoyance.
     
  8. serAph

    serAph Guest

    sorry weazel, you cant abide by some standards and not others. By standard definition, an "optimization" cannot be application specific, else its a cheat. The recompiled codepaths are not app specific.

    Do you know what aspects of HL2 will be run in DX9 mode in mixed mode, by chance?
     
  9. Sivo

    Sivo Master Guru

    Messages:
    726
    Likes Received:
    0
    GPU:
    Radeon 9600
    So.. am I right in saying that the nVidia is the better card to go for if you dont mind the FP being 16 instead of 24?
     
  10. serAph

    serAph Guest

    hmm... perhaps after 5x.xx gets refined...
     

  11. Sivo

    Sivo Master Guru

    Messages:
    726
    Likes Received:
    0
    GPU:
    Radeon 9600
    Any Idea what the current turnout rate for new nVidia Drivers is? I know at one point it was very low. I'll be buying a new card in January, so this is just speculation.
     
  12. serAph

    serAph Guest

    well they're already overdue with 5x.xx so I cant see it taking them much longer.

    Also keep in mind that IBM is starting to build chipsets for nVidia too - starting with the nv36, and supposedly the NV36 is something worth keeping an eye on...
     
  13. TheDigitalJedi

    TheDigitalJedi Ancient Guru

    Messages:
    1,851
    Likes Received:
    161
    GPU:
    RTX 3090 + CX OLED
    *This thread shall rise from ashes like the Phoenix*

    :p

    What is everyones theories in regards to the beta dets and shader performance? I'm sure this question has been answered thousands of times in other threads. I wish the newbs would do a search before posting a new thread. LoL.

    Before I go back up north I'm going to get some pics together of current games on both pieces of hardware.
     
  14. Cudda Kine

    Cudda Kine Member Guru

    Messages:
    158
    Likes Received:
    0
    GPU:
    PNY Verto® GeForce™ FX 5700 Ultra DDRII - 128mb
    what Detonator will the 5*.** be Officially from Nvidia???

    Will it be like 50.** or 51.** or whatever???
     
  15. TheDigitalJedi

    TheDigitalJedi Ancient Guru

    Messages:
    1,851
    Likes Received:
    161
    GPU:
    RTX 3090 + CX OLED
    According to rumors there may be a release this month. It'll probably be detonator 52.XX but don't quote me on that.
     

  16. Grov

    Grov Guest

    Why oh why did ya bump this.

    Anyways i got somet that might cause some disscussion. It would be interesting to see what the results would be if this thread was made now.:D
     
  17. s3R!4LK!LL3r

    s3R!4LK!LL3r Guest

    There are no way you will give performance to a card by just simple making new drivers.... I mean, nVidia has changed and adjusted codes before and now the only thing left to ajust is cuting out quality to give performance... that is totally ridiculous action by a company.

    EXEMPLE:

    Nvidia Detonator 45.23 (Oficial) AA= 4x e AF= Auto
    http://www.extremepc.com.br/imagebank/bagulho/Gp4nvidia4523.jpg

    Nvidia Forceware 52.13 (Beta) AA= 4x e AF= Auto
    http://www.extremepc.com.br/imagebank/bagulho/Gp4nvidia5213.jpg


    SOURCE - *** ForumPCs ***

    You can clearly see how texture far away are so bluuuuuuuured!
     
  18. serAph

    serAph Guest

    actually, giving cards large performance increases from drivers happens quite frequently. Dets and Cats. nVidia has recently given lossless performance gains with their variable precision code, which is a perfect example of leveraging hardware features for performance gains. The ATi simply isnt capable of rendering multiple shader precisions like the nv3x series is and thus cannot benefit from doing such things.

    What makes you think that performance gains from software that interfaces with hardware is impossible? :confused:

    besides that, the problem you're referring to is only AF quality/speed. Should they remove that optimization, the drivers would still be significantly faster in applications involving PS2.0.
     
  19. Sivo

    Sivo Master Guru

    Messages:
    726
    Likes Received:
    0
    GPU:
    Radeon 9600
    man, back from the dead.

    To answer Grov, I think it would be roughly the same, i mean currently the poll States ATi 51 and nVidia 34. For a point i think ATi votes would have even fyrther skyrocketed, but now that ppl are finaly realising that the nVidia card although not the best isnt a failure the scores would be the same.

    My opinion on Drivers and performance is: Yes you most deffinatly can improve performnce, IQ whatever. Why becuase its been done! With teh GeForce 2 and on a bigger Scale all the old ATi cards. A **** driver will make a top end card run like ****. So a good driver will show the full potential of the card. BUT, of course you ould optimise drivers in one area and decrese there effect in nother area if the card isnt doing well., That is an optimisation not normally found in Driver releases.
     
  20. s3R!4LK!LL3r

    s3R!4LK!LL3r Guest

    performance only by cuting stuff out...
    They are only optimazing for syntetic benchys... makin quality loss to performance increase. And look, that's a game above, not syntetich benchy.
    Aniso Filtering is one of the moooooooooooost important on the games or you wanna see blurs away ??? You play BF1942, you will know what I mean.
    Only the nV38 has full support 2.0 Pixel Shader with out bugs.
     

Share This Page