Nvidia sucks in DirectX 9

Discussion in 'Videocards - NVIDIA GeForce' started by Nustian, Sep 15, 2003.

  1. Nustian

    Nustian Guest

    Hey friends,
    Nvida hardware realy sucks in DirectX 9. Even though the new detonator 51.75 do increase DirectX 9 performance but it is no big increase. Nvidia hardware won't be able to catch up with ATi hardware in HL2 because the performance delta is too big. This goes to show the weakness of Nvidia's DirectX 9 Hardware. The only question that arises is that "Then why Nvidia hardware is faster in Doom 3?". I saw a website which e-mailed John Carmack about the same problem asking him why were Nvidia based cards faster than ATi ones in Doom 3 even though Nvidia hardware is slower in HL2. His reply was that Doom 3 uses a special code path for Nvidia hardware which uses partial precision shaders on Nvidia hardware. This special code path invloves using 16 bit or 12 bit precision instead of 24 bit precision (24 bit precisioin is PS 2.0 standard). On the other hand, ATi hardware uses 24 bit precision. So naturally the performance of Nvidia hardware will be faster in Doom 3 than ATi hardware. He further says if full precision (i.e. 24 bit) is used for both Nvidia and ATi hardware than ATi cards are considerably faster in Doom 3.
    The problem with Nvidia DirectX 9 cards is that they are capable of partial precision at 16 bit and full precision at 32 bit precision. So naturally 24 bit precision is not possible on Nvidia cards. So a mixture of 16 bit and 32 bit precision or partial/full precision is done. This reduces image quality as well as speed. Nvidia should have 24 bit precision shaders in their DirectX 9 products and even if they were to go beyond DirectX 9 specs of 24 bit (as their hardware does 32 bit at full precision) then they should have made the hardware powerful enough to run the games at silky smooth frame rates even when full 32 bit precision shaders were being used. So all the blame goes to Nvidia.
    On the other hand ATi hardware uses 24 bit at full precision. So there is no problem for ATi hardware in HL2.
    Thus this goes to show that Nvidia hardware really lacks the power to run DirectX 9 titles especially with higher precision for PS 2.0. Also we see in HL2 if special Nvidia codepath is used than there is a considerable increase in performance for Nvidia cards. So this was the piece of information I wanted to bring to you. Now you people are better judge whether there is a driver problem for Nvidia based cards or there is the lack of muscle power in Nvidia based cards to run DirectX 9 titles. Don't think I am an ATi fan. I have an ASUS GeForce4 Ti 4200 128MB card myself!

    Here is a review of the so much talked about Detonator 50 drivers. This review shows that althought there are performance gains(10-15%) are there but they are not enough to outpace the Radeon 9800 Pro and these gains come at the cost of image quality.
    Here is the link:
    http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_rel50/001.htm

    Thanx for reading and take care.

    NUSTIAN

    System Specs:
    AMD Athlon(tm) 2400+
    Gigabyte GA-7VAXP KT400 Motherboard
    640MB of PC2100 Kinston RAM CAS 2.5
    ASUS GeForce4 Ti4200 128MB card (275/550)
    Creative Sound Balster Live! 5.1
    20GB Maxtor 531DX ATA100 Hard Drive
    Philips 104S 14" Monitor
    Windows XP w/SP1
    VIA Hyperion v4.46
    Nvidia Detonator Driver v45.23
     
    Last edited by a moderator: Sep 16, 2003
  2. This has been run over in high detail, but some of your facts are plain wrong. DX9 specs call for 24-bit shaders, which ATI uses 100% of the time. The FX cards use 32-bit shaders in full precision, and 16-bit shaders in partial precision. The nvidia codepath in Doom 3(also in HL2), uses a combination of Nvidia full/partial precision shaders. The reason Nvidia cards are slower in a pure DX9 codepath is twofold. One: the nvidia cards use a 4x2 pipeline instead of an 8x1, this is inherently worse. Two: Nvidia cards can't run at 24-bit precision and when running at the higher than standard 32-bit precision performance becomes well sub-par to say the least. Finally the performance increase with the 51.75 is subsantial. In DX8 benchmarks no real notice is, or should be seen however PS2.0 performance is up 20-30%, and this obviously helps push DX9 performance forward significantly.
     
  3. luke929

    luke929 Member Guru

    Messages:
    178
    Likes Received:
    0
    GPU:
    Leadtek A400 Ultra TDH - Geforce 6800 Ultra
    Doom3's engine is based on OpenGL, it has nothing to do with DX9.......
    They are two different APIs.
    It is all about shader performance.
    You must know that not only DX9 uses shaders, OpenGL uses them also.
    Performance of HL2 is superior when using ATI cards becoz it is based on DX9.
    OpenGL, on the other hand, is always Nvidia's stronger field compared with ATI. So I don't find it strange when FX5900's performance is superior than 9800pro in Doom3.
    However, the 2 cards use different modes in Doom3.
    FX5900 is slower when using something called ?ARB? i don't remember it.
    Go search articles about doom3 and you should be able to find out yourself.
     
  4. Rudy

    Rudy Guest

    DX9 vs OGL ?

    I'm not sure if I'm a "Black Helicopter" conspiracy theorist, but consider this. Open GL is not controlled by Microsoft, but DX9 is. Nvidia also lost the deal for the next X-Box, and ATI won. Is Microsoft rewarding ATI with DX9 inside information to sabotage the future prospects of Nvidia? Does ATI have an inside advantage by Microsoft revealing more details of the internal of DX9 to ATI? Does this help X-Box-2? I don't know, but it sounds interesting.


    Rudy
     

  5. BladeRunner

    BladeRunner Guest

    Messages:
    1,938
    Likes Received:
    1
    GPU:
    Asus 1080Ti STRIX
    What is the point of this kind of threads with topics like 'Nvidia sucks in DirectX 9 :D ,'Nvidia is owned by ATi in HL2' ,its already been said a million times ,there are other posts that already creating headache for mods and normal users alike,threads like these belong in another forum ,not guru3d ,and people sign up here ,just to make one little post ,cause a bit of an annoyance and ,look at flaming replies ,then ran away or get banned ,like Violence Factor, I would definitely lock posts like this ,in the future.
     
  6. Nustian

    Nustian Guest

    Nvidia really sucks in DirectX 9

    Hello,
    Nvidia isn't all that well in DX9. Here's the proof.

    This what Gamers Depot has to say:
    "Even the folks at Gas Powered Games have chimed in regarding this issue. Here's a quote from James Loe head-tech-pimp-daddy over at GPG:

    James Loe: “Precision has a pretty big impact on the current nVidia chips, so it’s important to use half precision data types and the nVidia specific 2.0a HLSL compiler when running on NV3X hardware. Keeping those limitations in mind, it hasn’t been too difficult for us to maintain an acceptable level of performance.”

    We also emailed Gabe Newell from Valve Software about his experience with Pixel Shader Performance:

    Gamers Depot: We've started some initial DX9 Shader tests, and have seen, so far that NVIDIA NV3x hardware has a hard time.. Will it'll be similar w/ the HL2 Benchmark?

    Gabe Newell: "Yep."

    We emailed id Software guru, John Carmack about his experience with NV3x hardware on Pixel Shading performance, and this was his reply:

    Gamer Depot: John, we've found that NVIDIA hardware seems to come to a crawl whenever Pixel Shader's are involved, namely PS 2.0.. Have you witnessed any of this while testing under the Doom3 environment?

    John Carmack: "Yes. NV30 class hardware can run the ARB2 path that uses ARB_fragment_program, but it is very slow, which is why I have a separate NV30 back end that uses NV_fragment_program to specify most of the operations as 12 or 16 bit instead of 32 bit."

    Who are we to argue with NVIDIA’s wishes? They’ve been insisting that synthetic benchmarks such as 3DMark 2003 don’t reflect actual game play."

    Here is another (Also from Gamers Depot):
    "After spending the last couple of days with the team from Valve, and gathering comments from other top developers like id Software and Gas Powered Games, the fundamental performance increase on NV3x hardware only comes when the developer uses a lower bit-precision path 16 or 12bit vs. 24bit on ATI hardware."

    Mind it I am not one of those who just give a post and run away. I have proof of what I say.

    Here are the URLs if you want to see it for yourself:
    http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/001.htm
    [url]http://www.gamersdepot.com/h...ti_vs_nvidia/dx9_desktop/hl2_followup/001.htm [size=6][b]NUSTIAN.[/size][/b]
     
    Last edited by a moderator: Sep 16, 2003
  7. BladeRunner

    BladeRunner Guest

    Messages:
    1,938
    Likes Received:
    1
    GPU:
    Asus 1080Ti STRIX
    err Hello ???! This has been discussed million times already ,posts like these start flame wars ,why do people never read posts that warn them before they actually go on with this,gamersdepot IMO ,is not a professional review site ,with words like crap ,stupid and cheating ,often presented in their articles ,I would never trust them.
     
  8. RejZoR

    RejZoR Ancient Guru

    Messages:
    4,211
    Likes Received:
    0
    GPU:
    Sapphire HD4870 Silent
    I don't trust no-one except me. When im gonna se HL2 stucking on my screen,then i can confirm to myself that GF FX sucks. Until then... ( i still have OC GF4 so...)

    And DX8 graphics from HL2 are not much worse. In tech trailer,there was way too much reflections while using DX9. In DX8 only necessary object (glass,metal and maybe alien skin) will have reflective properties,which looks more realistic too me.
     
  9. reever

    reever Master Guru

    Messages:
    239
    Likes Received:
    0
    GPU:
    9700 Pro softmod
    Except R&D money, which doesnt all have to go to Xbox2 development. And it would seem obvious that Microsoft IS giving ATI more crap since just about EVERY piece of documentation, guides, and tutorials on how to effectively use dx9 and it's shaders, is written by ATI/MS
     
  10. DirtyDirty

    DirtyDirty Guest

    u will see nvidia WILL run hl2 as good as ati.

    its a nice pr stunt excuted by ATI+Valve (for $9 million worth)
     

  11. DirtyDirty

    DirtyDirty Guest

  12. reever

    reever Master Guru

    Messages:
    239
    Likes Received:
    0
    GPU:
    9700 Pro softmod
    And the notion that Nvidia will get more magic performance and be equal with ATI cards at Dx9 is nothing more than a nice pr stunt
     
  13. eynmyn

    eynmyn Master Guru

    Messages:
    368
    Likes Received:
    0
    halflife 2 isnt even out yet

    the game isnt even out yet and you guys are already compareing the two cards. well for one thing i didnt got out and buy a fx card wich im glad because my card t1 4600 visiontek still blows away most of them cards.
     
  14. DeadFish

    DeadFish Member Guru

    Messages:
    195
    Likes Received:
    0
    GPU:
    2 x NVIDIA GeForce 6800ultra (SLI)
    Well - have u checked the aquamark 3d review?

    It seems that 51.75 drivers has enhanced the performance - Now 9800 and 5900 are the same.
     
  15. Darkest

    Darkest Guest

    Messages:
    10,097
    Likes Received:
    116
    GPU:
    3060ti Vision OC V2
    Odd, the reviews I saw showed that NVIDIA gained an increase in preformance, but still didn't beat ATI. You've also got to wonder what else ATI can suck out of the 9800 with newer drivers. Nvidia need to move on and get new hardware out ASAP, they've messed up with the current FX line, in my opinion.

    Though I could be wrong, and only time can tell. Let's hope that I am, I wouldn't want to see anyone spend alot of money and get a bad deal.

    Also, as has been previously stated several times, -please stop the HL2 threads- we have a million on the forums already. If you want to talk about it search back for the original thread rather than spam the boards with new ones, it's childish.
     

  16. GrAC3R

    GrAC3R Member

    Messages:
    22
    Likes Received:
    0
    GPU:
    X800XT PE
    Dear god! Gamersdepot are really catering for that FanATiC demographic, the amount of Anti-NV articles in the last fortnight is astounding.
    Enough Already! I find it odd actually because the ATi zealots have more interest (unhealthy fixation) in the goings on @ Nvidia than most 5900 users do.....You Paranoia Politician Diva !
     
  17. ugapug

    ugapug Member

    Messages:
    41
    Likes Received:
    0
    GPU:
    EVGA GTX 580 Classified
    "Nvidia sucks in DirectX 9". And Nustian sucks everywhere else. TROLL! Get back under your bridge.
     
  18. Fat Drew

    Fat Drew Guest

    Hey ugapug hows that albatron working out for you? I got a 256mb one coming in the mail (hopefully).
     
  19. ugapug

    ugapug Member

    Messages:
    41
    Likes Received:
    0
    GPU:
    EVGA GTX 580 Classified
    Very well thanks. I'm able to stably run @ 475/960. I think if I put a 120mm YSTech fan running over the core I can go even higher. It runs quite well (and it has a great bundle). Oh and it looks cool too, and we all know that's what is really important.
     
  20. Dazz

    Dazz Maha Guru

    Messages:
    1,010
    Likes Received:
    131
    GPU:
    ASUS STRIX RTX 2080
    YOU GUYS JUST DON'T GET IT DO YOU? The reason the ATi cards are so far ahead is due to having DOUBLE the pixle shader units 8 vs 4 NOTHING can be done to bring the NV cards up to speed unless highly overclocked or flat out cheating, the latter is currently what they are doing. A GeforceFX would need to be clocked at, at least 600MHz core clock in order to compete with a 380MHz Radeon in applications that also have PS2.0 code.
     

Share This Page