ATI vs nVidia...it's all about the shaders..

Discussion in 'General Hardware' started by dnottis, Oct 31, 2003.

  1. dnottis

    dnottis Master Guru

    Messages:
    695
    Likes Received:
    0
    GPU:
    EVGA SSC 970 GTX 4GB
  2. reborn

    reborn Master Guru

    Messages:
    430
    Likes Received:
    0
    GPU:
    GTX460 1GB @Stock
    hope that's true.........
    dx 9.1 will crank up my fx perfomance
     
  3. mjmaskrey

    mjmaskrey Master Guru

    Messages:
    300
    Likes Received:
    0
    GPU:
    Creative Labs FX 5900 Ultra/BFG 6800 Ultra OC
    Apparently, DX 9.1 plus an FX card will be the 'golden combination'....

    Let's hope it's true

    :) :)
     
  4. Drumphil

    Drumphil Guest

    Messages:
    774
    Likes Received:
    0
    GPU:
    Bitchin'fast!3D 2000
    my first post basically said he assumed way too much, then guessed a lot on top of that.. now tho, I have done some further research.. Its all bull****. Reads straight out of the NVIDIA appologetics text book.
     
    Last edited: Oct 31, 2003

  5. dnottis

    dnottis Master Guru

    Messages:
    695
    Likes Received:
    0
    GPU:
    EVGA SSC 970 GTX 4GB
    Well here's the thing. The precision of fp24 won't be the "standard" for very long. nVidia has already developed next generation hardware that can handle fp16 and fp32, we got screwed because for some reason MS didn't feel the precision of 32 was necessary right now (how stupid). ATI has not. So 6 months from now when the next generation cards come along FX users will already have a piece of hardware that can handle the specs set forth in DX9.1. Also, nVidia has already developed for this technology so the next generation cards will build upon the FX technology, ATI will have to move towards using ps/vs 3.0 which will most definitly support fp16 and fp32 NOT fp24 rendering the whole ATI r3xx series obsolete.

    I guess the problem really lies in that nVidia is ahead of their time and unfortunatly this is biting them in the ass in the short run.
     
  6. reever

    reever Master Guru

    Messages:
    239
    Likes Received:
    0
    GPU:
    9700 Pro softmod
    What makes you think MS only will comply to specifications set only by NV, and undermine the other half of the market? Seriously, think about that

    As long as cards support it, it will be the minimum of any standard set out

    Do you even know what is in Dx9.1? Has Microsoft told ANYBODY what exactly they will be putting in, in regards to PS/VS 3.0 and shader precision? If DX9.1 sets the standard for 32 bit precision, FX cards will have REALLY low performance, even moreso than they do now, the entire concept of the optimizations that people think are going into DX9.1(if they are) for Nvidia, are not to make every shader run at fp32 with no performance difference



    The author of the article admits he was wrong on many points and an actual intelligent conversation about the technical aspect and misinformation being spread in the article can be found here:

    http://www.beyond3d.com/forum/viewtopic.php?t=8757
     
  7. Drumphil

    Drumphil Guest

    Messages:
    774
    Likes Received:
    0
    GPU:
    Bitchin'fast!3D 2000
    double post.. delete me!
     
  8. Drumphil

    Drumphil Guest

    Messages:
    774
    Likes Received:
    0
    GPU:
    Bitchin'fast!3D 2000
    Last edited: Nov 4, 2003
  9. dnottis

    dnottis Master Guru

    Messages:
    695
    Likes Received:
    0
    GPU:
    EVGA SSC 970 GTX 4GB
    "What makes you think MS only will comply to specifications set only by NV, and undermine the other half of the market? Seriously, think about that"

    MS isn't complying to standards set by nVidia...do you really think they will stick with a precision of FP24 forever. For MS to choose FP24 was just stupid. Let's try to set the bar a little higher next time so that consumers don't have to keep upgrading their video cards....a little forsight would be nice.


    "If DX9.1 sets the standard for 32 bit precision, FX cards will have REALLY low performance, even moreso than they do now"

    How do you figure this? ATI's current hardware even be able to perform it, why would the NV cards have REALLY low performance? Thats speculation on your part.


    I read the Beyond3d posts...it's just more speculation by people...so what? My personal feeling on the whole thing, regardless of how nVidia's shaders were developed (they may have screwed up too, ie pipeline decisions), is that using FP24 for DX9 was stupid. Obviously MS knew that in the future (eventually) 32bit will be the standard why not move towards it back then and let the hardware developers worry about optimizing for it.
     
  10. Drumphil

    Drumphil Guest

    Messages:
    774
    Likes Received:
    0
    GPU:
    Bitchin'fast!3D 2000
    dnottis.. you don't know what you are talking about.. when I am sober again ill type a reply properly.

    edit: we allready know how slow the FX cards do fp32.. cause thats what they run whenever you run direct x full precision mode.. and they are too slow to use..

    FURTHER EDIT: PLEASE dont start on the noob 3dmath that says "32 is better M'KAY cause it makes better sense somehow in binary math.. ". WHICH IS A LOAD OF BULL. damn.. where is humus when you need him.. can we get some programmers in this thread please.

    OK, and unlike the guy who wrote the article in question, many of the people at beyon3d have enough knowledge that they don't have to speculate about a lot of these things.

    ANYWAYS.. EVERYONE READ THE NVNEWS AND BEYOND 3D threads i linked too a couple of posts up.. Lol, next thing people here are going to be accusing the programmers at beyond3d of being biased too.
     
    Last edited: Oct 31, 2003

  11. ginfest{USA}

    ginfest{USA} Member

    Messages:
    19
    Likes Received:
    0
    GPU:
    EVGA 7800GT CO

    Perhaps you should read the forums at B3D more ofter (and sober)
    Most of the people who post there have an agenda which doesn't favor NV. (Not the site, the forum posters)
    In fact you'll see some of the same names at different web forums and all sing the same tune.
    And as far as the author of the article goes, with the bullying that goes on whenever anyone writes anything that in any way seems to favor NV he is immediately attacked.
    Case in point: Kyle from Hocp was lambasted across the web just a few months ago when the trolls thought he was in NV's pocket, now that he seems to favor ATI, these same people suddenly are praising him!
    Bunch of BS which has gone on for quite awhile across the web.

    Mike G
     
  12. Drumphil

    Drumphil Guest

    Messages:
    774
    Likes Received:
    0
    GPU:
    Bitchin'fast!3D 2000
    so quinfest, what is your technical viewpoint on this issue then..

    QUOTE QUINFEST:
    "Perhaps you should read the forums at B3D more ofter (and sober)
    Most of the people who post there have an agenda which doesn't favor NV. (Not the site, the forum posters)
    In fact you'll see some of the same names at different web forums and all sing the same tune.
    And as far as the author of the article goes, with the bullying that goes on whenever anyone writes anything that in any way seems to favor NV he is immediately attacked.
    Case in point: Kyle from Hocp was lambasted across the web just a few months ago when the trolls thought he was in NV's pocket, now that he seems to favor ATI, these same people suddenly are praising him!
    Bunch of BS which has gone on for quite awhile across the web."


    ------------
    perhaps you should read b3d more too.. and maybee you will find out that when a bunch of knowledgable people all sing the same tune, there is a fair chanse that its the right tune.. Now if it isn't feel free to explain why...

    Now what does the way that people turned around on the whole kyle thing have to do with this... Are you suggesting that people shouldn't say "good editorial kyle" when they see a good editoral??? Because they didn't like what he had to say before.. Did it enter your mind at all that the people you are refering too might actually be right? OMG...what a shock eh.. LOL kyle writes somethign good and then the people you refer too are bad somehow because they say its a good article.. ANYWAYS, what does this have to do with the technical points they picked up in the article in question.. Have a look at b3d. the author of the artice is postiing over there now...

    ARGH.. WHY AM I DEBATING WITH N00bs again.. Im gunna shut up now.. If you read those threads and still don't understand WHY the article is wrong, either you will never understand the 3dtech behind this stuff, or you're just a fanboy who doesn't want to know.
     
    Last edited: Oct 31, 2003
  13. RejZoR

    RejZoR Ancient Guru

    Messages:
    4,211
    Likes Received:
    0
    GPU:
    Sapphire HD4870 Silent
    Yey i got FP16/24/32,yey i can't use them anywhere!? Stupid if you ask me. The same thing is happening as it has happened with DX8 technology. After 2 years im still waiting for it... HL2 and Halo up or down,nothing revolutionary wont happen until maybe NV40 and R400 series...
     
  14. Drumphil

    Drumphil Guest

    Messages:
    774
    Likes Received:
    0
    GPU:
    Bitchin'fast!3D 2000
    er, there are a couple of games using ps2.0 allready.. how long do you think it's gunna take for more to come out given we allready have some now, and some nearly done.. Anyways, ill be playing hl2 the day it comes out. (and unless that takes more than 6 months, ill be getting it before I get a new GFX card. so the performance of what I have now matter a lot to me.. feel free to upgrade when it comes out if you don't want a card that does ps2.0 well now)
     
    Last edited: Oct 31, 2003
  15. Princess_Frosty

    Princess_Frosty Master Guru

    Messages:
    624
    Likes Received:
    0
    GPU:
    MSI GTX580 Twin Frozr II
    As Gabe newell said, Nvidia can cope at the moment by rendering things with only 16bit precision where possible without any IQ loss, and render everything else at 32bit.

    As shaders and effects get more complex the more they will need to run in 32bit, which is bad for Nvidia because doing everything in 32bit is slower. But the thing is ATi cant run at that precision at all, so for anyone thats planning on keeping their card for any reasonable length of time, is going to find their ATI card become rather redundant.

    I seriously dont think we'll see the effects of this for a while, maybe DX10, and they might not even be that bad, the real problem lies with R&D for ATI, they need to step up a level, they need to make the transition from 24bit to 32bit precision, along with their core to .13micro transistor size and also doing all this with 8 pipelines, its going to take a lot of development which in the very least means they're going to need a lot of money or a lot of time. Knowing their problems handeling drivers and suchlike it wouldnt supprise me if they have a hard time going though these heavy changes.

    Nvidia have gone and given their cards a far better head start, they attacked the problem early on which in the short run is costly because they had late releases with the FX cards and some problems.

    If i had to bet on it, i'd say Nvidia will make a comeback when precision is made to 32bit minamum in DX specs, and ATi will have a hard time keeping up

    If you dont think the Nvidia hardware is any good or their silicon is broken and all this crap, then wait untill S.T.A.L.K.E.R comes out, which has been developed cloesly along side Nvidia, if you've not seen screenshots then go a searching, trust me it looks totaly awesome. The engine will make full use of Nvidias hardware, and probably Cg which is going to give it a lot of speed improvements over the radeon cards. To be totaly honest it looks better than both HL2 and Doom3 and has HUGE poly counts from what i've seen.
     

  16. Drumphil

    Drumphil Guest

    Messages:
    774
    Likes Received:
    0
    GPU:
    Bitchin'fast!3D 2000
    QUOTE PRINCESS FROSTY: "If you dont think the Nvidia hardware is any good or their silicon is broken and all this crap, then wait untill S.T.A.L.K.E.R comes out, which has been developed cloesly along side Nvidia, if you've not seen screenshots then go a searching, trust me it looks totaly awesome. The engine will make full use of Nvidias hardware, and probably Cg which is going to give it a lot of speed improvements over the radeon cards. To be totaly honest it looks better than both HL2 and Doom3 and has HUGE poly counts from what i've seen."

    thats a non argument.. CG??? Thats like saying that 3dfx cards are better cause they run so well under GLIDE.. yeah, a big advantage im sure..


    yeah, and what if they had done an deal with ATI and optimised for their cards eh?? would that mean that the NVIDIA hardware was no longer good (if its suppose to be good cause stalker looks great on it).. How do you know that it wouldn't be even faster again if it was optimised for ATI hardware.. And how do you know that it wouldn't be slower on NVIDIA hardware if they made it MORE GFX card intensive.. In fact how do you know anything about stalker yet.. what do we have so far.. a couple of screen shots, and a bunch of NVIDIA PR.. I hate PR (but what would I know.. after all, I LOOOVE ATI so much.. of course anyone who thinks that has never followed my posts at rage3d.)

    damn.. i will LMAO if either doom III or stalker run faster on ATI cards after all this.

    EDIT: to quote someone from the NVNEWS forums

    "I'd seriously go read that B3D thread before you push the article too hard, the author of the article you posted up is apologizing for his mistakes in it."
     
    Last edited: Oct 31, 2003
  17. funkymonkey

    funkymonkey Ancient Guru

    Messages:
    5,512
    Likes Received:
    0
    GPU:
    GF 6600GT/ 6800GT went for RMA
    Actiually i dont care anymore.
    People say NV looses on IQ when it uses mix mode.
    but playing games it dosent matter to me.
    My 5900 U makes up for any loss in IQ with superior AF quality which looks much much crisp and better than 9800 pro.
    there is no point in bashing NVIDIA ot ATI.
    Play the games, forget about he PS2.0 and stuff. All games will support mixmode. While playing game it dosent matter.
    Its same way what many ATi users said when someone pointed about the flat textures on 9800. They said they dont stop to watch texture on walls during game. same here there is no time to search for difference between mixmode and pure FP24, both look so similar that during gameplay the difference is not noticible and actiually sometimes i prefer my 5900 for games which looks much better with good AF like MotoGP2, UT2k3, NFSHP2, Halo etc.
    There is no point in theorotical capabilities, its what we see on that damn monitor which matters to me. And frankly I cant choose winner between my 5900 U and 9800 pro.
    Where 5900 lacks it makes up with superior AF, and where ATi lacks in AF it makes up with AA.
    With 52.16 the colour and lightning problems have totally gone on 5900.
    SO I am happy with both the cards.



    About S.T.A.L.K.E.R. i dont think its gona use CG. Even if it uses CG ATI can can run CG if ATi makes good realtime compiler for that. CG is platform independent like OpenGL. NVIDIA has taken great care that CG will run satisfactorarily even on Radeons.
    I agree with 1 thing though S.T.A.L.K.E.R. has the best graphics i have ever seen on any game. Even screenshots look so much better and ahead of time than any game, wonder how real game will feel........


    Play games god damn it....
    Lol :p
     
    Last edited: Oct 31, 2003
  18. RejZoR

    RejZoR Ancient Guru

    Messages:
    4,211
    Likes Received:
    0
    GPU:
    Sapphire HD4870 Silent
    And the main big and huge problem are developers themself.
    Best example is PlayStation2 (PS2). Just check the Gran Turismo 3 !!!
    And now check the hardware specs and age of hardware (PS2)!??!?! Game looks awesome(compared even to all those powerful and fancy shaders we have on PC),but PS2 has never heard for any shaders. Got the point?
    We should now stick to DX9 for like 3 or 5 years,so we can get most of it,not just a few stupid tech demos and one game and hey look its DX10 coming to be released. Stupid if you ask me. 2 years of waiting just to get one a bit better DX8!!!! title (Doom3)? If this will happen to DX9 again i'll switch to PS2 damn it! (or PS3 when it comes out)
     
  19. SaberJ2X

    SaberJ2X Ancient Guru

    Messages:
    2,546
    Likes Received:
    52
    GPU:
    Zotac 3070
    CG is universal... is not a propietary technology... it's based on HSLS from Microsoft.. only it has presetted some NV3x tweaks/optimizations.

    (GT3 is the bomb... that game ownz any racing game on PC)
     
  20. Drumphil

    Drumphil Guest

    Messages:
    774
    Likes Received:
    0
    GPU:
    Bitchin'fast!3D 2000
    'Play the games, forget about he PS2.0 and stuff. All games will support mixmode. While playing game it dosent matter.
    "

    this is where is disagree funkymonkey, and so its also why I think ps2.0 performance matters so much... I really don't think that everyone is going to code a mixed mode for the FX cards.. Anyways, I certainly wouldn't take a punt on it when buying a $500 GFX card..
     

Share This Page