Detonator 51.75 & AquaMark 3 results

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Hilbert Hagedoorn, Sep 11, 2003.

  1. Morb

    Morb Guest

    duh...why didn I see that?
     
  2. Animationmonkey

    Animationmonkey Member

    Messages:
    13
    Likes Received:
    0
    its amazeing. ive read this topic like a million times over the last few days. and its the same everywhere. and the same problem has been evident since the futuremark fiasco.
    the fx cards cant render pixel shader 2.0 as fast as ati. it just cant. it sacrificed speed for the 32 bit precission. such is life there may be more to it. like the 2x4 1x8 problem but in the most part this may be the biggest problem they have. meaning its in opperable its incureable its written right here in the press release. :D

    id like to see what is happening next year. maybe shader code will get longer possibly benifiting nvidia when ati has to break the code into chunks who knows?? problem is tomorrow is a hard thing to bet on and you never know which horse is gonna throw his rider. so you pays your money and you takes your chance.

    Im happy with what i have im gonna leave it at that.
     
    Last edited: Sep 13, 2003
  3. bsoft

    bsoft Guest

    One thing about the HL2 benchmarks:

    They don't make sense. I can understand that NVIDIA's architecture is far different from ATI's. I can also understand that ATI's shaders do much better than generic DX9 code. However, ATI's advantage in pixel shader execution speed almost never reaches 2x or more IPC compared to CineFX (this is an oversimiplification, but it is true). Since NV35 has a higher fillrate and more memory bandwidth than the RV350, you should almost never see more than a 2x per clock advantage on the ATI side.

    Yet, in the Half-Life 2 benchmarks Valve released, the Radeon 9800 Pro is performing ~2.6 times faster per clock than the GeForceFX 5900 ultra. Even running unoptimized code, the GeForceFX shouldn't be doing that poorly. Especially considering the results in AquaMark3 and related benchmarks. It would be like a 1Ghz Athlon beating a 2.6Ghz P4.

    There's something fishy here. Valve first releases a statement that FSAA won't work on the GFFX, then they take it back. Now they release a statement that GFFX performance is poor and that it's a fatal flaw in the card.

    I'm sorry, but I don't see that fatal flaw. There is no fatal flaw. NV35 is like Itanium. It requires optimized code to reach full performance. No, NV35's shader performance probably isn't as good as RV350. But it's certainly not 2.6 times slower per clock.

    There's something very wrong here. NV35 does great with DOOM3, AquaMark, as well as most recent DX8.1 games. I must ask myself, why can't Valve make a game that runs worth **** on NVIDIA hardware. Valve is making a huge mistake by making their game unplayable by 30% of the market. DOOM3 shows that it's clearly possible to code a next-generation, great looking game that runs like a bat out of hell on the NV35. If id can do it, why not Valve?

    Something doesn't smell right here.
     
  4. Alessandro_BRZ

    Alessandro_BRZ Active Member

    Messages:
    59
    Likes Received:
    0
    GPU:
    GeCube R9600 PRO 128MB
    quote:
    --------------------------------------------------------------------------------
    Originally posted by Ultrag

    You people are fighting over THIS?! You people are fighting over 1 MEASLEY FRAME PER SECOND?!?!?!? 20 pages of bitching and moaning over 1 FRAME!!!!???
    Oh wait, no, at about page 3 the ATI mofos start talking about Half-Life 2. So what are complaining about now? 2 frames-per-second?
    Ooo look, Doom 3 comes in at page 12, and now the crown has gone to nVidia for nicer images and...3? 3 frames? WOW that's a HUGE difference!
    -------------------------------------------------------------------------------


    That's enough... I though this explanation from Ultrag is the best you guys could find out on this thread..... Thanx Ultrag :)
     

  5. TeuTonic

    TeuTonic Member

    Messages:
    41
    Likes Received:
    0
    GPU:
    Asus GTX680 DirectCUiiTOP
    If there is something that does'nt smell right,its NVidia with thier FX cards right from the start. People should have gotten a good wiff when NVidia was bitchen about FutureMarks DX9 benchmark program.

    Well its all comming together about NVidai's con job is'nt it?

    Hell I've never heard of a gaming company slag a video card company like Valve did. They must have had a good reason IMHO.
     
  6. shaten

    shaten Guest

    RE: Animationmonkey

    >maybe shader code will get longer ..

    Once DX10 or Open Gl 2.0 yes but DX 9 no.

    ATI meets the number of instructions that DX 9 allows. The 5900 may have a life span into the DX10 era because of it's extra instructions but like any graphics jump so far, it more than likely won't.

    The funny thing is ATI built straight to DX9 and Open gl 2.0 (draft). and nVidia got fancy. This is the reverse of Geforce 4 and 8500..
     
  7. zittwaredotcom

    zittwaredotcom Active Member

    Messages:
    91
    Likes Received:
    0
    GPU:
    PNY Geforce FX 5800 Ultra
    These posts / threads just make me laugh.

    Yeah, I bought a 5800 ultra ... yeah I spent $400.
    But it was only as a temp solution to get better frame rates with current games and to have a better platform than most when the D3 engine came out.

    Now that D3 is on it's usual "it's done when it's done kick"; I'm looking for a little diversion with HL2.

    I think it just stupid for people to whine because *ANY* high end card will suck when a new engine / game comes out. It just ass-o-nine for someone to spend $400 expecting that card to support radically changing gaming engines.

    D3 / HL2 are the next generation engines people.... just like Q3 was... and all of IDs titles before hand.

    Expect video cards to be obsolete overnight.
    Expect hardware to be obsolete overnight.

    This is the way things are... no hardware which starts design cycles over a year ago are going to be real contenters 6mths after these games are released.

    This is a good thing people; "killer" software drives "killer" improvements to GPU and CPU improvements. Without these games we'd all be sitting infront of a lame ass *nix terminal typing chats in some MUD type invention.

    While Nvidia has been pushing the envelop for the prior to the recent Radeon engines; it's a good thing that Nvidia has screwed up or has fallen behind the Radeon machine. They got sloppy or cocky... whichever you want to call it. IMHO, they deversified too much... chipsets for AMD, the Xbox chipset, and other projects to expand their business.

    They also spent WAY TOO MUCH time trying to "intergrate" the 3DFX engineers into their team. This is obvious by the 5800 ultra Fan screwup... and the CG compiler hype. I think they spent too many resources trying to make their GPU "programmable" instead of focusing on maintaining a leadership role in the Graphics industry.

    I don't seriously believe that Nvidia is ignoring these mistakes. I honestly believe (call it strong hope) their designers are already off making a new GPU which will take care of the Radeon family once again.

    I'm sure Nvidia has a whole team of "architects" analysing the current HL2 and D3 engines to make their next DX9 ready GPU "stellar". They've obviously been sucessful in the past; and they'll surprise us again with their next major GPU.

    I'm sure ATI is doing the exact same thing... but I also thing ATI is trying to model their business after Nvidia... this is evident by their recent "diversifying" into the chipset market and as their "win" of the XBOX2 designs. I predict ATI has bitten off more than they can chew and they will make a major stumble like Nvidia has already done. However, only time will tell.

    Both companies have MAJOR re-design comming in that AGP's days are numbered. Both companies are already working on PCI Express-based GPUs. Anyone who hopes that their existing computers will have extendablity more than 6mths down the road is living a dream.

    IMHO, Nvidia will beat ATI to market with a higher performing PCI-X GPU. The AGP revolution made Nvidia what it is.... I think they have the techincial knowhow to make PCI-X the next big thing. ATI will be playing catchup for years. But again; only time will tell.

    To many things are "on the horizon":
    Intel's "Prescott" / ?Pentium 5? core
    AMD Althon 64
    PCI-X replaces AGP
    DDR2 replaces DDR
    SATA replaces ATA-100 (looking forward to 10000 rpm as standard <grin>)

    With these changes "comming"; your computer / GPU is already obsolete.

    My guidelines are simple:

    Buy what *you* want today. Don't purchase on the hope that your system will be "upgradeable" in the future. Doing anything else is just pissin' in the wind.
     
  8. Fat Drew

    Fat Drew Guest

    AAAAHHHHHH!!!! I can't stand this argument anymore. Go grab some knives and bats and go out in the street and beat the poo out of each other.

    Xabre owns all.
     
  9. funkymonkey

    funkymonkey Ancient Guru

    Messages:
    5,512
    Likes Received:
    0
    GPU:
    GF 6600GT/ 6800GT went for RMA


    Hahahahahahaha!!!!!!!!!!!!!! :banana:
    I feel like that myself.........but you know what rather do it in forums than on streets isent it? Thats why forums are for...:invasion: :zap:
     
  10. BladeRunner

    BladeRunner Guest

    Messages:
    1,938
    Likes Received:
    1
    GPU:
    Asus 1080Ti STRIX
    you ve gone off topic now ,this isnt funny ,it maybe funny for ATi and Valve ,but it surely isnt funny for nvidia users,so please respect that ,if you want to show off emoticons ,you are welcome to do it in another forum,or better off in ATi forum.
     

  11. RejZoR

    RejZoR Ancient Guru

    Messages:
    4,211
    Likes Received:
    0
    GPU:
    Sapphire HD4870 Silent
    Yeah,but its stupid to rush now in stores for R9800 Pro if you already have FX5900 Ultra. This is perfectly idiotic.
     
  12. mighty_fc

    mighty_fc Member Guru

    Messages:
    101
    Likes Received:
    0
    GPU:
    XFX 7900GT 470M 256mb
    couldnt have said it better myself :D im waiting eagerly for the new drivers.. but if they dont do their job.. its hello radeon cuz i figure i will survive a while on that if i get a athlon 64 =)
     
  13. amakusashiro

    amakusashiro Banned

    Messages:
    354
    Likes Received:
    0
    GPU:
    two HD5770 in crossfire
    I picked up and tried out a Radeon 9600 Pro this last week from Best Buy,3d Mark 2k1,around 9200 points,2k3 got around 3100.I was very impressed,then tried out 3D Mark 2000 for direct x 7 just to see how it ran,using Catalyst 3.7,when I knocked the test up to 32 bit color,it crashed my rig...........Catalyst 3.4 in the same test using same high setting caused it to exit out half way into the test.....And Battle Engine Aquila had really bad texture problems no matter what drivers I used......I took the card back for a full refund.I am now using a 5600 Ultra,going to test it heavy,and see how it does for me.
     
  14. TheDigitalJedi

    TheDigitalJedi Ancient Guru

    Messages:
    3,991
    Likes Received:
    3,221
    GPU:
    2X ASUS TUF 4090 OC
    I presented a challenge 2 days ago and from what I've seen, the fanboys refuse to accept. It's very clear now that people want to talk alot of garbage but when it's time to put up or shut up, they choose to shut up.


    Wise choice!

    /me withdraws his Lightsaber.
     
  15. mighty_fc

    mighty_fc Member Guru

    Messages:
    101
    Likes Received:
    0
    GPU:
    XFX 7900GT 470M 256mb

    ehh something seems wrong there.. i get 3368 in 3dmark03
    with my fx 5600u
     

  16. Monrad

    Monrad Guest

    Messages:
    1,008
    Likes Received:
    0
    GPU:
    Sapphire Radeon 9800 Pro =)))
    Hey TheDigitalJedi, I accept! I can beat your 5900U with my 5600 :)
    Do you accept? Are you frightened?
    Ok, just joking, I'd love to do some benchmarking but I don't have the hardware to compete with you.
    Oh... and when HL2 goes out, I'll play it with FULL HIGH QUALITY (without AF or AA).
    I'm planning to do some overclocking with 2x256 OCZ PC3700 Gold memory... and maybe I reach almost 3.0 Ghz... that's a powerful machine to play HL2 despite the GPU.
    And for you ZITTWAREDOTCOM, congratulations for your post, I really liked it :)
    Users of FX5900U, 5800U, 5600U, etc, don't waste your time and money, your GPUs are more than OK, they are great and you should't bother because of one single game. You'll be able to play great games like doom3, stalker, etc etc without problem.
    And for the guy who said Doom3 was DX8... I don't know if it was in this thread... but there are almost 3 or 4 threads about HL2 and FX; you are a F****** NOOB, Doom3 is not DX8!!!!!!!!!!!!!!!!!!!!!!!!!!!!

    Good Luck and Have Fun.

    PS, If you are going to sell your FX5900U, please send it to me, I'll play shipping
     
  17. Animationmonkey

    Animationmonkey Member

    Messages:
    13
    Likes Received:
    0
    its a shame im not waiting for doom 3 never have :) oh well but im not going to continue to beat around the buh till i get hold of a copy of half life my self. or read a reveiw from a reputable games mag/ hardware mag that categoricaly states that the performance of the cards is poop under halflife 2 then ill beleive it. however 8 tests in 1hr 30 mins means i cant beleive that the setups were verifiable. im inagreement that the geforce fx gpu does seem underpowered some how. but just like the nvidia demo of doom 3 where they didnt allow ati to update the driver we shall see.
     
  18. amakusashiro

    amakusashiro Banned

    Messages:
    354
    Likes Received:
    0
    GPU:
    two HD5770 in crossfire
    "ehh something seems wrong there.. i get 3368 in 3dmark03
    with my fx 5600u" by mighty fc


    To answer that FC,Im only using a Athlon XP 1700+,while you are using a Athlon 2600+......nothing of my pc specs have changed except the card,which I refuse to oc at the moment.I do a nice set of test to see how well overall my rig runs,in both Windows 2000 and Windows ME,and while I have a copy of XP Pro,and sometimes test in it,I rarely do much else with that version of Windows,beause I am just a tad more in-love with Win 2k.Really all of you,if worse comes to worse,buy the card that runs the games YOU want to play.I know it sucks to have to do something like choose between Doom 3 and HL 2,but this has happened before,back in the Voodoo days,you all should be used to it by now,graphics cards come and go every year,just as the games do.I my-self just was not happy with the Radeon 9600 Pro.....even with higher fps on test,it just was not as well rounded a package.......If it cant run Battle Engine Aquila for me,then I have no need for it in my PC,espically considering even a Radeon 8500 runs that game fine texture wise.Granted its a driver problem for sure,but when it comes to ATI,driver problems have always been the case,its never been the great hardware they make,its just been their driver support.....
     
  19. bingo13

    bingo13 Guest

  20. mighty_fc

    mighty_fc Member Guru

    Messages:
    101
    Likes Received:
    0
    GPU:
    XFX 7900GT 470M 256mb

    sorry mate i thought u had a 2700+ my eyes are screwed
     

Share This Page