GTX 295 Holds On To The Crown

Discussion in 'Videocards - NVIDIA GeForce' started by LedHed, Sep 23, 2009.

Thread Status:
Not open for further replies.
  1. WaroDaBeast

    WaroDaBeast Ancient Guru

    Messages:
    1,963
    Likes Received:
    0
    GPU:
    Gigabyte HD7950
    I couldn't care less as to which card produces the highest FPS, especially when the newest card on the block uses beta drivers. Besides, I live in a pretty hot place and I care about saving energy, so heat and energy consumption also matter to me.

    That said, if nVidia manages to release a DX11 card along with some proper drivers for it, ATi will have to face tough competition, especially now that we're seeing the "true awesomeness" of PhysX, so to speak.
     
  2. mitzi76

    mitzi76 Guest

    Messages:
    8,738
    Likes Received:
    33
    GPU:
    MSI 970 (Gaming)
    i think the reviews have shown it to be nr to a 295 in some games. @1920x1200 i dont think you need 5870's in crossfire. its not about beating the 295..it's about getting excellent fps without dual-gpu headaches.

    actually 1x285 isnt enough today with some games especially if you want to play ones with are very graphically demanding. it all depends on your personal tastes as well regards how much eye candy u put on and type of game u play and whether fps dips and such annoy u.
     
  3. mitzi76

    mitzi76 Guest

    Messages:
    8,738
    Likes Received:
    33
    GPU:
    MSI 970 (Gaming)
    actually physx is somewhat of a fail atm. the one game which it has been released for in quite a big way is batman. and it lowers your fps quite a lot..

    i for one find playing with it off and more aa and af better than with on. interestingly the 5870 allows u add lots of msaa and ssaa. although i know with nhancer u can change multisampling and super sampling so perhaps that's not much different.

    perhaps the havok engine along with dx11 will come into play more now..by xmas we will know the state of play re nvidia and 300's and what havok/physx/dx11 have to offer. perhaps physx will be implemented better. but its seems a bit of a con that people need one card for gpu then another for physx..

    (witcher2 looks like it will have the havok engine..interesting.)
     
    Last edited: Sep 24, 2009
  4. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    Its not a fail, as it generated alot of buzz, made the 360/PS3 crowd jealous, and probably helped sell the game to people who couldn't run it.

    PhysX is just like any high setting in a game, i can get 60fps in Crysis if i lower the settings, i just choose to crank things up to custom very high settings, its the same with Batman:AA, and even though my lows are about 30fps in Batman with PhysX on high, i get avg of 51fps at 1920x1080 and 4xAA.

    Two 285's in SLI would be decently priced setup that is a good bit faster than the 295 or 5870.

    I for one hope something is sorted with GPU physics soon, Havok/PhysX, i have no preference, as CPU based ones are so underwhelming you don't know they are there, or they are fake and over the top like in Wolfenstein.

    The 5870 definetly has the power for it, if ATI decided to let PhysX be used on their cards, this for sure would have a rock solid 60fps.
     

  5. WhiteLightning

    WhiteLightning Don Illuminati Staff Member

    Messages:
    30,785
    Likes Received:
    3,959
    GPU:
    Inno3d RTX4070
    Its not Ati's decision to make isnt it :)
     
  6. Mineria

    Mineria Ancient Guru

    Messages:
    5,540
    Likes Received:
    701
    GPU:
    Asus RTX 3080 Ti
    Just saw something funny in an article over at Toms Hardware, where Nvidia claims that special-purpose software relying on GPGPU will propel GPU sales, not PC gaming.
    Irony is that at nvision08, several nvidia employees called Intel's SIGGRAPH paper about Larrabee "marketing puff" and told the press that the Larrabee architecture was "like a GPU from 2006".

    So either they want to tell us "Don't buy a DX11 card before we got one ready" or they jumped on the Intel idea, since there are benchmarks with the Larrabee performing as good as a GTX285.

    I believe in the first thou, based on info from SemiAccurate showing that Nvidia GT300 yields are under 2%.

    Cause in the end we buy new cards so we can play prettier games.
     
  7. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    Nvidia offered to port Cuda for ATI cards and they declined, probably because they have a deal with Havok.

    Problem is, i remember an interview with AMD's tech director late last year, and he said PhysX wasn't going to take off, and it was not used by any big game developers, and that GPU Havok physics on Radeon cards was possibly end of 2008 early 2009.

    With 3months left of 2009, and a few big PhysX games so far, and some more from Sega and Capcom appearing soon, methinks they got it wrong.

    Just found the article, and he states they want to do GPU physics more for gameplay, rather than eye candy, well people like eye candy, and using it for gameplay uses isn't going to work in this multi-format gaming world.
     
  8. Labyrinth

    Labyrinth Ancient Guru

    Messages:
    4,413
    Likes Received:
    92
    GPU:
    Tri-X R9 290 4G
    i doubt nvidia offered to do it for free anyway, plus havok has a stronger line up at the moment
     
  9. mitzi76

    mitzi76 Guest

    Messages:
    8,738
    Likes Received:
    33
    GPU:
    MSI 970 (Gaming)
    generating buzz, making console owners jealous does not make physx = win in my opinion.

    i have tried batman a lot with physx on/off and i can 100 percent say i prefer it with it off. if it didnt cause lag i would be impressed as the effects are very nice. however it's something that promises a lot but fails to deliver..that's not to say it wont get better.
     
  10. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    Course they didn't offer for free, would you offer something you paid money for, and then work on offering it to someone else for free, but i did hear the price was very reasonable, and would only add a couple of dollars to each GPU.

    Generating buzz and making console owners jealous is a win, gaming is a billion dollar business, good publicity is good for Nvidia, the game dev's and even PC gaming as a whole when its a game that been critically aclaimed everywhere, and the PC gets the best version.
    Its not lag, its low FPS, and mine doesn't go lower than 30fps with the newest PhysX software and 191.00 drivers, which is the highest the 360 and PS3 ones go.
     

  11. Mineria

    Mineria Ancient Guru

    Messages:
    5,540
    Likes Received:
    701
    GPU:
    Asus RTX 3080 Ti
    Not really, since Microsoft implemented tesellation into DirectX 11.
    Which means graphic card designers won't have to worry about physx, since the compute shader can be used by game developers to bring GPU-based physics, ray-tracing, better AI, and more.
     
  12. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    But DX11 games aren't out yet, and the ones that have been announced, none of them mention GPU physics in any form, and wont for a while, as for DX11 to take off it can't alienate too many DX10 customers in the same way DX10 with DX9.

    And PC gaming still isn't at the point where more money and time goes on their version.
     
  13. Mineria

    Mineria Ancient Guru

    Messages:
    5,540
    Likes Received:
    701
    GPU:
    Asus RTX 3080 Ti
    DX10 was bit of a flop, many game developers even skipped it and go from DX9 to DX11.
    So DX11 will alienate the game developers.
    And then you will see gamerss move from Windows XP to Windows 7 cause of that too, cause there are still many of them left.
    Point is that DX11 brings so many more features which will benefit all ends of the marked unlike DX10.
    It is possible to use DX11 to create physics, so if AMD, Intel and some other graphic chip developers can make it perform on pair or even better then PhysX, you think that game developers will say no to use it?
    I think they will skip Nvidia PhysX completely at that point, since it is a lot easier to code over one standard (only on MS OS thou) than to cover separate specific requirements.
     
    Last edited: Sep 24, 2009
  14. jmpnop

    jmpnop Maha Guru

    Messages:
    1,410
    Likes Received:
    0
    GPU:
    GTX 260 Core 216
    Yes PhysX has no future, moving to OpenCL and Havok.
     
  15. GC_PaNzerFIN

    GC_PaNzerFIN Maha Guru

    Messages:
    1,045
    Likes Received:
    0
    GPU:
    EVGA GTX 580 + AC Xtreme
    Actually it is. NVIDIA offered to lend a hand to get PhysX fully running on AMD hardware. But they refused it.
     

  16. mitzi76

    mitzi76 Guest

    Messages:
    8,738
    Likes Received:
    33
    GPU:
    MSI 970 (Gaming)
    most likely cos they asked for xxxx million pounds...
     
  17. GC_PaNzerFIN

    GC_PaNzerFIN Maha Guru

    Messages:
    1,045
    Likes Received:
    0
    GPU:
    EVGA GTX 580 + AC Xtreme
    Why would they ask money for doing something that benefits them?
     
  18. WhiteLightning

    WhiteLightning Don Illuminati Staff Member

    Messages:
    30,785
    Likes Received:
    3,959
    GPU:
    Inno3d RTX4070
    because nothing is free in this world
     
  19. GC_PaNzerFIN

    GC_PaNzerFIN Maha Guru

    Messages:
    1,045
    Likes Received:
    0
    GPU:
    EVGA GTX 580 + AC Xtreme
    You didn't answer my question, making PhysX a widely supported standard would have been far more profitable in long term.
     
  20. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    15,140
    Likes Received:
    1,743
    GPU:
    GTX 1080 Ti
    Yeah, if every other game out there used it. Most games I've seen that use PHYSX, are ue3 games.
     
Thread Status:
Not open for further replies.

Share This Page