Is Nvidia futureproof atm?

Discussion in 'Videocards - NVIDIA GeForce' started by RighteousRami, Nov 9, 2013.

  1. Veteran

    Veteran Ancient Guru

    Messages:
    12,094
    Likes Received:
    21
    GPU:
    2xTitan XM@1590Mhz-CH20
    Yeah i think i have about 5.5 years left, thats good enough i guess. I bet the darn thing breaks when the warranty runs out after all that time, typical:)
     
  2. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    Still that would be 7 years out if the PSU that's pretty good in my book.
     
  3. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    10,438
    Likes Received:
    3,119
    GPU:
    PNY RTX4090
    I would say at the moment its far too hard to decide who is future proof.

    Nvidia is planning the release of Maxwell GTX800 series at the end of the first quarter next year (I still think they will push it back again to the end of the year though)

    We also don't know how well AMD's Mantle will work, more games are supporting it now as well which is a VERY good sign. But still nobody knows how it will perform, it very well could catapult AMD's 7000, and R7 and R9 series (Any GCN core) past any of Nvidia's GPU's. AMD has chosen some very strong claims with it saying that Mantle will ridicule even the GTXTITAN in games that use it.

    We also have no idea on how G-SYNC will perform, how much monitors will cost, will it be tied down to Nvidia cards or licensed out for AMD to buy and use?

    At this point in time if you REALLY need a new GPU and you have the money then go for the 780Ti no questions asked it is a beast of a card. But with aftermarket coolers coming for the R9 290 and 290X the temperature issue could very well be sorted and we could see overclocked 290 and 290X meeting or beating the 780Ti.
     
  4. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    You got explanation on why glide died.
    As for EAX, it was never HW related. That is why technology got sold later as "software licence" for other sound card manufacturers. (1st generation of cards had HW capable to accelerate it" all cards made later used CPU power for that.
    Dedicated physX ... it could have run on CPU all effects as it can today. Thing is nVidia will not allow it nor allow to use more than one thread. (they reassured everyone about sdk 3.xx and it's ability to completely utilize CPU)
    Do you actually remember time of those dedicated cards? Here in europe they came at higher price than high end GPUs, because they were meant for scientific purposes too. It never got off due to extreme price while giving little more clutter, but nice rigid body physics.

    And you still has not got it? Mantle is not locked to GCN HW. OpenGL is not locked to specific HW too. DirectX is only graphical library locked to software platform, but is still not locked to single HW platform.
    intel/nV can get in and implement support same way as they get in OpenGL, OpenCL or DirectX.

    It's same as telling all prehistoric species died out. Therefore mankind is going same way. "Press Panic Button and Run!" (please, learn to make decisions based on understanding of cause. Then you will not compare bugs with apples.)

    DirectX is main gaming platform, do you know how much more you can get from it than you are already getting?
    Demoscene proved where High level APIs drag us every year.
    http://www.youtube.com/watch?v=4oPpcSZa3NE
    If you are able to cut overhead you can put this to 59KB single executable without additional resources. But now this takes great coding skills because if you used standard DirectX development tools to do that you would end up with beefy code where 98% would not work towards rendering stuff but doing something else not related to it.
     

  5. Veteran

    Veteran Ancient Guru

    Messages:
    12,094
    Likes Received:
    21
    GPU:
    2xTitan XM@1590Mhz-CH20
    Yeah your right.
     
  6. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    "Future proof", would imply that the future has no effect. So, a graphics card that is "future proof" would never have to be replaced.
     
  7. ESlik

    ESlik Guest

    Messages:
    2,417
    Likes Received:
    24
    GPU:
    EVGA x2 1080 Ti SC Black
    ^^ Bingo!
     
  8. inkarnat3

    inkarnat3 Guest

    Messages:
    22
    Likes Received:
    0
    GPU:
    GTX 780TI SLI
    yasamoka, thanks for the reply. I pretty much agree with you / don't have anything to really disagree with!

    I will say that competition overall is good and AMD did push Nvidia to drop prices and add a game bundle. All of which is good for the consumer! A possible low-level API war...ehhh. We shall see, but I agree it will probably be a bad thing.

    Whichever side of the fence you are on, as long as AMD pushes Nvidia and Nvidia pushes AMD the consumer should benefit!

    --Matt
     
  9. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    I'm glad Nvidia have become sensitive to pricing. They've been price setters for far too long. They still are, but less so now.

    Nvidia's game bundle has impressed me far more than AMD's Never Settle bundles of past as all 3 games are excellent games. The others had a more varied opinion between people although I do appreciate all of the ones AMD offered as well (got Crysis 3, Far Cry 3, Hitman Absolution, Sleeping Dogs and Bioshock Infinite <-- these two...).

    Carmack is saying they can get close to the metal with Nvidia's OpenGL extensions. Both companies are working on better OpenGL support (AMD particularly) due to SteamOS. So we might see a surge in Linux gaming and a more efficient OpenGL pathway?

    Really interesting times :)
     
    Last edited: Nov 10, 2013
  10. inkarnat3

    inkarnat3 Guest

    Messages:
    22
    Likes Received:
    0
    GPU:
    GTX 780TI SLI
    Just to add fuel to the fire with Mantle:

    It is confirmed that they are working on supporting Mantle with Battlefield 4. But in reality they are claiming to add support through the Frostbite Engine 3.0 that Battlefield 4 is built on. I read that DICE / EA stated that there at 15 or so announced games all running on that engine. Including racing games, RPG games and even RTS games.

    I'm actually a bit more curious how this will pan out because we are potentially talking about rather large franchise here. Battlefield 4, Need for Speed Rivals (11/19/13), the next Mass Effect and Dragon Age 3 are among the titles. They also mention a Command and Conquer game, but I'm not sure if that is the F2P that got cancelled or whatever....

    It will be interesting what kind of adoption is taken up here and what the benefit actually amounts to.

    If you really want to geek out, check out the following:
    http://www.eurogamer.net/articles/digitalfoundry-carmack-sweeney-andersson-interview
     

  11. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Yes, New CC got back to drawing board. They got very negative feedback from closed beta test.
    Since they have engine and models it's "just" question of doing it right this time.
    I personally do not care much about games based on Mantle, I want that API out in wild for public to play with.
     
  12. Vxheous

    Vxheous Guest

    Messages:
    1,348
    Likes Received:
    0
    GPU:
    MSI GTX 1070 X
    I'm using an Antec Nine Hundred Two V3, and it's definately not futureproof. I can install a Corsair H80i in it, but it doesn't have room for any of the larger coolers.
     
  13. toniglandyl

    toniglandyl Guest

    Messages:
    33
    Likes Received:
    0
    GPU:
    8800GT 512
    Any component is future proof as long as you take care of it and understand what it's supposed to do.

    Am I complaining that my 8800GT can't play Arkham asylum or city at full graphics at 1920*1200 ? no ! was this card still a great purchase ? hell yeah !

    The interesting thing about the latest generation for me is the integrated video *********. This feature is integrated to the card, so you're sure to be able to use that functionality in the future.
    if you grab a card today, you know what to expect of it and it won't change for the next generation of cards. games haven't been really pushing the limits of our hardware just yet (except badly optimized ones).
    Sure, if you want to game on triple 3D 4K screens, you won't be able to play the latest games at the best settings, but who really needs that kind of stuff ?

    Grab a card, know what to expect from it and take care of it. If you don't feel the need to keep all settings at the best level, it will last you for years. that's how things are, whether it be an AMD or Nvidia.

    for general purpose "future proof" stuff, I agree that a good PSU will last with a PC as long as it'll die.
     
  14. Red River

    Red River Guest

    Is Nvidia futureproof atm

    one word: gsync
    next year is gonna be nvidia year and AMD will be heading the Voodoo way...
     
  15. Sota

    Sota Ancient Guru

    Messages:
    2,520
    Likes Received:
    2
    GPU:
    MSI GTX 980 Gaming 4G
    Nvidia adopters aren't going to give a damn about Mantle due to G-Sync. Nvidia users will have smooth, stutter, and lag free game play at max settings...even if the fps drops below 60fps. True, you have to buy a monitor, but once you get it....you won't need another unless you simply want more.

    Mantle is great, but it will have no bearing on Nvidia adopters.
     

  16. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    Majority of NVidia users either won't know about G-sync, or won't care.

    Mantle itself, won't have any real impact on sales of NVidia or AMD cards.

    Not likely to happen. AMD has chips in all 3 consoles, which gives them a leg up with developers. Any developer looking to release games on the consoles, will already have to heavily optimize for AMD's hardware, just like they've done with NVidia's in the past.

    Until it gets wider adoption by display makers, G-sync will be nothing more than a gimmick like PhysX. G-Sync isn't going to drive sales of NVidia cards for anyone other than hardcore NVidia loyalists.
     
  17. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    As I shown in G-Sync thread with hard numbers. GSync looks like good upgrade over 60Hz screen, but it brings little to people with 120/144Hz screen it is based on.
     
  18. Spets

    Spets Guest

    Messages:
    3,500
    Likes Received:
    670
    GPU:
    RTX 4090
    Until they hit below 60fps where G-Sync shines some more:
    http://www.guru3d.com/articles_pages/nvidia_g_sync_preview_article,1.html
    http://www.anandtech.com/show/7436/nvidias-gsync-attempting-to-revolutionize-gaming-via-smoothness
    http://www.pcper.com/reviews/Graphi...Refresh-Rate/Potential-Benefits-New-Interface

    @sykozis; Asus, Philips, BenQ and ViewSonic are on-board so far.
    Don't think everyone agrees with you about a lag/stutter free and silky smooth gaming experience all round is a gimmick, but I also don't think this will push sales heavily.
     
  19. RighteousRami

    RighteousRami Guest

    Messages:
    191
    Likes Received:
    0
    GPU:
    2 MSI 980 SLI
    In regards to AMD using Mantle. It isn't some proprietary program they are using. Nvidia could use it as could intel, if they implemented the correct hardware to make it work as programmed which is something like cuda cores.

    My case is a Antec Eleven Hundred

    If performance gains over direct X i see more developers making use of this. As for G Sync, i'm not going to upgrade my 3x 1920x1200 monitors as i also have a 27" 120Hz 3d monitor also above my 3 screens for when i want good frames in FPS games.
     
    Last edited: Nov 14, 2013
  20. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    I call it a gimmick because at current, it only works with Kepler based cards and will only be available in a fraction of the displays from 4 manufacturers (3 of which I wouldn't pay $10 for a display from personally). For those still running Fermi cards or older, that means buying a new graphics card and a new monitor. Cost is going to be a major put off for most people, even without the requirement for a Kepler card. You're buying a hardware "tech" that NVidia could drop support for at any time if they decide it's inconvenient to code for or decide to drop support from a future graphics card line....
     

Share This Page