Is Nvidia futureproof atm?

Discussion in 'Videocards - NVIDIA GeForce' started by RighteousRami, Nov 9, 2013.

  1. RighteousRami

    RighteousRami Guest

    Messages:
    191
    Likes Received:
    0
    GPU:
    2 MSI 980 SLI
    Hey guys, wanting to upgrade my graphics card in the next few months.

    I am a bit confused why people would go with Nvidia cards in future at the moment if this whole Mantle thing is going to be utilized by alot of developers.

    Yes i do know all developers will use it, but BF4 like it or not is a huge game which would apparently benefit from from being processed using the mantle API.

    I am not going to get a new card until at least the R290 has a non-reference cooler as i hate high thermals. Or if im feeling adventurous try and watercool it myself (have only used a H80i on my CPU haha)

    Have a 660ti sli setup and i am noticing the 2GB memory holding me back in my triple screen setup lately. 3GB is also exceeded by BF4 on a tri screen setup iv'e seen somewhere.

    Nvidia may need to start putting SPU's on their cards so that they can utilize mantle. No Direct X update is going to do exactly what Mantle will do to work with the hardware as closely. Also it will break up everything too much if they try that imo.

    I would buy a 780ti if it had 4GB of RAM probably if Mantle doesn't turn out to be more widely adopted.

    So..... I don't want to go with AMD but if anyone has any thought on the topic, cool.
     
  2. inkarnat3

    inkarnat3 Guest

    Messages:
    22
    Likes Received:
    0
    GPU:
    GTX 780TI SLI
    If I were you and the timeframe of "a couple of months" puts you anywhere near February, I would wait until the spring release of cards.

    The little I know or understand about Mantle gives me zero indication that it is "going to take off". I can't predict the future any better than the next man, but it is honestly a bad thing if Nvidia and AMD both go down separate paths with ideas such as Mantle. All we end up with is a strong divide between games that are "The way it's meant to be played" and whatever AMD calls their dealio. Doesn't mean this won't happen, but I'll buy into it only when it becomes the standard....

    From my perspective, trusting AMD to do anything with drivers is a fools game. Yes they have gotten better, but I won't purchase a card that requires optimization to not run like utter crap in a game. It hasn't even been 6 months since they got their act together. No thanks. You can entice me with a game bundle and a few hundred less dollars, but I will take something that works better all around any day of the week.

    I just bought a pair of 780TI's, but trust me, I'm not a fan boy and my judgment isn't skewed. The cards I just bought could absolutely be faster and I wish they had more memory. With that said, when it came to spending my money the AMD R290X had a lot of little asterisk's to go along with it. Thankfully they pushed NVidia to drop prices, include a game bundle and release the 780TI. Competition is great! But I happily paid more for something that isn't a leaf blower / electric heater and I can't complain about the overall build quality, performance and price. I paid a little more, but I got a little more (and I don't mean performance with that statement).

    As far as future proofing, it depends on what you are looking to do in the near future. If you are on the cutting edge of 4k displays, triple monitors and throw the word 3D next to either of those, then nothing short of a couple thousand dollars of video cards will get you to "Playable, but yearning for more". 5760x1080 is about 3/4's the resolution of a 4k display and I can whole heartedly say that you need to spend cash and lower your expectations if you want to game at those resolutions. 1 1080P screen, you have a lot of options to run "max settings".

    I have 2 of the "fastest gaming card available" and my setup can easily push the limit of "oh yeah, can't do that". So "future proof" isn't exactly something I'm worried about at the moment.....I'm still chasing the dream!

    --Matt
     
  3. munwaal

    munwaal New Member

    Messages:
    6
    Likes Received:
    0
    GPU:
    NVidia GTX 780 6GB SLI
    Personally, it would take a lot for me to switch back to AMD. I've just installed a second GTX 770 and it has seriously impressed me. I game at 1080p and can run anything at max settings. I have no driver issues whatsoever. For me, everything just works with NVidia. Can't say I've heard the same about AMD.

    Just because a game supports Mantle, doesn't mean it's going to run like crap on NVidia hardware. Going over to AMD may introduce a whole new set of problems.

    I'm staying green for the foreseeable future.

    PS: I'm not a fan boy. I've had plenty of AMD/ATI and NVidia cards over my many years of gaming.
     
  4. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    As for Drivers, Lot of people have no issues on both sides.
    But both nV/AMD Driver sections of this forum will disagree that each side is fool proof.

    As for Mantle. I do not know if AMD delivers what they promise, but DirectX is hitting overhead wall and games will not look much better on it that they already do.
    That is where mantle goes in. AMD did little sad presentation with planet Core analogy with GCN. But truth is it's not build around GCN. It's layer cake and bot nV and AMD has to make proper driver separately to behave based on render calls from API.

    They like to call mantle low level api close to HW. Truth is, it's just like DirectX therefore usable by intel/nVidia. And it differs in overhead which should allow more complicated effects, scenery or bit better performance.
    And it will at least decrease CPU requirements.
    But switching 2x 770 for anything else on both sides is waste of money for single 1080p screen.
    Yes at end of next year 2GB vram may become very limiting factor for new games. But then we will have 20nm GPUs and whole different performance per watt.
     

  5. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    Guys, if you haven't extensively tried the newest AMD drivers, then don't judge.
     
  6. Violentum

    Violentum Guest

    Messages:
    326
    Likes Received:
    0
    GPU:
    ASUS GTX760 OC
    Have you ever heard the term "Glide" Well in a nut shell that was 3dfx's Mantle. It lasted for a few years and a few games kicked @ss using it. But that was about it. I wouldn't get your hair in a bun over Mantle...
     
  7. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Many of us used several 3dfx adapters. "heard" means not know anything.
    back then in 1995 when it got developed and 1996 released tere was no standardization at all in HW implementation of routines for OpenGL/DirectX across manufacturers. There were overrides for features in each game to make it work without glitches on different HW.
    It was chaos into which 3dfx came bringing little more instability and it lived for quite few years. It dies for simple reason, DirectX evolved gave something new. OpenGL evolved too. 3dfx library has not brought much inovation to table.

    Back then CPU not bottlenecking GPUs as much as now. And better GPUs gave adequate performance improvement.
    Now you can see games not getting more fps even if you lower resolution as we are at wall caused by DirectX (number of elements is usually not affected by resolution).
    Mantle has simply completely different reason behind and it is sound. 3dfx was just about getting it's share in market in chaotic times.

    As I wrote, we do not know if AMD delivers what promised. But if they do, DirectX graphics will in 2 years sit at similar place as now, while Mantle API gets drastic improvement over it.
     
  8. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    Last drivers I had were on my 6950 which I sold a month and a half ago I think I was using 13.6. Anyways the drivers were stable for the most part aside from skyrim still being broken and very low GPU usage in AC3 (I suspect game engine on this one as sli has negative scaling). But my biggest gripe with AMD's driver set is CCC that is a colossal disaster IMO, Nvidia control panel is much more intuitive and easier to use.
     
  9. Violentum

    Violentum Guest

    Messages:
    326
    Likes Received:
    0
    GPU:
    ASUS GTX760 OC
    What a pile of bull! If you have been following PC gaming history there is one thing that repeats its self time and time again. Every time a company tries to release a technology or API that is specific to there own hardware it has failed every single time or not lasted, or taken off as intended. Here are but a few examples:

    GLIDE
    EAX
    Dedicated PhysX
    Mantle?

    But hey prove me wrong...
     
  10. harkinsteven

    harkinsteven Guest

    Messages:
    2,942
    Likes Received:
    119
    GPU:
    RTX 3080
    DirectX has lasted. But that's maybe not the best example.
     
    Last edited: Nov 9, 2013

  11. Loobyluggs

    Loobyluggs Ancient Guru

    Messages:
    5,219
    Likes Received:
    1,589
    GPU:
    RTX 3060 12GB
    I wouldn't worry about either AMD or nVidia.
     
  12. inkarnat3

    inkarnat3 Guest

    Messages:
    22
    Likes Received:
    0
    GPU:
    GTX 780TI SLI
    Honestly, not trying to bicker, but you do not need to have first hand experience with AMD's products to get a good idea of how messed up their drivers were prior to 2013. There are many trusted review sites that paint a clear picture. The micro stuttering and frame time latency issues have been WELL documented and for the most part fixed, depending on what games you refer to. The way I understand it, they are fixed on a game by game basis. So if a particular game has not been optimized, your mileage varies, sometimes substantially. I do have a good friend who went down the AMD road and he has often expressed concerns / issues, but in general he loves his graphics card. In the mean time, he reaffirms why I stick with Nvidia....

    From my POV and just my opinion, AMD has a hard time getting the little things right. Because of this they have to add value and can't charge what Nvidia charges. If they could, they would, trust me. Last I checked, AMD doesn't run a charity!

    I feel that AMD has to add value by offering a price lower than the competition and then throwing an enticing gaming bundle with it. Instead of choosing AMD based solely on the merits of its card vs the competition most people wind up saying "That is $XXX worth of games, it is a little cheaper and it benchmarks very similar to Nvidia's XXXX card." I'm not saying that bundles don't add value for either camp and that getting more for you money isn't a good thing. I'm just not one for sacrifice. Let's face it, the game bundle is always brought into the decision when comparing AMD vs Nvidia. You either don't have the games and want them and it adds value or you have no use for the games and wish they sold a cheaper version with no bundle (if you really want the AMD card).

    Even the launch of the R290X couldn't go off without a hitch. Issues with retails samples not performing like review samples, all of which require a fix by what? A driver! So I'm not trying to be naive and bash AMD and call them trash like a fanboy / troll. I will just say that the facts and reviews that I read on a daily basis from tech sites has helped me form an opinion and educate me as to where to spend my hard earned money.

    Getting back on topic, AMD's body of work in the software department isn't giving me the impression that Mantle will be a huge success. It more or less sounds like a way to "game the system" and distance themselves from apples to apples comparisons to their competition. Fix Direct X or come up with another true standard by which others can make software and hardware for. Proprietary things always have a funny way of becoming their own worst enemy...

    --Matt
     
    Last edited: Nov 9, 2013
  13. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    nvidia is no more future proof than amd. the stronger cards you buy, the longer they will be relevant in the near future.
     
  14. Undying

    Undying Ancient Guru

    Messages:
    25,332
    Likes Received:
    12,743
    GPU:
    XFX RX6800XT 16GB
    Especially when you don't know when it's time upgrade a GPU. It has to be right timing to last you longer.
     
  15. ESlik

    ESlik Guest

    Messages:
    2,417
    Likes Received:
    24
    GPU:
    EVGA x2 1080 Ti SC Black
    NOTHING is future proof.
     

  16. Veteran

    Veteran Ancient Guru

    Messages:
    12,094
    Likes Received:
    21
    GPU:
    2xTitan XM@1590Mhz-CH20
    PSU is until it breaks:)
     
  17. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    my psu has a 7 year warranty with 4.5 years left thats about as future proof I can get
     
  18. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    Computer case?
     
  19. ESlik

    ESlik Guest

    Messages:
    2,417
    Likes Received:
    24
    GPU:
    EVGA x2 1080 Ti SC Black
    I suppose it depends on how far into the "future" future proof means. Near future might be the better term.
     
  20. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    No, frame pacing is a general solution that works for every DX11 game (up to 1600p for 7000 series and up to R9-280X, and higher for R9-290(X)). Some games that might not play very nice (few) are targeted in drivers, that's what Nvidia is doing as well, I'm sure. You will never get 100% of games to work with something.

    It's becoming more and more about brand name rather than driver reliability. People *feeling* more comfortable with having Nvidia GPUs. If we're going to be basing GPU purchases off of gut feelings, then we're most likely going to be ripped off at some point. VSync, which is basically the only way to get a tear-free and smooth experience at a constant FPS (pre G-Sync), was broken for many at 600 series release. Even now some suffer from V-Sync issues. If the surefire way of making a game smoother fails to work, then I have no guarantee of running all my games smoothly. That would be irritating.

    The release of the R9-290X made Nvidia drop the GTX 780's price down from $650 to $500, release the GTX 780 Ti, and toss in a sick game bundle: Batman: Arkham Origins, Splinter Cell: Blacklist, Assassin's Creed IV: Black Flags.


    Or in the case of 7970 vs. 680, the AMD card is also faster at higher resolutions, has 3GB of VRAM, higher memory bandwidth, scales better with CrossFire vs. SLi, and has better compute. And is $100 cheaper. And I get sick AAA games.

    Like the GTX680 release? Mythical card at the time. Out of stock everywhere.

    We'll see how Mantle goes. I'm not optimistic about splitting the market in half if Nvidia hacks up its own low-level API too.
     
    Last edited: Nov 10, 2013

Share This Page