Nvidia Pascal Based Titan 50% Faster than GeForce GTX 1080?

Discussion in 'Videocards - NVIDIA GeForce' started by pharma, Jul 5, 2016.

  1. holystarlight

    holystarlight Master Guru

    Messages:
    792
    Likes Received:
    62
    GPU:
    Nvidia 4090
    And here i am, waiting for my preorder to still go through for the 1080s lol, at this rate the titan p will be released.
     
  2. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    Not "to the max" I'm pretty sure I said I MAY need an upgrade.

    The article in the OP did however claim older i7 would bottleneck.

    Here is the exact quote which is using the word likely.

    It all remains to be seen.
     
    Last edited: Jul 8, 2016
  3. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    Hmm, I think going from GTX1070 to Titan P would still be an awesome upgrade. Twice as long at least.

    I reckon UK price will be around £1300. Might as well just get another GTX1070 and pretend it's just as good. :bang:
     
  4. Assnfiks

    Assnfiks Guest

    Messages:
    5
    Likes Received:
    0
    GPU:
    GTX 980Ti Sli
    If its true, will my i7-5820k bottleneck two pascal Titans on 144hz?
     
    Last edited: Jul 9, 2016

  5. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    At anything under 4K yes.

    2 1080's in SLI are bottlenecking on Hilbert's test bench.
     
  6. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    he's still on Gx-1x4 conspiracy :3eyes:

    amazing
     
  7. GhostXL

    GhostXL Guest

    Messages:
    6,081
    Likes Received:
    54
    GPU:
    PNY EPIC-X RTX 4090
    Man just one GTX 1080 at 2.05-2.1 Ghz is blowing 1440p 144hz Gsync out of the water with games maxed out AND max AA.

    I can't imagine what this is needed for....(5K+ if anything).

    I think I'd still rather have 2 GTX 1080's in SLI for 4K+.

    Cost effective.
     
  8. Carfax

    Carfax Ancient Guru

    Messages:
    3,971
    Likes Received:
    1,462
    GPU:
    Zotac 4090 Extreme
    Depends. If your CPU is stock, then definitely yes. If it's overclocked to at least 4.2ghz, then the bottleneck will be much less. Also, it depends on the settings of the game, the efficiency of the SLI scaling, and last but not least, what engine/API it's using.

    A well designed DX11 engine like Frostbite 3 for instance with good SLI scaling, will utilize the CPU very effectively and won't be a huge bottleneck for the GPUs. Some engines though might scale well with SLI, but use the CPU poorly which will result in bottlenecks.

    And then there's DX12. With DX12, CPU overhead is reduced dramatically, and the rendering is made fully parallel, which should theoretically eliminate CPU bottlenecks, even with high end SLI setups.
     
  9. Assnfiks

    Assnfiks Guest

    Messages:
    5
    Likes Received:
    0
    GPU:
    GTX 980Ti Sli
    Thanx, that was informative.
    Forgot to mention, my CPU runs @ 4.3 Ghz, as for settings, id probably max everything out.
    Speaking of DX 12, as far as I`ve seen from the benchmarks, it kind of sucks, which is disappointing...
     
  10. Carfax

    Carfax Ancient Guru

    Messages:
    3,971
    Likes Received:
    1,462
    GPU:
    Zotac 4090 Extreme
    That's because it's a brand new API, and it's going to take developers a while to master it. Every time a brand new API is released, we go through the same process.

    It took I think about 2 years after DX11 was released before we began to see games really take advantage of the API. And about 4 or 5 years till DX11 reached its pinnacle with games like The Division, Witcher 3, etcetera.

    DX12 seems to be going off at a faster start than DX11, but the learning curve is much steeper due to developers now having to do things that was previously take care of automatically by the API and or driver.

    The best example of DX12 at the moment is probably Ashes of the Singularity. Rise of the Tomb Raider is improving, but it still has issues with memory management in DX12 mode from what I've experienced. But AotS seems like it's optimized very well, and shows large increases for DX12 vs DX11, which can be almost solely attributed to DX12's CPU optimizations.

    A 50% increase on the GTX 1080, and a whopping 104% increase on the Fury X compared to DX11!

    [​IMG]
     

Share This Page