Volta 16k

Discussion in 'Videocards - NVIDIA GeForce' started by DreadZilla101, Sep 19, 2017.

Tags:
?

Am I jumpung the gun or was this interesting?

  1. Jumping the gun

    10.0%
  2. Interesting

    50.0%
  3. You're just a goof lolz

    40.0%
  1. DreadZilla101

    DreadZilla101 Guest

    Messages:
    23
    Likes Received:
    0
    GPU:
    GTX 660 Ti/2048 MB
    I agree with Venturi. And I believe that when Nvidia is able to secure more console deals (the Nintendo Switch is the beginning of the end), they will completely stamp out AMD GPU's forever. AMD CPUs though, that's the one way they can survive if they refuse to go all out and use new technology on the GPU side of things.

    AMD could focus on the enthusiasts first and make mainstream dual socket CPUs and mainstream dual socket (octo-socket would be crazily epic) motherboards (and super 4-way SLI, with the dual CPUs) popular, but they probably won't take those risks because they know that Nvidia and Intel will just step in like ninjas and immediately/brutally kill them in that market.

    It's a tough time to be AMD. GPU-wise, cheaper or RAID0-ish VRAM might buy them some time to come up with something good. But honestly, I don't see how they can win.

    The only thing that keeps Nvidia from owning all parts of the tech market is their lack of a license to manufacture processors. If they had that we'd have more impressive technology than we have today, but at an insane cost. When Nvidia makes a deal with Intel it will all be over (probably the end of the world somehow too).

    I just realized that if Nvidia buys AMD, it would be a sign of the end for Intel.
     
    Last edited: Sep 26, 2017
  2. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    Im eyeing on Big volta,

    I think its safe to say it will be what 1080Ti is to 980Ti when both OC'ed. So 2070Ti ~60-70% faster then 1080Ti?

    I mean If that thing doesnt drive next-gen cgi gfx , they can all shove it :D
    e.g. finalfantasy clip, cyberpunk2077, etc..



     
    DreadZilla101 likes this.
  3. DreadZilla101

    DreadZilla101 Guest

    Messages:
    23
    Likes Received:
    0
    GPU:
    GTX 660 Ti/2048 MB
    That would be cool, I love these tech demos. It will be epic when games fully reach tech demo quality.

    Edit: games are close to the level of the Witch demo though, like Hellblade.
     
  4. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,640
    Likes Received:
    1,143
    GPU:
    4090 FE H20
    Hellblade artwise looks good but technically nothing special.

    It has very poor textures across the board
     

  5. DreadZilla101

    DreadZilla101 Guest

    Messages:
    23
    Likes Received:
    0
    GPU:
    GTX 660 Ti/2048 MB
    They could be better, I wouldn't call them poor textures though.. I've played a lot of wore looking games.
     
  6. venturi

    venturi Master Guru

    Messages:
    517
    Likes Received:
    342
    GPU:
    2x 4090 FE RTX
    I've done my best to eak out the best I could from hellblade and I have to agree, I don't see anything special

    I was able to make it sharper and bring out more texture resolution, but still nothing really exceptional


    The last game that was challenging to eak out graphics and was a visually impressive experience was two years ago with the vanishing of Ethan carter redux - my first game on UE4 that I SLIed

    I haven't seen much since

    I don't really see performance issues in new games that are because of processing power, but I do see issues related to the game designs themselves manifesting as performance issues.



    The issues with the next gen cards may improve things but real 4k, 5k, 8k monitor resolution with max eye candy (not maxing AA) I have yet to see a single card solution keep frames higher than 60fos.

    Hence the need for sli-

    I have an 8k monitor and honestly all 4 cards tuned together can barely push that much to stay above 60fps 8k, borderline impossible. I'm constantly trying to find that magic config.


    So with the next gen, or current, only with SLI (quad) will one get that performance, -- a single card won't be able to do it in the immediate future.

    It's only been a few years since I tried 4096x2160 and that was a struggle with max eye candy.
    So, with the nvidia card after Volta, that 2nd gen may be able to do it.

    Of course, by then, we will also have monitors in excess of 8k ;)





    My current stack of project tweaking perf/visual and quad sli at 5k and 8k is:

    Echo ---> finally got it running well, but needs a few agjustmenrs
    Mass effect ---> painful. May have to pass
    Senua --- got it smoothed out, need to resolve some torch flashes, character looks better, environment is still lacking.
    Dying Light --> revisiting it after installing some mods

    And I'm tweaking the quad sli again on witcher 3 as I installed some game mods that caused me go back to the drawing board on witcher 3 quad. Starting to become more visually impressive.



    I also tried win 10 and server 2016, but the performance hit was too big to consider, back to 2012r2
     
    Last edited: Sep 27, 2017
    DreadZilla101 likes this.
  7. DreadZilla101

    DreadZilla101 Guest

    Messages:
    23
    Likes Received:
    0
    GPU:
    GTX 660 Ti/2048 MB
    Are you running out of VRAM? I saw that, like Linus in his 16k experiment (24+GB seems good for 16k), ThirtyIR had massive performance drops at 8k (16+GB seems good for 8k) as a result of VRAM bottle-neck.

    The only current solutions for 8k would probably be the Quadros (expensive, but your Rig is already epic anyway).
     
    Last edited: Sep 28, 2017
  8. venturi

    venturi Master Guru

    Messages:
    517
    Likes Received:
    342
    GPU:
    2x 4090 FE RTX
    I'm sure there are instances where the vram is insufficient. The other issue is somewhat related to the fact that most games are NOT optimized to even be pushed at that level, yet.

    The most pleasurable experience has been around the 4096x2160 so far
     
    DreadZilla101 likes this.
  9. DreadZilla101

    DreadZilla101 Guest

    Messages:
    23
    Likes Received:
    0
    GPU:
    GTX 660 Ti/2048 MB
    4k is still ultra cool, and I bet you run supersampling at the highest level and aren't affected as well.

    I'm still at 1080P, I have a 660TI. 4k is next for me (when I have more time). Is 8k even worth it? I bet an 8k resolution with 8k textures is stunning..
     
  10. wrathloki

    wrathloki Ancient Guru

    Messages:
    2,134
    Likes Received:
    318
    GPU:
    EVGA 3080
    I would be ecstatic if I could just get a single card that would run most things at 4k60. Can’t even fathom 16k yet.
     

  11. jbscotchman

    jbscotchman Guest

    Messages:
    5,871
    Likes Received:
    4,765
    GPU:
    MSI 1660 Ti Ventus
    I'm perfectly fine with 1080p. I don't have 10,000 dollars to blow on minecraft.
     
  12. venturi

    venturi Master Guru

    Messages:
    517
    Likes Received:
    342
    GPU:
    2x 4090 FE RTX
    I respect that
    my problem is that I can't find a single card solution that does well at 4k and 8k

    I also use my personal rig for nuc / isotopes, and radiology, and then games


    so if you give a mouse a cookie:
    ...to run high rez and quad SLI you need a dual mobo, if you need a dual mob... etc etc
     
  13. genbrien

    genbrien Guest

    Messages:
    174
    Likes Received:
    5
    GPU:
    Asus Strix GTX980
    any news on Volta's mobile version?

    Might be time to upgrade from my 770m
     
  14. DreadZilla101

    DreadZilla101 Guest

    Messages:
    23
    Likes Received:
    0
    GPU:
    GTX 660 Ti/2048 MB
    (old but blue stuff has been corrected) The GTX 1080 had 25.7 frames per second at High settings in 4k, a 24% (.23964497) decrease when compared to the 1080 TI. The 1080 is at 9 Tflops FP32 and the 1080 TI is at 11.34, a difference of 2.34 Tflops. Ignoring all other relevant factors (like VRAM speed and quantity), if 2.34 Tflops equals a 24% performance increase in Dawn of War 3 (my chosen point of reference), then a 5 Tflop increase in FP32 will result in a 51% increase in average frames per second ([5/2.34]*.23964497=.512061901).

    All of this means that at high settings in 4k a single, non-overclocked, 17 Tflop Volta GPU could raise average frames per second (in Dawn of War 2 at 4k, high settings) to 51 ([33.8*.512061901]+33.8=51.10769225).

    If Volta GPUs scale well it would take Quad SLI scaling at 95% to reach 196 frames per second (51+[51*.95*3]=196.35) in Dawn of War 3 at 4k, high settings. At 8k high settings it could bring 4 non -overclocked high level Volta GPUs (probably Titans) down to ~68 frames per second (196.35-[196.35*.65510204]=67.720714446).

    At 16k the Quad SLI setup (with non-overclocked cards) would theoretically manage 23.3 frames per second (67.720714446-[67.720714446*.65510204]=23.3567362622) in Dawn of War 3 16k, at high settings.(old but blue stuff has been corrected)

    (new)Edit: A few major corrections to my calculations. Originally I used 16.34 Tflops to represent a 17 Tflop GPU. I used and increase of 5 Tflops instead of 5.66. Updated it would be ([5.66/2.34]*.23964497=.57965407273).

    Calculations for a single 17 Tflop GPU (Volta/Ampere) based off of data observed benchmarks from Dawn of war 3 in 4k at high settings:

    4k (33.8+[33.8*.57965407273]=53.3923076583), about 53.4 fps.

    8k (53.4-[53.4*.65510204]=18.417551064), about 18.4 fps.

    16k (18.4-[18.4*.65510204]=6.346122464), about 6.4 fps.

    Calculations for a single overclocked 17 Tflop GPU, when overclocked to 20 Tflops in out same scenario.

    Increase of 3 Tflops ([8.66/2.34]*.23964497=.88689121376).

    4k(33.8+[33.8*.88689121376]=63.7769230251), about 64 fps and an increase of about 10.37 fps versus 17 Tflops.

    8k (63.7769230251-[63.7769230251*.65510204]=21.9965306464), about 22 fps and an increase of about 3.6 fps versus 17 Tflops.

    16k (21.9965306464-[21.9965306464*.65510204]=7.58655854702), about 7.6 fps and an increase of 1.2 fps versus 17 Tflops.

    We see that at our current rate of progression from the 1080TI (earlier in the post) to a new unnamed GPU which might deliver 17 Tflops of power, (a change of about 5.66 Tflops) without being overclocked, 2-way SLI will soon destroy 4k and 8k in current-gen high demanding titles. But future insane resolutions with even more demanding games will require major technological developments.

    A 20 Tflop GPU in 4-way SLI could expect to manage 29 fps in a graphically demanding game at 16k (7.58655854702+[7.58655854702*.95*3]=29.208250406). GPU's could easily be capable of 16k at over 30fps in 2-way SLI (will do 15 fps within this year), in current-gen game titles, within 5 years (but I believe that Nvidia has to pace itself in order for other kinds of technology to catch up).

    HDMI 2.1 can do uncompressed (UC) 4k at 120hz and UC 8k at 60hz. The next version of HDMI might do UC 8k 120hz and 16k at 60hz (within 5 to 10 years). I believe that within 5 years an overclocked 2-way SLI setup will manage 30 frames per second in current-gen titles at the 16k resolution. But we know that games should continue to get more demanding.

    To go beyond the graphical and technical demands of current-gen titles while also increasing resolution dramatically will require massive increases in GPU and CPU capabilities. A few ways technology could progress is with official 4-way SLI support which scales well (based on Venturi's setup), both mainstream dual socket motherboards and CPU's which manage resources well (like Venturi's setup), and some advent which makes all PC components more affordable.

    If resolution increases faster than the level of stress a graphically demanding game will put on the GPU, then the rate of graphical and technological progression in games will be halted (this has already happened and will likely continue to happen anyway).

    All in all, (I'm surprised) a 2-way SLI setup will destroy games at 16k within 5-10 years if games don't get more graphically demanding out of pace with road maps given by GPU manufacturers (basically Nvidia at this point); just as they are about to tear through current-gen titles in 4k and 8k. But Venturi-style 4-way SLI GPU's will demolish 16k [30ish fps] within this year, without being able to display 16k though (no monitors)...

    My Rorschach-ish theory proved that it's a crafty setup... The relationship between screen manufacturers, data cable companies and content creators. Together they seem to be slowing down technology in order to limit the need for a 4-way SLI... Or could it be a trend that Nvidia predicted without collusion? No, it is all a scheme. This way they can charge more and produce less GPU's, when 2-way SLI becomes too powerful.(new)
     
    Last edited: Jan 14, 2018

Share This Page