Hello everyone, This is my first thread and the calculations might be based more on conjecture than anything else, but there is a slight chance that everything I theorize is right so I decided to make this thread. There is much opinion involved, I hope that everything checks out. This post assumes that Nvidia follows a pattern where their first Tesla GPU is only a fraction of the capabilities of their GPUs designed for more graphically demanding roles.The only example of Volta architecture we have now is the Tesla V100, clocking in at 15 Tflops FP32. The Pascal Tesla came in at 10.6 FP32 and the final release the Titan Xp was 12.15 FP32, a 12.8% (or .127572016) increase. So the projected maximum for Volta should be 17 Tflops ([15*.127572016]+15=16.91358024, rounded to 17). I theorize that based on the scaling of the 1080 TI in Dawn of War 3 (a demanding game designed for PC), based off benchmarks at high settings at both 4k and 1080p provided at notebookcheck.com), that quadrupling resolution causes a 66% (.65510204) performance hit (it dropped from 98 average frames per second to 33.8 at only High settings in 4k, not even Ultra). The GTX 1080 had 25.7 frames per second at High settings in 4k, a 24% (.23964497) decrease when compared to the 1080 TI. The 1080 is at 9 Tflops FP32 and the 1080 TI is at 11.34, a difference of 2.34 Tflops. Ignoring all other relevant factors (like VRAM speed and quantity), if 2.34 Tflops equals a 24% performance increase in Dawn of War 3 (my chosen point of reference), then a 5 Tflop increase in FP32 will result in a 51% increase in average frames per second ([5/2.34]*.23964497=.512061901). All of this means that at high settings in 4k a single, non-overclocked, 17 Tflop Volta GPU could raise average frames per second (in Dawn of War 2 at 4k, high settings) to 51 ([33.8*.512061901]+33.8=51.10769225). If Volta GPUs scale well it would take Quad SLI scaling at 95% to reach 196 frames per second (51+[51*.95*3]=196.35) in Dawn of War 3 at 4k, high settings. At 8k high settings it could bring 4 non -overclocked high level Volta GPUs (probably Titans) down to ~68 frames per second (196.35-[196.35*.65510204]=67.720714446). At 16k the Quad SLI setup (with non-overclocked cards) would theoretically manage 23.3 frames per second (67.720714446-[67.720714446*.65510204]=23.3567362622) in Dawn of War 3 16k, at high settings. 4 Quadro P5000s (each at 8.9 Tflops FP32), ran Rise of the Tombraider at 1 fps in 16k couldn't have been running over 3 fps in Crysis 3 16k (though they did manage to run Minecraft at an apparent average fps of 50, just flying around slowly in creative mode), during the LinusTechTips experiment. (I apologize for the implied curse word in the title.) Instead of SLI, Linus used Nvidia Quadrosync which he wrongly assumed would overcome the VRAM constraint (limited to the maximum of one card instead of the total of all 4 [hopefully Nvidia will fix this]) found in SLI setups. While it worked fine for Half-Life 2 and Minecraft at 16k, the 16GB of VRAM of the P5000 was not enough for 16k (current Quadro P6000s have 24GBs of GDDR5X) in Rise of the Tombraider and other graphically intensive titles (at what appeared to be low settings). Linus believed that since the P5000s were running at 66% and 100% of VRAM (16GB) was in use, he could have gained a possible 30+% in performance if the setup had more VRAM. The increase, he said, would have brought performance to "1.3" frames per second. This suggests that when gaming at super high resolutions 24 GB is the minimum amount of VRAM required. Linus's setup seemed to have worked better than traditional SLI (without Venturi), though the scaling of the Linus setup was not fully disclosed. There should be a Linus/Venturi collaboration build, we should all start trying to make that happen. That would be insanely awesome. If the high level Volta cards have at least 24 GB of VRAM, Volta Quadro V6000s may be needed (The P6000 is almost equal to the Titan Xp, 12 Tflops for the P6000 and 12.15 Tflops for the Titan Xp), could run a graphically intensive game (like Dawn of War 3) in Venturi-SLI far better than the Linus original build did (at a theoretical average of 19 frames per second). Also, I believe SLI will work better with overclocking (water cooling would be needed) than Quadrosync, so we need a Quadro-Venturi-SLI vs Quadro-Quadrosync. If we can learn from the past then the V6000 and Titan Xv could be held off until late 2018 for the V6000 and sometime in 2019 for the improved Titan Xv. To conclude, we need a Venturi-Linus team up, even future cards will struggle with 16k (I'm Cpt. Obvious) and it is crazy how natural it is to overlook 8k and 4k when you know that real Quad SLI really exists. 16k should be the last resolution for a while though, until some major technological breakthrough. Though at our current rate of progression, there will be setups running 16k at 60 fps by 2022. Edit: A few major corrections to my calculations. Originally I used 16.34 Tflops to represent a 17 Tflop GPU. I used and increase of 5 Tflops instead of 5.66. Updated it would be ([5.66/2.34]*.23964497=.57965407273). Calculations for a single 17 Tflop GPU (Volta/Ampere) based off of data observed benchmarks from Dawn of war 3 in 4k at high settings: 4k (33.8+[33.8*.57965407273]=53.3923076583), about 53.4 fps. 8k (53.4-[53.4*.65510204]=18.417551064), about 18.4 fps. 16k (18.4-[18.4*.65510204]=6.346122464), about 6.4 fps. Calculations for a single overclocked 17 Tflop GPU, when overclocked to 20 Tflops in out same scenario. Increase of 3 Tflops ([8.66/2.34]*.23964497=.88689121376). 4k(33.8+[33.8*.88689121376]=63.7769230251), about 64 fps and an increase of about 10.37 fps versus 17 Tflops. 8k (63.7769230251-[63.7769230251*.65510204]=21.9965306464), about 22 fps and an increase of about 3.6 fps versus 17 Tflops. 16k (21.9965306464-[21.9965306464*.65510204]=7.58655854702), about 7.6 fps and an increase of 1.2 fps versus 17 Tflops. We see that at our current rate of progression from the 1080TI (earlier in the post) to a new unnamed GPU which might deliver 17 Tflops of power, (a change of about 5.66 Tflops) without being overclocked, 2-way SLI will soon destroy 4k and 8k in current-gen high demanding titles. But future insane resolutions with even more demanding games will require major technological developments. A 20 Tflop GPU in 4-way SLI could expect to manage 29 fps in a graphically demanding game at 16k (7.58655854702+[7.58655854702*.95*3]=29.208250406). GPU's could easily be capable of 16k at over 30fps in 2-way SLI (will do 15 fps within this year), in current-gen game titles, within 5 years (but I believe that Nvidia has to pace itself in order for other kinds of technology to catch up). HDMI 2.1 can do uncompressed (UC) 4k at 120hz and UC 8k at 60hz. The next version of HDMI might do UC 8k 120hz and 16k at 60hz (within 5 to 10 years). I believe that within 5 years an overclocked 2-way SLI setup will manage 30 frames per second in current-gen titles at the 16k resolution. But we know that games should continue to get more demanding. To go beyond the graphical and technical demands of current-gen titles while also increasing resolution dramatically will require massive increases in GPU and CPU capabilities. A few ways technology could progress is with official 4-way SLI support which scales well (based on Venturi's setup), both mainstream dual socket motherboards and CPU's which manage resources well (like Venturi's setup), and some advent which makes all PC components more affordable. If resolution increases faster than the level of stress a graphically demanding game will put on the GPU, then the rate of graphical and technological progression in games will be halted (this has already happened and will likely continue to happen anyway). All in all, (I'm surprised) a 2-way SLI setup will destroy games at 16k within 5-10 years if games don't get more graphically demanding out of pace with road maps given by GPU manufacturers (basically Nvidia at this point); just as they are about to tear through current-gen titles in 4k and 8k. But Venturi-style 4-way SLI GPU's will demolish 16k [30ish fps] within this year, without being able to display 16k though (no monitors)... My Rorschach-ish theory proved that it's a crafty setup... The relationship between screen manufacturers, data cable companies and content creators. Together they seem to be slowing down technology in order to limit the need for a 4-way SLI... Or could it be a trend that Nvidia predicted without collusion? No, it is all a scheme. This way they can charge more and produce less GPU's, when 2-way SLI becomes too powerful.