Editorial: GeForce RTX 2080 and 2080 Ti - An Overview Thus far

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 17, 2018.

  1. Loobyluggs

    Loobyluggs Ancient Guru

    Messages:
    5,242
    Likes Received:
    1,605
    GPU:
    RTX 3060 12GB
    I know.

    Using an offshoot of the math of AO to get RTRT in gaming, was JC's point.
     
  2. Evildead666

    Evildead666 Guest

    Messages:
    1,309
    Likes Received:
    277
    GPU:
    Vega64/EKWB/Noctua
    I would think AMD has a reply for this somewhere down the line.
    From the MS announcement, ALL current DX12 compliant GPU's will be DXR capable, its just that they have deliberately left the door open for Hardware implementation/acceleration of all or parts of it in the Future.

    " You may have noticed that DXR does not introduce a new GPU engine to go alongside DX12’s existing Graphics and Compute engines. This is intentional – DXR workloads can be run on either of DX12’s existing engines. The primary reason for this is that, fundamentally, DXR is a compute-like workload. It does not require complex state such as output merger blend modes or input assembler vertex layouts. A secondary reason, however, is that representing DXR as a compute-like workload is aligned to what we see as the future of graphics, namely that hardware will be increasingly general-purpose, and eventually most fixed-function units will be replaced by HLSL code. The design of the raytracing pipeline state exemplifies this shift through its name and design in the API. With DX12, the traditional approach would have been to create a new CreateRaytracingPipelineState method. Instead, we decided to go with a much more generic and flexible CreateStateObject method. It is designed to be adaptable so that in addition to Raytracing, it can eventually be used to create Graphics and Compute pipeline states, as well as any future pipeline designs."
    https://blogs.msdn.microsoft.com/directx/2018/03/19/announcing-microsoft-directx-raytracing/

    I'm just waiting for AMD's next driver.
    One good thing is that this tech could well be in the next console refresh.

    edit : It also states in their brief announcement, that for games at least, we aren't going to be seeing the DXR tech being used for complete visual rendering, but mostly for light rendering techniques, and other supplements to the scene that can be accelerated by Rays conveniently (Maybe 3D audio is back in play).
     
    Last edited: Aug 19, 2018
  3. Evildead666

    Evildead666 Guest

    Messages:
    1,309
    Likes Received:
    277
    GPU:
    Vega64/EKWB/Noctua
    As for the new cards, I think the Core increase with respect to the current cards is pretty small, and the difference in some Games will be just as small.
    The memory bandwidth has gone up, which will help greatly in those games that can use it.
    1080 to 2080 doesn't seem like a good idea yet, neither does 1080Ti to 2080Ti, unless those Tensor/RT cores can be used meaningfully.
    As with all new tech, its great epeen, but generally best to wait for v2 of the tech so they iron the good bits out and remove the bad.

    Can't wait for all the new Benchmarks this is going to bring though ;)
     
    Luc and tunejunky like this.
  4. tunejunky

    tunejunky Ancient Guru

    Messages:
    4,459
    Likes Received:
    3,078
    GPU:
    7900xtx/7900xt
    yes, indeed.
    imo(!), this is the clumsy aftermath of getting caught behind in the process-shrink, as i've said extensively.
    we all know Nvidia did a few things: 1) were late to crypto, 2) depended on crypto too long, 3) possess superior architecture (atm), and 4) became complacent.

    in the meantime TSMC in partnership with Qualcomm, Apple, and AMD is making 7nm IC's right now.

    pity the Intel I-5. not only has ryzen+ made it irrelevant, but Qualcomm's new 7nm ARM cpu will have all day laptop battery performance and easily run all applications that an I-5 would be tasked for.

    And of course, in September the new 7nm iPhone

    and in 2019 Ryzen 2 and Navi

    so Nvidia and Intel were the odd men out and (mixed metaphor) caught behind the eight ball of process
     

  5. Robbo9999

    Robbo9999 Ancient Guru

    Messages:
    1,858
    Likes Received:
    442
    GPU:
    RTX 3080
    I generally think it's better to go for v1 of a new GPU architecture, eg getting the GTX 680 rather than GTX 780 and then upgrading to GTX 980 - in other words skipping the v2. But, I think in this particular case where this new architecture seems very new to me with all the Tensor & Ray Tracing Cores, that fundamental difference means it's all exceedingly new & untested & probably therefore unoptimised, so I think in this architecture's case it would be better to go with v2 (if they ever do a v2 that is). So for this specific architecture I certainly agree with your point.
     
  6. wavetrex

    wavetrex Ancient Guru

    Messages:
    2,465
    Likes Received:
    2,578
    GPU:
    ROG RTX 6090 Ultra
    Twitch livestream seems to be already active, with a countdown ( Currently at 21 hours, 30 mins, with over 2400 people watching a clock go down o_O )

    https://www.twitch.tv/nvidia
     
    Robbo9999 likes this.
  7. sneipen

    sneipen Member Guru

    Messages:
    137
    Likes Received:
    16
    GPU:
    Nvidia 1080
    Got a regular 1080. Had to upgrade from amd 6970 because it was on the edge of flying out the window..Not sure if o
    Im going to wait to see what amd can offer. Im guessing i
    Rtx 2080-2080ti will cost more than all my other parts combined. Im also not a fan regarding how nvidia has been acting.. Dont like to suport companys that are shady... But my 4k monitor dosent care about these things..
     
  8. Solfaur

    Solfaur Ancient Guru

    Messages:
    8,013
    Likes Received:
    1,533
    GPU:
    GB 3080Ti Gaming OC
    Nice summary, I'm definitely more curious now. Seems it's more than a refresh after all.
     
  9. Luc

    Luc Active Member

    Messages:
    94
    Likes Received:
    57
    GPU:
    RX 480 | Gt 710
    Sure, completely right (and funny "risitas" quotation, alanm), but I even understand that at the end everything depends on marketing and financials.

    Nvidia pushed launch prices on Quadro cards when they introduced tensor cores: P6000 was 1000$ more expensive than M6000, but GV100 was 3300$ more expensive than GP100. RTX8000 will be 1000$ more expensive than GV100.

    In gaming, betwen Maxwel and Pascal there was an increase of 50$. But Titan V (semi-pro) costed three times more than last generation Titan XP.

    The introduction of RTX and tensor cores will be expensive because of many things mostly explained here and there.

    Rumors talks about 1000$ for the RTX 2080Ti, but Hilbert predicts it will be about 100$ less, wich is 200$ more expensive than a 1080Ti. And it looks plausible.

    -

    But what really interests me it's the possibility of alleviating workload from the older cores to the new ones, maybe in a future.

    Thanks and excuse me for my bad repetitive english.
     
    Last edited: Aug 20, 2018
  10. alanm

    alanm Ancient Guru

    Messages:
    12,274
    Likes Received:
    4,477
    GPU:
    RTX 4080
    Got this gnawing feeling that the 2080ti joint release with the 2080 is to deflect from the lesser cards (2080/2070) less than expected gains when reviewed by tech sites. So no one will say Turing fail when the 2080ti gets all the attention and praise. If 2080/2070 were released without Ti, the Turing line may lose much of the excitement when we see the reviews. I mean we all know we will not see Pascal size gains with the 2080/2070, only modest gains vs their predecessors. As AMD compared Vega 64 with Fury X on release (+25% ?), so Nvidia will focus 2080/2070 vs 1080/1070 with the Ti out of the picture to make it more appealing. Hope I'm wrong, but lets see how Nvidias marketing magic deals with this.
     
    BangTail and Luc like this.

  11. BangTail

    BangTail Guest

    Messages:
    3,568
    Likes Received:
    1,099
    GPU:
    EVGA 2080 Ti XC
    I concur and I suspect this is why there is no Titan this time round, they are going to try and drive all the gamers who would normally buy Titan because there was no Ti until later on to buy the Ti right away and I suspect the Ti will be the only card that offers any real performance advantage over the current Ti.
     
  12. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    https://www.nvidia.com/en-us/titan/titan-v/

    Granted, it's not "turing", but it's also not pascal, since its volta. Whose to say if there won't be a turing titan, but realistically speaking, since there's a volta, there's not much of a reason for a turing titan currently.

    My bet is though the titan for turing will come out, and i'm not sure why you say there is no titan this time around, since the Titan X(pascal) came out a couple months after the 1080, and the Titan Xp came out a month after the 1080 ti, and the titan X(maxwell) came out many months after the 980.

    To say there's no titan this time around due to what we know or suspect is going to release first, when the previous titans have come out months after the initial release of the product, just doesn't make sense.
     
  13. The Goose

    The Goose Ancient Guru

    Messages:
    3,057
    Likes Received:
    375
    GPU:
    MSI Rtx3080 SuprimX
    £1400 ebuyer pre-order for an Asus 2080ti dual fan....£850 for an Evga 2080 sc, so the 2070 is gonna be around 650-700, not worth the upgrade imo, might just just keep my eye out for a second 1080.
     
    Last edited: Aug 20, 2018
  14. BitBleed

    BitBleed Guest

    Messages:
    9
    Likes Received:
    2
    GPU:
    RX 580 Strix OC
    Great article but you lost me at "competition really does not have an answer"
    Have you seen your own Vega 64 / Vega 56 / RX 580 benchmarks ?
    They are more than capable in their own league , Vega 64 is on par with 1080 and better in DX12/Vulkan , 56 is better than 1070 thats why Nvidia had to release 1070TI , And RX 580 is better than 1060 in almost every game

    And FYI , competition always had an answer since GCN was a thing , Only devs couldn't utilize their full potential with DX11 , No wonder why AMD is called "Fine wine"
    It gets better over time when devs use DX12/Vulkan's potential ( Hence DOOM ) and Nvidia's older architectures suffer massively compared to AMD , Overtime
    And don't try to start the price argument , We all know thats because people started buying all of the AMD cards when they realized they were great for mining , That's not AMD's fault
    blame the miners for that

    This article was informative indeed but the writer seemed like an informed fanboy at best .
     
  15. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Well, while true. It requires you to omit power consumption (cost of ownership). Polaris is OKish in that respect, Vega not so much.

    And how all current cards from both camps scale to 20X0(Ti) series is just guess at this point. Those current cards may actually shine on performance/price scale in current games. It remains to be seen.

    And while you are right that nV's older arch suffered from time to time, it was due nV delivering optimizations to particular title for new arch only. Required bit uprising from community and it got fixed in each occasion.
    AMD's per title improvements are rarely great, they more often deliver tiny improvements on global level making all games run bit better. But GCN is at end of the road, I would not expect some global 5%+ improvements at this point. Maybe fix improving particular title.
     

  16. BitBleed

    BitBleed Guest

    Messages:
    9
    Likes Received:
    2
    GPU:
    RX 580 Strix OC
    Power consumption is usually a non issues on PC and costs are negligible IMHO
    The AMD cards won't hold against the new RTX series thats for sure

    And about optimizations , AMD got a LOT better on RX 480/580 over time , A lot more than 5% , Check some fine wine benchmarks on Youtube
     
  17. BitBleed

    BitBleed Guest

    Messages:
    9
    Likes Received:
    2
    GPU:
    RX 580 Strix OC
    God damn , John Carmack makes me wet
     
  18. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Truth is, that power consumption can make difference. Having 2 cards at same purchase price and same performance, one is 170W, other 270W. Every hour you play you eat 100W more. 4 hours a day makes daily 0,4kWh difference. 300 days in year played and in 3 years you are at 360kWh difference. With regular EU prices that's ~75 Euro. For that addition once could have justified purchase of stronger GPU. (At least back when price difference between GPU lines was not $200 ~ $300.)
     
  19. BitBleed

    BitBleed Guest

    Messages:
    9
    Likes Received:
    2
    GPU:
    RX 580 Strix OC
    But the difference between GPUs are a lot more than 75 Euros and 75 Euros is not much in a whole year !
    But I agree , Lower power consumption is always better . Just not a big deal for me personally as long as the GPU isn't going to replace my frying pan
     
    Fox2232 likes this.
  20. A M D BugBear

    A M D BugBear Ancient Guru

    Messages:
    4,424
    Likes Received:
    643
    GPU:
    4 GTX 970/4-Way Sli
    Also Per request to Hilbert before doing the full review, can you include some 8k Testing please, Would love to see differences.

    Either in 16:9(4320p) or 16:10(4800p).

    Thanks.
     
    fantaskarsef likes this.

Share This Page