NVIDIA Starts Teasing GeForce RTX 30 “Ampere” Series With Social media Viral

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 10, 2020.

  1. Supertribble

    Supertribble Master Guru

    Messages:
    978
    Likes Received:
    174
    GPU:
    Noctua 3070/3080 FE
    I missed the whole dedicated 3D accelerator phase entirely after getting soured on PC gaming because I got a 486 about a week before the first Pentium landed, after that it was Pentium this, Pentium that.

    The first discrete GPU I bought was the FX 5200 (oh god). Things didn't start looking up until I got the 6600 GT, then I was hooked.
     
    Solfaur likes this.
  2. Mda400

    Mda400 Maha Guru

    Messages:
    1,090
    Likes Received:
    201
    GPU:
    4070Ti 3GHz/24GHz
    Do you think people will care if it has those shortcomings, as long as it is less money for similar performance? Again, in these current times many will look for the bargain.
     
  3. k0vasz

    k0vasz Active Member

    Messages:
    71
    Likes Received:
    26
    GPU:
    nV GTX1060 6G
    well, Geforce256 had the revolutionary T&L (Transform and Lightning) engine, which could be compared to Ray-Tracing - but that was a new thing in Turing, and it's very unlikely that they can present anything groundbreaking this time, so I don't think it's a good comparison from their part :)
     
  4. NewTRUMP Order

    NewTRUMP Order Master Guru

    Messages:
    727
    Likes Received:
    314
    GPU:
    rtx 3080
    21 years and 21 days will be the length of time it will take to pay off your credit card bill for the 3080/90 TI [​IMG] BTW I'm going big Navi I just refuse to pay more than my computer build costs for a Nvidia flagship gpu!!!
     

  5. asturur

    asturur Maha Guru

    Messages:
    1,374
    Likes Received:
    503
    GPU:
    Geforce Gtx 1080TI
    Since when gpus are supposed to do
    something about gameplay? You mean physix?

    Where do you see 1080ti at 400$ ? are you talking about stuff used for mining?
     
  6. asturur

    asturur Maha Guru

    Messages:
    1,374
    Likes Received:
    503
    GPU:
    Geforce Gtx 1080TI
    yes... exactly how?
     
  7. Sixtyfps

    Sixtyfps Master Guru

    Messages:
    242
    Likes Received:
    43
    GPU:
    Tuff 4080 non OC
    Great new high powered cards but where are the games ? PC gaming is dead on arrival with no new games
     
  8. Cru_N_cher

    Cru_N_cher Guest

    Messages:
    775
    Likes Received:
    2
    GPU:
    MSI NX8800GT OC

    We are trying to gain UHD stable and Nvidia is at the forefront with this no 3rd party beats DLSS 3.0 currently with infereing support from the Tensor Cores.
    This is Psychovisual Optimization and it's a very heavy Research Field compared to all the Simple Algorithms you put on the table here which Nvidia deprecated long time ago in favor of working on DLSS ;)

    It will bring 4K Gaming to the masses even on lower cost tier cards scaling up from 1080p/1440p without anyone noticing the difference if correctly Engine implemented and it will even improve the most simple reconstruction implementations in their visual IQ/Performance result.

    it gives Nvidia currently a head on of ~3x Hardware Shrinking Generations in Silicon and fill that saved Render overhead with the RT cores.

    It is the biggest Nvidia IO/Performance Cheat ever and it's being trained on their Mega Cluster, Hardware training itself.

    We see the beginning of ML complementing Hardware inefficiencies or problems where Physically improving them as fast as we used to be because of the rising Quantum problem isn't as easy anymore as putting a small genius team together that train a massive amount of dataset to fix/improve a problem and where even personal costs now get relavated for the later cloud scaling and competitive efficiency.
    And Nvidia also improved their internal compression around it additionally with ML.

    And we have not yet any AMD/SONY answer to this AMDs approach is Async Hardware efficiency most efficient implementation visible in Idtech7 for now running on it.

    This Battle will be very different this time DLSS 3.0 will be pushed on the Front Lines for Nvidia and they see it as a revolution being able to lower the Overhead so drastically to the goal reaching 8 million pixels in Realtime + Hybrid Raytraced for a decent Power output.

    And this is what Nvidia Customers gonna pay for and how Nvidia is going to scale it in the Game Visual Experience Context in Price over the cards line of GPU scaling, keep a close eye on Cyberpunk 2077 behavior in the Reviews.

    DLSS 3.0 is the first step to make you dependent on Nvidias Cloud structure the first step in their Transition.

    AMD improved the GPU significantly the 5700XT captured the 1080 and 2070 Super at lower price point but DLSS 3.0 is another hurdle now, which is much harder to beat without extensive training on really heavy Clusters.

    And you will see Nvidia Showcasing their Killer App implementations here Control and Cyberpunk 2077 running decently at 4K on as low as the upcoming 2060 replacement.

    With the possibility to upgrade any TAA supporting Game going ahead after greenlighting the TAA 3rd party Engine implementation.

    In the End this might become a Fight between Nvidias and AMDs Cluster and training/inferring efficiency Power.

    When AMD should surprise us with their own Solution being ready for Launch.
     
    Last edited: Aug 11, 2020
  9. kings

    kings Member Guru

    Messages:
    158
    Likes Received:
    131
    GPU:
    GTX 980Ti / RX 580
  10. Caesar

    Caesar Ancient Guru

    Messages:
    1,561
    Likes Received:
    686
    GPU:
    RTX 4070 Gaming X
    That's on my birthday.:)
     
    angelgraves13 and OnnA like this.

  11. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    I don't see a real bargain to be honest. The 2070S is a better card than the 5700XT, and Nvidia always prices matches in similar performance levels. If there is a driver/feature disparity, the bargain is not there.
    Bargain is getting a card like the 2060 with DLSS 2.0, running games with raytracing OK.

    Turing was probably the most revolutionary thing out of Nvidia after the original GeForce and whichever one it was that had Pixel Shaders (I believe it was the GeForce 3, not certain, it's been too long).
     
  12. Cru_N_cher

    Cru_N_cher Guest

    Messages:
    775
    Likes Received:
    2
    GPU:
    MSI NX8800GT OC
    Turing though could be seen as both a failure to their GPGPU Promises or a learning process that more fixed function in tandem will be needed without redesigning everything from scratch in a not doable timeframe ;)

    Turings Async disadvantage was the Power Distribution it was a pure Beta Design playing on Cost efficiency, this is fixed now and the final results of this Hardware side fix when everys core work clashes in 1 Frame together will be interesting to see.

    This thing brought us DLSS 3.0

    https://blogs.nvidia.com/blog/2020/07/29/mlperf-training-benchmark-records/

    And it wont surprise me it the DLSS 3.0 Team gets a lot of Presentation time for their Achievements on Sep 1 ;)
     
    Last edited: Aug 11, 2020
  13. Mda400

    Mda400 Maha Guru

    Messages:
    1,090
    Likes Received:
    201
    GPU:
    4070Ti 3GHz/24GHz
    Right now, you may not see a bargain but i meant for what may happen depending on the performance of RDNA2.

    Seeing as RDNA made a lot of efficiency improvement over Vega, i'd say it would be good to hold off and see what AMD counters with unless there are features you must have now that wouldn't be available with AMD.
     
    Last edited: Aug 11, 2020
    Deleted member 213629 likes this.
  14. bouxesas

    bouxesas Guest

    Messages:
    1
    Likes Received:
    0
    GPU:
    GTX 1080 Ti
    No. This hints that the name of the cards will be 2180 and not 3080
     
  15. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    I don't know what that ML record has to do with Nvidia managing to inject DLSS2.0 in a generic manner. I really hope they manage it, but one thing is not like the other.

    RDNA seemed like a pipe cleaner and a way to recuperate some design costs while the "real" architecture (ie RDNA 2.0) was still on the way. It literally feels like a draft someone decided to publish because the book wasn't ready on time.
     

  16. Ameubius

    Ameubius Guest

    Messages:
    7
    Likes Received:
    7
    GPU:
    MSI GTX 980 ti
    Looking forward to getting a 3080ti so I can utilize the HDMI 2.1 port on my LG 65" C9 OLED "monitor".

    120Hz @ 4K, heck yeah!

    Going to build a new wood/aluminium PC case for it, that will look very cool :)
     
    Solfaur likes this.
  17. Andrew LB

    Andrew LB Maha Guru

    Messages:
    1,251
    Likes Received:
    232
    GPU:
    EVGA GTX 1080@2,025
    Ha. I had a pair of Diamond Monster Voodoo 2's in SLI before going to nvidia as well. and my one deviation has been the 9700 pro also. volt modded the crap out of that thing.
     
  18. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,038
    Likes Received:
    7,379
    GPU:
    GTX 1080ti
    Considering there was 3080 on the real leaked cards, this is a nope.
     
  19. Cru_N_cher

    Cru_N_cher Guest

    Messages:
    775
    Likes Received:
    2
    GPU:
    MSI NX8800GT OC

    This is the thing this war will be for AMD about how fast will they be able to counter DLSS 3.0 inferring it efficiently from RDNA 2 maybe with the Help of Sony R&D they both work much closer together since the whole Playstation 4 Design and share R&D.
    Many things we see in RDNA 2 is a corporational effort between AMD/SONY R&D especially on the Hardware side and we gonna see more.
    And you wonder what Sony is up to in General here after Cell.
    One thing for sure it makes jensen shaking in his shoes ehh leather jacket, and that Samsung "forced" him after the lost lawsuit to buy Ampere production resources he doesn't like much as well ;)

    But anyway all this is super beneficial for us consumers we gonna see a nice boost in overall 4K and VR Performance even at the more balanced sweetspot Energy Efficiency/Performance/IQ and more then the General 10 FPS gain + possible additional 10 FPS from crazy overclocking which becomes though less and less as bruteforce efficiency is dieng slowly at those resolutions and more and more compute workloads we see in engines :)
     
    Last edited: Aug 12, 2020
    Supertribble likes this.
  20. Cru_N_cher

    Cru_N_cher Guest

    Messages:
    775
    Likes Received:
    2
    GPU:
    MSI NX8800GT OC
    Also i want to add something

    We have never been in a better situation of code quality parity for AMD/Nvidia as well since AMD moved to a Scalar architecture from VLIW this is also why DLSS 3.0 is so important for Nvidia as since Nvidia lost the the Consoles completely the code quality/performance for AMD was forced to improve significantly and wee see these results daily improving on the Consoles and being brought over to the PC and Nvidia predicted this long time ago happening :)
    And they upping their exclusivity R&D on it but in the End how their 500 Engineers gonna survive vs million of DEVs can ML be really the answer ;) ?
    ML is also nothing exclusive at all to Nvidia so their Dominance is swindling faster now.
    They are at the absolute highest point currently but we all know what follows after the PEAK ;)

    The catchup to DLSS 3.0 will happen just a matter of time training time to be exact.
     
    Last edited: Aug 13, 2020

Share This Page