NVIDIA keynote and GeForce RTX 40 GPU / DLSS 3.0 announcements

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 20, 2022.

  1. barbacot

    barbacot Maha Guru

    Messages:
    1,005
    Likes Received:
    986
    GPU:
    MSI 4090 SuprimX
    I do admit that the 4080 16 GB variant (the true 4080) having a lower tbp than last gen 3080 - by a small margin is true - is a nice achievement from them giving the DLSS performance increase.
    Other than that I don't use DLSS. I want to see the performance increase figures over last gen without dlss.
     
    Mineria likes this.
  2. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,209
    GPU:
    AD102/Navi21
    how is vram a gearbox ?
    24gb is a lot of vram,more than anyone needs.
     
  3. EngEd

    EngEd Member Guru

    Messages:
    138
    Likes Received:
    40
    GPU:
    Gigabyte RTX 3080
    VRAM has nothing to do with the pure performance of the card. It matters how speedy the ram is, which resolution you are running, and of course how much texture load you are playing with. I have the RTX 3080 10GB, never a problem with 4K at max settings on every game I tested. So 24GB is way more than enough...
     
  4. Fediuld

    Fediuld Master Guru

    Messages:
    773
    Likes Received:
    452
    GPU:
    AMD 5700XT AE
    Both 4080s have serious issue with VRAM bandwidth (16GB 735.7 GB/s, 12GB 503.8 GB/s) as they have less than the 3080 10GB /3080Ti (760.3 GB/s), let alone the 3080 12GB 912.4 GB/s

    AMD supposed had such issue with the 6700XT with its 192bit wide bus, over the 256bit used on the 5700XT, however used 96MB L3 cache which removed the potential chocking of the card. There are benchmarks having both the cards as same clock speed and there is only one game Deux Ex which the 5700XT was faster than the 6700XT because of the memory bus. On all other games the 6700XT was faster even if by 3fps.

    We haven't seen Nvidia adding any L3 cache to help the 4080s though. If it was the case, it should have been pointed out at the announcement like AMD did.

    Both 4080s are terrible products.

    Lets hope AMD deliver a killing blow to this gen like did against Intel with the chiplets. Which led to having 16 core CPUs for $700 instead $1800 for 10 cores (6950X).
     
    Undying likes this.

  5. Raider0001

    Raider0001 Master Guru

    Messages:
    522
    Likes Received:
    45
    GPU:
    RX 7900 XTX Nitro+
    i played couple of games now exceeding 10gb for example Watch Dogs Legion: 11,7GB at 1440p ultra and that is almost 2 years old game the way your playing games now is pure luck that a true next gen AAA game didn't came out
     
    Picolete likes this.
  6. Fediuld

    Fediuld Master Guru

    Messages:
    773
    Likes Received:
    452
    GPU:
    AMD 5700XT AE
    The issue is the 3080 10GB has 760.3 GB/s. The 4080 12GB has 503.8 GB/s and the 4080 16GB 735.7 GB/s.

    The 3080 12GB 912.4 GB/s.
     
  7. bballfreak6

    bballfreak6 Ancient Guru

    Messages:
    1,907
    Likes Received:
    470
    GPU:
    MSI RTX 4080
    Not strange marketing; it's done purposely to justify the stupid price increase of the true 3080 successor that is the 16GB variant.
     
    Valken and Fediuld like this.
  8. EngEd

    EngEd Member Guru

    Messages:
    138
    Likes Received:
    40
    GPU:
    Gigabyte RTX 3080
    Well, WDL is the worst optimized game by Ubisoft in my opinion. I had huge trouble running the game smoothly with my PC before. Even with the updated DLSS 2.1. It's like it did not care to use DLSS properly. That was with a 4790K and RTX 3080 10GB, but now I have a 12700K CPU and the same card. Haven't tested yet with this CPU, but I don't think the game is worth it in my opinion
     
  9. TheDigitalJedi

    TheDigitalJedi Ancient Guru

    Messages:
    4,024
    Likes Received:
    3,327
    GPU:
    2X ASUS TUF 4090 OC
    Wow. I'm late. Long workday fellas.

    To be honest Gurus. Are we really that shocked by the pricing? For the 4080s, every extra 2gb of Vram is a hundred bucks. Many thought the 4090 was going to release at $2000.00.

    So many are extremely angry and I get it. Our beloved hobby is becoming too expensive. I have to be honest though, I'm not shocked at all with the prices. To be frank, it's better than I expected. The main question I have for myself is, what is it that I'm paying for? Will this card pay for itself and make money for me like my previous cards? These cards are for more than gaming cards and deliver a great amount of additional technologies for speedily and more effective content creation.

    For the 16GB 4080TI card it seems that every extra 2gb of VRAM is 100.00. The 12GB 4080 is $200.00 higher than its predecessor. Maybe that one should be $799.00?

    Many of us welcomed Ampere even with the price point of $699.00 for a 10gb 3080. Many paid more than that less than 6 months ago. The 12gb 3080 TI launched at $1,119.00 and again, many paid more than that less than 6 months ago. ADA is being met with hardly any love at all. We've had tons of discussions on price speculation. I've read tons of post that had far worse price expectations. The 12gb and 16gb 4080 cards are claiming twice the performance of their predecessors. The 4090 is making the same claim. If the technology doesn't live up to expectations, I will understand. But if it does??? We will soon find out. As of now, I'm impressed with ADA and Lovelace. The beloved pascal 11GB 1080TI was $699.00 at launch in 2017. That was 5 years ago. Just sayin.

    I can see AMD preparing themselves to take advantage of this. I hope their video cards can compete in graphics and Raytracing performance. That will be great for all of us especially if the pricing is more competitive.

    DLSS3 looks amazing. Did Jenson Huang state in so many words that the 4000 series will even work well with moderate CPUs due to handling the bulk of the workload with DLSS3? "DLSS3 is able to render the game, at a much higher framerate than the CPU is able to compute the game". The Cyberpunk performance is impressive, but I would love to see the ingame graphic settings. Are the ingame settings maxed with Raytracing on Psycho? Is the resolution 4k? "Here is Cyberpunk 2077 shown in all new max Raytracing mode, with SER and DLSS3." All new being the key words. The video looked very sharp with framerates ranging from 92 to 98 fps. Is Jenson stating that DLSS3 with SER will give us max setting visuals with these insane framerates due to this all new rendering mode?

    Omniverse, A.I., Hopper/Grace Hopper and Robotics Platforms have my interest. I was educated watching this video. There is a lot of technology with these products besides how smooth we can play a game. I rendered videos with my 3090 and was floored at the speed. This is awesome for content creation. The 4090 is going to be even faster with claims of double gaming performance.

    Gurus I'm impressed and this is a purchase for me.
     
    AuerX and pharma like this.
  10. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,114
    Likes Received:
    2,612
    GPU:
    3080TI iChill Black
    They said L cache got a big uplift too, but not as much as amd with its infinity cache.
     

  11. Chert

    Chert Member Guru

    Messages:
    142
    Likes Received:
    44
    GPU:
    Gigabyte RTX 3060ti
    I would like to see the pure raster performance of these 4000 series cards without the bells and whistles of DLSS(3.0) and RT. This will prove if these new cards are indeed better than the previous generation.
     
    pegasus1 likes this.
  12. pegasus1

    pegasus1 Ancient Guru

    Messages:
    5,272
    Likes Received:
    3,685
    GPU:
    TUF 4090
    I did post a week or so ago that £700 gets you a fast current gen card, 6900xt for example, im worndering what performance £700 worth of 4*** series gets you in comparison, maybe exactly the same or even less.
     
  13. pegasus1

    pegasus1 Ancient Guru

    Messages:
    5,272
    Likes Received:
    3,685
    GPU:
    TUF 4090
    Try Eagle Dynamics DCS ha ha ha, that gobbles up RAM, VRam and CPU cycles like a monster
     
  14. kanenas

    kanenas Master Guru

    Messages:
    512
    Likes Received:
    385
    GPU:
    6900xt,7800xt.
  15. Horus-Anhur

    Horus-Anhur Ancient Guru

    Messages:
    8,783
    Likes Received:
    10,903
    GPU:
    RX 6800 XT
    I have some doubts about DLSS3. Since it's interpolation, the frame rate will look higher, but input latency will be the same as the original frame rate.
    So a game might be running at 120 fps with DLSS3, but have the input lag of a game running at 60fps.
    Reflex will help mitigate this. But reflex will reduce input latency with or without DLSS3.
     
    Picolete, Valken, Solfaur and 3 others like this.

  16. pegasus1

    pegasus1 Ancient Guru

    Messages:
    5,272
    Likes Received:
    3,685
    GPU:
    TUF 4090
    As with anything new and shiny, or any major announcement about anything, so many are busy looking at what the shiny new thing can do, they tend to ignore what it cant, its deficiencies or what the announcer is leaving out or trying to hide.
     
    Fediuld likes this.
  17. Fediuld

    Fediuld Master Guru

    Messages:
    773
    Likes Received:
    452
    GPU:
    AMD 5700XT AE
    Yeah. All NV benchmarks showing FPS difference with the previous cards, are in DLSS 3.0 Performance Mode (so 1080p rendering) with the frame filler activated. So it seems the total number of FPS is higher because of the frame filler.
     
    Valken likes this.
  18. EngEd

    EngEd Member Guru

    Messages:
    138
    Likes Received:
    40
    GPU:
    Gigabyte RTX 3080
    Yup, keep everything with a grain of salt as they say. I don't believe anything until we actually have AIB/founder's reviews of these cards. Heck I even think DLSS 3.0 is useless as many games still lack support, even on DLSS 2.x. What I want to see is pure native performance without DLSS and without RT, then with RT alone. Because running RT alone hits hard on performance. This of course should be much faster, but they did not show this...
     
  19. Pale

    Pale Member

    Messages:
    18
    Likes Received:
    10
    GPU:
    2 x GTX 1080 Ti
    I'm not really getting the Euro pricing. Previously, the USD and EUR pricing was identical, which was already pushing it, as the exchange rate favored the EUR. Now the exchange rate of the USD and EUR is about 1:1, so now they just raise the price with €350 so they can rake in more money on the European market?
     
    Fediuld, toyo and Cave Waverider like this.
  20. mackintosh

    mackintosh Maha Guru

    Messages:
    1,230
    Likes Received:
    1,145
    GPU:
    .
    They're still 1:1. By law, European prices must include applicable VAT.
     

Share This Page