Rumor: NVIDIA's GeForce RTX 3000 video cards launch delayed to September

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Mar 30, 2020.

  1. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    You have missed the point. I hope it was not intentional. Not about freaking TFLOPs on some particular chip in XBOX/PS. It could have been 2 or 50.
    It is about transistor count and clock at which given performance is achieved. Secondly, have you seen size of XBOX's new chip?
    Maybe check it to transistor density for RDNA1 GPUs on 7nm and Zen2 on 7nm. Density seems to be quite higher.

    If we ignored that transistor budget per CU went likely up from RDNA1 to RDNA2 due to DX-R implementation, and 4/8-bit operations and took just RDNA1, 56CUs in GPU would end up having entire area of XBOX's chip alone... leaving 80mm^2 Zen2 part floating in open with all extra things that are usually not in GPU, but MS requested for I/O.
    One has to stop and take a look at transistor density. You have 8C/16T Zen2 which is roughly 4B transistors. GPU that would have around 14.5B transistors if it was RDNA1. That's without counting in I/O die that Zen 2 has separate.

    Minimum, you would be looking at 18.5B transistors in 360mm^2... ~51M/mm^2. (Ignoring all the I/O or additional transistors for improvements.)
    Zen2 ... 48M/mm^2.
    RDNA1 ... 41M/mm^2.
    Realistically, with all transistors counted in, you are looking at more transistors in that area or AMD doing changes that result in fewer transistors per CU while delivering all those new features.

    Or you can go and take other route around it. Taking given 15.3B transistors (which is likely just GPU) and think that it is entire chip with CPU included. Then remove CPU part and i/o which comes with it. What transistor budget would remain for GPU with 56CUs? How does it compare to RDNA1? That would not tell story of any IPC improvement, but it would say that AMD can deliver around 56CUs at same transistor expense as RX 5700 XT.

    Would you take a while to extrapolate with tiny 15% IPC improvements, known reachable (and stable) clock of 10% higher and suddenly 35% higher count of CUs at same transistor budget?

    From my point of view, XBOX's chip either uses at minimal 25% higher transistor density than RDNA1 GPU. Or its CPU and GPU delivered all included improvements while reducing transistor budget drastically.

    As you can see, I took very mild route with everything. Stated only 15% IPC improvement while AMD said that RDNA2 is to deliver around same improvement over RDNA1, as RDNA1 had over GCN. I stated mere 10% higher sustainable stable clock over RDNA1. And went with more denser manufacturing process instead of taking official transistor count on face value and saying that AMD has really scary improvements and removed a lot of transistors in process.

    So, what would it look like if we took bad route from nVidia's point of view? 25% higher performance at same clock per CU than RDNA1. 15~20% smaller CUs. 15% higher clock. And not yet known if AMD will use EUV for desktop GPUs.

    Are you getting the pattern? I say best case scenario for nVidia, that AMD already used all things available. And you kind of feel it is scary and want to disprove it. But disproving makes underlying technology much better.
    I am pretty sure nVidia had their hands on sample consoles and know what to expect from desktops. And I am pretty sure that with RDNA2, people will start looking at AMD's GPUs same way as they are looking on their CPUs now.
    - - - -
    On other hand, who cares if nVidia delays their GPUs by half a year. This time it does not matter. People will have excellent alternative.
     
  2. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    Time of 2yrs wait isn't a problem at all. However, I think you're under-estimating the financial burden that's pushed Sony to do it in the first place. By the time PS5 comes out the world is going to be a different place than in the PS4 generation. Sony is going to need to change their business strategy, the writing is already on the wall. You're going to be surprised what happens.
     
  3. Fediuld

    Fediuld Master Guru

    Messages:
    773
    Likes Received:
    452
    GPU:
    AMD 5700XT AE
    You forgot what PS4 exclusive is? Only Horizon was PS4 exclusive, the rest were released on windows the same day.
     
  4. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,872
    Likes Received:
    446
    GPU:
    RTX3080ti Founders
    This isn't youtube-land. Do some research before posting.
     

  5. RavenMaster

    RavenMaster Maha Guru

    Messages:
    1,356
    Likes Received:
    250
    GPU:
    1x RTX 3080 FE
    I just wanna connect my C9 OLED to a graphics card that has a HDMI 2.1 socket so i can finally switch my settings to 4K 120hz in beautiful 4:4:4 HDR.

    Club3D and Realtek were supposed to be releasing a Displayport 1.4 to HDMI 2.1 adapter but they keep on pushing back the release date. At this rate Nvidia will get there first and there will be no need for the adapter.
     
  6. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    The only reason I could see for NVidia delaying a launch, would be if the launch was planned months in advance..... In this case, the launch would have had to have been planned before COVID-19 started infecting people in China. I doubt NVidia plans product launches quite that far ahead, so I don't see a launch delay at all. And if we look back at the RTX launch, it was what? August/September? Seems that time frame would probably work out best for a product launch with all of the "back to school" sales for people in the US.

    There's no reason for NVidia to get their hands on samples of the new XBox or PS, since they're not involved in the development of either console nor any software that will run on either console.
     
  7. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Same way as people with no special resources to use for obtaining samples get their hands on unreleased GPUs/CPUs/Consoles?
    I am pretty sure, that in case like this following applies: "Where there's a will, there's a way."
     
  8. wavetrex

    wavetrex Ancient Guru

    Messages:
    2,449
    Likes Received:
    2,542
    GPU:
    TUF 6800XT OC
    I keep looking at prices for RTX 2060 Super, RX 5700XT, RTX 2070 Super .... but unfortunately none of them are enticing enough to replace my GTX 1080.

    Both 2060 S and 5700 XT are "somewhat" cheaper, but they barely beat GTX 1080 by a few percent in most older games, and only really new games with engines designed for the new tech show 20%+ better performance.

    2070 S is indeed faster in all areas, so it is an actual upgrade, but the cheapest here is 520 Euro, with the large majority of models over 550. It feels like a lot of money for not a that much extra performance (best case scenario 40%).

    ---
    All that while the "-80" class GeForce cards are way out of my "acceptable" price range...

    So really hoping the new gen comes this year with better performance/$, because my "old" 1080 is starting to get physically tired (noisy VRMs and not so good anymore fans). If it breaks completely before 3000 series or new Navi, I'll have to buy one of these overpriced cards that exist now... :-(
     
    Exodite likes this.

Share This Page