Crytek employee says Playstation 5 will win, Xbox Series X has bottlenecks

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Apr 7, 2020.

  1. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    15% difference, to you, is 5 years?

    Oh now i understand, you need a tinfoil hat. Everything makes sense now.
     
  2. Asgardi

    Asgardi Guest

    Messages:
    248
    Likes Received:
    14
    GPU:
    MSI GTX 980 Ti OC
    XX doesn't need that kind of SSD because of the hypervisor design. They can just save the VM state of a game and return it in seconds. No need to launch the game, no need to load saved game. If you were in the middle of shooting when you quit, your bullets will fly when you return. Just like in the current X1, even though now they can just save a state of one or two apps and one game. But in the future they said they can save the state of multiple games. During gameplay that SSD gives you nothing anyway.

    Remains to be seen how it goes about performance but the only thing he actually said is that PS5 is nicer for a developer. Which is weird, because once again the developers are forced to learn new programming model while for XX it continues to be the same DirectX stuff. The same code and previous gen games will just run. On PS5 they simply don't. And he didn't even say that if you fully utilize both systems you would get more out from PS5. Which would be weird thing to say as both are basically the same AMD stuff with XX having more metal inside after all.

    And remember, developers have exactly the same kind of personal preferences for things as any of us. Some developer enthusiastic about XX will dig out more performance from it for sure than someone who is not. Just like some developer enthusiastic about Lumias put Windows 95 and Windows 10 running inside...
     
  3. Asgardi

    Asgardi Guest

    Messages:
    248
    Likes Received:
    14
    GPU:
    MSI GTX 980 Ti OC
    Yep. Above are the guys who basically get their money from Sony. PS4 was the best thing ever after how difficult and expensive it was to develop for PS3. And now PS5 is the greatest leap ever after how difficult and expensive it was to code for PS4 and PS3... Right. And have fun making those PS4 games compatible again.

    Below are the guys simply speculating with information coming from the opposite side. Though there are a couple of (I think non-mentioned) points which support those claims:
    - Some game release dates are being pushed back, even if now with everybody home would make sense to rush for release as quickly as possible. So is there are console to release for?
    - Phil just stated that even if this is the case, nothing will stop XX from releasing on time. This would actually be a huge win for them, if they can release well before holidays and have the more powerful system, while your competitor is slipping for the next year.
    - Internally at Sony corona virus would be a perfect thing to blame for missing holiday launch, even if it has nothing to do with heating of the console. So they might be more easily thinking about it as a viable option.

    Though this is just speculation based on nothing :) After all, PS is too big part of Sony to be screwed up. But you can only be right after making your guess public.
     
    Loobyluggs likes this.
  4. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Neither will use GCN and will inherit its limitations. If AMD's claim that RDNA2 will provide similar jump in optimizations as does RDNA1 over GCN is true, then that "raw power" at marginally lower clocks will win.
    Or do you think that having 15% higher sustainable clock will matter over XSX's higher raw power?
    Do you think that once raytracing is used, it will benefit from clock and not from actual HW available to be used?
    Like it or not XSX is over 40% more robust GPU which has smallest advantage in terms of ROPs count as that is just 25% more. But you know that ROPs are not even going to be decisive factor.
    Raytracing will be used via shaders+TMUs. And data will hopefully be in caches as needed since that's one area AMD focused a lot with RDNA.
    And if not. Who's having higher chance to deliver data as fast as possible? XSX with 560GB/s for games code/resources or PS5 with 448GB/s shared between games and System?
    - - - -

    And talking about clock advantage. It seems that while PS5's boost is nice on paper, it's base guaranteed clock will be just 2GHz which is just under 10% more than XSX's clock.
    So what kind of bottleneck in architecture alleviated by 10% clock speed will outweigh good 30% higher total raw power? (Mind that this is not GCN. And even there it is like comparing HD7970 at 10% higher clock to R9-290X. HD7970 never won.)
     

  5. Loobyluggs

    Loobyluggs Ancient Guru

    Messages:
    5,221
    Likes Received:
    1,590
    GPU:
    RTX 3060 12GB
    ORLY?

    [​IMG]

    [​IMG]

    Care to retract your statement? Good lord...Sony is balls deep in electronics manufacturing.

    I got love for you, you keep people on their toes for realies, but c'mon, that was too easy to refute.
     
    Last edited: Apr 11, 2020
  6. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,016
    Likes Received:
    7,355
    GPU:
    GTX 1080ti
    Sustainable clocks will always trump adding more of something.
     
  7. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    No.
    I could have worded it better, but these days they pay other people to make far too many of their products compared to the 90s.

    It was you who started the whole "If they pay other people to make stuff then..." argument.

    Sustainable will be the key.
    I ws also thinking, and I don't have much experience with modern GPUs and even less experience with AMD hardware, will higher clock speeds not mean higher temps and increased noise?
     
    Last edited: Apr 9, 2020
  8. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Not in GPU technology and you know it. GPUs are not based on some stupidly poor threading. Or have you forgotten what SIMD stands for?
     
  9. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,016
    Likes Received:
    7,355
    GPU:
    GTX 1080ti
    We aren't talking about 1:1 gpu technology though, there are customization in each that alter the performance projections, and the base toolsets actually matters a lot.

    Xbox hasn't gone half as far into describing what gives their chips a further edge.

    Sure, late toolsets might give Xbox the edge, but by then we'll have hardware refreshes for each to further remove redundant hardware and add anything the other needed.
    It wasn't until the Gamecube was out that developers had N64 figured out to be able to give us aggressively optimized stuff, like the later games from Factor5

    On the Memory segmentation, DF is WRONG (not surprising, they almost always are about Hardware)
    You can't rely on the automatic distribution of game data in a singular memory pool that consists of two seperate speeds of memory.

    And there is significant latency problems to factor into that Sony has avoided by going the one singular pool of memory design.

    The Xbox Series X will allow for significant performance, with sufficient hardware specific optimization, much like DX12 can do, but often doesn't.

    The console that offers the most straightforward development will perform the best on average.
     
  10. Turanis

    Turanis Guest

    Messages:
    1,779
    Likes Received:
    489
    GPU:
    Gigabyte RX500
    The success of a console depends Totally of what good games they have.
    PlayStation is 1st console back in the time and always could be on top on gaming consoles.

    PS5 vs XBOXXX: if will be Last of Us 2 or 3,Bloodborne 2 or 3,Horizon ZD 2, then Xboxxx dont have very good chances on the market.
     

  11. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    No it does not, unless you can point exactly on part of GPU that is bottlenecked and can improve only through clock on RDNA1 and there is reason for it not being improved for RDNA2.
    Sony made bet on AMD failing to improve GPU. But that bet was wrong because AMD improved drastically on RDNA1 already. Even if both consoles were based on RDNA1 withoput any additional improvements, that tiny clock difference would not offset benefit from having brutally more powerful GPU all around.

    You can dance around PS5 all day, but it will not outperform XSX. XSX has over 30% higher Shader performance for start. That closely correlates to raytracing performance.
    If you think that some ancient part of pipeline will be bottleneck, you are to be unpleasantly surprised. Because game that will run with raytracing, shaders, high resolutions on XSX at 60fps will run 45fps on PS5 at same image quality.

    Much lower shader power, lower ROPs performance can be alleviated by lower resolution. But even 1080p will not enable PS5 to win at same image quality. You may be talking to 720p console gamers, but I doubt that you manage to find more than 10 on them. And there, it will be about equal.
    - - - -
    Look at freaking RTX cards with DX-R enabled. Do you think some part of nVidia's architecture that could increase performance only through higher clock matters once you start DX-R? No, it does not. Because fps goes down a lot in comparison to DX-R OFF and that part which benefits only from clock is now quite underutilized in comparison to other parts of GPU that handle parallel brute forcing.
    - - - -
    And as far as storage goes, Sony's presentation revolved mainly about storage, because that was about all they could make look good.
    MS simply said that they have internal 1TB storage with RAW 2,4GB/s and around 4,8GB/s for compressed data. And that they have HW level compression chip. They simply felt no need to boast about HW level compression.
    Then they said that they have expansion capability with another 1TB "card" that behaves exactly same way as internal storage.
    Will anyone cry if they RAID it?
     
    Last edited: Apr 9, 2020
  12. Ricepudding

    Ricepudding Master Guru

    Messages:
    869
    Likes Received:
    276
    GPU:
    RTX 4090
    I disagree, lowest common denominator isn't a thing anymore. This generation has shown that quite a bit, otherwise the games would all be made for the OG xboxone, which many games struggle to run on that barely hitting 900p (often lower) and struggling to get 30fps. I think they are just made for the most common denominator, or even the easiest to run denominator. Hence why ps4 base runs better than even the xbox one x sometimes (maybe less clarity in resolution but often more stable fps).

    Consoles may limit graphics, here and there but thats more on developers not wanting to make games like Crysis anymore that push the boat out. I get what you are saying but PC still often gets better graphics and we even got ray tracing and nvidiaworks and so on. We do still get amazing graphic features and that hasn't stopped just because of consoles. If that was the case we would only be getting ray tracing after the consoles came out.

    And a 10.28 Tflop GPU is far from crap, or you're saying a GTX 2080 gpu is crap? Cause this technically has more Tflops, not that Tflops are much to go by. End of the day consoles are often about their exclusives, and don't often have too much of a baring on PC gaming as a whole, specially if we are going to be getting mid gen consoles from now on.

    Also as an end note, was Sony and MS meant to put a 20 Tflop gpu in their consoles or something? Cause for once these consoles are getting pretty high performance
     
  13. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    10,28TFLOPs on PS5 is peak clock performance. They guarantee only 2GHz, so look closely to 9,2TFLOPs in shader/raytracing heavy games. In other words, It will sustain clock only in light games which do not need much of GPU anyway.
    And they have "around" 20 TFLOPs. It is just FP16 and 18,4 for PS5 while XSX has sustainable 24,3TFLOPs of FP16.
     
    Neo Cyrus likes this.
  14. nick0323

    nick0323 Maha Guru

    Messages:
    1,032
    Likes Received:
    77
    GPU:
    Asus DUAL RTX2060S
    Wow this thread. This forum is so...
    AMD vs Intel
    AMD vs Nvidia
    Microsoft bashing but still uses Windows and now...

    Playstation vs Xbox
     
    HonoredShadow likes this.
  15. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,016
    Likes Received:
    7,355
    GPU:
    GTX 1080ti
    XSX is NOT sustainable.

    Microsoft hasn't even spoken about how their clocks work, and I don't trust DF to know enough about hardware to even believe it when they say microsofts are locked just because microsoft said so.
     

  16. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Because they clearly stated, they use fixed clock speed for both CPU and GPU. Again I'll repeat myself: "You must have missed it or forgotten."
    It is locked, therefore performance will be sustained.
     
  17. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,016
    Likes Received:
    7,355
    GPU:
    GTX 1080ti
    No.

    That doesn't even work on PC with a fixed overclock, Performance will change with the type of processing being performed.

    Enjoy the variable performance of your lazy unity ports on the XBX though, the PS5 will run them better just because the console is designed to do the work for those lazy cash grab devs.
     
  18. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    It does not. It is not intel which overheats at certain workloads and has to downclock.
    They have fixed 1,825GHz on GPU. And what changes is power draw depending on load and type of workload.

    They stated they use Fixed clock. Not fixed TDP limit. You mess apples with oranges. And think that when AMD's desktop GPUs use power limit, console will do same.

    AMD did what MS did ask from them. As well as they did what Sony did ask from them.
    With your way of thinking you must be shocked by fact that PS5 can lock down CU count and clock to exact count/clock Sony requested for each compatibility mode because RDNA on PC can't really do that.
     
  19. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    Idk lol.. lots of definitive statements being made about two unreleased consoles with an unreleased architecture.

    I'm sure Sony went for a smaller chip with higher clocks because they wanted to save money.. not because they bet on AMD failing to improve the GPU. I also believe Microsoft wanted predictable performance with it's console, so even if the clocks vary slightly I'm sure the "static" clocks make it easier to predict. I imagine most devs will use the 2.5GB of slower memory for CPU related tasks and keep the graphics within the fast 10GB. How that's going to affect performance compared to devs on Sony using the full 16 is another story.

    But yeah idk Fox - in the past few threads you've been making really specific statements about RDNA2 with no evidence. The one post, which I didn't respond to in the other thread, you were arguing about the density of the SoC but you didn't even know how many transistors the SoC has.. in your post it ranged from "a minimum of 18B" based on your estimate to the 15.3B that's reported and you're making assumptions about both. You're talking about how impressive the density improvement is for RDNA2 based on an SoC with a ranging transistor count.. in the meantime Renior is 20% more dense then the numbers you were coming up with and that has Vega. In that same thread you were saying Nvidia was surprised by AMD's performance density, in this thread you're saying Sony just doesn't understand AMD's GPU.. I find it hard to believe that these companies that employee engineers who worked on this stuff their whole life, have full access to the process/architecture/performance are in a worse position to gauge it than you.
     
    Last edited: Apr 9, 2020
    yasamoka likes this.
  20. Ghosty

    Ghosty Ancient Guru

    Messages:
    7,962
    Likes Received:
    1,177
    GPU:
    RTX 3050
    There might be some truth in this. Watching the Mark Carney tech talk gives you an idea how clever the guy is with software and coding. He even makes his own custom sound coding to compete with AMD's.
     

Share This Page