ZEN2 8c/16t with NAVI GPU Spotted - AMD Console chip?

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jul 24, 2019.

  1. tsunami231

    tsunami231 Ancient Guru

    Messages:
    10,831
    Likes Received:
    646
    GPU:
    EVGA 1070Ti Black
    call me skeptical but both sony and ms said there cpu where woefully underpower and the wont make the mistake again. but to me it looks like it gona happen again. I get the most consoles try to stay under the 300watts and for most part around 200watts, but still these low clocks are worry some to me there is only some much work that can offloaded to gpu to make up for the cpu lack of power, unless there is huge diffrence from jaguar chip and zen2 at same clocks.

    any game the physic a lot suffer greatly do to the cpu being bad, particle effect too fps generally plummeted in this those games and
     
  2. Ricardo

    Ricardo Member Guru

    Messages:
    150
    Likes Received:
    93
    GPU:
    1050Ti 4GB
    Console makers learned their lesson over the years to not overspec their machines. If you look at the historical sales of most consoles and handhelds, the vast majority of the best-selling in their respective generations were the lower spec'd ones - Gameboy, DS, 3DS, PSOne, PS2, Wii, PS4 (was cheaper on launch). So, there's little reason to invest in cutting-edge hardware, since that doesn't translate into more sales, and selling hardware at a loss is 101 business bad practice.

    Also, at this point, there's very little in the way of making games run in lowered spec'd machines - just look at some switch ports.

    So, I believe that consoles won't be using high spec hardware ever again. Maybe console makers will make some "premium" SKUs like the PS4 pro or XB1X as an option, but that's as far as they will go, IMHO.
     
    schmidtbag likes this.
  3. tsunami231

    tsunami231 Ancient Guru

    Messages:
    10,831
    Likes Received:
    646
    GPU:
    EVGA 1070Ti Black
    here in lies the problem they both said both there systems will be 4k and 8k capable, with ray tracing.... higher end gpu has issues with that stuff, most 300$ gpu can't even do 4k with acceptable fps and 800$ gpu still struggle to Maintain 60 fps at 4k so forget it if ray tracing it added and forget it some more if they really think 8k is possible for anything but movies.


    I still waiting for consoles to do 1080p @ 60 as a requirement cause sub 1080p is unacceptable and 30 fps is only acceptable in slow paced slow moving games. only time 30fps is acceptable in fast pace game is when there are ZERO framepacing issue, which is still a issue, even then 60 fps is preferable.

    There huge difference how say nioh in performance mode runs which for most part 60 fps 720p or lower or higher with huge drops when particle effect are used. and nioh in movie mode with 1080p but 30 fps locked with still the huge drops. even though nioh look like crap in performance mode, but it plays 1000% better with 60 fps. and anyone that plays in 30 then 60fps will say they same think less the blind

    the cpu had be strong enough to atlest keep up with gpu it coupled with not underpowered like it was with the ps4/xbox one and even the pro/x the cpu was still way under power.
     
  4. Venix

    Venix Maha Guru

    Messages:
    1,375
    Likes Received:
    510
    GPU:
    Palit 1060 6gb
    To be fair xb1 also struggles to sustain 1080 /30 hell often is 900p or 720p or somewhere odd in between.
     

  5. Ricardo

    Ricardo Member Guru

    Messages:
    150
    Likes Received:
    93
    GPU:
    1050Ti 4GB
    Them saying that the consoles will be "4k" and "8k" capable is simply marketing fluff. Games don't have to be rendered at those resolutions for the console makers to say that they are supported. Same logic applies to raytracing - the fact that they have some amount of it is enough for bragging rights, but the actual amount implemented might even be smaller than the ones we currently have in current pc games.

    Most likely we'll see games running around under 1440p with raytracing, or near 4k@30fps without raytracing. Anything beyond that is a pipedream, or the games will sacrifice graphical fidelity/complexity for it. I mean, 60fps isn't anything new, you just have to take your target hardware in consideration before developing your graphic pipeline. But most console players prefer image quality over FPS, so...
    Yeah. It depends on a lot of things.
     
  6. Kaspar Kople

    Kaspar Kople New Member

    Messages:
    1
    Likes Received:
    0
    GPU:
    Matrox
    1.8GHz is most likely idle clock, consoles have fixed performance, it should run at 3.2GHz locked in games.
     
  7. ObscureangelPT

    ObscureangelPT Master Guru

    Messages:
    545
    Likes Received:
    64
    GPU:
    Zotac GTX 1650 Supe
    What people forgets is that Consoles have very low end APIs which can make miracles.
    Get a Jaguar core based CPU, even at higher clocks it struggles to match that CPU on the consoles.

    Dispite that, I won't be paying 360€ for a R7 3700, it's nuts, I will get a R5 3600 anyway.
    But I absolutelly know that more CPU Bound games will be a huge issue if developers will take advantages of the "Ryzen of the consoles", imagine how much power they can unfold with that CPU in that low level API.

    I'm honestly worried!
     
  8. tsunami231

    tsunami231 Ancient Guru

    Messages:
    10,831
    Likes Received:
    646
    GPU:
    EVGA 1070Ti Black
    Oh i get that, but like said they still struggling with 1080p 60fps till that be comes mandatory bar all games MUST archive, I could careless about 4k or higher if 1080p @ 30 fps is unacceptable to me so is 4k @ 30, and most console only players are oblivious to the the difference of 720p and 1080p and 30 fps and 60fps.

    PC gpu arnt ready for 4k@60fps neither atlest not the gpu that would be use for a console.

    I am amazed by what ps4 and xbox one did consider how lacking the cpu was and it definitely shows games that heavy on physic and particle effect. but if they were not able to offload majority of work onto the gpu which was much more powerful things could been much worse. alot less things ran 1080p on the xbox one then ps4 most cases xbox one did 900p and ps4 did 1080p, most console people would never noticed that cause the dont know any better.

    there us huge diffrnece from 720p and 1080p 30fps and 60fps, just like scaling 900p to 1080p there is diffrence vs actual rendering of 1080p
     
    Last edited: Jul 24, 2019
  9. fry178

    fry178 Ancient Guru

    Messages:
    1,602
    Likes Received:
    235
    GPU:
    2080S WaterForceWB
    @fantaskarseff
    So your suprised that a console running in limited space (vs computer), that has to stay within a general power/cooling limit of 300-400w doesnt outperform a pc.
    So basically expecting a 2.5l 5cyl to have the same output as a 2.5 V12 that revs twice as high...


    And anything related to 4k/rt, if you are able to implement it, even if its just one game, ist not a lie.
    Not once did i hear sony talking about games running at 4k (render resolution), so as long as its 4k out, its 4k out.
    And upscaling works pretty good, or ppl werent willing to spend 10$ a ticket for watch a 3D movie (upscaled to 4k), when there was no (native) 4k content/signal.

    And like many others, low input lag/high refresh rates arent the goal for +60-80% of console buyers, and hope it stays that way. If you need +60hz, buy a pc.
    I want to sit down and play on a tv with 1-4 ppl,
    and dont wont it to look like crap, just so i dont see a stutter every no and then (ace combat 7@4k).
    So far companies like to sell products that favor the largest groups.
    Or you expect me to beliefe you buy your groceries ina +800hp 2 door super sportscar, because it handles/performs 2-3 times better than the avg 30k car..
     
  10. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    9,535
    Likes Received:
    453
    GPU:
    GTX 1080 Ti @ 2GHz
    Don't underestimate their ability to intentionally butcher PC ports. Same x86 code? Naw, let's frack it hard as theoretically possible, and make it single threaded even though the console versions, all of them, are 8 threaded for the 8 core jaguar chips.
     
    Ricardo and airbud7 like this.

  11. airbud7

    airbud7 Ancient Guru

    Messages:
    7,835
    Likes Received:
    4,739
    GPU:
    pny gtx 1060 xlr8
    a single core from a 3800x or 9900k will blow away 2/3 cores from a low power console.

    The sheer power of it.
     
    Evildead666 likes this.
  12. tsunami231

    tsunami231 Ancient Guru

    Messages:
    10,831
    Likes Received:
    646
    GPU:
    EVGA 1070Ti Black
    you would think now that it all running off actual PC architecture cpu, that it would be theoretically impossible frack that port or at very less the AMD systems would be be fine, seeing it all AMD hw based on there PC hardware now. but it still happens that blame is solely on on the developers. most pc ports have the make it work forget about getting to work correctly mentally.

    PC port of Nioh is one worse port from performance stand point I actual own. only thing I know was worse was batman arkham knight pc which was so bad it got pulled a few times from steam.

    Im sure cpu is better then what x/pro has but how high is that bar really? the ps3 cell cpu had more raw power then the ps4 cpu mind you not by much ( this putting aside how hard cell was to program for, when done right it had more raw power) I hope this time it not barely more power or slightless power full. 1.5x to 2x performance at min would be nice so as the gpu is sitting there waiting for data from the cpu. would be great for physic and partcile effects too so they dont just castrate the framerate.

    I just think in this day in age 1080p@60 (stable) should be bare min anything runs cause it should no longer be any issue. we have 4k and now 8k being pushed and they cant even give us 1080p@ 60fps, the fact that we have "fixed" resolution tech make anything other then is native res look blur, just adds to the frustration.

    Which why I still running a 1080p monitor on my pc.
     
    Last edited: Jul 25, 2019
  13. airbud7

    airbud7 Ancient Guru

    Messages:
    7,835
    Likes Received:
    4,739
    GPU:
    pny gtx 1060 xlr8
    Sorry if OT ...But I have an old 2600k/GTX 1060 and a xbox1 an my PC blows it away...I have wireless controllers for both an the same games on both.....there is no competition against my PC...it simply looks better and is faster @1080p
     
  14. Venix

    Venix Maha Guru

    Messages:
    1,375
    Likes Received:
    510
    GPU:
    Palit 1060 6gb
    People that claim that yep you must buy a 3700x cause next consoles ....i disagree you can totally go with 3600 we can all safely assume that it will be enough for 4 years games will not sudenly go to ....yep 8cores or go home , so buying a 6 core today with a decent vrm b450 motherboard like tomahawk is more than enough and if it comes to worse ....you can always in 4 years buy a used 3900 ...how much that will be by then ? 100 ? 200? Usd ?

    All i said becomes null if you do more tasks that you actually need more cores.
     
    HandR, airbud7 and Evildead666 like this.
  15. JamesSneed

    JamesSneed Maha Guru

    Messages:
    1,052
    Likes Received:
    430
    GPU:
    GTX 1070
    @Denial @schmidtbag You guys had an interesting discussion. Considering this is gaming I suspect that the 3.2Ghz boost of the console chip comes into play a lot since the main thread will likely be running at boost pretty much always. It's hard to figure this out with games because a large percentage of the time the main thread dictates a lot of the performance we measure as fps as the other threads are carrying lighter workloads like say physics. Since the 3.2Ghz boost is 24% less than the 3600's boost of 4.2Ghz I suspect in gaming we will see performance closer to the 24% difference of the boost clocks than the the 50% difference the base clocks imply. I think a low base clock but a decently high boost is a very intelligent design for something like a console that is this thermally limited. AMD have essentially introduced a virtual big little design which I highly suspect will have more gaming performance than a design with higher base clocks but lower boost clock to stay within the same TDP.
     

  16. waltc3

    waltc3 Maha Guru

    Messages:
    1,145
    Likes Received:
    355
    GPU:
    AMD 50th Ann 5700XT
    After being selected for the first xBox contract, apparently in preference to ATi, nVidia was so brainy it sued Microsoft thereafter, one of nVidia's sincere "Thank You" strategies, I guess--whereupon Microsoft promptly paid nVidia the disputed sum (or part of it) in a settlement and thereafter completely and permanently severed its business relations with nVidia and all things xBox. Immediately shuttling ATi in for the win--which should have been Microsoft's choice in the first place, imo. Classic case of nV's "cutting off your nose to spite your face", or "biting the hand that feeds you," corporate strategies, etc. I don't recall exactly what nVidia did to so fan the ire of Microsoft in that relationship, but the stench of it remains inside Microsoft until this day, apparently. However, I do have many plausible--all too plausible--ideas about nVidia's conduct during that time--oh, brother, do I ever!

    Like other people have mentioned in this thread, I also don't care for the bogus "ray tracing" marketing drivel*, or the notion that right now "8k" is useful for more than 1-frame-at-a-time viewings.**

    *If you're curious about ray-tracing, fire up programs like Blender, Cinebench, or Lightwave to discover what it is, and why not even nVidia's $1400 2080ti can come close to using any of these programs to ray trace @ 60 frames-per-second *gag*, not even one-frame-per-second, or thereabouts. These programs can ray trace--the 2080ti cannot. Write a script controlling an RTX GPU to use Blender to ray-trace a frame--and, uh, ring me up when you break the 1 frame-per-second barrier...:D (If you ever got that fast it would be nigh miraculous)

    Nothing is wrong with the fact that it can't ray trace, of course. No rasterizing 3d GPU made today can ray trace--rasterizers are made specifically to simulate visible ray-tracing results in an incredibly small fraction of the time actual ray-tracing requires. What RTX is doing is generating, via 100% rasterization, scene lighting precalculated to resemble what the scene lighting in a ray-traced scene might look like. All the rest of it is pure marketing exaggeration and hyperbole. Anyway, it gets very tiresome every time you have to say "No it isn't," when the marketing department fibs and says, "Yes it is--uh, but not really--but, yeah, it is, sort of..." etc. ad infinitum. Borrrrring-g-g-g-g...

    **Slap a couple of 5700XTs/2080ti's in D3d12 multiGPU mode and suddenly, I believe, even "8k" gaming becomes possible. Might not be very fast gaming, however! But in all seriousness *cough* I think we are still a few years shy of routine 8k gaming. How many years is anybody's guess. Will 8k even prove desirable from a number of standpoints-?- is the basic question, imo.
     
    Backstabak likes this.
  17. Denial

    Denial Ancient Guru

    Messages:
    13,111
    Likes Received:
    2,577
    GPU:
    EVGA 1080Ti
    So it can Raytrace or it can't? First sentence says it cannot the second says it can.

    You have this weird obsession with repeating this and no matter how many times you do it, it doesn't make it real.

    Like I get that you hate Nvidia and I understand that Nvidia's RTX launch has been less than ideal but can we not just completely fabricate things about it?
     
    Last edited: Jul 25, 2019
  18. Backstabak

    Backstabak Master Guru

    Messages:
    654
    Likes Received:
    261
    GPU:
    Gigabyte Rx 5700xt
    Yeah, I work in a research in fiber optics. We recently started using Zemax for ray tracing, if you use something like sequential mode, where the order of surfaces is known it's quite easy to get results quick. However, for non-sequential mode it can take a few minutes to get single result, especially if you consider polarization, scattering and ray splitting, because you will get significantly more rays in the end than what you started with.

    I also think that there is a huge problem of material characterization, as to get accurate results you'd need to at least know refractive index of each material and actually consider surface roughness to get good results.

    However, realistic lightning is an incredibly important step towards more believable graphics, it's just that PCs are not powerful enough today for it. Maybe there will be some revolutionary algorithm or we'll get there through brute force and gradual improvements in HW, but right now all the RTX features are just a gimmick, at least from my point of view.

    8K is complete fantasy on the consoles, what I think it means is that it supports such resolution for e.g. video playback.
     
    schmidtbag likes this.
  19. Astyanax

    Astyanax Ancient Guru

    Messages:
    8,052
    Likes Received:
    2,679
    GPU:
    GTX 1080ti
    The actual case notes for Nvidia vs Microsoft in 2002 indicate Microsoft was renegging on the negotiated price following a price cut to the xbox in the european market.

    Microsoft (not nvidia) took the case to arbitration trying to force nvidia to alter the contract, and nvidia throughout the case continued to produce the chips at a loss as ordered by the court. Nvidia won the case and Microsoft was made to pay the original amount as well as backpay for the chips they produced while the order was in place.

    I don't really know why people have that case so wrong, Microsoft were the ones in the wrong, nvidia wasn't after more money, it was microsoft trying to pay less.
     
    Last edited: Jul 26, 2019

Share This Page