AMD Ryzen 7000 processor announcements (preview)

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 30, 2022.

  1. Truder

    Truder Ancient Guru

    Messages:
    2,118
    Likes Received:
    1,099
    GPU:
    RX 6700XT Nitro+
    Seen it here https://wccftech.com/amd-ryzen-7000...c-230w-ryzen-5-7600x-up-to-90c-at-120w-rumor/

    While true, not stating the ambients does make this leaks credibility questionable, we can only really take it as an estimate at best.

    If the leak has tested this sample at room temperature, between 18-21°c, with a known good cooler, then it does illustrate my predicitions and comparison to prior AMD Zen2/Zen3 thermal behaviour in the difficulty of cooling these high density chips. Conversely, if this was tested in a warm and humid environment, then these leaks are just useless.
     
    pegasus1 likes this.
  2. tty8k

    tty8k Master Guru

    Messages:
    964
    Likes Received:
    388
    GPU:
    3070 / 6900xt
    I have no confirmation on ambient, but should be between 20-30C.

    This however is worrying regarding cooling

    [​IMG]

    Which makes the 7900x the most "easy" chip to cool from the start lineup.
     
    pegasus1 likes this.
  3. suty455

    suty455 Master Guru

    Messages:
    568
    Likes Received:
    244
    GPU:
    Nvidia 3090
    pssst lets see what the real world tests show not rumours, whatever it does I will bet my next years wage Intels chips are hotter
     
    Ivrogne and pegasus1 like this.
  4. tty8k

    tty8k Master Guru

    Messages:
    964
    Likes Received:
    388
    GPU:
    3070 / 6900xt
    Now it's rather clear why Apple developed their own efficient chips from scratch.
    AMD and Intel are just stacking engine on top of engine on a car trying to make it faster, sort of speaking.
     
    pegasus1 likes this.

  5. Ivrogne

    Ivrogne Member Guru

    Messages:
    134
    Likes Received:
    118
    GPU:
    It's a GPU, I guess
    Not really, they did it to maximize profit and fully control their hardware.
    Efficient? Well its an ARM, of course they're efficient. It takes a very little power to run an ARM.
     
  6. suty455

    suty455 Master Guru

    Messages:
    568
    Likes Received:
    244
    GPU:
    Nvidia 3090
    Why do folks insist on comparing Arm chips to x86, the software in Apples case is a walled garden you cannot step outside what they design so no matter what you do they will design the whole system to show them in the best light, this is especially true when you look at threaded workloads, Arm chips work better on single threads period, however x86 work far better in simultaneous threads, this is where the software side comes in Apples software is designed to take advantage of the Architecture whilst neither Intel nor AMD design software am guessing if they did it would look a whole lot better for the same specific scenarios.
    Truth is Apple is a tiny part of the computer market and frankly its about as relevant to the majority of home or office users as Linux, Its bespoke and you have to either be a programmer (Linux) or have a specific need personally to lock yourself out of 80% plus of the worlds software (Apple), why do you think Apple purposely chased and designed for graphic designers, simply because its a lucrative niche market where they can focus the software design.
    X86 is simply better for everyday use, sure Arm is better in terms of power consumption but to achieve the same performance they need more cores, servers and users are normally licensed by the core thats where the cost comes in and where the balance and parity in the 2 platforms meet, I can imagine a time say on 3nm where X86 and Arm have virtually the same power envelope.
    Apples CPU is not "better" its designed for a specific niche userbase with propriety software how can you compare the two?
     
    Ivrogne and pegasus1 like this.
  7. tty8k

    tty8k Master Guru

    Messages:
    964
    Likes Received:
    388
    GPU:
    3070 / 6900xt
    Well I do agree with you generally speaking, I just think that they're not that niche anymore.
    It's not restricted to a few native design software anymore, that's what it was 6+ years ago and ironically they were running Intel CPUs back then. The niche segment changed a lot lately even for the home user when microsoft made almost their entire software to run on mac.
    For example I know a few companies myself where every dept runs on macbooks, from the digital/creative dept to dev and even marketing/administration.
    Imo the only big gap between windows and mac for end user is gaming.

    I'm comparing them because they recently changed from the x86 chips for the reason we fail to admit today, more efficiency and less waste.
    And I particularly don't like the way x86 is going, compensating performance with 2% power consumption per less than 1% gain and more cost to produce.
    For example, my mbpro M1 pro runs wow shadowlands in 4k at 80fps @ 35W (entire system/wall), my 9900k/3070 system in same exact scenario at 80 fps draws about 280W. Monitor not included but worth a mention that the mac still has a 16" screen in that power consumption and it's virtually a keyboard and a screen, not a huge heavy box (did I mention it also has a battery?).

    I still enjoy the pc because it's customizable but frankly when using both systems I can see a big gap in tech advance, the pc is like a dinosaur that keeps getting bigger, heavier and more power hungry, not to mention costs a lot lately.
     
    Last edited: Sep 2, 2022
    suty455 likes this.
  8. tty8k

    tty8k Master Guru

    Messages:
    964
    Likes Received:
    388
    GPU:
    3070 / 6900xt
    Now on topic, it seems that voltage on most of the leaked results is higher than it should be, maybe that's why the bios delay/issues for AMD.
    A 7600x undervolted dropped from 93C/122W to 56C/68W , while on a 5GHz clock.

    This in theory still should be higher at more than 5GHz but at least it's not that bad.
     
    Last edited: Sep 2, 2022
    pegasus1 likes this.
  9. shady28

    shady28 Member

    Messages:
    33
    Likes Received:
    8
    GPU:
    GTX 2060
    The 7600X bench's that showed up on Geekbench and are being used for comparisons are at 5.4Ghz.

    So you lop 7.4% (delta between 5.4 and 5.0 Ghz) off of those scores and you'll wind up with 12600K performance.
     
  10. pegasus1

    pegasus1 Ancient Guru

    Messages:
    2,197
    Likes Received:
    993
    GPU:
    ROG 6900XT@2.6Ghz
    The sole aim of a headline is to entice somebody to read the story. That might mean buying a paper, as we did in olden times, or clicking a link as the kids do now.
    Either and both of those generate income.
    Does the headline have to be accurate? Hell no, and neither does the story or article.
    Its all about the clicks, remember that people.
    Trust nobody, believe nothing.
     
    bobnewels likes this.

  11. bobnewels

    bobnewels Master Guru

    Messages:
    825
    Likes Received:
    539
    GPU:
    RTX 3080
    Wow I agree 100%
     
  12. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    6,575
    Likes Received:
    3,762
    GPU:
    RTX 3060 Ti
    there's a 280w zen4 ?
     
    suty455 likes this.
  13. suty455

    suty455 Master Guru

    Messages:
    568
    Likes Received:
    244
    GPU:
    Nvidia 3090
    Exactly!
     
  14. suty455

    suty455 Master Guru

    Messages:
    568
    Likes Received:
    244
    GPU:
    Nvidia 3090
    Shadowlands is hardly a taxing game though try something like Doom Eternal or cyberpunk my point is the Arm Arch and specifically in Apples case the software is cherry picked to show the best possible results whilst the PC deals with everything you can throw at it and has to be able to run a turn by turn and a AAA title out of the same envelope.
    All work requires energy which is usually converted to heat or movement etc, modern PC chips and SOCs from all sectors are getting better at sipping the energy rather than gobbling it up look at the Ryzen 7 in comparison to the previous gen if AMDs results are to be believed its a huge advance as were the previous gens since Ryzen launched, Intel has its big little type solution that simply is not as good consumption wise overall, but I think in a couple of years no matter what the Arch we will see parity as the competing structures and designs converge in cpus, Graphics is a whole other ballgame however and it will be interesting to see the solutions they all come up with.
     
  15. nizzen

    nizzen Ancient Guru

    Messages:
    2,240
    Likes Received:
    977
    GPU:
    3x3090/3060ti/2080t
    A bit overclock to 7950x and 350w is reality :)

    Can't wait for direct die tool :D
     

  16. tty8k

    tty8k Master Guru

    Messages:
    964
    Likes Received:
    388
    GPU:
    3070 / 6900xt
    I will need a lot of convincing or beer to upgrade my system at this point lol.
    This is Forza5 native 4k high/ultra settings on 9900k 4.8GHz / 6900xt 2.3GHz.
    Cpu is still not a bottleneck there.

    upload_2022-9-5_1-2-22.png
     
    Maddness likes this.
  17. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,374
    Likes Received:
    471
    GPU:
    6800 XT

    Tbh you could do SoC big as Apples on an X86 have it be roughly as performant at roughly the same wattage.

    Apple is utilizing really wide and big package for their whole thing. That is the main reason they went solo, they can control everything.

    AMD and Intel cannot just start selling SoC's to everyone it doesn't really work that well.

    The custom ones AMD made for xbox series x and ps5 using quite old tech at this point on 7nm use from 50-150watts in gaming. With 5nm it would be even better.

    Anyway point being Apple does things really differently and they gain from that. M1 the most basic one has 16 billion transistors. Whilst Pro had 33 billion and ultra 57 billion. Now if we compare to just Navi 21 its 26.8 billion whilst the cpus Intel and AMD produce are relatively low in transistors compared. One CCD having 4 billion.

    Anyway having that wide design is an advantage for them. Allows them run more instructions per cycle
     
  18. Aura89

    Aura89 Ancient Guru

    Messages:
    8,346
    Likes Received:
    1,432
    GPU:
    -
    Honest question since i am unfamiliar with the game, though have looked up other peoples benchmarks and it just is making me wonder, is it not?

    That CPU render is all over the place, the simulation isn't which i don't know the difference, like i said i dunno the game or the benchmark, but it's the only screenshot i've seen with the CPU render all over the place like that

    And looking at it further, the GPU Limited percentage should be near 100% to indicate no CPU bottleneck from what i've been able to look up that percentage to mean (seems backwards to me, but every benchmark i've seen shows that to be true, the worse the CPU, the lower the GPU limited percentage is)

    Everything that i can find shows your CPU is bottlenecking forza horizon 5, unless all the information i can find is just wrong, again, not familiar with the benchmark or the game. Though it does not appear to be a "massive" bottleneck.
     
    Last edited: Sep 6, 2022

Share This Page