AMD Epyc 7763 CPUs break Cinebench world record, crushes Intel Xeon in performance

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Apr 19, 2021.

  1. Kaarme

    Kaarme Ancient Guru

    Messages:
    3,516
    Likes Received:
    2,361
    GPU:
    Nvidia 4070 FE
    They can, it'll just take a bit more time. In the meantime they are still making as much money as ever, so business, let alone existence, wise they have got nothing to worry about. They kicked out the technology hater CEO, so things ought to be rolling for them again in the near future, once research starts to bear fruit. Of course it also depends on how well they get their 7nm process working, but I wouldn't expect a catastrophe there. They had giant problems with the 10nm, but as we all know, a failure can teach more than 10 successes.
     
  2. anticupidon

    anticupidon Ancient Guru

    Messages:
    7,898
    Likes Received:
    4,149
    GPU:
    Polaris/Vega/Navi
    What about the security?
    Anything to say about proven security and vulnerability mitigations?
     
    beedoo, carnivore and schmidtbag like this.
  3. illrigger

    illrigger Master Guru

    Messages:
    340
    Likes Received:
    120
    GPU:
    Gigabyte RTX 3080
    The market segment that actually makes the money for companies, enterprise customers.

    The costs in a datacenter are, as has been said, based on thermals more than anything else - the highest cost associated with running a DC is cooling it, followed by redundancy, power and rent on the space. You buy new servers every 5 years (longer, if you don't care about warranties), but you have to cool and power it 24/7, and UPS systems have to be maintained with battery swaps every 2-3, plus paying people to maintain it and the servers in it.

    As a result, most companies who operate their own servers lease space in hosted datacenters to alleviate complexity. The average cost of operating a Tier IV datacenter (the "five nines" level, 99.995% uptime) is around $25,000 a month per square foot, or around $750,000 per rack (datacenters treat this as 30 Sq ft, to account for cooling space and the doors opening). And yeah, this is *still* considerably cheaper than AWS/Azure/Google/etc for the same compute power - cloud hosting is super expensive for high data throughput applications. Therefore getting your compute into as small a physical area as possible is key to making money on it - density is king.

    To put that in perspective, a fully kitted out Dell R7525 server with the chips in the article costs around $75,000, and typically you can get 16-18 of them in a single 42u rack, with typical server lifespans being 5 years. Server costs are pretty small compared to running them.
     
  4. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,016
    Likes Received:
    4,395
    GPU:
    Asrock 7700XT
    Tell that to some of the largest corporations and organizations in the world that bought them anyway.
     

  5. anticupidon

    anticupidon Ancient Guru

    Messages:
    7,898
    Likes Received:
    4,149
    GPU:
    Polaris/Vega/Navi
    @tty8k
    Reliability, as in CPU platform?
    Well, server platform manufacturers have a standard, as in reliability and hardware parts chosen.
    As in proven reliability through the years? OK, sounds logical, but also sometimes a new platform should and would be implemented.
    Nobody can see the future, but everyone can see in the past, higher electricity bills, more heat and vulnerabilities.
     
  6. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    um...10 watt TDP, even if real-world that comes to be true that it uses 10 watts more, for better performance, means datacenters can get more done for cheaper. Yes, the total "package" cost more to run, but datacenters care about how much money they have to spend to get whatever project they have going done, done, when it comes to power use and cost of running.

    AKA, if something used 50% more power then a competitor, but got projects done three times as fast, they'd be spending less money to get a project done even with a higher TDP.

    And then there's simply the fact that time is money.
     
  7. Mineria

    Mineria Ancient Guru

    Messages:
    5,540
    Likes Received:
    701
    GPU:
    Asus RTX 3080 Ti
    Both Intel as AMD are proven more than enough for server environment, so businesses will take what ever CPU that does the jobs it get's thrown at it in the most desirable way.
    Most companies are used to have both CPU's in their server racks anyway, since Intel does one thing better than AMD and visa versa, what ever that is in pure performance, cost effectivity or being best at something specific, not much place for fanboyism in that market.
     
  8. Mineria

    Mineria Ancient Guru

    Messages:
    5,540
    Likes Received:
    701
    GPU:
    Asus RTX 3080 Ti
    You can't really look at it like that, regular consumers is not the same as server market where AMD always had a much larger role than what ended up in our PC's
     
  9. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    You're saying that AMD has traditionally always been stronger in server market than in desktop?

    [​IMG]

    [​IMG] [​IMG]
     
  10. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,665
    Likes Received:
    597
    GPU:
    RTX3090 GB GamingOC
    AMD is back baby. Well maybe if they had slightly better GPU's but for CPU's they are finally ahead on all fronts.

    Once we get an APU that has a powerful enough GPU like a 3060Ti then i bet it'll sell like hot cakes as people on a budget go with APU's instead of folking out for both.

    I don't know what the score is atm between Intel and AMD(Desktop CPU's). I mean Intel used to have a massive advantage over AMD but i doubt that now though.

    Maybe Steam could give us a better idea of the recent CPU gains and losses.
     
    Last edited: Apr 20, 2021

  11. anticupidon

    anticupidon Ancient Guru

    Messages:
    7,898
    Likes Received:
    4,149
    GPU:
    Polaris/Vega/Navi
    There was a time when Opterons were highly regarded in server - data centre space.
    Now they're back. What I would like is to see more ARM solutions developed and deployed.
    Old habits die hard.
     
  12. EspHack

    EspHack Ancient Guru

    Messages:
    2,799
    Likes Received:
    188
    GPU:
    ATI/HD5770/1GB
    this whole situation leaves me wondering if there could be a mechanism to prevent companies from going stagnant and tyrannical after securing a market, but if so, wouldnt that make it exponentially harder for a competitor like AMD to ever catch up? there's no easy solution to this

    I hope this doesnt mean another 10 years of $499 8 cores from AMD this time around
     
  13. Mineria

    Mineria Ancient Guru

    Messages:
    5,540
    Likes Received:
    701
    GPU:
    Asus RTX 3080 Ti
    AMD is used a lot in the racks I have seen, mainly in blade units, the bigger storage and db servers mainly use Intel, so they do mix.
    I agree regarding the Cinebench benchmark though. :)
     
  14. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,665
    Likes Received:
    597
    GPU:
    RTX3090 GB GamingOC
    I very much doubt AMD will carry on pumping out 8 cores for very long. All along the Zen range we have seen core increases for the masses.

    They could make 5900X or 12 cores mainstream if they wanted but more likely is that newer gens will have 2 x 8 cores = 16 core 32 thread will be the norm in 2 or 3 years from now.
     

Share This Page