1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Rumor: AMD Seeds Board partners Ryzen 3000 Samples - Runs 4.5 GHz and show 15% Extra IPC

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Apr 29, 2019.

  1. BReal85

    BReal85 Master Guru

    Messages:
    325
    Likes Received:
    100
    GPU:
    ASUS 270X DC2 TOP
    Wow, so you will have maybe 5+ % more fps in FHD games with a 2080Ti. So basically 0,0001% of the gamers. BRILLIANT!
     
  2. DeskStar

    DeskStar Master Guru

    Messages:
    586
    Likes Received:
    53
    GPU:
    4 eVGA GTX TITAN SC
    Now is this with a heavy OC, or are we talking about typical "theoretical" boost performance.....?!?

    This will be the true deciding factor as to whether or not I want to build a new system. That and memory support. I want silly speeds and a minimum of quad channel support.

    Looking good so far AMD......Let us all keep hoping....
     
  3. Yogi

    Yogi Master Guru

    Messages:
    228
    Likes Received:
    55
    GPU:
    Sapphire R9 290X Vapour X
    Google translate of the source HH linked to seems to indicate that it's a 15% improvement "overall" compared to Zen+.

    I think people are overreacting with theories of 15% IPC improvement on top of clock improvements. Just because the Eng Sample clocks at 4.5 GHz doesn't mean a whole pile. The silicon lottery swings both ways even with production improvements over time and improvements to drivers, etc.
    Retail chips could perform the exact same as this hypothetical ES.

    Edit: Zen+ not Zen2
     
  4. screwtech02

    screwtech02 Member Guru

    Messages:
    178
    Likes Received:
    20
    GPU:
    R9 390 8gb x3
    Sooo, they will "actually" run 32 gb of DDR4 3200 @ its rated speeds now???
     

  5. chispy

    chispy Ancient Guru

    Messages:
    8,758
    Likes Received:
    891
    GPU:
    RTX 2080Ti - RX 580
    Well i have been running 32 gb of DDR4 3300Mhz Cas 14 Hynix ram on Zen+ 2700x for some time without a single problem so i would say they won't have any problems running 3200Mhz and beyond on Zen 2 ;) .

    [​IMG]

    [​IMG]
     
    Aura89 and user1 like this.
  6. Aura89

    Aura89 Ancient Guru

    Messages:
    7,641
    Likes Received:
    900
    GPU:
    -
    Doubtful. I mean sure if thats the case, but nothing of this article says anything about 2700k, it wouldnt relate to zen2, etc.

    If he meant 2700k, that is extremely confusing

    They already do (zen/zen+)
     
    K.S. and chispy like this.
  7. Fox2232

    Fox2232 Ancient Guru

    Messages:
    9,739
    Likes Received:
    2,199
    GPU:
    5700XT+AW@240Hz
    Above you have 2x 16GB. I do run 4x 8GB at 3333MHz and if I did run just 2 modules, they would run 3600MHz. (But that would be just 16GB.)
     
    Aura89 and chispy like this.
  8. skline00

    skline00 New Member

    Messages:
    1
    Likes Received:
    0
    I've had a 2700x custom water cooled on a Asus CH6H 370 mb and I now have a 9900k on an ASrock Z390 Taichi mb with a Kraken X72 AIO cooler.

    Both cpus use my Gskill DDR4-3200 Flare-X CL14 ram (2x8).

    Both have limited headroom to OC. The 2700x on all 8 cores is at 4Ghz while the 9900k is at 4.7Ghz all cores.

    The 2700x can hit 4.3 Ghz on 2 cores while the 9900k hits 5ghz on at least 2 cores.

    Aida memory data is very close when running the ram at the XMP 3200 spec.

    AMD closed most of the gap with the 9900k but it IS still faster but should be for the price.

    Zen2 is a much different cpu concept so I think it (8c/16t) will probably equal if not exceed the the 9900k. The question will be the price.
     
  9. Maddness

    Maddness Master Guru

    Messages:
    948
    Likes Received:
    214
    GPU:
    EVGA RTX 2080Ti FTW
    This looks pretty sweet if true. I have been dying to build a new PC. Zen 2 is at the top of my list. Take my monies AMD
     
  10. nizzen

    nizzen Master Guru

    Messages:
    728
    Likes Received:
    131
    GPU:
    3x2080ti/5700x/1060
    Think about at Threadripper with 35ns memory latency, that would be epyc :D

    The only drawback for AMD now is the high latency in games, unless you are gpubound. My Threadripper 1950x i about 67ns, and my 9900k is 37ns. 1950x is in my F@H crunching computer, so latency is not imortent at all. Can't wait for Threadripper gen 3
     

  11. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    9,232
    Likes Received:
    314
    GPU:
    GTX 1080 Ti @ 2GHz
    What's with this nonsense I'm seeing about 15% IPC putting AMD on par with Intel or still preferring Intel even if AMD's ahead? AMD is pretty close to Intel as far as IPC in real world performance goes as it is, right now, with Zen+. 15% over Zen+ would put Intel at the bottom of a garbage can.

    Was being bent over and violated since 2006 not enough for you guys? You should be creaming your pants over the thought of AMD possibly taking the performance crown back from those shitters at Intel.
    Maybe Intel can bribe giant companies like Dell to exclusively sell their inferior products at higher prices than what AMD's stuff would be sold at. That should do the trick. Then when they lose a lawsuit resulting from it 2 decades later, it'll be a tiny fraction of what it gained them, not to mention the extremely dangerous and weak position it'll put AMD in. Then Intel can crap on their customers with bloated monopoly prices, and 1-3% real world IPC gains per generation that's essentially the same crap re-branded for all eternity. Then if any product is accidentally too good they can stop soldering the IHS to cripple OC potential, and follow it up by having a shill write a BS article spewing fake news about how Intel have to stop soldering or it'll cause micro cracks that are actually dangerous. Morons will believe it. Someone should let Intel know about this diabolical plan.

    Think of where the world would be at technologically if that didn't actually happen.

    Companies will continue doing highly illegal scumfuck tactics like that forever so long as there isn't some hardcore prison time given as a penalty, the chump change they pay in lawsuits is always calculated ahead of time as a part of business. I want to see Intel stockholders and CEOs behind bars for 30+ years after being fined ONE HUNDRED PERCENT of what they attained illegally, or sometimes they'll even take the prison time if it's an insane enough gain. There is no justice until that happens. A broke minimum wage worker can end up in jail for petty theft of something worth $1, but these scum sucking pigs that manipulate the economies of the world, and in this case technologically cripple the world, serve zero time for billions of dollars attained through illegal methods.
     
    Last edited: May 1, 2019
    carnivore, K.S., Caesar and 2 others like this.
  12. Denial

    Denial Ancient Guru

    Messages:
    12,343
    Likes Received:
    1,529
    GPU:
    EVGA 1080Ti
    Microcracking is 100% a real thing - whether or not Intel could figure out a better way around it is another story but it's a well researched issue and multiple companies/industries have published papers on it.

    I'm not sure if we would be much further tbh. You have to realize most of the breakthroughs in computing don't come from the companies themselves but from academia. For example AMD's zen architecture - aside from MCM (which has been around long before Zen) doesn't really do anything too different than Intel does under the hood - they basically just took all the known advancements and shoved them in there. That's why manufacturing was such a big deal for these companies because everything else architecturally is pretty well known and similar between companies. It's also why you kind of see the same advancements being added to different vendors in tandem - for example Nvidia/AMD in GPUs often come out with similar features/technologies around the same time because they are both pulling the ideas from a common area (research at academic institutions). I don't really expect to see massive gains by either company in general IPC after this. It's going to be incremental for some time outside of specific instructions like AVX.

    The rest of your post I agree with. Intel has historically been a shitty company and the punishment for their anti-competitive behavior wasn't even close to offsetting the advantage they got from doing it. I can totally see them doing it again. What's a few billion for 80%+ marketshare of the computing industry?
     
    Last edited: May 1, 2019
  13. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    9,232
    Likes Received:
    314
    GPU:
    GTX 1080 Ti @ 2GHz
    We all know it's a real thing, but threatening enough for Intel to avoid soldering the IHS altogether, on everything, starting from the 3000 series... coincidentally right until they started losing market share to AMD? Hell no. I may not be a materials engineer, but I know enough to see Intel BSing.

    As far how much we'd be ahead, who knows. But if Intel had to compete with ANYTHING they'd have been forced to innovated so many years ago that we can only imagine what new thing would be around. It may not have been mind blowing, but the raw performance we'd have at any given price would definitely be ahead of the "here's 4 cores at about the same frequency for the next 12 years + or -2% IPC" paradigm that Intel shoved up our asses with zero shame. Hell, maybe there'd even be some more research into that elusive reverse SMT. We're going to hit that multi threading wall eventually, and we're already near the physical limits of silicon.

    The mid and high end market for CPUs was a literal monopoly for so many years that it really was a worst case scenario as far as technological advancement went. Even towards the low end, AMD was rarely an option if OCing was taken into consideration. I think we're quick to forget what an abomination Bulldozer was. To this day it seems to me that whoever was pushing Bulldozer was either trying to sabotage the company, or completely incompetent.
     
  14. Fox2232

    Fox2232 Ancient Guru

    Messages:
    9,739
    Likes Received:
    2,199
    GPU:
    5700XT+AW@240Hz
    Well, what AMD vs. intel is doing differently is way they put it together. How they handle data transfers, how they handle cache accesses. And similar things.
    And then there are those known shared design parts. Both AMD and intel have surely quite a few of them very different because they have their own way to process same source data into same results in more efficient way.
     
  15. D3M1G0D

    D3M1G0D Ancient Guru

    Messages:
    1,942
    Likes Received:
    1,242
    GPU:
    2 x GeForce 1080 Ti
    Yup. As far as I know, Intel still solders their Xeons. If microcracking was a serious issue, I would have expected them to have stopped using solder with their premium products - instead, it's only their consumer products where they use paste. This fact alone should tell you that it's a false flag, and that Intel is most likely using paste because it's cheaper, not because solder is dangerous.
     

  16. Loophole35

    Loophole35 Ancient Guru

    Messages:
    9,356
    Likes Received:
    836
    GPU:
    EVGA 1080ti SC
    I hope Zen2 wins all around this time. Tired of Intel and their ridiculous prices. I was stoked at first when I heard my motherboard would receive a BIOS update that would allow the 9900k then I saw the price.
     
  17. Denial

    Denial Ancient Guru

    Messages:
    12,343
    Likes Received:
    1,529
    GPU:
    EVGA 1080Ti
    Multiple xeons are not soldered and there can be a ton of different reasons why some are and some aren't. I highly doubt Intel is going to retool it's fab lines for TIM and re-validate the entire line to save several cents on TIM vs solder. They are doing it likely because its saving them money on x number on failed processors due to microcracking. I'm sure they ran some internal study where they said "by switching to TIM our temps go up 10c which cuts the life span by 2 years, outside the warranty period, but we save .5% of processors within the warranty period" and pulled the trigger on it.

    Yeah but that's just a side-effect of the overall design. It's not like AMD is doing anything where Intel is like "wow how are they doing that?!?!" It's all well understood it's just different because the MCM approach is different. A future Intel MCM design at a high level will function similarly.

    I'm not saying these companies implement things identically but the stuff they do implement from a high level is well understood and documented long before its in an actual product. At a low level they may do it slightly differently because it benefits their overall design goal - or for example AMD who values security over performance puts a bunch of extra checks in it's speculative execution (spectre stuff).
     
    Last edited: May 1, 2019
    Fox2232 likes this.
  18. Dazz

    Dazz Master Guru

    Messages:
    815
    Likes Received:
    78
    GPU:
    ASUS STRIX RTX 2080
    Micro cracking, lol show me a 8 year old 2500/2600k even ones thats been overclocked to over 4GHz encounter micro cracking, the reality is by the time micro crackings occurs they are long gone as people would have moved onto something better. If a 8 year old 2600k overclocked isn't encountering it then clearly it not a issue for 99% of the users.
     
  19. Denial

    Denial Ancient Guru

    Messages:
    12,343
    Likes Received:
    1,529
    GPU:
    EVGA 1080Ti
    Does the 2500/2600K experience a relatively high rate of voiding? Was the thermal density of the process/design of the chip as prone to failure as ivybridge? How many users of failed 2500/2600K would even know their failed processor failed due to microcracking without an electron microscope? How many batches of processors don't pass Q/A validation and end up in consumers hands to microcrack in the first place? Why did Xilinx, Intel and others publish research papers showing this as an issue? Why are their companies selling million dollar inspection and sorting systems design to Q/A these chips for voids if this isn't an issue?

    What sounds more far fetched

    "Intel wants to cut down on chip defects based on research done by multiple companies and internal studies so they retooled away from solder"

    or

    "Intel wants to stop .01% of their users from getting slightly better performance out of their processor but 75% of those .01 people are going to delid anyway but regardless they're going to spend millions on retooling/revalidating to prevent them from doing that and make up multiple research papers about it that just happen to coincide with other companies in the industry"


    I don't even understand why the latter is still an argument it's tinfoil hat levels of dumb.

    Also 1% of processors Intel ships (if it doesn't effect 99% of them) is millions of processors.
     
    Last edited: May 1, 2019
    Undying and yasamoka like this.
  20. Dazz

    Dazz Master Guru

    Messages:
    815
    Likes Received:
    78
    GPU:
    ASUS STRIX RTX 2080
    Can't get numbers on Intels RMA process as they don't appear to publish it but checking around it's far more common for them to be DOA than to fail later on in it's life. My mates Athlon X2 2800+ 15 years old thats used as a workstation 24/7 still going strong, pretty sure that would have cracked by now but thermally it has had no perceivable impact. Then again AMD has a thermal threshold of 70C while Intel allows upto 105C which will no doubt have a big impact on the life of the solder. Personally i think it's more for cost savings more than anything else, after all why solder all the CPU's when theres only .5% overclock, that 20 cent saving across millions of processors adds up. Have to keep the share holders happy and cutting corners always wins.

    Also you make a good point, why are all Xeons soldered? They are built for reliability and endurance they have significant thermal cycling as in servers/workstations workload can vary quite significantly.

    Looking into this however i see people going on about if they are not soldered correctly then the CPU could be DOA which by moving to the thermal compound should remove this issue, but then i would have expected AMD to have followed in this step yet they have not. Although granted the APU's use thermal compound but they are low cost processors so one would expect a cost effective process is to not to solder them.
     
    Last edited: May 1, 2019

Share This Page