AMD-ceo: Zen-processors available at the end of 2016

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jan 20, 2016.

  1. Kaarme

    Kaarme Ancient Guru

    Messages:
    3,518
    Likes Received:
    2,361
    GPU:
    Nvidia 4070 FE
    It's not going to be a free market if there's only one choice to choose from.
    Sure, my numbers might have been exaggerations, but it's not many weeks ago that Guru3D posted the news about Intel's very, very healthy profits despite the general difficulties in the market.

    AMD isn't dead yet, so obviously it has not come to pass, duh. I hope it won't be dead, either, in my life time unless some quantum chips make it totally obsolete (and hopefully Intel as well).
     
  2. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    VIA made the mistake of buying Cyrix, which was already a dead company at the time of purchase. VIA never had any hope of competing with Intel or AMD. Cyrix was already so far behind it was laughable. To make things worse, Cyrix was almost completely unknown since OEMs wouldn't touch them. Their chips were hot, power hungry and slow. There was nothing VIA could have done to save them.
     
  3. Chillin

    Chillin Ancient Guru

    Messages:
    6,814
    Likes Received:
    1
    GPU:
    -
    Again, you are talking about stuff you know nothing about and are making up to try and sound intelligent.

    I quoted TDP numbers, which are higher than the SDP numbers on the processors they are available (like the Y-series Haswell) which aren't on graphs. Some of the Intel CPU's there use cTDP; for example the m3-6Y30 can go down to 3.8w or up to 7w, but it depends on the manufacturer.

    Regardless, the i5-5200u quoted on that graph as the 15w CPU only has a cTDP down to 7.5, and no cTDP up so it's left at 15w TDP. Negating any point you were trying to make.

    It's amazing the excuses and nonsense you come up to try and defend AMD; it's literally a repeat of what you tried on the Fury launch and is borderline trolling in my opinion. Seriously, they are a for-profit company; not a humanitarian organization; let them sink. Hell, let Nvidia and Intel sink for all I care, I just want the best overall products and AMD doesn't have it. Either put up or shut up.

    Notice the fact sheets for your CPU which only has SDP quoted, vs. all the rest which have TDP and (if applicable) cTDP limits:
    http://ark.intel.com/products/85474/Intel-Atom-x5-Z8500-Processor-2M-Cache-up-to-2_24-GHz
    http://ark.intel.com/products/88198/Intel-Core-m3-6Y30-Processor-4M-Cache-up-to-2_20-GHz
    http://ark.intel.com/products/85212/Intel-Core-i5-5200U-Processor-3M-Cache-up-to-2_70-GHz
    http://products.amd.com/en-us/searc...Laptops/FX-8800P-with-Radeon™-R7-Graphics/123
     
    Last edited: Feb 9, 2016
  4. Chillin

    Chillin Ancient Guru

    Messages:
    6,814
    Likes Received:
    1
    GPU:
    -
    Of course it makes profits, as it should; AMD should try that sometime.

    In all seriousness, Intel hasn't raised (and in fact lowered for the same performance slots) the prices on its processors in nearly a decade when accounting for inflation.
     

  5. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    You do not get it, do you? I specifically wrote SDP and TDP where it belongs. Fact is that while TDP may be closer to reality, neither values provided by intel represent full power consumption under full load. In case of SDP they are misrepresented up to 5 times. In case of TDP maybe up to double (or more).

    I am not questioning that they perform better than choked down 15W AMD A10/FX chips. I question their real world power consumption. Because while AMD is under microscope in that category, not many question intel's values on paper.

    And when you look at those fact sheets, please take note that 2W SDP low clocked atom (low enough transistor count to cost $25) eats more than double energy (in real world) than is paper TDP of m3-6Y30 (4.5W TDP - $281).

    Read This. Can you guess where that performance and price comes from? That's from magnitude higher transistor count. That eats more energy while being at same clocks. Yet TDP of m3-6Y30 is less than half of real world micro $25 chip.

    Question things, do not eat what's served to you.
     
  6. Chillin

    Chillin Ancient Guru

    Messages:
    6,814
    Likes Received:
    1
    GPU:
    -
    Oh wow, the GTX 980 beats the GTX 780 despite having several billion less transistors and using less power on the TSMC 28nm; how can that be? ( read with sarcastic voice)
    http://www.anandtech.com/bench/product/1493?vs=1442

    Guess what, different architectures and design priorities (not to mention the time separating the development of the two). But hey, it's easier to believe in the conspiracies. Then again, you believe in the insanity that we didn't land on the moon, so I guess I shouldn't be surprised.
     
  7. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    I gave you chance to think, instead you bring this irrelevant thing.

    For your understanding: http://www.notebookcheck.net/Microsoft-Surface-Pro-4-Core-m3-Tablet-Review.153843.0.html
    That's Surface Pro 4 with your so much mentioned newest and greatest "m3-6Y30".
    intel has it as:
    TDP 4.5 W
    Configurable TDP-up 7 W
    Configurable TDP-down 3.8 W

    Surface pro with this chip has wonderfully large battery with capacity of 38.2Wh. At idle with minimal brightness device lasted for 13h 11minutes.
    That's 2.9W average power consumption of all circuits inside.

    When they put it in burn test battery lasted for 2h 18minutes. That's 16.6W average power consumption of all circuits inside.

    Now, main difference between those two tests is screen brightness. I happen to know that my 10" IPS tablet with very good screen eats 1.1W with screen at lowest (backlight OFF) and 2.5W with screen at maximum brightness. That would be 1.4W for 10" IPS screen.
    Surface Pro 4 does not have that much bigger screen (12.3") but lets say it is much less efficient and eats 3W more from backlight OFF to full brightness.
    (This is actually missing link between their long life battery test and burn test.)

    We have:
    Idle 2.9W - lowest brightness => 13h 11minutes
    Idle + Presumed hungry screen: 2.9W + 3W = 5.9W => 6h 28minutes of idle life
    Burn test + Presumed hungry screen (reversed): 16.6W total power consumption, 3W screen, 2.9W remaining electronics + idle CPU state. That still leaves 10.7W of power drawn due to workload (above idle CPU usage).

    If we are lucky m3-6Y30 idles at around 1W, making CPU eat in total 11.7W (optimistic). But we were too kind.
    Because Surface Pro 4 has nearly same internal composition as most of Atom tablets and those eat at idle around 1.1W while Atom eats 0.6W from that. Leaving 0.5W for electronics around. Electronics in Surface Pro 4 may stand for 1W of that 2.9W idle, making m3-6Y30 idling at 1.9W. MS is not using some cheap hungry screen so it is not 3W difference between backlight off and full brightness, it is like 1.8W.
    In pessimistic scenario m3-6Y30 eats 13.8W under full load (average over time). Running it with unrestrained TDP with ThrottleStop would probably lead to overheating and thermal throttling anyway as it is passively cooled.
    Neither of those values are anywhere near proclaimed 4.5W or 7W (as believed maximum TDP).
    But you may have some other magical explanation for 16.6W power consumption under load which is dynamic based on workload and does not come from CPU itself. Otherwise if it is not dynamic, it would be present at all times even at idle making battery life abysmal.
     
    Last edited: Feb 9, 2016
  8. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    15,140
    Likes Received:
    1,743
    GPU:
    GTX 1080 Ti
    Hey guys.

    Calm down.
     
  9. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    http://www.anandtech.com/show/2807/2

    Intel's processors can definitely exceed their TDP value and do fairly often.

    Anyway, Anandtech's recent article on Carizzo does not paint a good picture at all of AMD's relationship with OEMs. Even if Zen is really good, they are going to face a massive uphill battle getting them into decent product lines. Zen is basically going to have to beat consumer expectations. If it comes out and it simply matches what Intel has, they are still screwed.
     
  10. zer0_c0ol

    zer0_c0ol Ancient Guru

    Messages:
    2,976
    Likes Received:
    0
    GPU:
    FuryX cf
    My God did oem-s killed charrizo
     

  11. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Considering there is not best imaginable configuration for high end Carrizo. Which is 12.5" netbook without dGPU but with big battery in place of it.
    Preferably configurable by user within entire TDP range.

    I will repeat myself: "Best thing AMD can do, is to make AMD branded devices themselves to set bar."
     
  12. Humanoid_1

    Humanoid_1 Guest

    Messages:
    959
    Likes Received:
    66
    GPU:
    MSI RTX 2080 X Trio
    "Let them sink", "put up or shut up..."

    A company cannot "put up" a great product if it does not exist.

    The more players in the game the better the chance one of them will produce something excellent beyond expectation :)
     
  13. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    That's sad thing here. They already did. It did not work, maybe last APUs with boosted clock like FX-9830p (if that's real) will do.
    But in meanwhile, some retrospective.
    On left Surface Pro 4 with best of the line intel's 14nm i7-6650U in 25W mode as confirmed by notebookcheck's battery test. On right 28nm FX-8800p in 35W mode.
    While AMD's chip is 34% slower in single threaded workload, it is matching intel's performance once entire chip is in use.
    Yes, from graphical standpoint i7-6650U with 48EUs is like 30% better for gaming than FX-8800p. But when I look at price, i7-6650U costs $400 and FX-8800p is like $120. Entire FX-8800p device can cost what intel's chip costs alone. As those mobile APUs were meant for $400~700 devices.

    Only thing which makes APUs inferior to intel is 28nm vs 14nm.

    Why there are not good APU alternatives? Not because APUs are bad, not at all. It's because they are perceived as cheap and worse alternative and OEMs would have to price devices accordingly.
    But once out of the box, they could pretty much harm that High cost market. And no OEM wants that.
     
  14. BLEH!

    BLEH! Ancient Guru

    Messages:
    6,408
    Likes Received:
    423
    GPU:
    Sapphire Fury
    AMD got royally screwed by GF mucking up the 20 nm node, otherwise they'd be in a better place.
     

Share This Page