1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

AMD Security Announcement on Fallout, RIDL and ZombieLoad Attack

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, May 15, 2019.

  1. D3M1G0D

    D3M1G0D Ancient Guru

    Messages:
    1,929
    Likes Received:
    1,237
    GPU:
    2 x GeForce 1080 Ti
    Nah, they'll praise AMD for bringing back competition and then wait for Intel / Nvidia to lower prices so they can upgrade cheaper.

    Neither. ARM is by far the most numerous and popular CPU architecture.
     
    ZXRaziel, Kaarme, Venix and 3 others like this.
  2. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    4,414
    Likes Received:
    1,347
    GPU:
    HIS R9 290
    I myself have a laptop with an i3-4100U. For many years, it's been able to keep up just fine with my daily workloads, but ever since 2019 when pretty much all of the mitigations have been applied, I've been noticing performance dips, and my CPU usage when watching Youtube videos has gone up dramatically. It's still usable but I think I'm ready for an upgrade now.
    The only reason I never mentioned any of this earlier is because I don't like to use personal anecdotes as evidence to back up a point. Haha so, the only reason I'm saying any of this now is to say I feel your pain.
     
  3. Aura89

    Aura89 Ancient Guru

    Messages:
    7,626
    Likes Received:
    897
    GPU:
    -
    ARM

    Many more devices then Intel and AMD, and so much more potentially juicy, valuable data with what people do with their phones and tablets and various other gadgets.
     
    airbud7 and Alessio1989 like this.
  4. Alessio1989

    Alessio1989 Maha Guru

    Messages:
    1,377
    Likes Received:
    231
    GPU:
    .
    ARM looks like a clusterfuck but I can bet my balls they share 99.9% of security holes (speaking of the same generation). OEM and builders usually personalize only minor things on the socket, but they do not touch the CPU implementation at all. Moreover on ARM devices usually things are surrounded by a crap OS, usually in an outdated version and usually with tons of OEM useless personalization which adds other security holes.. And on those devices most of the user sensitive data are stored.
     

  5. jwb1

    jwb1 Master Guru

    Messages:
    463
    Likes Received:
    48
    GPU:
    MSI GTX 1080 Ti
    airbud7 likes this.
  6. Alessio1989

    Alessio1989 Maha Guru

    Messages:
    1,377
    Likes Received:
    231
    GPU:
    .
  7. Aura89

    Aura89 Ancient Guru

    Messages:
    7,626
    Likes Received:
    897
    GPU:
    -
    There was no facts in what er557 said, just more nonsense without facts.

    "minimal performance impact" can't be a fact, as, again, you have to prove something to be a fact. He can hope, but that's no fact.

    "With AMD you get lower per core performance" direct opposite of facts. Now, is this true some of the time? Sure, but it's not universal, and is dependent on the processors on each side as well as what you're doing with, it, such as gaming. So yup, still no facts here.

    "low efficiency interconnect," Define "efficiency", as i see no facts here again. Yes, AMDs infinity fabric has its issues, but it also has very strong points as well. But none of it really could be labeled "efficiency", unless you're talking about wattage used, and Intel chips at best match AMD for efficiency, so, nope no facts here either.

    "lower performance in games" Again, not universally, and depending on your GPU, your game settings and resolution. Now if he had said "lower performance in games i play at settings i play", then sure! there's some facts, but he didn't. No facts here.

    "lower performance in productivity software" Ha? I mean, was this one just a joke? Do i even have to explain why this is not facts? No, really, i shouldn't have to, so i won't, it's that obvious.

    The problem with many of you intel fanboys is the fact that you decide that if Intel wins at ONE situation, it therefore wins at all situations.

    Then you go around saying how great intel is and how AMD can't compete, these are universal statements, not directed statements, and can only be received as such.

    You want to say that Intels i9-9900k is is better at most games with a RTX 2080 ti, low settings and 1080p then a 2700x with same settings? Do it! you'll be correct! Finally, for once, you'll be correct! But no, you just say "Intel is best at gaming" with no regard to what game, what resolution, what settings, what processors and what graphics card is being used

    Then you proceed to dig yourselves even more of a hole by stating things, like above "lower performance in productivity software", which literally has no bearing. What are you comparing, an Core i9-9980XE to an Athlon 200ge?
     
    Last edited: May 16, 2019
    ZXRaziel, Darkest and Fox2232 like this.
  8. chispy

    chispy Ancient Guru

    Messages:
    8,758
    Likes Received:
    891
    GPU:
    RTX 2080Ti - RX 580
    Glad to hear i'm safe on my 24/7 AMD Ryzen 2700x pc .
     
  9. Fox2232

    Fox2232 Ancient Guru

    Messages:
    9,736
    Likes Received:
    2,196
    GPU:
    5700XT+AW@240Hz
    Technically speaking. In most of cases where we are aware of vulnerabilities, they were found by Security Research labs. Their modus operandi is not to cause harm.
    Therefore they test and think about possible vulnerabilities in chips regardless of number of devices where they could do harm with found vulnerability.

    As result, market share does not really matter.
     
    Aura89 and airbud7 like this.
  10. Astyanax

    Astyanax Ancient Guru

    Messages:
    3,265
    Likes Received:
    843
    GPU:
    GTX 1080ti
    Intel skipped validation procedures and lost a lot of talent in their validation lab thanks to certain senior intel employee's whining that they were falling behind ARM.
     

  11. xrodney

    xrodney Master Guru

    Messages:
    326
    Likes Received:
    46
    GPU:
    Aorus 1080ti xtreme
    At RTX launch 2080ti was hard to get and real price was $1200-$1400 at same time you could get GTX 1080ti between $540-$700 so if you take higher prices there is your $700 price difference, noone cares about MSRP when price is higher due to low availability.

    As for yield and cost for chips 754mm2 should have at least 40% Yield which makes 26 perfect and 38 damaged dies, Thats if Nvidia using full dies for final product which is not case here. Instead they disable part of chip. This help with yields as 80% of chip is non critical for defects and even in remaining part there are certain redundancies to help salvage chips to increase yield in case of defect.
    If we assume that 80% of those defective dies are usable with part of chip disabled you get to additional 30 usable dies which makes overall yield to 87.5%.
    Cost per wafer should be between $6000-$8000 which makes your cost per chip in $(120-150) range (cost for 1080ti between $60-$80).
    Difference in cost between GDDR5x and GDDR6 should be in 30$ range so we are looking at $90-100 higher costs per GPU.

    Yes there are R&D costs, but there are here for every single chip or generation so RTX is no different from past generations.

    As for comparison to Intel/AMD CPUs its not exactly same, CPUs are hit much more critically by defects, there are less redundancies for defect and as well higher impact on silicon quality that affects clocks.
     
  12. HWgeek

    HWgeek Master Guru

    Messages:
    439
    Likes Received:
    314
    GPU:
    Gigabyte 6200 Turbo Fotce @500/600 8x1p
    Don't think so, If AMD gonna have SMT4 then it makes perfect sense to limit TR parts to only SMT2.
    This way EPYC parts will stay as the best performance parts.
    Moreover, you are paying double the price per core on 2990WX vs Ryzen 2700($1700 vs 4*$220), so why would AMD loose all this extra money?
     
  13. Astyanax

    Astyanax Ancient Guru

    Messages:
    3,265
    Likes Received:
    843
    GPU:
    GTX 1080ti
    Ryzen getting SMT4 isn't a thing.

    Ryzen is per core ipc on par with intel, they are behind where clocks count.
     
    Last edited: May 16, 2019
  14. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    14,675
    Likes Received:
    1,249
    GPU:
    RTX 2070FE
    Not sure what I would call it, but if we're referring to 4 threads per core Intel and IBM both have iterations of this.
     
    ZXRaziel likes this.
  15. Astyanax

    Astyanax Ancient Guru

    Messages:
    3,265
    Likes Received:
    843
    GPU:
    GTX 1080ti
    On cpu's that are designed for throughput at the expense of latency, not for high performance servers and consumer chips.

    I think its called Coarse Grain Multithreading btw.
     

  16. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    14,675
    Likes Received:
    1,249
    GPU:
    RTX 2070FE
    Xeon Phi yes, but IBM only makes PPC based CPUs for high output server performance. In any case, the technology does exist.
     
  17. Astyanax

    Astyanax Ancient Guru

    Messages:
    3,265
    Likes Received:
    843
    GPU:
    GTX 1080ti
    ZXRaziel likes this.
  18. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    14,675
    Likes Received:
    1,249
    GPU:
    RTX 2070FE
    Again though, the technology exist. Purpose in this case was not the argument.
     
  19. Astyanax

    Astyanax Ancient Guru

    Messages:
    3,265
    Likes Received:
    843
    GPU:
    GTX 1080ti
    Sorry, i meant in the case of Ryzen (or any x86 consumer chip really), have clarified in the original post.
     
    vbetts likes this.
  20. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    14,675
    Likes Received:
    1,249
    GPU:
    RTX 2070FE
    You're fine buddy. :)

    I know there were rumors about the AMD chip inside of the PS5 featuring a sort of SMT4 or however you want to call it, but anything like that I would take as a grain of salt until we officially here.
     

Share This Page