AMD Security Announcement on Fallout, RIDL and ZombieLoad Attack

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, May 15, 2019.

  1. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    I don't know how many times this needs to be said.

    If you're going to complain about NVIDIAs pricing, then you're complaining about MSRP. If you're going to complain about non-MSRP pricings, then you're complaining about retailers and third party manufacturers.

    This is fact. If you argue this, thats a you problem.

    Wheres your proof of that? The same implies same cost of R&D, so wheres your proof?

    The only information i can find is up to 10 years of development/research and millions of hours of research and development.

    Are you saying they all take that much? If so, nvidia must be hemmoraging money as that is not sustainable.
     
    Last edited: May 16, 2019
  2. xrodney

    xrodney Master Guru

    Messages:
    368
    Likes Received:
    68
    GPU:
    Saphire 7900 XTX
    Sorry but if you set MSRP and then do not deliver sufficient amount of cards price goes up and its your fault, that's how market works, so fault is still on Nvidia here.

    We all can see yearly info about R&D spending though none except NVIDIA itself knows exact detail. Even if you say you spent 10 years developing something it says nothing as it does not give information about time and resources spent, you pretty much could put it on paper 10 years ago and get back to it month ago with one person involved and that statement still will be true.
    If you look at yearly expenses for R&D there is no sudden peak, it just grow slightly year over year as NVIDIA start investing in other areas.

    If you look at architecture from Maxwell to Pascal to Volta and finally Turing, there are no major changes to GPU architecture, only Volta adding tensor cores which NVIDIA more or less reusing with small changes on Turing for RTX functions.
    There was no Turing in long term plans so there is no long development its just something NVIDIA put quickly together and as Volta supports runs RTX as well as Turing, tell me where is Turing hiding your ten years of expensive development?
    Its simply not there.

    NVIDIA is stating 11,72 billion revenue with gross margin of 61.2% from which 6.25 billion is from gaming business and they spent only 647 million in R&D and only very small part of it is related to gaming.

    So no, there was no reason to price hike Turing except increasing your already insanely high margins even more which more than deserve to be called Price gouging.

    I maybe don't know exact amount NVIDIA spent on Turing but we can estimate its real costs and as well NVIDIA Financial report speaks more than clearly.
     
  3. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    Again, retailers and 3rd party manufacturers issues.

    Their problem, not nvidias. Nor have i seen RTX 2080 ti's stocks being low, so your idea that there's not sufficient amount of cards really has no bearing at all. Maybe where YOU live, possibly, but that's not a global issue, and has nothing to do with 3rd party manufacturers and retailers increasing the price. That is on them.

    As to the rest of your post, it's pure nonsense based off literally nothing. You have zero facts in what you state, you just claim to be right, claim that nvidia had no reason to increase the price, because you as an individual want that to be true. We don't know the facts, other then what nvidia has told us. You can choose to not believe nvidia as you are doing, or you can get off your high horse and come back down to reality and understand that business' are a business to make money, and how much they decide to price products at, to make said money, is their business, not ours. We can choose to not pay it, and if that helps the future, great! but that's no guarantee.

    You have zero clue as to how much R&D went into RTX features, so you have zero grounds to have your nonsense statements.

    Per nvidia, the only source we have

    https://www.nvidia.com/en-us/geforce/news/geforce-gtx-ray-tracing-coming-soon/

    "It required millions of hours of research and development, focusing on everything from GPU hardware and software, to updated APIs and game engines, to development tools and denoisers."

    And the only other information:

    https://nvidianews.nvidia.com/news/...l-time-ray-tracing-to-gamers-with-geforce-rtx

    So yes, you can continue to believe that it's just something they "quickly put together" while disregarding any actual information we have to go on, continue to be a conspiracy theorist, just because you can. Because apparently, according to you, something that has been so difficult to do, real time ray tracing in games, is apparently actually extremely easy to make possible, and was "quickly put together".

    Either way, this topic is offtopic for the current topic at hand, and with mentalities such as yours, i have no desire to continue this with you. You're one of those people who chooses to believe whatever they want to believe with no regards to facts or information we have and basing everything on personal experience that's not valid to anything on the topic.
     
    Last edited: May 16, 2019
  4. tsunami231

    tsunami231 Ancient Guru

    Messages:
    14,751
    Likes Received:
    1,868
    GPU:
    EVGA 1070Ti Black
    has Intel made announcement yet about all this? or is it more head in sand? next system if make new one is gona be AMD plain and simple. and I been Intel for as long as I can remember. I getting tired of this crap. ever 6 month something it outed that will reduce performance of cpu just to plug it. and so far most of this stuff dont effect AMD ( amd using new architecture? intel using decades old architecture?) i not really sure or care majority of people probably arnt protected from initial flaws which required MS to push all the fixes threw there updates. and to this day even in windows 10 I find pending updates more then anything else.

    I know my old PC my dad is using will never be plugged fully it will be lucky it partial plugged from the initial flaws. and I guarantee there are pending updates on that pc too
     

  5. ladcrooks

    ladcrooks Guest

    Messages:
    369
    Likes Received:
    66
    GPU:
    DECIDING
    I laughed at this - nice to be 100% , then x 3 , well your mentally a winner :D

    With AMD you get lower per core performance, low efficiency interconnect, lower performance in games and productivity software. :D

    You made me laugh as well for the wrong reason - Amd are so, so, so, behind in what? Even if I could afford their most expensive CPU, my logic would tell me another 5 - 15 fps and in other apps in the real world, you would hardly notice. I bet most gamers are still on 1080p, why do you 180 .... fps .

    Now i will add the word lower! LOWER prices
     
    Aura89 likes this.
  6. D3Master

    D3Master Member

    Messages:
    21
    Likes Received:
    4
    GPU:
    SLI Gigabyte GTX 1070 G1
    Go ahead and enjoy your poorly optimized games in 720p with Intel.

    For me, I'm experiencing the opposite with Threadripper in applications, and I max out 3440*1440 in games.
     
  7. ZXRaziel

    ZXRaziel Master Guru

    Messages:
    425
    Likes Received:
    134
    GPU:
    Nvidia
    I am not a fanboy but I can clearly see one where there is one , have a good day .
     
  8. ZXRaziel

    ZXRaziel Master Guru

    Messages:
    425
    Likes Received:
    134
    GPU:
    Nvidia
    Yes that's right , it seems that the small performance advantage comes at a price ;-)
     
  9. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Only mental gymnastics around is on your side.
    Entire bold part stands on term AMD and on final admission of shifting comparison to bizarre place by stating that you intentionally compare cheaper solution with quite more expensive.

    Edit: Lol, man put quotes in quotes or at least make sure that it is somehow properly linked to person who wrote it.
     
    Last edited: May 18, 2019

Share This Page