NVIDIA Sells Two SKUs of each Turing GPU (a normal and OC model)

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 17, 2018.

  1. ht_addict

    ht_addict Active Member

    Messages:
    76
    Likes Received:
    23
    GPU:
    Asus Vega64(CF)
    I've always had AMD for CPU and GPU's. Never had an issue with them playing games(high to ultra settings), running benchmarks, etc. Sure it never gave the highest fps, or highest score but it worked. Games were playable in High to Ultra settings. Also saved me a few $$$ that i could use on other components for my setup. People gotta stop looking at the graphs and listening to all the hype. We really got to stop pumping out new cards yearly with minimal performance increase, new features and focus on optimizing software to utilize what we have now.
     
  2. fry178

    fry178 Ancient Guru

    Messages:
    2,078
    Likes Received:
    379
    GPU:
    Aorus 2080S WB
    @Yogi
    you forgot the time frame you want to cover.
    there is a difference if i need to get something to use the pc/play games for a couple of month before getting something bigger,
    or if im willing to spend more on a bigger chip, cause i want to keep for years/future game releases, even if its not the best ratio...
     
  3. Koniakki

    Koniakki Guest

    Messages:
    2,843
    Likes Received:
    452
    GPU:
    ZOTAC GTX 1080Ti FE
    Does this only applies to AIB partners selection/preference? Or it also applies to the ones sold directly at Nvidia's website?

    If so, how do the customers know which one they will get if pre-ordered directly from Nvidia, if that's the case? Nvm. 2080Ti Founders have already a factory oc applied which indicates the use of TU102-300-A sku.

    And if they were only selling TU102-300-A(oc sku) at the Nvidia website, what will happen to all the rest non-oc TU102-300, since no one is opting for them?

    Or the AIB's do indeed opt for the non-oc TU102-300 but only for the blower style/cheapest release version of each brand?

    Last but not least, wasn't the RTX 2080 NDA supposed to be lifted today(Sept 17th)? Just checked and it seems it's been pushed back to September 19th.
     
    Last edited: Sep 17, 2018
  4. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,640
    Likes Received:
    1,143
    GPU:
    4090 FE H20
    People want AMD to produce better products in every way

    AMD the last few generations have usually been marginally slower, while being hotter and significantly more power hungry.
    Selling RX vega64 for more than a 1080Ti(that is significantly faster) is not a good product for the price.

    Unfortunately miners did jack up the prices but AMD cards just haven't been comparable to alternate NV cards as a whole.

    RX580 is a bit faster than the equivalent 1060 but uses more than double the wattage to do so.
    That's not a competitive architecture.

    Similar comparison for vega vs 1070/1080

    The extreme power gap between AMD and NV shows that AMD is forced to have higher voltages and clock speeds to reach similar performance.
    Pascal and kepler are clocked low out-of-the-box with a huge clock ceiling.

    That's like a chevy trying to tune their V8 to the max while sacrificing efficiency/MPG to get comparable performance to another manufacturer.

    So the statement "I wish AMD was more competitive" is a good one.
    If they could produce all around better cards than NV, NV would have to put much more effort in their designs while not raising their prices.

    1200USD for a 2080Ti is because AMD is not competitive.

    If vega 64 was 500 USD and performed better than the 1080Ti do you think 2080Ti would be 1200?
    Absolutely not.

    and lastly, your experience is not everyone's experience.
    That's your opinion.

    I can tell you that vega64 is definitely not fast enough for me and my experience would not be fantastic either(even my 2100mhz 1080ti is not fast enough)

    I play 165hz 1440P and find <100 fps jarring.
     
    fry178 and Robbo9999 like this.

  5. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    Yeah, Nvidia dominates in the mainstream and budget markets as well, with the GTX 1060 vastly outselling the RX 480/580. In reality, these gamers would be far better off with a AMD GPU + FreeSync monitor but Nvidia's brand is extremely strong.
     
    carnivore likes this.
  6. fry178

    fry178 Ancient Guru

    Messages:
    2,078
    Likes Received:
    379
    GPU:
    Aorus 2080S WB
    not if they dont need free/gsync.
    the 1060/6 is fast enough to do most games with decent settings @60 with reg vsync.
    and its not like a lot of those buying 1060s have gsync monitors,
    so removing the freesync "advantage" from the occasion, means you need a 580 (rather than a 480) to be on 1060 level.
     
  7. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    The thing is that it is not. I would get the Vega cards over their counterparts, but not over the 2070.
    Hence AMD has nothing for this level of performance and up.
    It matters because it's yet another tickbox that gets the final prices to be even higher than the supposed msrp. Instead of having the standard of yields and average oc of chips produced later stabilizing, now you can sell them on a premium over your supposed msrp, because there is no one out there to stop you.


    AMD matters just until a cheap offer on a Vega 56 now. Even a Vega 64 in msrp is not worth it over the 2070.
     
  8. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,665
    Likes Received:
    597
    GPU:
    RTX3090 GB GamingOC
    So basically 70+% ASIC is now being turned into a money grab and will probably always be this way from now on. Nvidia wants more for the higher ASIC cards.
     
  9. Yogi

    Yogi Master Guru

    Messages:
    354
    Likes Received:
    153
    GPU:
    Sapphire RX 6800
    J
    That's still determined by a prospective buyers budget and again perf/€
    If I was to give advice it would be 1)what resolution do you want to use
    2)what's your budget
    3) if the budget exceeds the min tier of gpu for that resolution would you be better off spending on a adaptive sync monitor or a higher tier gpu or something else like larger ssd etc.
     
  10. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Well, maybe AMD would be more competitive if people did not persuade everyone around (including themselves) that they should not be buying their products.
    Especially if they use false statements to achieve that.

    So, you think that RX580 uses more than double wattage than GTX 1060? I guess that 2 * 120 < 185 now. And in reality that RX580 usually goes to 140~160W depending on game. Performance per Watt is almost same for those 2 cards.

    So, good job. Persuade more people. Then you can blame AMD some more for nVidia's pricing. Maybe blame AMD for intel's pricing... Sorry, I forgot you already did.
    Now imagine what would happen if AMD fitted Ryzen exactly to intel's pricing. Both would have sales at those lovely intel's prices.
    Now imagine that AMD releases Navi next year and prices it to fit nVidia's new prices. Who will you blame? What will be statement? "Evil AMD is not reducing prices?"
    Surprise, surprise... AMD is not here to make nVidia's GPUs more affordable. That's not their purpose at all.
     
    carnivore likes this.

  11. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,640
    Likes Received:
    1,143
    GPU:
    4090 FE H20
    Here you go again with your nonsense taking the meaning far beyond the point's made.

    And yes, RX580 uses nearly 2x the wattage.

    Don't like facts?

    [​IMG]

    [​IMG]

    [​IMG]

    Whatever you want to believe, it's a fact that NV is significantly more power efficient out of the box. Regardless that the above is non-factory card, power difference between it and stock is what 30watts? Still a huge difference between 1060 power consumption.

    Edit: guru3d shows rx580 stock using 191watts, just so I'm not picking and choosing.
    That's nearly a 60% increase in power consumption.

    Which is pretty bad.
     
    Last edited: Sep 17, 2018
    BangTail likes this.
  12. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    15,140
    Likes Received:
    1,743
    GPU:
    GTX 1080 Ti
    Vega is not bad at all. What really killed Vega is availability, which was due to low yields and also miners snatching up the cards as soon as they were available. Because Pascal had higher yields and had more market time before Vega came out, it's easier to find a 1080 at MSRP or lower compared to finding a Vega at MSRP or lower. Now reference design, the 1080 sells for $550 straight from Nvidia. I don't think AMD Sells their own branded Vega 64? Just board partners do?
     
    PrMinisterGR and Maddness like this.
  13. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Seeing those graphs. Lovely. Facts:
    - Fury X is Hard-capped at 300W. It required vBIOS edit to allow it eat up to 360W.
    - I have access to two RX-580, by twist of faith... Yes, exactly mentioned Sapphire RX-580 Nitro+ 8GB is one of them. It does not even go to 180W without moving Power slider in MSI AB. And then at time it does, workload is taking it to ~60fps and less.
    - And guess why RX 570/580 were favorite mining cards. Do you remember ethereum? 23MHash/s on GTX 1060, 28MHash/s on RX-580. You should have advised those guys better, maybe they would not buy as many of those RX-570/580s.
    - Quite a few of those wattages in your TPU images are quite ridiculous, and I mean Hilbert's calculated values are actually much more accurate (and both of those RX-580 are special OC editions).
     
  14. Yogi

    Yogi Master Guru

    Messages:
    354
    Likes Received:
    153
    GPU:
    Sapphire RX 6800
    Going off average US electric rates that's about a difference of $2.50 for a solid week of running furmark. Or 10 bucks a month difference running the cards 24/7.
     
  15. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    That's an extreme example. The Nitro cards are highly overclocked and not representative of most 580s. For instance, Guru3D's review of MSI's RX 580 Gaming X shows 191 watts (vs 134 watts for the 1060).

    https://www.guru3d.com/articles_pages/msi_radeon_rx_580_gaming_x_review,5.html

    Pascal is much more power-efficient overall, but saying that it's 2X is overblown.

    To be fair, many of those miners were modding and undervolting their GPUs. You can also reduce power consumption greatly on a 1060, although the efficiency per watt is still in favor of Polaris. Also, DaggerHashimoto just ran better on AMD hardware (Nvidia GPUs are better for other algorithms).
     

  16. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,640
    Likes Received:
    1,143
    GPU:
    4090 FE H20
    Well considering I've seen several reviews with just 2 mins of looking up reviews that show fury x > 300w, that proves your statement is incorrect.

    [​IMG]

    Another 100w+ difference between OC 1060 and OC 580.

    Also tomshardware shows 224 watts under load for nitro 580 as well.

    Are you saying every reviewer out there is wrong?
    I mean, the data proves you wrong several times again.

    Lastly, you bring in more things that have absolutely zero relevance to the post.
    How exactly is mining etherum performance relevant to the previous statements?

    It's not.

    and btw, techpowerup doesn't use calculations.
    It uses hardware to measure power consumption of DC input directly , which gets rid of human error. Mind you, that equipment costs several thousand dollars.
     
    BangTail likes this.
  17. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    15,140
    Likes Received:
    1,743
    GPU:
    GTX 1080 Ti
    Enough of this conversation. Only warning.
     
  18. tunejunky

    tunejunky Ancient Guru

    Messages:
    4,451
    Likes Received:
    3,071
    GPU:
    7900xtx/7900xt
    as schmitty and vbetts said...this reeks of Intel marketing.

    there has been a lot of engineering cross-pollination, i guess that some in marketing have moved as well...

    honestly this is a major fail that should never have seen the light of day.

    and it is very true that AIB manufacturers have very thin margins that are being squeezed thinner. in the past the lower binned chips would be a different model if the variance in headroom and performance was so high.
    again, greed and sloppy marketing for a very short-lived product staring obsolescence in the face. (from its own company at the very least, not counting AMD and Intel).
     
  19. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,016
    Likes Received:
    4,395
    GPU:
    Asrock 7700XT
    That's assuming absolute worst case scenario where there isn't enough AIB competition. I don't see that happening any time soon.
     
  20. fry178

    fry178 Ancient Guru

    Messages:
    2,078
    Likes Received:
    379
    GPU:
    Aorus 2080S WB
    @Yogi
    and i still prefer to buy the more efficient unit, as the power has to be made somewhere,
    the less my pc needs, the less greenhouse effect (while doing the same thing with it)...

    same reason i would have prefered a golf gti over a pontiac firebird @same HP (1.6L vs 7L),
    not that i was old enough to drive ;)
     

Share This Page