GeForce RTX 3080 CTD Issues likely due to POSCAP and MLCC configuration

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 26, 2020.

  1. AuerX

    AuerX Master Guru

    Messages:
    253
    Likes Received:
    122
    GPU:
    EVGA RTX3080
    Leaks can and will hurt a company. You don't want your competition to know what's up.

    We're also seeing leaks as being a moneymaker to youtubers and other "Tech journalists" and this seems to be the culture were in.

    We have paranoid companies due to leakers that gamers give clicks to.
     
  2. Mufflore

    Mufflore Ancient Guru

    Messages:
    12,975
    Likes Received:
    1,430
    GPU:
    Aorus 3090 Xtreme
    They wont change their method, only try and perfect it.
     
  3. kapu

    kapu Ancient Guru

    Messages:
    4,794
    Likes Received:
    443
    GPU:
    Radeon 6800


    Check this video quite interesting :)
     
  4. Astyanax

    Astyanax Ancient Guru

    Messages:
    11,365
    Likes Received:
    4,263
    GPU:
    GTX 1080ti
    How to do everything wrong whe modifying a circuit board.
     

  5. DannyD

    DannyD Ancient Guru

    Messages:
    3,019
    Likes Received:
    2,020
    GPU:
    EVGA 2060
    The 'RT cores' are locked into the regular cores, just like the 20 series, so if a 3060ti OCed has the exact same overall game performance as 2080ti, it'll also have the same performance as it in RT enabled games.
    The only reason gen2 RT cores are fast is because the gpu cores they're locked into are fast.
    It's not like intel cpu where you have the processor then seperate you have the gpu, the GPU and RT cores aren't seperate.
     
  6. The Goose

    The Goose Ancient Guru

    Messages:
    2,810
    Likes Received:
    246
    GPU:
    MSIrtx3070 gaming X
    Lets hope the AIB`s have learnt a lesson in time before the 3070 launch
     
    DannyD likes this.
  7. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    6,083
    Likes Received:
    2,420
    GPU:
    HIS R9 290
    As a Linux user, I can confirm this is definitely true. Nvidia is weirdly anal about how much they reveal/share about anything at all, as though whatever they reveal is going to ruin the company.
    People have reverse-engineered drivers and you can inspect their architecture with an electron microscope. For to-be-released products, there isn't a way for anyone (including companies) to create a retaliatory response to it in a timely manner; if Nvidia's goal is to not give people driver impressions before the product is released, well, not only did they fail to do that, but they shot themselves in the foot by supplying an unstable driver.

    Whatever they want to hide, it can be revealed with enough effort. One of the primary reasons I don't buy Nvidia anymore is because of their lack of cooperation with anyone who doesn't want to do things their way. AMD might not impress but at least they don't annoy.
     
  8. StarvinMarvinDK

    StarvinMarvinDK Maha Guru

    Messages:
    1,362
    Likes Received:
    82
    GPU:
    GB Gaming OC 3080
    Well as it seems the majority of the problem has been solved by the new driver, I'd say the 3070 launch can mostly be screwed by not being enough cards to fill the needs of the customers
     
  9. kapu

    kapu Ancient Guru

    Messages:
    4,794
    Likes Received:
    443
    GPU:
    Radeon 6800
    What did he do wrong? And if so, why havent you done better? :p
     
  10. Mineria

    Mineria Ancient Guru

    Messages:
    4,766
    Likes Received:
    349
    GPU:
    Asus RTX 3080 Ti
    Their worst problem is supply and demand together with AMD releasing their cards soon.
     

  11. AuerX

    AuerX Master Guru

    Messages:
    253
    Likes Received:
    122
    GPU:
    EVGA RTX3080
    Fixd
     
  12. Gomez Addams

    Gomez Addams Member Guru

    Messages:
    192
    Likes Received:
    101
    GPU:
    RTX 3090
    Yes, that is fairly obvious, but it does not excuse giving developers an inadequate piece of software to test with. If problems of this severity were not detected then it is inadequate by definition.
     
  13. alanm

    alanm Ancient Guru

    Messages:
    10,492
    Likes Received:
    2,604
    GPU:
    Asus 2080 Dual OC
    Of course it doesnt excuse them. Their paranoia of leaks and the resulting withholding of proper testing support to their partners is a ringing indictment of the way they introduce products in the market. Its backfired spectacularly. Not sure where you saw 'excusing them' implied anywhere in this. :D
     
  14. AuerX

    AuerX Master Guru

    Messages:
    253
    Likes Received:
    122
    GPU:
    EVGA RTX3080
    How much of that paranoia is justified? And is it really fair to use the word paranoia as it implies delusion?

    A lot of this has to do with us living in a reality where leaks can cost a company a lot of money, and leakers stand to make a lot of money.
     
  15. alanm

    alanm Ancient Guru

    Messages:
    10,492
    Likes Received:
    2,604
    GPU:
    Asus 2080 Dual OC
    Sure, but its a balance between rational assessment of what your competition can do with the info vs what your products quality, stability (or lack of) can suffer from as a result of clamping down on proper testing. THAT imo has hurt Nvidia and their rep far more than the threat from their competition. AMD would ultimately KNOW exactly how Ampere performs on launch day. So what if they knew a week or 2 before that due to leaked benches? All they have gained is 2 weeks extra to factor in what they know about Ampere into their own product response. Which is too little time for them to do anything substantial other than bios tweaks (like what they did with 5600xt).

    So yes, when quest for absolute secrecy results in botched product launches, hurt rep, upset customers, then it has gone beyond rational thinking into paranoia.
     
    Herem, Maddness and carnivore like this.

  16. brogadget

    brogadget Active Member

    Messages:
    94
    Likes Received:
    23
    GPU:
    2xR9 280x 3GB
    yep, we donĀ“t know exactly what happened to the driver? Fact is, there "was" an issue, and they changed something, maybe by lowering another, so called "card limit" (power target or something else). Probably a 3x8pin to 12pin adaptor would break any default specifications.
     
  17. brogadget

    brogadget Active Member

    Messages:
    94
    Likes Received:
    23
    GPU:
    2xR9 280x 3GB
    and this is why early birds become beta testers these days, really sad.......
     
  18. isidore

    isidore Ancient Guru

    Messages:
    6,276
    Likes Received:
    58
    GPU:
    RTX 2080TI GamingOC
    this, i was waiting for someone to do this, thx.
    Isn't he the one who created that specific block for AMD Ryzen, for better cooling?
     
    HARDRESET likes this.
  19. Astyanax

    Astyanax Ancient Guru

    Messages:
    11,365
    Likes Received:
    4,263
    GPU:
    GTX 1080ti
    1. I would have used flux, and 2. I wouldn't destroy something i paid money for.
     
  20. kapu

    kapu Ancient Guru

    Messages:
    4,794
    Likes Received:
    443
    GPU:
    Radeon 6800
    Its not broken if it works. It is far from destroyed. Also he got paid for it. I don't see problem.
     
    HARDRESET and StarvinMarvinDK like this.

Share This Page