i had that error, and it was definately the power supply. Only went away when i went to an 850w single rail unit. Don't forget to run 2 seperate PCIE power cables to the card, as it will choke out a single splitter cable.
I got my 2080 yesterday and experienced one crash so far similar to this with a driver stopped responding but haven't had any issues yet. I really hope it isn't a PSU requirement, as even the GPU I bought came with only a 650 watt. I already had a 650 watt and researched before buying my 2080 and found that 650 watt should be more than enough. I am currently running the GPU off one PSU cable that has a y cable/split on it, so if it happens again I may just plug in a second dedicated GPU power slot as opposed to one y cable to see if that helps, but I still don't think the issue is due to raw lack of power and needing a beefier PSU. Even Hilbert recommends 600 for 2080 and 650 for 2080ti. (https://www.guru3d.com/articles-pages/geforce-rtx-2080-ti-founders-review,9.html). Of course with overclocking and beefier systems he suggest a higher PSU, but I still think 650 should be enough for a single 2080/8700k OCd.
^ Don't use any spliters or converters. Those can cripple current a bit, or make it more "dirty". Could be driver controlling issue, could also be bios issue. Or bios is ok and nvidia needs to add your bios to driver database.. I've seen when they collected 980TI bioses, mostly or only for custom gpus - for fan and voltage info. is F5 your latest bios? If not then update it as well. https://www.gigabyte.com/Graphics-Card/GV-N2080GAMING-OC-8GC#support-dl-bios I would first report a bug to nvidia driver team @ManuelG or open a ticket at main Nvidia site, then open a ticket by Gigabyte. btw; fan shuts off bellow 60C by default, that's why it didn't spin when you checked it out, by 60C it should normally start to accelerate to that said temperature target, in your case it auto hits 100% when gpu hits 60C.
something quickly found; https://www.tomshardware.com/reviews/graphics-card-power-supply-balance,3979-6.html Looking at the interface to the PSU itself is even more interesting. Everything terminates at a single six-pin connector. Even though the power supply's specifications state that this connector is rated at 20A (240W), and the PSU does shut off, just as it should, beyond that number, a pair of eight-pin connectors on the other end suggest 300W. Simply, this isn't possible due to the multi-rail PSU’s OCP, making the cable configuration all bark and no bite. Unfortunately, the kind of problems shown here reappear in many variations with other PSUs and their cables. Usually, combined cables like these should, at the very least, have a solid eight-pin connector on the PSU. A good example is depicted below. The manufacturer decided to go with all-black cables for aesthetics. This doesn’t matter to you or I, but the manufacturer incurs higher failure rates because the cables are harder to distinguish and identify correctly. The ideal would actually be to just have an individual cable for each PCIe power connector. That's eliminate all of the problems posed by too-high currents and fast load fluctuations in one fell swoop. AWG 16 cables with a pair of eight-pin connectors and at least eight-pole PSU connectors fall in the same category of supremely usable. Also, the max power for one of those cables is 150w + 75w (PCIE) = 225w. GPU-Z reads my card power draw over that often whilst gaming
^ That's what I was using. Pigtail. Anyone mess with the auto overclock feature for these cards yet? I don't see the ability in Afterburner, but the Gigabyte software that came with my card had that option. Auto anything always makes me a bit nervous..
Yeah i leave mine in OC mode (1800mhz boost), but if you watch the clocks in GPU-Z it's actually boosting up to 1920mhz. Seems to be fairly common from the reviews and threads i've read.
Has anyone seen a review of the EVGA Geforce RTX 2080 FTW3 anywhere? Took a bit of a look and can't find any. If anyone has seen one, please lmk. Cheers
Nah. 144hz is the keyword there. I had a similar setup and went from a 1070 to 2080 so I can be above 100fps or use ultra settings in games that I otherwise couldn't. Playing at or around 144hz with everything on ultra is so worth it, where before games like AC:O would dip down to the 50s with settings adjusted. I guess "waste of money" depends on what you want your gaming experience to be and is very subjective. I personally think 4K is a waste of money and that gaming should be focused on 1440p instead of pushing 4K everywhere but that's just me.. I like 120+fps/Hz over a resolution bump when 1440p is typically pretty golden even with just FXAA. EDIT: Boy PC gaming has become spoiled. I remember back when everyone was just aiming for 60 FPS on Crysis lol. Good time to be alive for gaming I guess EDIT 2: Just want to make it clear that I don't think 4K is stupid or useless, I just think gaming should be pushing to 1440p instead. Especially consoles... they can barely sustain 60fps in most games.
I would just like to point out that I called the naming scheme way before the name was announced. I'm smart.
Nvidia divides its Turing GPUs into classes A and non-A. If you're lucky, you'll find an A-GPU under the cooler of its low-priced Geforce RTX - and you'll have unimaginable possibilities. PC Games Hardware shows what tuning-joyful lucky guys can get out of a Geforce RTX 2080. While owners of a factory-overclocked Turing graphics card are guaranteed to own an A-chip, all users of a cheaper, factory overclocked Geforce RTX 2080 (Ti) and RTX 2070 should check the device ID using GPU-Z. Remember it's at your own risk .... http://www.pcgameshardware.de/Gefor.../Nvidia-BIOS-Mod-Powerlimit-erhoehen-1268042/
.... Yeah read that too. Or you check first with teardown reviews, or possible, techpowerup to see what chip's underneath the cooler.
Interesting. I just checked and my 2080 is "1E87" which according to that article is "good". I assume this is similar to the silicon lottery we all play with CPU chips? Some CPU chips are just 'better' than others when it comes to OC/voltages.. similar thing going on here?
From what I saw A models are just blower style gpu's. Edit: ah they removed separation @geizhals They had Tu104a and normal tu104, I think just about 5 different makers. It was the same by 2080ti. Although zotac made it clear with A at the end. https://geizhals.eu/zotac-gaming-ge...-t20800a-10p-a1870909.html?show_description=1