Discussion in 'Videocards - NVIDIA GeForce' started by Netherwind, Sep 2, 2020.
I am not sure anymore i heard there is a 20 gb version coming very soon i might hold out for a bit.
I'd say your 2080Ti will do a stella job of handling Cyberpunk 2077.
welp - I have a backorder in for a Strix 3080 with Memory Express here in Canada, hopefully doesn't take 2 months till I see the card.
Hopefully sooner. My retailer have confirmed on stream that they will almost daily receive shipment from various vendors in different quantities.
Yeah, I think so too. Maybe I'll wait for the 3080 Ti/Super with 20GB VRAM, if and when it comes. I just need 4K/60.
Have a 2070 Super that I spent too much to get in the midst of the AMD Radeon drivers going full Hindenburg on my modeling here.
Does not have enough VRAM with 8gb.
Stupid-proofing hidden in spoiler.
Not just ALLOCATED, USED @#$%ING VRAM. Period. DO. NOT. ARGUE. NOT. ENOUGH. VRAM.
Hitching, small freezes, modeling software slowing a 1 or 2 second op down to a 3~5 minute crawl, taskbar wigging out, Windows chastising me about thrashing due to running out of VRAM when <1gb remaining free, and all that.
8GB not being enough on my barely-half-finished city mod (while modeling any-who), I would deplore that folks think twice before ordering something with 'only' 8~10gb of the card right now. The 3090 looks beautiful - if MASSIVELY overkill on price - with it's 24gb... that'll outlast the card's core power during it's usable life by leaps and bounds, but 8~10gb is something we got with 1070's and 1080's.
1x DDS compressed texture (and it's req'd layers) in 8k can EASILY take up to 300mb, sometimes more - always more with translucency in some layers.
A fancy tunnel model (mind you, one third of the tunnel!) can easily top 2 million triangles, especially if foliage models are included and not seeded automatically at the engine level with easily-batch-rendered and sometimes partially or wholly 2D plants. That lasts just the main visible model itself in the 300~500mb range with some model formats (others may be more or less, but the engine can also compress these prior to or at run-time for a nice savings).
So basically, a few nice concrete textures WITHOUT LAYERING VARIATIONS (which take up additional room) take up almost 1gb... and the model, let's say that compresses to 120mb in VRAM total (for all detail levels combined), and you have over 1gb used just for a stupid tunnel. Not the road texture, not the reflectors, the warning stripes on barriers, the barrier textures, the emergency escape signs/doors etc, not the plants hanging from the ceiling vents which open to the outside at / above each entrance... nope, none of that. Just the concrete superstructure textures. All things considered, after 1~2gb total in just TEXTURES, you start to have most your boxes checked... throw in the model and you're sitting there at 2~2.5gb of VRAM used.
Should have got a Radeon VII with 16gb, used the pro drivers... didn't know I could use pro drivers for it or I'd have had one already and not been here moaning about it.
So basically now my 2070 Super is a paper-weight and destined for the dust bin by the end of the year, as I've had to get a Quadro card to continue my work.
Case-in-point: Don't be me. Let me fall for y'all and warn you that nope, 8gb is not enough. It might be just fine in a majority of games you own right now, but there's already a handful of titles that ask for more. Add to this working area for RT to work, the OS's VRAM overhead use, and anything else you might do like add more resources by modding the game of choice, and you can see where this goes, fast.
I've been babbling on about this since the summer when I started running up against the 8gb VRAM wall pretty hard, warning folks that 8gb is alright for 2020, but the year's almost over.
Figures above for space assume best-case use of DDS compression on power-of-2 textures, square shape. I won't even bring up 16k textures yet.
Figures assume models will be compressed into proprietary binary formats custom to your game engine / use case before or at run-time, to additionally save on VRAM use.
Of course it takes LESS VRAM to just run the game and not the development tools along-side it, but gaming itself is never far behind what's being developed for reasons which should be readily obvious.
Make sure you can get correct info with your game's/company's SDK or through a nvidia, intel or AMD (or DirectX) SDK, otherwise you'll get your ears rattled off by noobs/kids thinking they know better than you, all because one of the tech tubers out there brought up 'allocation' in a video and now the child-hood stars of tomorrow (cough, cough) won't stop yammering about it.
--Spend your money wisely, folks.
Oh, and in other news, yes it's hysterical seeing the prices of 3080's on Ebay, when the sellers, half of which who don't even have them in hand yet, are so hoping they're going to get them reasonably on-time.
'probably' the 3090...massive upgrade tail end of this year, every next year for me. Everything from monitor (4096x2160 10 bit) to Zen3 (threadripper pro) to GPU.
Big upgrades for me...it's been more than 7 years
Yeah, there is some very nice upgrade paths this year and early next. Me myself, I'm in desperate need of a gpu upgrade.
i have canceled a msi gaming trio x card and re ordered a asus rog strix rtx 3080 gaming oc
Waiting for card to be delivered
Good choice, but sadly you might be waiting until Nov to get it now.
So, I've had my MSI 3080 Gaming X Trio for an evening now and here's a quick review.
The cards size and weight is pretty much the same as my MSI 2080Ti Gaming X Trio although the 3080 is slightly shorter and a tiny bit thicker but generally runs a bit cooler actually despite having smaller fans than the 2080Ti (2080 has one small and two big fans but the 3080 seems to have three smaller ones). Yes, the RGB isn't great but it isn't bad either. Back to the fans, just like the 2080Ti the sound profile is really pleasant and not intrusive at all when you wear headphones (except at 100% of course). As you all know, the fan profile is very conservative and fans only start spinning somewhat fast at 70C and above which is a difference compared to the ASUS TUF. Though when spinning the fans at 60% and running RDR2 bench the card stayed cool (around 64C in my case) and I assume it wasn't louder than the TUF. Backplate is indeed plastic and the card does sag noticeably but not more than my old 2080Ti. I've not tried the bracket but I imagine it won't help. I do have an anti-sag thing that came with my Phanteks D500A case but it doesn't do anything.
Performance wise I personally thing that even though other coolers could potentially be better, it comes down to chip quality and, surprisingly enough, resolution. When running Firestrike Extreme the frequency is all over the place and has difficulty settling. Running a better fan profile does seem to help to avoid those low numbers and personally I think I got a pretty nice chip since the frequency mostly says north of 1950 MHz. I'm unfortunately CPU bound at 3440x1440 which shows when I'm running MSI AB OSD in the background - I notice increased CPU load in all games I've tried so far, even games like WoW that hardly had any CPU load before.
The card truly shines at 4K as many reviewers already said and I can finally play RDR2 at 4K & locked 60fps with optimized settings. CPU usage goes down at that resolution and GPU clocks are much more stable. I was looking at 1980MHz at stock settings which stayed that way with a slightly more aggressive fan profile.
When it comes to overclocking I added +50 to the core @60% fanspeed just to see what happened and I did complete both runs and during that time I did see the core reaching 2070MHz for short moments which was cool. I don't think that OCing helps much in the games I play anyway.
Coil whine is a slight problem for me though - not too satisfied with that, on the other hand I've not had a card that doesn't have coil whine.
All in all I'd say it was worth it, especially if I manage to sell my card and also had a gift voucher from earlier so the card only really cost me like €200 which of course is nice. We'll see what the future holds (RDNA2, 20GB cards, 3080Ti?) but for now I'm happy.
Didn't bother with benchmarking but did a small one anyway
Firestrike Extreme 2080Ti
Undervolt : 17160
1995MHz OC, Fan 60% : 17544
Stock : 16435
RDR2 2080Ti (3440x1440)
Undervolt : 82
1995MHz OC, Fan 60% : 82
Stock : 79
Firestrike Extreme 3080
+50 Core, Fan 60% : 21708
Stock : 21119
RDR2 3080 (3440x1440)
+50 Core, Fan 60% : 113
Am in the 8% as it stands waiting on 3070's. Are they doing the super thing again? oops change to 11% or how about in between 3070 and a 3070 super haha. Just saw there will be super versions.
Probably throwing my self on the fire, but considering how nvidia behaves and their past.. Just speculating.
Has nvidia "rebranded" Ti models (considering chip quality, how much its cut vs full die) to 90 series, or im the only one who feels that. Its not a titan, its a "titan performing" card that when looking history has the chip quality you got for paying the extra cash? Its like it went down hill when Titan brand was created, and competition and what we consumers went for. Every new generation you end up paying more for less, for example. Ampere Ti is downgraded (more cut down) vs older ti card like pascal or turing?
I dont know how the numbers of a full blown ampere chip. But it feels like 90 series purpose is to downgrade or we pay more for less. Its not a titan, not a ti, it cost what you expect ampere titan would cost looking at turing and chip dosent seem to be bad vs a full die.
Waiting for 20gb or TI or AMD. 3080 vanilla does not impress me much compared to what I've been using.
Did you try latency setting = on or ultra, by those games with increased cpu load and or when cpu bottlenecked?
Im now using only 4core/4thread cpu, thanks to my clumsiness and water damage my old 4770k, if I acted asap I could have saved at least cpu.. anyway Desitny2 now has ofcourse much higher cpu usage and few cpu bottleneck spots.
Once I set latency from off to on, I got ~ 10% lower cpu usage and I wasn't getting those low gpu usage drops anymore, or very very rare.. Worth a try
I literally been set getting RTX 3090 but after seeing news regarding the RTX 3080 with 20GB then I think I will be rather getting RTX 3080 with 20GB if price will be under £1k then I will get rather 20GB version but for now let's wait on RTX 3090 reviews in production or productivity benchmarks
10GB in my case won't cut it, 20GB will be just enough for me, although 24GB is just perfect
If you do any rendering or you use GPU for modelling etc then I would recommend or suggest wait on 16GB or 20GB version of RTX or AMD Big Navi, I used for while GTX1080 and wouldn't use them again because of VRAM, 8GB is just not enough
Hope this helps
No, I've actually never that setting but I will, thanks! I didn't even know what it did until now