Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Feb 7, 2019.
AMD has a card with the most VRAM so this lie starts up again.
Do you think that NVidia marketing lied when he promoted the 1080TI and Titan cards?
RVII loses to the 2080 in FFXV at 4K in both average fps and 99th percent.. and even then it's like 40fps on average - I also don't even believe it uses that much ram or needs to. The textures in that game are average at best - its probably just one of those games that caches everything until VRAM is filled. I also like how you state RT/DLSS will only be used in 10 per 1000 games but that same logic doesn't apply to the number of games that will exceed 8GB of VRAM but still offer enjoyable framerates.
Ffxv is bad benchmark full of flaws and very nvidia biased
Not sure why many fail to understand "uses" more vram vs "needs" more vram in whatever scenario or game applicable.
True, but point still stands re vram usage vs need.
I agree but I'm also not the one using it as an example of 16GB of HBM being necessary.
Ah ok I didnt read that. Well imo 16gb won't be a thing for a long time. Even re2 remake is fine with just 6gb vram at something like 1620p.
Same with newer cod's or hmm idk I saw few more that could use 10-12gb vram and all ran fine on my 980ti 6gb.
With "fluffy bisons" or without them? They gimped my fps from 75 to 28 fps I played FFV with hi-res textures with anabled HBCC and card was using 11.5GB VRAM in cities and 10.8 in plain.
From where I know that so few games will use RT? Poor speed of adaptation in previously announced titles means that coding for using RT (so as not to kill fps) is very time consuming = expensive. Probably this will be so until the consoles accept RT as a standard.
Games are increasingly using more vram. Same time next year 8gb will be on the lower side. When you buy an 700$ card you want to last more than a year, right?
8gb vram a year from now will be on the low side? Dude, lay off whatever AMD are feeding you.
Here's a tip. Don't live by the numbers you see in MSI Afterburner. Because for example, the AMD logo video in the startup process of The Division 2 is apparently using 5.1gb of VRAM, acording to MSIA. Damn, must upgrade to 128gb gpu if a silly 2 second logo takes up 5gb, right?
Yep, if Game devs can have the freedom they can use even more then 16GB vram for even beautiful world in games, they can't do so if the 90%+ of the market is limited by 4GB/6GB cards.
Or go all the way with 2080TI or take 2060/2070.
NV just nerfed the 2080 so it's days will be short and you will upgrade sooner.
8GB is great, but not at RTX 2080 price level.
The 2070 is not powerful enough for proper RTX at the resolution someone paying this kind of money for a GPU would likely play at (2k 144Hz or 4k) so it should not be accounted for. It's like having a F1 engine in a Mazda 6. Or 16GB HBM2 memory instead of 8GB on a card not powerful enough ... Nobody would pay for a F1 engine inside a non modified Mazda 6. Doesn't make any sense.
It remains to be seen if DLSS will be useful and not blur the image too much. I have my doubt personally and i will wait for more games to support it before considering it a feature worth paying for. TAA is awful and i would rather poke my eyes and immolate myself than use it. DLSS will not only need to perform better (which it will no doubt) but also not blur the image as much. imo no AA >>> TAA specially at 4k. Remains to be seen if DLSS will fall into the same category.
This card might (does) look worse than the 2070 and 2080 but it's like choosing between a polished turb and a normal turd. It's still a turd. Right now the only card worth paying for in the high end segment is the 2080 Ti (unless you got an oldie like 5xx-6xx-7xx).
Omg what is this forum?
Different opinions man. No need to insult.
I'm not insulting anyone. What I do find insulting is people acting like we suddenly now need 16gb gpus because AMD released one. hilarious.
You just told me to lay off the stuff amd is feeding me. Thats just silly.
I just said 8gb will not be enough in a distant future not that 16gb is needed. There are also 11gb cards.
And what I find insulting is that NV taking 2 years old 1080TI, replacing it with same performance card at higher price and cutting down the ram to only 8GB and to see ppl thinking it OK and there is NP.
So it's not the 8GB the problem, its that it's has 2 years old GPU performance with it's memory cut down with higher price.
Save for 2080TI or take2060/2070 and save the money for Real upgrade next year. IMO 2080 is irrelevant.
Honestly 16GB is overkill but 8GB looks very borderline for proper 4k. Graphics did not progress much over the last 5-10 years. When i was young you would load a 10 years old game and it looked extremely dated. Now you load a good looking 10 years old game and it looks just fine (even compared to ray tracing).
Maybe we don't need more than 8GB because cards don't come with more than 8GB and devs have to cut the corners. I mean i can't really remember the last time a game wowed me graphically speaking. Most of the times the textures look bland and low res to me on my 2k monitor.
Well, in the distant future we'll be 3-4 generations of cards from now (read 3-4 years at least) and then the 16gb will be standard, I'm sure. But we'll also have cards that will be 200%+ faster than current cards.
For 4k, I'd consider 8gb being a touch too close, agreed. Still, 8gb even at 4k will be enough for a year or 2. 4k is still a niche market and it's not because we don't have gpus with enough vram, it's because we're just about getting gpus that can play 4k60 and those gpus cost over $1000.
By the time most people can run 4k60 comfortably, we'll sure have gpus with more vram.
I'm on 1440p btw, 8gb is plenty for me for the next year or two, then I'll upgrade anyway.