Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Oct 1, 2020.
i think 8gb will be fine for 1080 for next 3 years. There is no evidence poiting different scenario
And last, left one is 4k res , right one is 1080p and both zoomed like u walk in to near object in games. Screenshot takes from 1080p monitor.
Again.... why are you putting your face up against a wall and consider that gaming? For the few fractions of a second at any given time you're doing that, it's not like you're seeing anything interesting anyway.
Anyway, here's my sources:
Remember, we're talking about 1080p here. It's hard to find pre-recorded examples only lowering texture detail, though I suppose having all detail levels lowered only further proves my point.
First of all, the differences there are minimal. Second, zooming into it is not the same thing as walking near it. And even if it was the same thing, who the hell puts their face that close to the monitor to inspect the most tiny differences in detail like that? View those images at a normal zoom and with your face at a normal distance and you'll barely be able to tell the difference. I suppose you might if you just pause the game and stare at that one spot for a long time, but if you're that picky, I don't see how you're really enjoying the game.
Play a game like a normal person and that difference in detail is practically insignificant.
You don't know what you are talking about. 8GB of vRAM is enough for most gamers who don't care about enabling all of the eye candy. You don't need all of the eye candy to make a game look good. Developers of games know more than you because they test their games to see which cards are required for min and recommended specs and what you need to get those results. That applies to any resolution that you want to play on.
Those images look like crap because you are not supposed to be zoomed so close that you can see the pixels. You can't tell the difference between the two images.
I hope you are right but 6gb is line for now with 1080p and rdr2 is almost 1 year old game.
Watch it after 17:45
Firstly high presets versus ultra presets not always include texture res. Texture options can be same with top presets.
Second i take ss from your sample, if you don't understand differences between this images during gameplay your are maybe 70years old but HIGH preset blurry as hell against ULTRA.
Last, differences are more clear in real life than compressed youtube videos.
True, but in modern AAA games, textures are usually lowered because they're taxing on VRAM, and if you're facing performance issues on a cheaper GPU, it's probably because you don't have enough VRAM.
RDR2 is the worst example of the ones I provided, but it's not "blurry as hell". Certain objects (like the ones you pointed out) are noticeably worse, but the important details aren't. Besides, the textures aren't even good on ultra, and seems to be poorly optimized considering the difference from high to low didn't really make much of a difference.
Actually I'd argue they're less clear in real life since you don't really have the opportunity to be looking at minor background details. Take for example the average racing game, where some of the environment details are atrociously bad, but because you're zooming by them so quickly and need to focus on what's in front of you, the level of detail is perfectly adequate for your peripheral vision.
That's all fine and dandy, BUT - name one game without memory leak issues that needs more than 8GB of VRAM and won't run on a card with 8GB of VRAM in 1080p or even 1440p. Of course, bigger is better, faster is faster, 11GB card will run a couple of FPS faster due to more VRAM, but both 8GB and 11GB VRAM cards will run the game far above 60 and even 140FPS with ultra textures (depending on the game). The argument is pointless.
The last game I played with a memory leak problem was GTA4. Memory leaks are hard to track these days because modern software and drivers deliberately buffer data, but they otherwise seem pretty rare. So, your memory usage appears to just keep going up, but, the buffered memory is allowed to be overwritten. That's why you can play certain games where you're using 100% of your VRAM, but there's no sudden drastic performance loss, since it isn't actually full.
Correct, there is a difference between VRAM allocation and actual VRAM usage. I saw 2080ti allocate all available VRAM to a game that on the other hand ran just fine on an RTX 2060 on the same texture resolution settings, that's 11GB vs 6GB (more than 100% difference in available physical RAM).
Like with page file and system RAM, the computer will keep running the software even if it runs out of the primarily utilized resource, it will swap to the second fastest available memory, the swapping will cause performance degradation, but not by much. I recommend checking out good ol' GTX 1060 3GB vs 1060 6GB in 2020 comparisons. The 6GB has ~ 10% slower GPU so it is slower and is loosing very little performance due to halved memory frame buffer.
The gist of it that you will sooner run out of GPU computational power than physical VRAM before you'll need/want to upgrade. It has always been like that and always will be.
P.S. Take all benchmarks available on the internets with a grain of salt, the same goes for monitoring data.
Yeah and when a GPU runs out of vRAM the application it was running crashes. JayzTwoCents did a video where he benchmarked workstation applications rendering a super highend video clip and he benched a 3080 and a 3090 and on some of the benchmarks the 3080 crashed due to running out vRAM.
Not sure why everybody all of a sudden thinks 8gb became obselete over night. I guess all those 2070/2080 and Radeon 5700 owners might as well throw their cards away.
It will happen when more games starts supporting raytracing. It does increase the vram usage alot.
I think ray tracing puts way more stress on the cores themselves than anything. The 3090 with its massive 24gb still only averages 10% better than its 10gb little brother in RT.
I feel bad for the people who are constantly disallusioned into thinking 6gb, 8gb and 10gb aren't enough for the next 4+ years.
History continues to show you how wrong you are, but you keep insisting that GPU makers add more costs into their GPUs, but also complain they are too expensive.
Some of y'all need a reality check and even then....probably wouldn't help....