That makes no sense I was using a 1GB VRAM card years ago on a machine with 16GB RAM. Worked perfectly fine.
I just went looking for a used 1080 ti oc - It's probably going to be about $700 USED. Maybe they'll come down. Sure $1300 is a lot more but it is NEW, comes with better performance than the 1080 ti and gives me a new warranty. I think if I could get a 1080 ti strix OC for about $475 I'd go that route but only because I just am tired of waiting for the 1080 to come down in price I'll probably go the 2080 ti. If they have a huge glut of them some where they sure are releasing them slowly.
Even if you choose 2080ti, I'd wait it out. There appears to be a bad batch of cards that die (serial numbers have been monitored) and NVidia are shipping out replacements for failed cards that have serial numbers within the bad batch still!
Something to keep in mind, is that while some games like to full up your vram with textures, they don't actually need all that space to run smoothly. Some games just have absurd buffers. I believe 2 of those are Shadow of Mordor and Rise of the Tomb Raider if my memory isn't failing.
I could play rise of tomb raider @ ultra on a 780gtx with 3gb vram. No stutter, but it ate a lot of system ram. Thst said you will be fine for now. By the time 6gb won't cut it GPU will be slower sooner. I think 1year easy np. So far I think I will have to upgrade for Cyberpunk2077. Heck even 1080ti won't be fast enough @1080p, pretty sure about that.
No, you peeps spewed alot of BS, but that didn't make it true. Games don't just "fill all vram available", it uses a certain amount of resources with certain graphic settings. If your gpu doesn't have enough vram, it will instead use ram as "shared vram". In low bandwidth games, this might not cause stuttering, but in high bandwidth games, it sure as hell will. An example of the above You can see how much more ram the gpu's that doesn't have enough vram uses, cause the gpu is offloading to the ram. Sure, you can run the game at max settings with a gpu that doesn't have a sufficient amount of vram, but you will more often than not get alot of stuttering due to the vram swapping. See this vid where digital foundry tests the lack of vram the 1060 3gb has in rise of the tomb raider.
Yes, but if you have fast system ram it won't. 780gtx used up to 12gb ram, now 980ti max 7-8gb. Game runs the same, only faster. Obviously. If I lowered vram texture on 780gtx it used 8gb ram and ran faster (5-7fps), but it ran the same smoothly.
It clearly still did... if you weren't limited by your vram, then using the highest texture setting should have close to zero performance impact... You experienced the exact same thing that digital foundry finds in their vid with rise of the tomb raider and 3gb vram gpus. And i should know, i had 780 ti's myself and played rise of the tomb raider with them.
What I was trying to say was 3gb didn't stutter, even at ultra. Yes only downside was it slowed down a bit.. I remember HL2 with 320mb ram on 8800, and then 570/580gtx with up to 1.5gb by TitanFall.. Here I literally saw 1-2 sec pauses if I turned around, terrible stutters Also made a test on 780gtx and Assassins creed unity, only 8xmsaa bottlenecked 3gb vram, but funny enough it still didn't stutter just ran at 10fps (4:30+) I think other stuff bottlenecks 1060 more, like low vram speed and low rop count
The 1060 3gb does in fact manage to somehow mitigate stutters on some games but the examples here in rise of the tomb raider are spot on I get the stutters in the exact same places but as my gpu core is 2012mhz and my vram is at 4552Mhz it makes the stutters happen really fast and or less Im playing forza horizon 4 at the moment and that defo does stutter when swapping ram if i use anything higher that medium setting on shaders I need a new gpu so hopefully maybe next year but its a good job i dont mind lowering a few settings here and there
Yeah, games that uses low bandwidth, so that the pcie bus doesn't become a bottleneck, can use more vram without stuttering... but yeah, ideally you don't want vram swapping to happen in the first place In far cry 5 with hd texture pack at 4k i get heavy vram swapping, i can see that it uses about 2gb "shared video memory"... which leads to HEAVY stuttering :/
It depends settings and hardware , CFX and SLI are different beast , from my experience with my R9 290x CFX system to my current 1080ti system is day and night, with the 1080ti and 16GB are a good match ,7 to 8 GB used ,were as 290x CFX used 16 GB or more. 290x CFX 1080Ti Last edited: Yesterday at 5:28 PM It depends settings and hardware , CFX and SLI are different beast , from my experience with my R9 290x CFX system to my current 1080ti system is day and night, with the 1080ti and 16GB are a good match ,7 to 8 GB used ,were as 290x CFX used 16 GB or more. 290x CFX 1080Ti Last edited: Yesterday at 5:28 PM
If you're going to call BS on a well-established fact, you might as well get the evidence to back up your assertion. Your evidence fails miserably. So here we have graphs showing how much RAM & VRAM Rise of the Tomb Raider uses with different cards, establishing a painfully obvious fact - GPU memory transfers occur through system memory. If a card does not have enough VRAM to fit *everything*, those assets stay in system memory to be loaded later. Incredible find, truly. Where are the required graphs to accompany these and show the lower VRAM cards like the Fury X actually stuttering due to lower VRAM? Right, they don't exist. Assertion with no evidence provided whatsoever. Sure, some games will stutter if you go too low on VRAM, rather than outright stall and start swapping. Others will not ("low bandwidth games"). These games, believe it or not, simply load all that they need into the card's VRAM, since it's available. That is absolutely unrelated to whether the game needs them right now at the bandwidth requirements that only VRAM can offer. A 3GB VRAM card in a game that officially requires 4GB+ VRAM to enable its highest texture setting. Right, the game does choke on 3GB VRAM, but it's not like it makes an effort to hide it. Find better evidence if you want to run contrary to well-established fact. By your own measuring stick, every card under 11GB is now worthless. 1070, 1070Ti, 1080, 2070, 2080, all junk, eh? Let's all buy 1080Ti's and 2080Ti's to play our damn games smoothly. No, not really. From Guru3D's RX 470 (4GB) review: https://www.guru3d.com/articles_pages/asus_radeon_rx_470_strix_gaming_4gb_review,26.html That 480GB is the 8GB variant, frametime line pulled from this review: https://www.guru3d.com/articles_pages/msi_radeon_rx_480_gaming_x_review,26.html So much for 4GB being a limiting factor in this very VRAM hungry game.
Hillbert only take an outline of the frametimes from the first 30 seconds of benchmark - if you look at the digital foundry video, the vram swapping in the benchmark starts happening after 32 sec, thus doesn't appear in hillberts frametime test. Here is digital foundry comparing the 4 and 8gb versions of the 480. If you go to 2.13, they test hitman and rise of the tomb raider, and you see the 4gb having jumps in frametimes, while the 8gb has completely stabile frametimes. I agree that the frametime jumps aren't nearly as pronounced with 4gb vram as they are with a 3gb, but they are still there with 4gb, while they are completely gone with 8gb.
290x CFX 2560X1440 WITH 32 GB vs 1080ti 4k with ony 16GB . 1080TI is way more efficient for my system even at higher resolution.
I think you are wrong in your assumption here. As they said they keep Cyberpunk2077 playable on normal GPU through all development (gameplay demo they showed was using 8700k and single 1080ti) which means a lot of optimizations already in. I would actually expect Cyberpunk2077 to be much better optimized than for example latest UBI games.
Well I hope so, was just being precarious and the way how it looks.. although it seemed to be a little "downgraded" compared to early tech demo leak. Characters a bit less detailed and overall detail. Still looks great though.