Discussion in 'Videocards - NVIDIA GeForce' started by cucaulay malkin, Jan 3, 2022.
Aaand.. ignore list extended by another fanboy barking at the moon.
Aw...I guess someone can't handle not being as smart as he thinks he is
Yeah, what do I know after playing Fortnite for hundreds of hours on different hardware and seeing the immediate result after swapping a card.
You really got issues, but I'm not your shrink. So, bye.
Anyways, from TPU latest review, 6700XT vs 3060Ti vs 3070 in avg and minimum FPS
Things you're arguing about aren't mutually exclusive.
Did you really swap to amd or im hearing it wrong?
Weird that Nvidia crappy DX12 driver is only specifically affecting his 3060 and not faster Nvidia GPUs , throw hissy fit when shown contradictive evidences don't make his anecdotal evidence anymore valid.
And any Fortnite competitive player would use DX11 for both Nvidia/AMD anyways, I guess he doesn't know that.
Use DX12 and get 33% lower FPS, how about that LOL
Seems like it from the chart but I have no experience with fortnite and hub aren't really a great source either. I haven't really seen any dx12 frame time specific issues on my 3060Ti though, in any of the games I play, so I don't think the dx12 driver is garbage. Not saying it's problem-free, and amd's might be better indeed, they've been working on dx12 before nvidia.
I do have witcher 3 complete though, and on 3060Ti I have silky smooth frametimes, even in Novigrad (the whole of it) with rtgi+rtao+rt reflections enabled (dlss2b+reflex, ~48-52fps). No idea why 3070/3060ti produce bad frametimes on his, but it's certainly not everyone's experience. It's not vram, ~6-7G is my reading with rt on. Must be the case of an aging six core for a ray traced game. 10700f + 4133 ram has no frametime issues, gpu usage is almost 100% in novigrad, even with raytracing.
This is what I get, frametime variance is 18-22ms, but fps is unlocked too so I might be getting a few fps difference here and there. No idea how to make osd show a frametime line, if you tell me I'll upload that too.
Gut curious after some things ran smoother with Intel Arc and the 6700 XT doesn't yet show any screw-ups that would drive me crazy (unlike the Intel card).
DLSS is missed in Hogwarts Legacy, but Fortnite is more important to me and there TSR is a viable alternative.
DX11 mode in UE5 disables nanite and RT.
Nanite is very heavy, as it ads a lot of polygons.
I don't know if he was using RT in DX12, but that could be another factor.
No giant difference with Nanite / Lumen on or off in that regard. If at all, only texture settings matters for this. D3D11 is dead, it was worse than 12 when I last tried it even years ago. Just don't bother, who would even mention it...
Slow-down stutter is 100% reproducible by just moving from some parts of the map to some distant others. It's mostly clean on Radeon with shaders cached, especially with Anti-Lag, whereas it's kinda shitty on GeForce with Reflex or in-game limiter (and not great without it either). Everyone who isn't incompetent or doesn't want to lie can easily capture it on video + frame time graph.
I don't play Fortnite, so I don't know how big the diferences are.
But in UE5, DX11 and DX12 don't render the same quality settings and that's why HU found that diference.
Yeah, no Nanite and Lumen with D3D11. But D3D12 renderer also generally has been running better for years.
If you want to have stable >120fps (not counting the stutter slow-downs), you can forget about Nanite + Lumen on everything but a 4090 anyway. Nanite without Lumen seems to be fast on the 6700 XT, but assets look worse than legacy ones without using Lumen, so I got both disabled. Instead, I target constant 135fps with in-game limiter, which works with 1440p TSR high 70% and everything else maxed.
Yeah, for competitive shooters, ultra settings is not the best option. So disabling nanite, and especially hardware lumen, is recommended.
BTW, I think that in DX11 it still uses software lumen with SDFs, instead of RT. Maybe someone that plays the game can confirm.
You can't enable Nanite option with D3D11, which is a requirement for activating Lumen.
With D3D12, there is a separate option to enable HW RT with Lumen. It is generally very recommendable, light scattering through foliage looks super great and also general lighting/AO is greatly improved, depending on the scene's content:
It's more expensive on RDNA2 than on Intel Arc/Nvidia, but not more than 20%. Often less.
Lumen still has some leaking issues in buildings, but it's generally not too harsh to say other games look last-gen in comparison (to put it mildly). RT in Witcher 3 is so lulz in comparison...
Lmao capping FPS and wonder why a GPU that is 40% faster than the other is also running smoother, what a genius.
Let say I get 1% Low FPS of 200, and I cap the max FPS to 200, I will get perfect frametimes while slow GPUs will run for their dear lives to even reach 200FPS, not too hard to understand
Still it's a massive handicap to cap FPS in competitive shooter, because we want the client side image to be as close to server side as much as possible
Un-hiding your post for amusement reasons: Sure, just cap fps at 1% low value. This will turn a 4090 into a 6700 XT, but hey, it's Krizby "logic".
And no, the 1% low value of the HU benchmark scene doesn't reflect the issue. Which shows exactly how pointless such benchmarks can be. Same goes for VRAM pressure issues that often only show up after playing a longer period of time. But preach more to people how much VRAM they need with the 24GB card installed in your system...
Capping FPS in competitive shooter is enough proof that some people shouldn't be taken seriously, I guess his doesn't even count as anecdotal evidence.
Now can we go back to 3060Ti vs 6700XT.
So i can finally play Doom Eternal with raytracing and it runs better than 3070 with dlss (1fps due vram issues) and 6700xt is running native. If only had fsr2 it would be a slaughter.
RE4 remake maxed out 8gb textures+ raytracing high +fsr2 (no hair) Vram usage jumps to 10.5gb as soon you enter the Village.