Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 16, 2020.
Great job H.H. as always
Is UV working via the MSI AB?
Is it similar to Turing (Ctr+F).
Yup, it's working (0.800v at 1800MHz easy, shaved off 60tW).
2x my Vega it is, monster of an GPU.
Shhh, its a secret.
You wont find those published anywhere, he is in contact with those in the know.
335 ? 337 with Aero themes enabled!
Up to 3x faster than my GTX1070. Right now, this card looks awesome.
'it just works'
Hmm, impressive, but I don't like the power draw, and (most importantly) the mention of coilwhine. I still remember that from my 970, which did magically stop after a while.
So in most games it's a 70-90% performance bump at 1440p/4k compared to my 1080Ti. That's pretty awesome, but on some titles it's barely above 2080Ti? What's up with that, engine/driver/architecture incompatibilities? Seems very weird.
Anyway, I'll go for aftermarket cards (since I literally cannot buy FEs here), most likely MSI again. However, I'll have to make up my mind if I go for a 3080 now, or wait for a higher vram but also likely significantly more expensive variant in the future... or do the insane thing and buy a 3090 and keep it for several years.
Noise levels seem pretty jank.
Looks like i have to find a partner board that fit my Arctic Accelero Xtreme 3.
Wish there was 8k test compared to the 2080ti, can see more performance gaps between the two, raw pixel power.
I notice in the test results, even @ 4k in some games, the gap was not to much from the 2080ti, this is the only reason why I asked to put more stress to the gpu to widen the gap some.
Or try a game with heavy duty AA injection, can really start to see performance difference then.
Hopefully when doing the rtx 3090 testing, we can see 8k test results.
But yes, comparing performance from the previous generation(2080, not the super version), almost double the performance, so I would say, pretty dang good.
If anyone is interested, take a look at this, kinda off topic:
I'm on a 7700k @5ghz and honestly, it does a great job. But I've had it for a while so I'm itching to upgrade. But purely for gaming, I'm struggling to pull the trigger. Other than gaming, I watch YouTube and... That's about it. Not a fan of Cinebench running all day.
Yeah, the coil whine had me concerned but noticed some reviews did not have any. So hopefully just a bad batch.
My current 1070 has pretty bad coil whine as well, it's more like a buzz than a squeal though, still annoying.
I don't get why they can't sort those things out in 2020.
It could very well be a few things. Drivers is the main culprit here, no doubt. These cards will improve with time, that's pretty standard. Just watching the GamersNexus review, they couldn't even get F1 2019 run. It's a game they always include in their testing. So drivers will have some way to go yet.
I will myself go with the 3080, 100%. I never buy the top dog, be it a Ti or Titan. They're always bad value for money. My 2080 cost me just over 50% of the cost of the 2080ti and it's some 30% slower, that is good value to me. the 3080 is nicely priced, as low as £649 in the UK. And it almost doubles the performance of my 2080 at 1440p. Division 2, 110 fps to 190fps is just insane.
489w <1ms transients
Why are people so upset about the performance? This isn't the 3080 Ti or Super, and there's also the 3090. The performance is roughly proportionate with the specs of a 2080Ti, while being a lot cheaper. Despite how power-hungry it is, the efficiency is actually pretty good. DLSS only makes things better.
The 1080p results aren't super impressive, but that's probably because the CPU is a bottleneck.
Not Nvidia's fault MS's engine is poorly optimized.
Yeah, kinda. Nobody forced you to buy one. It's hard enough to get 240FPS+ at 1080p with maxed-out settings on modern games with a 2080Ti, and you expected a 3080 to achieve that at 1440p?
Lower your detail settings or lower your expectations.
How was that a mistake?
Crysis remaster rereviewed in few days.
I wanna see temperature in load with custom fan curve to 100%, the default curve seems conservative.
I honestly think that given your displays it won’t make that much of a difference at this point to upgrade the CPU while the GPU will still make a decent difference.
same here, Ill put headphones on and ramp the fans. Those red dead numbers look good.
its not wrong to base the whole card off RDR2 fps is it? lol. That game for me is like my personal bar of what I want the gpu to do.
I'd really like to see some 3440x1440 benchmarks too.