Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Oct 8, 2020.
You're right, I didn't take into account that in 4k the GPU is the bottleneck, so the CPU not being the same doesn't make much of a difference here.
Yeah if it was 1080p we could argue about it.
Don't forget coil whine on reference designs...
RT performance is not very important for me right now. I understand that in the future RT is going to be central in graphics but i think it´s going to take some time for it so not worried about Nvidia or AMD having better or worse RT performance.
The only thing i care even less than RT is DLSS or similar.
What i want is great performance at decent prices.
I don't know where people are getting 10% slower than 3080. According to Digital Foundry, OC3D, Hexus 3080 reviews, BL3 scores on avg 60fps at 4K. So in this game its a tie. For Gears 5, Guru3d scores 76, hardware unboxed 76, Hexus 77 fps, only Digital Foundry shows 81fps in Gears 5, but out of all the reviewers they used 3600MHz memory configuration. So its 4% behind in Gears 5.
Of course it maybe be more than 10-15% slower than 3080 in independent reviews, but if AMDs number is to go by, its not.
TPU has BL3 at 70fps, Gears at 84.
You're never going to know until you see it tested by the same reviewer. The location tested, the settings (INCLUDING AF AMD), even the silicon lottery (the same card SKU can literally swing +-5% depending on the binning), etc all make a difference.
Interesting...pretty much *double* my 5700XT @ 4k....sweet. No funky goofy looking power plugs hanging off the RX-6000, either! To be honest, I don't know which RX-6000 series they used--will assume it was their top dog--but I don't know that for sure. I'm not really one for 3d-card aesthetics, because the card goes in my case and my case is under my desk, out of sight--but that darned FE 3080/3090 thing is uuuuugly, compared to the RX-6000 photos that have been shown to date. I hope the AIB RTX-3000s look a lot better--I'd bet they will... JHH looks to have left the FE's in the oven too long... So much for looks--they really aren't something I look for, not a big point of interest--but I did see the difference as pretty much in-my-face obvious!
Indications are that I'll be going into RX-6000 first--Zen 3 will come later. Assuming that AMD will have enough in the channel to become available this year! However--I refuse to pay a penny more than MSRP for either! I'm not that impatient! Arrrrrghghghgh....all this waiting! Like the Navy--"Hurry up and wait!"
I hope you're right. I'm leaning toward the same way of thinking. I'm thinking of getting the most budget-oriented 4K GPU without caring about RT, and upgrade when RT becomes more mainstream. Since I've ditched Windows, I'm not sure I'll be getting much of an RT experience any time soon anyway (not really something I looked into). The tricky part is, I'm guessing the 3060 Ti and 6700 are going to be the cheapest 4K-ready GPUs. Considering how much Nvidia has been lowering their prices, I highly doubt either will be under $400. That's pretty much the most I'm willing to spend on a single component (I'm cheap haha). I could afford something better, I just don't play games enough (let alone AAA titles) to warrant that kind of money.
Did you even look at TPUs bechmark page, or just repeating whatever you heard?
So not only TPU used DX11 in BL3, they also didn't use the in game benchmark like AMD and other reviewers so TPUs benches are not even apples to apples with other reviewers.
Same might hold true for Gears 5 that they ditched the in game benchmarks and run their own test scenes unlike other reviewers.
Not sure what having more ram does if it doesn't help the performance. I'd get a 10% faster card with less memory any day of the week, as useless numbers are, well, useless.
Does having 10-15% more performance than 3080 makes 3090 an 8K card? Not really, then what does? More VRAM. And many even justifies the high price of 3090 because it got a "whooping 24GB of VRAM"
Oh, so now we're going to 8K to justify stupidly high vram?
There's many articles that show even at 8K, you wouldn't use a lot more then 10GB, IF you even use more, for example, Wolfenstein young blood only uses 11GB, and just because it's allocated, doesn't mean it's actually helping. If you wanna find a reviewer that compares a 3090 vs a 3080 at 8K, go right ahead.
Take that back, have some results:
Falls right into the 10-15% you claim isn't why the card is an "8K" card....
Only game i've seen that seems to matter about more then 10GB so far is Battlefield V, AT 8K, a resolution almost no one is playing at. And it's not to say that more ram at some point will matter but it realistically does not matter now, or likely in the next few years.
Why ask me? Tell nvidia marketing and fans who claims 3090 for 8K is justified because of VRAM not just sheer performance uplift. And if you think 30fps makes 3090 an 8K card? Then rx 580 is also a 4K card, because sure you will find it deliver "30fps" in some games.
Not even sure what you are arguing about anymore.
10GB is enough, even for 8K currently, as what's stopping 8K from really working well isn't the vram, but performance of the GPU.
I never claimed more ram helped the 3090, it doesn't except for in specific scenarios, at 8K, even though the games are still relatively unplayable.
So again, not really sure what you are arguing, as you first stated a statement that implied 10GB wasn't enough cause the 3090 is has 24GB and has higher then 10-15% performance at 8K, and isn't true 99% of the time.
Not bad...black screens per second framerate. Much more than the 5700
Most review sites do not use badass setting for bl3 like AMD did for their testing.
It was also the built in benchmark. So the TPU numbers are out, not sure if others are using custom runs too or the in game bench that was shown.
I am not sure maybe my ears are not sensitive to high frequency sounds.
For some reason I never noticed coil whine on Vega 64 (reference design with Sapphire label) that I owned before or Radeon VII that I still own also with Sapphire label.
Both cards were fitted with custom water cooler from EK.
Even with insensitive ears, the stuff would be pretty audible.
I had a Sapphire 290X reference design which had quite bad coil whine (I paired it with a dual-fan MSI 290X which was completely silent in that regard), and a Sapphire HD6990 reference before that, which had it too (though not as bad).
It's good to hear that the Vega and VII don't have the issue! Fingers crossed for the 6900XT