Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Nov 18, 2020.
I want to see who saves most once nvidia has an auto undervolt feature.
I'm more interested in temp changes that under-volting may bring.
Let us not forget that for the time being Nvidia is stuck with a horrible node just because they're greedy about it the stupid way.
We don't really know that. Or how and why NV went for both Samsung and TSMC.
We know that TSMC has its 7nm capacity stretched. It's not like AMD is not suffering from capacity issues.
Besides it looked damn smart to use 14/12nm on Turing, while AMD struggled with 7nm teething probs, didn't have (couldn't build?) big die and was unable to utilize 7nm to leapfrog Turing's 14nm.
NV going Samsung to save few bucks, or TSMC punishing NV (for what? staying on TSMCs 12nm?) - none of that makes much sense tbh.
Would you really rather nvidia be in 7nm TSMC, cluttering up what is already in short supply due to AMD, 3000/4000 CPU, 5000 CPU, 5000 GPU, 6000 GPU, Xbox one S/X, PS5 and various other products TSMC has booked for 7nm?
If you'd like for the shortage from both AMD and nvidia to be that much more constrained, by all means, be upset nvidia went with samsung.
Well it's not horrible. It's just worse then TSMC 7nm. It is better then the 12nm for sure tho.
Good performance from AMD. Let's hope that this trend continues. The only minor downfall is that AMD is lacking an additional display output that they dropped and replaced with a USB-C connector which could cause problems for users who run 3 DP monitors because they would need a DP to HDMI adaptor added in there. Other than that AMD has done a pretty good job here. Let's hope that AMD can keep up the good work lately on their drivers since in the past it took them like 6 months to fix some major issues with their cards.
Some cards have 3xDP+HDMI.
And those which have USB-C have it capable to send out display data.
Get a load of this bad boy:
Thats insane indeed. Cant wait to see 6900xt 2.7ghz.
The choice of games tested are awful though.
Same here, Really interested in the 6900 XT.
Granted, the RT performance is not to par with nvidia, but everyone has to be aware that, AMD is relatively new to Ray-Tracing, not only that, the new Big Navi Gpu's don't even have a dedicated Tensor/RT cores for that matter, all done through the CU's Compute Units.
Showing off like 2080ti performance in RT, that's actually very good, considering still new and don't even have a dedicated RT cores for that matter, so yes, very impressive, RT segment, Give it awhile or so, AMD will catch up rather quickly.
Would be interested to see but not going to happen:
Ampere gpu's vs Amd 6xxx series, RT performance while Nvidia's RT cores being disabled, Just straight through the Cuda cores themselves, now we are talking a even match in the RT dept., again, not going to happen anyways.
Rather interesting, kinda off topic, take a look at this link:
Here is BF V Results on our site, scroll down to the RT section:
On some games, the RTX 3090 was tested(Benchmark), it showed almost 3 times the performance of a 1080ti, but again, totally different gpu's. Still, you can use the above links as reference, even though it doesn't mean anything at all, still though, Absolutely, Interesting stuff here folks.
I see great potential out of the Big Navi, but yes, 6900 XT is the one we all been waiting for, can't wait to see results on that bad boy.
EDIT: NM, AMD had a thing called Ray Accelerators that's within the CU(Compute Units), my bad. Still its their first generation in RT, so gonna be some time to improve, which they will.
That wouldn't be a core vs core comparison. You'd need to turn RT processing off on both to actually compare, which isn't very exciting.
Question, never owned the RTX 2xxx/3xxx series,
Is there a way to shut off the RT cores themselves and process Realtime RT through Cuda Cores?
Never really asked till now.
Cause if possible, would love to see testing results, I don't think there is any.
I would've thought any RT benchmark that doesn't support DX:R would work.
I want that Red Devil,so I went to buy it lol it was more expensive then all 3 RTX 3080 cards I had. There was none in stock and there not even listing the cards on websites.
Well,in 2000 Amd beat Intel with World 1st 1 GHz Athlon Cpu.
Now,in 2020 or 2021 Amd can beat Nvidia with World 1st 3 Ghz Radeon GPU.Amazing tech.
I think they were the 1st ones to introduce PCI-E as well, ATI and some other company if I was not mistaken. They were the wavefront during those times.
Agreed, Very Sick Speeds or VSP!!
EDIT ON PCI-E:
I did remember a LONG time ago on ATI site, Introduction to PCI-E along with other company, been to long to remember.
AMD has really good HW engineering and they need to beef up their SW team, which has done an amazing job despite obvious difference in budgets. That Red Devil is way awesome.
I want to see the 6900XT Red Devil and make it available for people who want to buy one!
Just noticed the more expensive 6800 xts have a full backplate. The cheaper ones have an open section...so you need to pay the extra £25-50 for a full backplate. Seems a bit cheap but with everything have seen with the inflated pricing nothing comes as a surprise.
What next black screens and driver issues? Grabs popcorn and rum sits back and waits haha.
p.s the 6800 actually looks like an interesting gpu based off some benchmarks esp if you get one that overclocks well.
And on a side note gonna see how my ageing rig can handle RDR2. Gonna post a few pics which could be interesting or make me cry!
I have 6800 Sapphire . And my clocks are totally different to those in the review . My card normaly boosts and holds clock around 2320 mhz (games furmark doesnt matter , under full load) , how come in the review it is 2100 ? In overclocking section you mention only 2304 Mhz overclock , this is below my NORMAL highest sustained clock.
My sample boosts 2560Mhz, this is maximum clock that radeon software allows ( says 2600Mhz but it is 2530-2560 ).
Can someone explain why such a difference ? Benchmark results would be quite different with my card ?