Discussion in 'Videocards - AMD Radeon' started by OnnA, Oct 29, 2020.
582fps mesh shaders
Yeah the screen is nuts lol. Still pushing this Red Devil. It's an amazing card when pushed correctly.
im curious if the frequency tab is pre binning for the gpus, mine lists 2577 mhz, and i cant set any higher in radeon settings and keep it stable
I think auto overclock feature is a good indication.
Mine does there 1509MHz there and I can tune it 2675MHz setting which give a real frequency in range of 2620MHz.
so your GFX maximum frequency listed in more power tool is 1509mhz. or is your gpu clock listed in gpuz 1509 mhz.
I meant if you apply auto oveclock in Radeon Software it is automatically offering 2509MHz. So this is indication that it is possible to tune it even higher.
I have currently 2675MHz@985mV setting in Radeon Software with MPT set to 270W and 310 TDC and 15% Power Limit in Radeon which means a total GPU power can go to 361W compared to 300W on default.
My actual frequency in HWINFO with my current overclock setting is 2600MHz-2630MHz.
Time Spy score of 19800 and Port Royal 10120.
This is the setting for daily gaming. It is running wonderfully.
Actual uplift in performance compared to my default setting is 7% in AC Valhalla and also 7% in Borderland on DX12.
Here I pasted results at TPU:
Note: Originally I tried 2685@980mV but Port Royal Stability test produced 98.3%. Dropping it to 2675MHz@985mV increased stability to 99.7%.
10MHz less does not have any significant performance drop but increased stability for a setting for daily use.
Just hit #38 for all time in Fire Strike Ultra with my current hardware settings. Time Spy is a finicky pain in the ass. I did however increase my CPU score by like 600 points modifying PBO limits in the bios. I'm pretty happy. My MPT settings are linked below as well. 480 + 50 btw for total board draw on 3x8 pins on the Red Devil. I've seen it draw 457 watts on the GPU core alone in Time Spy for a total board draw of around 507 watts. Right at the limit.
Fixed my high junction temps issue. Slapped a EK-Quantum Reaction AIO RX 6800/6900 on it.
This about as hot as it gets now. 2600 stable seems to be as far as it will go without using MPT. I will start to play with that once I get a case that can fit the AIO, it actually bigger that 240mm due to the pump and res being attached at the bottom.
Also managed to break 20k in 3dmark graphics and 10K in Port Royal.
Well boys I've done it. With the new Radeon Driver I don't know wtf AMD did but they upped the secret sauce. Not ONE hard crash in Cyberpunk 2077, AND I was able to crank the GPU to the absolute limit. Hitting 58 to 60 FPS on ULTRA full settings at 5120x1440p. Seeing clocks spike as high as 2791 MHZ!!! 2820MHZ OC in Wattman.
AMD just... cripes what a card. Screenshot for proof, and my Overclock settings in Wattman with my 480 (505 board power draw) watt MPT bios. Btw this OC isn't exactly stable in traditional benchmarks but in games it's fine lol. I'll need to try an undervolt for benchmarks to get it to work.
EDIT: 2790 MHZ OC is stable in all my games now including COD Black Ops. Testing benchmarks now...
To test the stability use Port Royal Stress test and Firestrike Extreme Stress test.
I can set my cards to achieve on benchmarks and games 2.7GHz frequency with 6800XT on MPT 290W and 350TDC but not really stable when tested with Port Royal Stress test. I saw artifacts in stress test.
Real 100% stable setting for me is 2675MHz@985mV and VRAM 2100MHz (MPT 275W and TDC 310A and 15% power limit) .
This gives up to 2640MHz in games but rock stable.
Game performance increase is around 7% compared to default setting.
I gave up on higher than this setting as I wanted to have setting that is very stable otherwise I had every now and then game crashes which were annoying.
I have reference card and not 3x8 pin likes yours.
I purchased the MSI RX 6900XT reference and had some weird issues with games crashing when I modified the power profile to 15% max or even just the fan curve, but wouldn't be an issue with the default fan curve. So seemed that modifying either of those would increase power usage and crash the system at times. And I have an EVGA Supernova G2 1000W. Increasing the fan curve would make the GPU to stick at its Turbo core more often, triggering it. So I had two separate 8-Pin PCIe cables but connected in the PSU in Port 1 and Port 2 and found out that when connecting two cards, it had to be Port 1 and Port 4, and Port 2, Port 5. As Port 1, 2, 3 shares similar lanes and the Port 4, 5 shares a different lane even though this is a single rail PSU. So made no sense to me but decided to move the secondary power cable to Port 4 and voila. Stable as a rock lol. It still uses less power than my Vega 64 Liquid Cooling Edition and barely any more power than the Radeon VII while being twice faster or more, is just ridiculous!
And the best thing, only paid $100 as I sold my Radeon VII anniversary in Ebay for $1,400 and used the same money to buy the RX 6900XT for $1,400 in the marketplace. Happy camper lol
As currently I don't have time to game much. I've been making my GPU pay itself back to me at 150W... But anyway been enjoying lately some Total war three kingdoms quite a bit! The card is everything I hoped for and more. It's basically just an awesome thing to have oc/uv capabilities are there getting more out of the card while using less power is simply amazing.
If anyone is looking for a silent card I can recomend the msi 6800xt gaming x trio, its really silent and the temps are really good.
fans are around 1500rpm most of the time. I was going to put it on a waterblock but I think I wont now.
It dosent have the best bios for overclock, but apart from that its a good card.
Pretty sure none of the 6000 series GPU's have a particularly great stock bios due to the limitations AMD seem to be pushing though there is some room for tweaking although Wattman itself isn't the best or the clearest as to what the changes actually affect so More Power Tool and careful monitoring and testing stands a better chance though the voltage locks and clock speed cap still hinders any fancy increases.
But at least undervolting is still capable and if the Navi20 lineup had the same value limitations there's a fairly high chance the 6800 XT could clock higher than the 6900 XT giving the clock speeds a edge performance wise over the compute units in most scenarios possibly changing a bit at higher resolution or specific tasks.
It's kinda close to that point already though so not much really changed with these restrictions or so it feels.
6800 would still be lower due to missing that entire hardware bit too although even so the 10 - 15% difference from one card to the next for reference performance is still quite low.
(Was expecting around 15 - 25% difference initially from clock speed and hardware combined here.)
Anyone here able to test Black Ops Cold War multiplayer for me?
I seem to be getting an issue on this game where GPU clocks will drop as low as 400MHz and usage is all over the place. Frame rate spikes and the game stutters A LOT. Sometimes its fine for a while but after about 5 minutes of a match the game will start hitching.
Warzone is the complete opposite, everything maxed out (except RT) @ 1440p and I am getting upwards of 230fps mostly hovering around the 150fps mark and gpu clocks are stable and hit 2.4GHz and usage is near always 99% with zero hitching.
Strange one. I've tried a few things, but the issue still persists in Cold War. I'm using the latest drivers at the time of writing this post.
My GPU temps are perfectly fine and within spec, I am not overclocking the card just raised the power limit to +15 and created a custom fan curve. Max gpu temps is 75C core 85C junction hot spot. idles around 27c core and 32c junction.
Does AMD have a setting in the drivers or anywhere else that forces "high performance" like nvidia has three different power settings "optimal", "adaptive", "prefer max performance"?
Cheers, sorry if this isn't really the place to ask I didn't want to start a whole new thread for something that could be fixed easily?
@CPC_RedDawn Hi, we only tried so far trial version but no we don't seen issue like this, only it does crash randomly (at alt+tab for example).
The hitching occurs during fortnite for myself especially panning camera fast in any direction fps go from 300 to 40 fps.
Strange one then, I too see some random crashes when alt tabbing also.
Will do some more testing trying out other things.
This happens to me also, not tried Fortnite, but it happens in cold war A LOT and in a few others too. But it gets terrible in cold war to the point the game becomes unplayable.
Could you test fortnite again and monitor your clockspeeds and usage on the GPU when these stutters happen?
Yeah core is bouncing all over place, sometimes it's reading nearly 0.
mate you are so right about these cards have had one for about a month now and I am astounded at how good they are they are superb. I paid £1300 english pounds for mine which is 300 over MRP but I dont care its amazing. got a total new build with a 5800X CPU and a ROG STRIX X570-F Gaming MOBO and 32 MEG of 3600 MHZ ram = Happy Gamer