AMD last week released the Radeon Adrenalin 21.8.1 drivers. After installing the drivers, the RX 6600 XT's video card consumed very little power when watching YouTube videos and using two monitors. ... Latest AMD Radeon Driver Reduces Power Consumption RX 6000 Cards with YouTube
any radeon gpu ? cause my 1070 reports 16-18W and 405-810mhz memory clocks so did 1080ti and 2070 super.
waste of energy having more than one display is already adding up to power consumption and heat production,I don't need the card to put out extra +20w 24/7 on top of that. if you compare 3080 and 6800,320w typical vs 250w typical power consumption,the 6800 will draw and heat up the room significantly less in one hour.But in 1h gaming/4h desktop usage the power draw and heat output will be the same if 6800 doesn't downclock with multiple monitors. what an absolute waste of engineering they did for rdna2 gaming efficiency.
Tbh if I was going 80"+ on a TV i'd pretty much want 8K at this point. My brother in law has a 4K projector (VW715ES) which he puts up on a 120" screen and while the scale is impressive the detail definitely starts to degrade. I'm sure the number of people it effects is like 12 but if you want 8K60 it's kind of frustrating that AMD supports it via HDMI but their decoder simply can't handle it.
Hey no fair, you are thinking rationally here . Have seen quite a few on this forum where size/distance does not matter and where 8k even on a 32" PC monitor is desirable.
If only a 4k signal was a 4k signal. A bad 4k signal can easily be worse then a 1080p and a good 4k signal can be better then a bad 8k. Replaced my mediacenter APU because the new codec was too demanding, before I could run 1080p no problem, but the new codecs only allowed me to run 720p and all the way down to 420p if 720p was a 60hz only. Codec and compression makes a big difference.
Cry more ROFL. My GTX 970s, etc did the same as my R9 290s did with triples. Seriously, you act like the Vram clocks being at max adds 90w idle. Your a clown.