Heh, how do you even encode and decode on HW level at the same time? Is it Adobe stuff? Because when i try this (granted i cannot do this through Adobe, because f*ck their subscription and i have no access to HW HEVC encoder there. Anyways, when i try to do such reencode, i hit HEAVY CPU limit But in any case, even i had power to decode without bottlenecking into CPU when reencoding, it would've been like 20-25 FPS at best. Granted it is reencoding into HEVC _10bit - quality preset... Which is quite heavy load. P.S. still only about 15W of power for this reencode... On GPU, i mean))). CPU takes 106W to decode this monstrosity P.P.S... And now i know that MPC-HC is quite messy between HW AV1 and HW HEVC_10bit decode. Some methods support HW decode of 10 bit HEVC but not AV1 and some vice versa... P.P.P.S. Seems like RDNA2 overflows decoder load onto 3D renderer graph. 4k@60 HEVC goes 100% decode + 20-22% 3D load (on very low clocks, though). Total GPU power consumption is about 19W for that.