I didn't think much of a GPU encoder prior to getting a Quest 2 VR headset. It uses the GPU encoder, and I've ran into too many issues with encoding in-general and gotten too many vague answers as to why that all seemingly point to AMD. Oculus doesn't support HEVC encoding in any situation with any AMD GPU today (they do HEVC for Oculus Air Link on NVIDIA) (see a post below for a registry mod) Varjo Aero doesn't support any AMD GPU (they don't specify why directly, but I heard somewhere it was encoder-related) Virtual Desktop has display driver crashes on Polaris on any driver above 20.10.1 (Oct 2020) AMD broke Oculus Link support entirely for 5-6 driver releases starting with 20.11.2 Virtual Desktop claims HEVC last worked on 5000/6000 GPUs with driver 21.10.2 (there are no known-issues in the driver notes about this) AMD broke Oculus Link with the May preview driver, in all 3 22.5.2 drivers, and also 22.6.1, without any known-issue notes, and there are floods of reports about this as of lately in various communities (it's also with other VR headsets) Notably the VD issues have been reported to AMD, and the main VD developer has (reportedly) also discussed the issues with AMD. AMD is well aware that their driver on their supported Polaris platform has been broken with a popular piece of software for PCVR since 2020. There's two issues here. AMD doesn't seem to put effort into their encoder stack beyond AMD Link, and they don't care about breaking support widespread for the most popular and affordable headset, further cementing the rumor that AMD sucks for VR. AMD's already known for not being the best for GPU encoding, and that is spreading to the VR community as well (check some NV vs AMD Quest 2 threads). This is more of a rant as I don't expect a forum post to kick a fire under AMD to address this, but I would like to try to understand what AMD is trying to do here? Pushing raw data over display cables isn't the future for VR headsets, and GPU encoding is seemingly here to stay for a while. Quest 2 is cheap and excels for PCVR. PCVR needs good GPUs. AMD should be dedicating a large chunk of resources to make this experience as great as possible, instead of seemingly just making their stack good-enough for lossy 1080p Twitch streaming. This first became an issue for me when I got a Quest 2 with a RX 580. I was coming from a Rift CV1 with the same GPU. The Quest 2 was a muddy compressed mess, and I had to figure out why. I believe I was the first person to make a point of this publicly with the Quest 2. I went from a RTX 3060 to a 6600 XT. The Quest 2 overall experience was notably better with the 3060 (higher encode resolution for 90 and 120Hz), even though the 6600 XT has better raw-performance. This will only get worse with higher refresh rate and resolution headsets, and AMD should be sweating if Cambria keeps the same Link encode/decode process. Overall, I'd like AMD to improve their encoder stack or to come out and say why others are using it wrong, and to not break official Oculus Link every few drivers. It can't be hard for AMD to attach a Quest 2 to those test machines they have. ___________________________________ To avoid confusion: Most of this thread talks about Oculus Link, as in Oculus's official PCVR software VD is Virtual Desktop, the most popular PCVR software, and is a 3rd-party alternative to Oculus Link ALVR and ReLive VR are also 3rd-party alternatives to Oculus Link AMD Link (the streaming stuff built into Adrenaline drivers) is only mentioned in this thread once and isn't relevant for VR AVC = H.264 and HEVC = H.265; Oculus references AVC/HEVC but others do H.264/H.265 Here's a short-but-lengthy explanation as to how rendering works for VR headsets also that explains why the GPU encoder is important. Games are rendered at whatever resolution and quality your GPU can handle. Nothing particularly special here; you need GPU power to process graphical effects and a 3D world at high resolutions like 3616x1840 at 90Hz. The final rendered frame at this point is the highest quality it can be. For traditional VR headsets, this goes through minor processing (warp/distortion) and is sent 1:1 to the VR headset's screen. In the case of the Quest 2, the final rendered frame at that point instead goes through a process to compress it. This uses the GPU's encoder. So your GPU would take that 3616x1840 frame, and compress it to something like 1984x1120. This compressed frame is then sent to the VR headset to be decompressed and displayed on the VR headset's screen. So the final image shown on the VR headset in this case is a direct-result from how high of quality the compressed frame comes from the PC's GPU encoder. The GPU encoder has to keep-up with your VR headset's refresh rate, and that has priority, so in the case of encoder-limited GPUs, the higher the refresh rate, the lower the resolution has to be for it to keep up. ___________________________________ To further explain this with numbers (and largely because I like this topic ): With a RX 580, a Quest 2 headset at a 1.0x rendering resolution is: 3616 x 1840 @ 72Hz A RX 580 can handle these encode resolutions at these refresh rates: 1984 x 1120 @ 72Hz 1824 x 960 @ 80Hz 1664 x 960 @ 90Hz 1376 x 800 @ 120Hz A Rift CV1 headset for reference is: 2160x1200 @ 90Hz So in this case, games are being rendered at 3616 x 1840, and then being displayed on the Quest 2 at 1984 x 1120, which is lower than the CV1's 2160x1200 resolution (on-top of being lower Hz). We see here that the GPU encoder is severely bottlenecking the image quality on the Quest 2 even though the GPU can handle the higher rendering resolution and is wasting time rendering at it. To not be bottlenecked with this, your GPU encoder has to be able to handle the same resolution and refresh rate as best as possible, and realistically the Quest 2 with PCVR needs 120Hz, or at least 90Hz to lower the overall Link overhead latency (Quest 2 in best conditions is about 25ms overall where a CV1 I've seen at 15ms; Quest 2 at 72Hz is about 45-60ms overall). GPU overhead is also something to consider, as high GPU use has higher-priority than GPU encoding (if you max GPU usage, encoder won't be able to keep-up). That's mostly worst-case though as Polaris is the lowest-end GPU family Oculus supports, however there are no AMD GPUs currently (even with RDNA2) that can handle the highest encode resolutions with AVC. For further reference, Oculus notes the encode resolution usually with just the width (from Debug Tool): The default max encode width over Oculus Link is: 3664 Max encode width for a RX 580 (AVC) is: 1984 (72Hz) The default width for a RTX 3060 (AVC) is: 3664 (it can handle over 4000 at 90Hz) The default width for a 6600 XT is: 3664 (72Hz; only gets lower with higher refresh rates) The situation may be different with HEVC, but as stated above that's not an option with AMD currently. I maintain a list of resolutions and other Quest 2 info on my wiki. I recently added RX 6600 XT and RTX 3060 numbers. ___________________________________ I'm also interested in seeing how different GPUs perform with encoding. I don't know if I have my old reports from a 580, but there's software here that will benchmark GPU encoding at different resolutions and refresh rates: https://alax.info/blog/software (notably AmaEncode and NvcEncode). AmaEncode H.264 image gallery With a 6600 XT, it seems with H.264 encode latency is the lowest at 1440p at 144 FPS at around 6ms. Going to 1080p lowers the latency to 4ms. Going to 4K spikes the latency to 10ms. For Oculus, it seems it would be a good idea to either use 120Hz, or force a lower encode resolution if using a lower refresh rate if you want lower latency. NvcEncode results from a RTX 3060: AVC Gallery: https://imgur.com/a/bC6zcny Spoiler: AVC Code: 640x360@144 0.85 640x360@260 0.47 1280x720@60 1.00 1280x720@120 1.01 1920x1080@60 2.01 1920x1080@72 2.01 1920x1080@90 2.01 1920x1080@120 2.01 1920x1080@144 2.02 2560x1440@60 3.19 2560x1440@72 3.33 2560x1440@90 3.10 2560x1440@120 3.50 2560x1440@144 3.53 3840x2160@30 7.01 3840x2160@60 7.24 3840x2160@72 7.01 3840x2160@90 7.17 3840x2160@120 7.01 3840x2160@144 6.12 HEVC Gallery: https://imgur.com/a/4blNAyW Spoiler: HEVC Code: 640x360@144 0.30 640x360@260 0.07 1280x720@60 1.00 1280x720@120 1.01 1920x1080@60 3.00 1920x1080@72 3.00 1920x1080@90 3.00 1920x1080@120 3.00 1920x1080@144 3.00 2560x1440@60 2.42 2560x1440@72 2.63 2560x1440@90 2.51 2560x1440@120 2.60 2560x1440@144 2.25 3840x2160@30 5.00 3840x2160@60 5.01 3840x2160@72 5.01 3840x2160@90 5.01 3840x2160@120 5.07 3840x2160@144 5.15 With the 3060, it seems latency goes from 3ms to 7ms going to any 4K resolution with AVC. It goes from about 2.5ms to 5ms latency with any 4K resolution with HEVC.