Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jul 6, 2020.
Of course it is.
its not, the standards consortium hasn't assigned VVC a number yet.
The creators call it VVC/H.266 so you can safely assume it exists.
The creators don't get to decide on the number.
Ok man good talk.
Is every movie going to look like a cartoon now?
Meanwhile everyone except Netflix/Amazon prime and Apple store is still on H.264 and Youtube support for H.265 is still experimental, despite having equivalent Google developped format.
The 2080-ti can barely record H.265 4k60 with its onboard ship, and it comes out with artefacts and massive framerate drop.
what gpu decode h266 with hardware? future rtx 3080ti support h266?
We don't know.
As DICE would put it "we don't have the technology"
Bitrate doesn't mean quality. There are already the "feature" profiles and the "performance" tiers and levels (both defining a maximum bitrate) for both AVC and HEVC. Industry already needs to qualify for some specifics profiles, tiers and levels depending of the product they are selling. Other profiles, tiers and levels are optional (eg: 10 bit on AVC, where no known hardware on earth is able to decode it with hardware acceleration, despite tons of stupid people re-encoding anime to 10-bit AVC for futile reasons but refusing to use HEVC where 10-bit hardware decoding is part of the standard and where 10-bit encoding offers better compression for animated videos)
The issue with platforms like YouTube is that you already upload (probably bad) compressed videos and the servers compress them again (and again on every new codec added)
Probably the next-next-gen, like the 40 series RTX, but realistically you won't need it until much later, maybe another 5 years.
Quantizer? Constant Quality variable bitrate encoding modes?
You can safely assume that a decoder can be built leveraging the current amd and nvidia hardware.
Then if you mean that general purpose hardware should decode it with 1% cpu utilization and 1% gpu utilization, well probably no.
I'm interested to see how they came up with the same quality. Objectively the image has to be worse, it's just the test group they showed it to rated the image quality the same as H.265. Could be cool though, if the claim is true, and in a decade or so when we have HW to actually use it.
Compressed video (and audio) and looks acceptable, until or unless you have a chance to view the original direct-from-camera high bitrate version. Then it is plainly obvious what the re-compression to lower size high-compression did to it.
Wake me up when youtube 480p streaming with 64kbps internet is possible (smooth).