Bottom line: We can’t speak to why HDMI 2.1 was written so poorly and with so little care for the end-user experience, but no device that claims to support “HDMI 2.1” can be trusted without a specific list of which features you are buying. HDMI 2.1 devices should be assumed to be HDMI 2.0 devices in disguise unless the manufacturer provides specific information to the contrary.
Bottom line:The purpose of a standard, from a marketing perspective, is to tell customers what a device can or cannot do in just a few words. It’s much faster to read “This device supports DirectX 12” than it would be to scan a table of DirectX 12 requirements to make certain a GPU was compatible with the API.
But standards only work if they are designed properly. Based on what we now know about HDMI 2.1, it isn’t.
In theory, HDMI 2.1 should be a substantial upgrade over HDMI 2.0, with a maximum 10K resolution at 120Hz, Display Stream Compression 1.2a support, a latency-reducing and supposedly power-saving mode known as Quick Frame Transport, and an Auto Low Latency Mode (ALLM). A larger list of HDMI 2.1 features is below:
In the past, a chart like this would have been taken as proof that any device claiming “HDMI 2.1” support would, in fact, support these capabilities. That’s no longer true. According to the standard’s developers, all of the features listed above are optional. The only thing a device needs to do to claim HDMI 2.1 support is to offer HDMI 2.0 features.
HDMI 2.1 isn’t just an extension of existing HDMI standards; it’s a wholesale replacement. HDMI 2.0 has been retired and is no longer being licensed. According to TFTCentral, which broke the story, new devices should no longer claim to support HDMI 2.0, all features of HDMI 2.0 are a subset of HDMI 2.1, and all of the features associated with HDMI 2.1 are optional. Devices claiming HDMI 2.1 support are supposed to list which of its features they offer, but that requires the customer to read a great deal of fine print.
The reason TFTCentral went digging into this question in the first place is that Xiaomi is offering a “Fast LCD Monitor 24.5″ 240Hz Version” that advertises itself as HDMI 2.1, while the fine print reads: “Due to the subdivision of HDMI certification standards, HDMI 2.1 is divided into TMDS (the bandwidth is equivalent to the original HDMI 2.0 and FRL protocols). The HDMI 2.1 interface of this product supports the TMDS protocol, the maximum supported resolution is 1920×1080, and the maximum refresh rate is 240Hz.”
When HDMI 2.1 was announced back in 2017, there was no mention of HDMI 2.1 completely replacing HDMI 2.0. HDMI 1.4 continued to exist as a standard well after HDMI 2.0 was announced and shipped. Most organizations do not sunset previous standards so aggressively.
The only people who benefit from allowing all HDMI 2.0 devices to be sold as HDMI 2.1 devices are the hardware manufacturers who can pretend their products deliver meaningful improvements without actually shipping anything new. The USB-IF’s chart above is absolute garbage, but it at least attempts to provide some kind of roadmap for understanding USB performance based on standard support.
Given this situation, we regret to inform you that there is essentially no value or meaning one can assign to a claim of “HDMI 2.1 support” absent a great deal of additional information. Also, some manufacturers are still using HDMI 2.0 branding, which makes the appearance of “HDMI 2.1” all the more confusing.
Users who wish to minimize confusion may wish to look for DisplayPort 2.0 devices where applicable. While features like Adaptive Sync are also optional in DP2.0, the authors of that specification thankfully neglected to make the entire standard opt-in. We can’t speak to why HDMI 2.1 was written so poorly and with so little care for the end-user experience, but no device that claims to support “HDMI 2.1” can be trusted without a specific list of which features you are buying. HDMI 2.1 devices should be assumed to be HDMI 2.0 devices in disguise unless the manufacturer provides specific information to the contrary.
I like going to Cable Manufacturer Cable Matters to get information since they make the cables for the various outputs.
Here Cable Matters explains the difference between HDMI 2.0 and 2.1: https://www.cablematters.com/blog/HDMI/hdmi-2-1-vs-2-0
So if a device says it is HDMI 2.1 look at the transmission rate in the specs to see what it can support. If it is 48GPS than it is a HDMI 2.1 device at least in speed.
HDMI 2.1 is a landmark revolution of the standard, rather than the incremental evolutions of the past few generations. Where HDMI 2.0 built upon HDMI 1.4 by improving color spectrum support, increasing transmission and data rates by over 50 percent, and doubling the support for audio channels, HDMI 2.1 turned the standard on its head.
HDMI 2.1 supports a maximum transmission bit rate of 48 Gbps, compared with HDMI 2.0's mere 18 Gbps. HDMI 2.1 vs HDMI 2.0 cables on max effective data rate is a similar wash, with HDMI 2.1 supporting up to 42.6 Gbps, where HDMI 2.0 manages just 14.4 Gbps. All that additional bandwidth opens up the HDMI standard to higher resolutions and refresh rates than it ever had before, making it a true competitor for the high-end DisplayPort standard.
HDMI 2.1 offers native support for 4K resolution at high refresh rates like 120Hz, and even up to 144Hz. It also enjoys better support for 5k resolution at up to 60Hz (where HDMI 2.0 needed compression technology to make that viable) and 8K at 30Hz refresh rate. Previous generations of HDMI cables weren't capable of compression, but with the new, lossless DSC 1.2 now supported by the new HDMI standard, even higher resolutions and refresh rates are possible.
With DSC 1.2 enabled, HDMI connections can handle up to 10K resolution at up to 120Hz. The same goes for 8K and 5K resolution, with 4K resolution playable at up to 240Hz when compression is used.
These are major advancements of the HDMI standard and put HDMI 2.1 vastly ahead of HDMI 2.0 in terms of its raw power and capabilities. HDMI 2.1 vs. 2.0 is a blowout when it comes to performance.
"So if a device says it is HDMI 2.1 look at the transmission rate in the specs to see what it can support. If it is 48GPS than it is a HDMI 2.1 device at least in speed."
The transmission rate alone says nothing about what features are supported beyond resolution and refresh rate. But what about a FRL device that supports 32 Gbps? (Like the PS5). So by your definition the PS5 is not an HDMI 2.1 device because it doesn't support up to 48 Gbps. So what is it then? Is it HDMI 2.0? Well no, HDMI 2.0 utilized TDMS type signaling and was limited to 18 Gbps.
Anything that utilizes the Fixed Rate Link (FRL) signaling type is effectively HDMI 2.1. But does it have to support the maximum bandwidth of 48 Gbps? That is where the problem comes in, and doing something similar to Freesync/Premium/Pro where different features are supported at different tiers is probably worthwhile. Like I said earlier, if you require that all features are met, then virtually no displays are HDMI 2.1 due to lack of DSC over HDMI. But they aren't 2.0 either. Just labeling everything HDMI is what we most want to avoid otherwise it'll just be the Freesync problem all over again.
What I should have said was if any device supports greater transmission rates than HDMI 2.0 (compared with HDMI 2.0's mere 18 Gbps) and UP TO 48 Gbps then it can be considered to have HDMI 2.1 aspects.
I am going solely by the HDMI 2.1 standards that states HDMI 2.1 supports a maximum transmission bit rate of 48 Gbps, compared with HDMI 2.0's mere 18 Gbps.
So yes the PS5 would be considered to have HDMI 2.1 if it supports transmission rates higher than 18Gbps.
Also when it mentions it supports HDMI 2.1 doesn't necessary mean it will support all the features of HDMI 2.1 but only some HDMI 2.1 features.
That is why a User needs to look at the specs to determine which features it supports when it mentions it supports HDMI 2.1.
"That is why a User needs to look at the specs to determine which features it supports when it mentions it supports HDMI 2.1."
Which is absolutely what we want to avoid, because that will just be Freesync all over again. It was up to the user to determine what the freesync range was, if HDR and LFC were supported, and if gamma remained stable across the range. Leaving it in the users hands did not go well, and let to a lot of disappointment. HDMI seems hell bent on repeating the mistakes of the past, so I guess it will be up to a third party to produce the HDMI version of GSync-compatible.
This was already true anyway. When you look at things like HDMI 2.1 GPUs the full feature set was supported, however, how many HDMI 2.1 displays support DSC? Virtually none of them. But it is a slippery slope to go down.
AMD should know better than anyone, as this is strikingly similar to what occurred with VESA adaptive sync and Freesync. Freesync was supposed to be an extension of the standard with certified performance criteria but effectively became just a synonym with no tested performance metrics.
Freesync displays had no minimum range for VRR, optional LFC support, no criteria for gamma performance in the VRR range, optional HDR support etc. The user was forced to glean for themselves what was/was not actually supported and led to a host of user issues, most specifically the brightness flickering.
That was a huge thread on these forums that went on for years, cause by monitor manufacturers not qualifying that display gamma was similar over the Freesync range and AMD didn't require them to do it.
It wasn't until NVidia added support for VESA adaptive sync with GSync-compatible that these displays were held to any kind of standard. AMD has since added the premium and premium pro tiers to also help differentiate what is supported.
So it would be in the consumers interest to add some easy marketing language for consumers to determine what is features are supported in HDMI and which are not.