Was hoping I could get some insight and help since I literally have nothing left to try atm.
I have a 6800 XT setup all set and calibrated for PC gaming on my LG OLED CX, however I'm running into some issues that I can't really seem to fix. I've seen several people with other models say they have no issues doing 4K/120hz/10bit/4:4:4 but the only way i can get these options to apply to my CX (which is set to be on PC mode) is to downgrade the resolution to something like 1280x1024. When I set the resolution to 4K, it defaults to 29.970hz and im allowed in windows settings to only go up to 60hz. If you go to 60hz color depth and pixel format are set static at 8 bpc, and YCbCr 4:2:0 in the Radeon menu. HDR will not function at 60hz either, it just toggles right back off. I'm also having issues with HDMI Ultra deep color giving only a black screen until it's toggled off. Any insights into whats up? (Also additionally, I have an Xbox Series X that has no issue doing all the bells and whistles this display can afford. And to rule out any funny cable business I've used a a Zeskit UHD 48Gbps, as well as a Club 3D certified cable, and for posterity even plugged in the Xbox's cable [Which again works fine for that device] but to no avail.)
I don't know of any HDMI that can do what you want from a PC. I ran into this issue myself with both AMD and Nvidia. The only way to get 4k 10bit and HDR @60hz is with a specific display port to HDMI adapter from Club3d. AMD works for me with the standard driver. With Nvidia you have to use the Studio driver if you want the setting on the desktop but works in games with the game ready driver.
This is the one I use:
Club 3D CAC-1080 DisplayPort 1.4 to HDMI 2.0B HDR Adapter Supports 4096X2160@60Hz High Dynamic Range
I'm merely asking just for a little more insight, as I am really confused by a lot of conflicting info I've been getting, but does the 6800 XT not output in HDMI 2.1? One of the reasons I got this display was for compatibility with the next gen cards being able to push that, I was under the assumption the card could at least push 4K at 120hz but I can't even achieve that, also I thought the Display Port were basically the next best thing if you didn't have HDMI 2.1 ports. (Again I'm newer to OLED displays, HDR, and such and my info could be completely off base)
Totally just trying to understand, as several people seem to be having issues across the board, I have noticed a lot more people running Nvidia card's don't seem to be having AS many issues with these same displays, and seem to be getting the specs that I want. Makes it not only confusing but also frustrating that I can't seem to get this thing to even do the basics that I assumed it was capable of right out of the gate with the correct hardware.
These are user to user forums the spec page on the amd page I see is the same one you see. I wish I had those answers for you. I know for instance in the past that you can claim compatibility with a HDMI spec and that doesn't mean you necessarily support all bandwidth or features of the spec. HDMI in Windows is also limited to what can be done by the Windows subsystem. Because of the DRM in HDMI, it isn't as simple as it is on a console or streaming device. Not saying that is the problem as again this card is new to all of us and nothing I have seen show anything more than a simple spec number as to what it's capabilities are.
You could ask the question to AMD support and maybe they could answer. https://www.amd.com/en/support/contact-email-form
There are also several great people on the amd threads on redit, as well as the Red Team side of the community forum here.
I already explained how I got the feature you wanted with my AMD card. Display port has supported this a long time and it was the way to go for me.
Thanks very much for the response! Guess I'll give those guys a shout out too, and see if I can't glean more info, and in the meantime I'll go ahead and give the display port a shot!
Hello,
I do not have an 6800 XT, but I do have a RTX 3000 series hooked up to to an LG GX OLED TV. I can confirm, that if the HDMI input is set to PC-mode, I can select 10-bit 120Hz for both 4K and 1440p resolutions. These display properly on the TV as long as deep color is also turned on for that input.
So the display can do what you want over HDMI, the issue must be somewhere in the AMD card settings. Do you have "HDMI Link Assurance" turned on in the adrenaline drivers? That can cause the system to run at a lower HDMI speed and could cause the issues you are reporting.
Your statement here made me think of something very important.
Depending on how and or which port on the TV you hook too they may be different compliance standards and or have settings in the TV menu that need to be changed to report the right EDID information to the PC as well.
I think all the HDMI ports on the LG BX, CX, GX support HDMI 2.1, so it shouldn't matter which port is being used as the input.
I know on my tv I have to configure in the settings what is hooked to my ports though.
Right. The OP said that he had set the input to PC mode which is what allows the TV to receive the pure RGB signal along with 10-bit color and higher refresh rates.
I don't have that model and all these TV's seem to require different settings under cryptic menus of different names. Can be daunting to find the right settings. I went through hell getting that setting working on my Samsung before I figured it out. Even once I had the TV set right and the right adapter I needed it still didn't work. Had to use the Nvidia studio driver as well as the game ready one doesn't do 10 bit on the desktop.
@ajlueke Hi! Thank you so much for the response! Unfortunately I do not have this setting enabled so looks like it isn't that. As far as a lot of my armchair research has gleaned, while there are some issues, a lot of RTX users aren't having too much trouble pushing what I want with this same tv. In fact, in some more recent sleuthing the major culprit is, as you stated the fact that in order for the TV to unlock the video options I'm targeting, Ultra Deep Color setting HAS to be enabled. It just refuses to work on ANY resolution or setting, merely gives me a black screen until I toggle it off. Seems to work just fine for my Xbox series X though, even with the same cord. Not sure what to glean from that other than it's just not playing nice with my card.
UPDATE: So, weird but somewhat helpful happenings as of late. Following in the footsteps and advice of @pokester I was shopping with the missus and saw a Display port to HDMI adapter so I picked it up. Now mind you I know better than to trust some random generic brand thing to do anything notable but it did lead to some weird... success? Using the adapter my display sprang up in 1920x1080, 60hz max. Absolutely surprising no one. The real surprise came in the fact that I could in fact put the display into Ultra Deep Color mode and enable HDR. Up until now nothing allowed UDC to work without giving a blank screen and dropping signal. I tried to switch it back to HDMI without actually toggling anything off and surprisingly the display held the settings, even in the regular HDMI port, however the settings were still locked at 1920x1080 @ 60hz. I tried to mess with the color settings in the Radeon menu and broke my stride when I was met with another blank screen and had to reset it all. This time however I tried just plugging in the HDMI and setting the settings to what the Display Port Adapter gave me, mind you I've done this before but for whatever reason it totally worked. UDC Enables, 120 hz unlocks, 4:4:4, the works. I don't know what's different, I've done this before to no avail hence my posting here, and now all of a sudden it plays nice... well that's not accurate. It is INCREDIBLY unstable, messing with any settings correlating to the UDC can cause a signal loss and I have to start back at square one. Enabling HDR is consistent, however turning it OFF is sort of a crapshoot, and can also cause signal loss. I can't really say the issue is solved, and I haven't really tried playing games in this yet as I'm trying to sorta figure out the limits of just HOW unstable it is, but as of right now, just doing general stuff, and not messing with any of the settings it's holding! I have at least somewhat gained access to those desired settings, it's just the why or how it happened makes no **bleep** sense.
with that adapter the bandwidth may have been limited to 1080p. With the right adapter in the display port you might do better. I know the one I have and linked to is only rated for 60hz 4k and HDR. They now have one that does 120 but the reviews are a bit mixed on it. Not surprising though as from pc this is really a bleeding edge connection. I know when I got mine it was the only one on the market that did this and worked.
Your card though does have a higher HDMI spec and maybe it really should do this. Talk to AMD support. Talk to your TV makers support department too. Maybe they have advice on a settings change or TV firmware update.
I am going to mention an AMD guy from the pro side. I would bet he knows exactly what this port should support and could maybe help tell you what it can do. I hope he will try to help you. @fsadough I know it is not a pro product but it is an AMD flagship product and there are no specs anywhere to say what the HDMI port on this card can and should do connection wise to a TV. Thanks!
The RX 6800 XT has an HDMI 2.1 output and with the right HDMI cable and selecting the proper HDMI input on the OLED TV and associated settings, HDR should be available.
You get this working yet? struggling with the same problem right now and it's super aggravating. Bought this GPU/TV combo specifically to play 4k 120hz.
Oh another thing I just thought of. You actually have to turn on 10-bit color support in the AMD drivers for it to send the appropriate signal. I think 10-bit color is actually disabled by default.
It may be that this is itself a driver issue. I know with older AMD GPUs, 10-bit color was disabled over HDMI at any resolution, despite HDMI 2.0 supporting 10-bit up to 4K60Hz. It was turned off at the driver level and only supported over DisplayPort. It could be that the driver is still behaving as if the port is an HDMI 2.0 and not correctly recognizing the HDMI 2.1 status.
It is actually not true. Once EDID handshake takes place and 10-bit display has been detected by the GPU, it will be automatically enabled. Obviously if this is not the case, it could be a driver bug or an EDID bug by the display device. The other factors affecting this could be a bad cable or adapter (if used).
ok I have had the same issue and found a fix. Yes you need a high speed HDMI, they sell the 48gbps ones at best buy but they are expensive. To get the display to show something other than a black screen on the high color setting, which you need to be on to get it to allow the proper color and refresh settings, you need to have another display connected to you computer at the same time. I cannot explain why it works, but it does. I ran into the same thing when I got my 6800 connected to my LG CX.
You can either use a monitor with a display port or you can use an app called spacedesk to wirelessly use your phone as a second monitor. As soon as you have another monitor connected it will get rid of the black screen on the CX in high color mode. I assume it is a bug in the current driver.
So I'm a bit unsure if that previous comment fully answers the question.
Do you mean that you can now get full color sampling/HDR + 120Hz using the HDMI 2.1 cable? Or does it still not reach 120Hz?
I find this bug really annoying because I bought an Alienware R10 with a 6800 XT and it looks like I may not get the 120Hz and full color range out of my C9 if that is true.
i had issues too thought my GPU was toast, finally worked out it was my tv (lg 65" b8 2020 model). the frame rate was a total mess couldnt fix it at all. switched my tv to an older tv and its working fine, think it was a conflict with LG's VRR or something. I could get 120htz no problem at 4k, just my games were unplayable.
One thing to take note of is the type of VRR activated in the menus. If I turn on "Instant game Response" on my LG GX (2020) TV and leave freesync off, the display will use the VRR built into the HDMI 2.1 spec. This does not seem to work correctly, as I will get a flashing unstable screen.
Turning on "freesync premium" as well switches the display to AMD's VRR over HDMI, and that works without issue, even at 4K120Hz 10-bit color.