cancel
Showing results for 
Search instead for 
Did you mean: 

General Discussions

hardcoregames_
Big Boss

Re: When will AMD add HDMI Forum VRR support?

goodplay wrote:

Win10 1903 VRR, make sure to read all Dev. blogs.

That was in 1809 too but most did not notice as they do not have a compatible monitor like my new 2019 model LG panel. My panel has FreeSync and even nVidia supports it now.

bigyundol
Adept I

Re: When will AMD add HDMI Forum VRR support?

After ten years, I want to buy a new TV too. Preferably LG OLED65C9 this time.
So it would be nice to know, if new Radeon 5700 XT will support HDMI Forum VRR like XBox One X does.
Or if AMD kicks LG into their pants for updating Freesync-support soon, like Samsung do since last year.

sprungnickel
Adept I

Re: When will AMD add HDMI Forum VRR support?

Well NVidia has come through with an update that enables "G-Sync Compatible" support for C9/E9 LG OLED tv. New Firmware from LG on 10/23 for North American sets, is downloadable today. There is a Beta Driver Nvidia Experience 440.52 that enables VRR over HDMI by 20 series RTX cards and GTX 1660 cards. 40-60HZ at 4K and 1440p 40-120hz/fps. 

Now AMD, Will your cards work with a "G-Sync Compatible" TV over HDMI. Previously VRR over HDMI has worked with LG monitors with AMD GPUs. Display port sourced AMD Freesync works with "G-Sync Compatible" Monitors. The  Question remains, Will AMD enable  Freesync over HDMI to a "G-Sync Compatible" TV such as the LG OLED65C9PUA?  Ball is in your court AMD. I really fancy one of those RX 5700 XT at 1440p 120fps for my C9! 

tygeezy
Adept I

Re: When will AMD add HDMI Forum VRR support?

Yep, I ended up buying a vega 56 so that I could get freesync out of my 2018 qled tv. I have interest in getting a c9 though and would have to change my graphics card yet again to get variable refresh rate... Just absurd. HDMI 2.1 VRR is essentially just freesync. I actually reached out and asked this question to someone that works in display division:

"I am glad you enjoy adaptive sync technologies like FreeSync, and we at AMD look forward to further improvements to FreeSync in the future. Alas I cannot speak to any of that here.

 

As to the differences between FreeSync and HDMI VRR, unfortunately I cannot go into details on either of them. But generally FreeSync can be thought of as a system level specification that covers many GPU and the display device aspects of the whole system, while HDMI VRR is more focused on the cable level protocol for transport of adaptive sync video frames.

 

While this may not answer your question, we will work within AMD to provide more clarity in this area in the future. You are not the first to ask for this."

eccentric
Adept III

Re: When will AMD add HDMI Forum VRR support?

It was my understanding that freesync is just a hardware framebuffer that literally does what vsync is supposed to slightly better and usually lets you disable vsync. I thing VRR just means that rather than buffering and things like that the monitor just shows the frames its been given so instead of outputting a constant 60hz always it could adjust its display output to better match the framerate being fed into it to help eliminate lag spikes. Basically both freesync and VRR do the same thing in different ways and the only real thing you'll notice about them is they lower input lag to the display. Using enhanced sync can achieve a fairly almost freesync like experience on any monitor assuming you keep the FPS considerably higher than the max of your display panel hz setting. according to rtings.com testing TV's with an xbox one X VRR enabled dropped the input lag at 1080p from 10ms to 6 or something. Freesync can also greatly lower input lag so having freesync means you basically don't require VRR, maybe? I'm not too sure, I do wonder if they can both be used at once I believe VRR is literally built upon freesyncs free and open standard. I also have a 2018 Samsung QLED well its really a 2017 model but was sold all year in 2018 but mid 2018 it got a 4k 120hz update. I've since bought a bunch of HDMI 2.1 cables. Saw on 5700 xt product pages claims of 4k @ 120hz. Rushed out and bought it, only to read "its only for displayport 1.4 its still got the older 2.0b HDMI ports" my samsung TV being a TV means it only has hdmi ports for input and no display port 1.4 ports that I can see. and I sure as hell cant find a DP 1.4 to hdmi 2.1 cable or whatever as I'm pretty sure they don't exist. Will AMD be updating their 5700 XT's to support 4k 120hz over HDMI I wonder? I hope they will I cant afford to buy a new card anytime soon.

therg
Adept II

Re: When will AMD add HDMI Forum VRR support?

For many years I backed Nvidia with my purchases, but due to hairworks/gerforce experience enforced login/gysnc and no freesync support I decided to purchase a Vega 56 and support team red. I just purchased a LG oled b9, and it looks like I am going back to green, wish we could just use 2xHDMI cables.

eccentric
Adept III

Re: When will AMD add HDMI Forum VRR support?

Hey Therg, I tried replying in my email but it didn't update the thread that I could see so I'm pasting it all here.

You can cheaply buy a display port to HDMI 2.0b adaptor which will support 4k 60hz and HDR I have one on my PC to my audio receiver and use a regular HDMI 2.1 cable straight to my TV I have a rx 5700xt and its got displayport 1.4 which supports 4k 120hz or 8k 60hz my TV supports 4k 120hz but sadly my TV has no displayport input! You will probably need to buy an “active” displayport to hdmi adaptor as the passive ones might not work. Either way trying both is cheaper than a new GPU for certain. But I’ve a Samsung QLED TV. You bought the wrong TV panel mechanism to use with a computer really as OLED isn’t meant for static content and isn’t meant for brightly lit sunny rooms with windows and proper lighting like a room you’d use a computer in. OLED are for dark cinema theatre rooms and playing back blu-ray discs basically in a nutshell. Samsung and a few other TV makers have quantum dot panels which is literally the evolution of the computer LED/LCD panel it literally just uses less layers of screens for brighter cleaner image and has special coated backlighting for closer to proper white whites, not yellowish or blueish whites that LED’s produce in general. Samsung TV’s have always had lower input lag and were first to have support for 4k 120hz and other things as far as I could tell and have PC input modes and have been 200hz 1080p a couple years ago. But in other countries many TV brands also have quantum dot models besides Samsung or just use a regular LED LCD panel on a cheaper TV model for PC use instead of OLED because OLED suffer from burn-in but its called “ghosting” as oled panel is different tech to plasma and OLED also suffer from colour bleed and the colour blue fading away over time to never return. Samsung panels have a 10 year guarantee on them no ‘ghosting/burn-in or color bleed’. Nice of them.

I forgot to mention, OLED’s are good for dim rooms because the organic stuff cant be made very bright by default its set to energy saver/power saver/eco mode and if you crank the brightness or color up from its factory settings it will quickly burn out and die. You’ve never seen a jellyfish or algae as bright as the sun have you? Organic LED is made from jellyfish bioluminescent technology so they cant be very bright. The sun/daylight rating is 10,000 nits of brightness. Some Samsung TV’s you can buy are 4,000 nits most 4k blu-ray discs are mastered in 1,000 nits. Most OLED TV’s have been struggling to reach 1,000 nits for years while quantum dot TV panels are literally as bright as the LED lights placed behind them (the backlighting). So they’re ideal for use in places called houses that have “lights” and “windows” where illumination exists in some form or other. If you wanna get the absolute most out of what your OLED is capable of and its strongest features of strong blacks and take advantage of its better viewing angles should try viewing videos of black cats filmed at night while seated at near right angles in a dark room. That is exactly what OLED TV’s make possible and why they were revolutionary and “the best” for viewing movies when they were invented years ago and cost fortunes. But they’ve long since been replaced by new better technology that’s got ALL the benefits of OLED and “zero” of its awful organic drawbacks.. Thats right MicroLED technology its the same but better not sure if that’s called MLED or what but its microLED first microLED panels I saw revealed at tech expo’s in the news was from samsung.

therg
Adept II

Re: When will AMD add HDMI Forum VRR support?

Wow you really typed a wall, OLED is the correct TV, if you think any other BS LCD marketing comes close you are wrong. Burn in wont be a problem as I wont run desktop icons, set desktop background to black, put the start menu/taskbar on auto hide, set a screen saver/ turn off after a few mins of inactivity etc. OLED blows LCD OUT OF THE WATER when it comes to refresh times. You sound like you work for Samsung really lol

hardcoregames_
Big Boss

Re: When will AMD add HDMI Forum VRR support?

eccentric wrote:

Hey Therg, I tried replying in my email but it didn't update the thread that I could see so I'm pasting it all here.

 

You can cheaply buy a display port to HDMI 2.0b adaptor which will support 4k 60hz and HDR I have one on my PC to my audio receiver and use a regular HDMI 2.1 cable straight to my TV I have a rx 5700xt and its got displayport 1.4 which supports 4k 120hz or 8k 60hz my TV supports 4k 120hz but sadly my TV has no displayport input! You will probably need to buy an “active” displayport to hdmi adaptor as the passive ones might not work. Either way trying both is cheaper than a new GPU for certain. But I’ve a Samsung QLED TV. You bought the wrong TV panel mechanism to use with a computer really as OLED isn’t meant for static content and isn’t meant for brightly lit sunny rooms with windows and proper lighting like a room you’d use a computer in. OLED are for dark cinema theatre rooms and playing back blu-ray discs basically in a nutshell. Samsung and a few other TV makers have quantum dot panels which is literally the evolution of the computer LED/LCD panel it literally just uses less layers of screens for brighter cleaner image and has special coated backlighting for closer to proper white whites, not yellowish or blueish whites that LED’s produce in general. Samsung TV’s have always had lower input lag and were first to have support for 4k 120hz and other things as far as I could tell and have PC input modes and have been 200hz 1080p a couple years ago. But in other countries many TV brands also have quantum dot models besides Samsung or just use a regular LED LCD panel on a cheaper TV model for PC use instead of OLED because OLED suffer from burn-in but its called “ghosting” as oled panel is different tech to plasma and OLED also suffer from colour bleed and the colour blue fading away over time to never return. Samsung panels have a 10 year guarantee on them no ‘ghosting/burn-in or color bleed’. Nice of them.

 

I forgot to mention, OLED’s are good for dim rooms because the organic stuff cant be made very bright by default its set to energy saver/power saver/eco mode and if you crank the brightness or color up from its factory settings it will quickly burn out and die. You’ve never seen a jellyfish or algae as bright as the sun have you? Organic LED is made from jellyfish bioluminescent technology so they cant be very bright. The sun/daylight rating is 10,000 nits of brightness. Some Samsung TV’s you can buy are 4,000 nits most 4k blu-ray discs are mastered in 1,000 nits. Most OLED TV’s have been struggling to reach 1,000 nits for years while quantum dot TV panels are literally as bright as the LED lights placed behind them (the backlighting). So they’re ideal for use in places called houses that have “lights” and “windows” where illumination exists in some form or other. If you wanna get the absolute most out of what your OLED is capable of and its strongest features of strong blacks and take advantage of its better viewing angles should try viewing videos of black cats filmed at night while seated at near right angles in a dark room. That is exactly what OLED TV’s make possible and why they were revolutionary and “the best” for viewing movies when they were invented years ago and cost fortunes. But they’ve long since been replaced by new better technology that’s got ALL the benefits of OLED and “zero” of its awful organic drawbacks.. Thats right MicroLED technology its the same but better not sure if that’s called MLED or what but its microLED first microLED panels I saw revealed at tech expo’s in the news was from samsung.

I have a DisplayPort to HDMI cable intended for 1080p displays, The logic is hidden in the connector.

My LG panel is so bright I used the Windows calibration to configure it for the studio light levels.

HDMI and DisplayPort standards have been updated to handle 8K panels properly

eccentric
Adept III

Re: When will AMD add HDMI Forum VRR support?

And what will you do for video game health bars or for MMO action/skill icons? You get burn in that’s very visible and noticeable after just 8 to 11 hours of gameplay as shown in Samsung TV commercials with an MMO gamer gaming for that long in a continuous session to exaggeratedly illustrate the point as who the hell can game for 8 hours in a row or leave their web browser open for 8 hours in a row with an OLED TV am I right?

What will you do if you watch TV and its got a channel/network icon logo thing in the corner, or the news and it has a scrolling banner for a few hours?

What will you do if you use a “web browser” and its often got the “address bar” or “tabs” up on the screen for extended periods?