cancel
Showing results for 
Search instead for 
Did you mean: 

General Discussions

jasonamd
Adept II

When will AMD add HDMI Forum VRR support?

Now that there is an HDMI 2.1 OLED (LG C9/E9) available, I thought AMD would add their promised HDMI Forum VRR support like Xbox One X has.

Source: https://www.overclock3d.net/news/gpu_displays/hdmi_2_1_vrr_support_will_come_to_amd_radeon_rx_gpus_w...

39 Replies

Hey, any update on this? I'd love to use VRR on my new TV. I know HDMI 2.1 isn't necessary because the Xbox One X can do it.

You do need HDMI 2.1 to use VRR. XBOX ONE will be able to have HDMI 2.1 features via Firmware Update to be able to use VRR in the future according to this article from last year: The Xbox One X is the first device to support HDMI 2.1 - NotebookCheck.net News 

Seems like, at least,  in Linux AMDGPU Driver it is included as per this article :

But according to this Reddit Thread, No AMD or Nvidia GPUs have come out with HDMI 2.1 which is the only type of connection to get VRR: https://www.reddit.com/r/Amd/comments/blr5v7/amd_vrr_support/ 

Do not know how accurate these comments are concerning needing HDMI 2.1 for VRR to be enabled.

jdrobinson314
Challenger

If the linux driver got it, then there's a pretty good chance it's in the latest update for windows.

Vrr = freesync ! = vrr.

Basically vrr over hdmi is freesync to begin with but as far as hdmi is concerned it needs to be addressed specifically I guess.

If you have a vrr tv then it might be worth a look in the driver panel for a freesync toggle.

0 Likes
jasonamd
Adept II

Yeah, just so we are on same page. VRR includes FreeSync and HDMI Forum VRR. HDMI Forum VRR is a standard feature of HDMI 2.1 devices but is an optional feature which can be added to HDMI 2.0 devices (like was done with Xbox One X).

Only if the device has some HDMI 2.1 features incorporated when it was manufactured. The XBOX One when it was built was built with some basic HDMI 2.1 electronics for future use.

Thus with a firmware update it is able to use those HDMI 2.1 features to enable VRR.

DisplayPort has supported variable refresh rate displays such as my 4K LG panel fine

I have to use DisplayPort as HDMI is bandwidth starved in comparison

stella37
Journeyman III

AMD revealed that they are planning to add HDMI 2.1 VRR (variable refresh rate) support with an upcoming release of Radeon Software Adrenalin, the standard that will bring variable refresh rates to future Visit Site televisions. 

Win10 1903 VRR, make sure to read all Dev. blogs.

My PC- Ryzen 5 5600x, B550 aorus pro ac, Hyper 212 black, 2 x 16gb F4-3600c16dgtzn kit, NM790 2TB, Nitro+RX6900XT, RM850, Win.10 Pro., LC27G55T.

goodplay wrote:

Win10 1903 VRR, make sure to read all Dev. blogs.

That was in 1809 too but most did not notice as they do not have a compatible monitor like my new 2019 model LG panel. My panel has FreeSync and even nVidia supports it now.

bigyundol
Adept I

After ten years, I want to buy a new TV too. Preferably LG OLED65C9 this time.
So it would be nice to know, if new Radeon 5700 XT will support HDMI Forum VRR like XBox One X does.
Or if AMD kicks LG into their pants for updating Freesync-support soon, like Samsung do since last year.

sprungnickel
Adept I

Well NVidia has come through with an update that enables "G-Sync Compatible" support for C9/E9 LG OLED tv. New Firmware from LG on 10/23 for North American sets, is downloadable today. There is a Beta Driver Nvidia Experience 440.52 that enables VRR over HDMI by 20 series RTX cards and GTX 1660 cards. 40-60HZ at 4K and 1440p 40-120hz/fps. 

Now AMD, Will your cards work with a "G-Sync Compatible" TV over HDMI. Previously VRR over HDMI has worked with LG monitors with AMD GPUs. Display port sourced AMD Freesync works with "G-Sync Compatible" Monitors. The  Question remains, Will AMD enable  Freesync over HDMI to a "G-Sync Compatible" TV such as the LG OLED65C9PUA?  Ball is in your court AMD. I really fancy one of those RX 5700 XT at 1440p 120fps for my C9! 

Yep, I ended up buying a vega 56 so that I could get freesync out of my 2018 qled tv. I have interest in getting a c9 though and would have to change my graphics card yet again to get variable refresh rate... Just absurd. HDMI 2.1 VRR is essentially just freesync. I actually reached out and asked this question to someone that works in display division:

"I am glad you enjoy adaptive sync technologies like FreeSync, and we at AMD look forward to further improvements to FreeSync in the future. Alas I cannot speak to any of that here.

 

As to the differences between FreeSync and HDMI VRR, unfortunately I cannot go into details on either of them. But generally FreeSync can be thought of as a system level specification that covers many GPU and the display device aspects of the whole system, while HDMI VRR is more focused on the cable level protocol for transport of adaptive sync video frames.

 

While this may not answer your question, we will work within AMD to provide more clarity in this area in the future. You are not the first to ask for this."

Anonymous
Not applicable

It was my understanding that freesync is just a hardware framebuffer that literally does what vsync is supposed to slightly better and usually lets you disable vsync. I thing VRR just means that rather than buffering and things like that the monitor just shows the frames its been given so instead of outputting a constant 60hz always it could adjust its display output to better match the framerate being fed into it to help eliminate lag spikes. Basically both freesync and VRR do the same thing in different ways and the only real thing you'll notice about them is they lower input lag to the display. Using enhanced sync can achieve a fairly almost freesync like experience on any monitor assuming you keep the FPS considerably higher than the max of your display panel hz setting. according to rtings.com testing TV's with an xbox one X VRR enabled dropped the input lag at 1080p from 10ms to 6 or something. Freesync can also greatly lower input lag so having freesync means you basically don't require VRR, maybe? I'm not too sure, I do wonder if they can both be used at once I believe VRR is literally built upon freesyncs free and open standard. I also have a 2018 Samsung QLED well its really a 2017 model but was sold all year in 2018 but mid 2018 it got a 4k 120hz update. I've since bought a bunch of HDMI 2.1 cables. Saw on 5700 xt product pages claims of 4k @ 120hz. Rushed out and bought it, only to read "its only for displayport 1.4 its still got the older 2.0b HDMI ports" my samsung TV being a TV means it only has hdmi ports for input and no display port 1.4 ports that I can see. and I sure as hell cant find a DP 1.4 to hdmi 2.1 cable or whatever as I'm pretty sure they don't exist. Will AMD be updating their 5700 XT's to support 4k 120hz over HDMI I wonder? I hope they will I cant afford to buy a new card anytime soon.

therg
Adept II

For many years I backed Nvidia with my purchases, but due to hairworks/gerforce experience enforced login/gysnc and no freesync support I decided to purchase a Vega 56 and support team red. I just purchased a LG oled b9, and it looks like I am going back to green, wish we could just use 2xHDMI cables.

Anonymous
Not applicable

Hey Therg, I tried replying in my email but it didn't update the thread that I could see so I'm pasting it all here.

You can cheaply buy a display port to HDMI 2.0b adaptor which will support 4k 60hz and HDR I have one on my PC to my audio receiver and use a regular HDMI 2.1 cable straight to my TV I have a rx 5700xt and its got displayport 1.4 which supports 4k 120hz or 8k 60hz my TV supports 4k 120hz but sadly my TV has no displayport input! You will probably need to buy an “active” displayport to hdmi adaptor as the passive ones might not work. Either way trying both is cheaper than a new GPU for certain. But I’ve a Samsung QLED TV. You bought the wrong TV panel mechanism to use with a computer really as OLED isn’t meant for static content and isn’t meant for brightly lit sunny rooms with windows and proper lighting like a room you’d use a computer in. OLED are for dark cinema theatre rooms and playing back blu-ray discs basically in a nutshell. Samsung and a few other TV makers have quantum dot panels which is literally the evolution of the computer LED/LCD panel it literally just uses less layers of screens for brighter cleaner image and has special coated backlighting for closer to proper white whites, not yellowish or blueish whites that LED’s produce in general. Samsung TV’s have always had lower input lag and were first to have support for 4k 120hz and other things as far as I could tell and have PC input modes and have been 200hz 1080p a couple years ago. But in other countries many TV brands also have quantum dot models besides Samsung or just use a regular LED LCD panel on a cheaper TV model for PC use instead of OLED because OLED suffer from burn-in but its called “ghosting” as oled panel is different tech to plasma and OLED also suffer from colour bleed and the colour blue fading away over time to never return. Samsung panels have a 10 year guarantee on them no ‘ghosting/burn-in or color bleed’. Nice of them.

I forgot to mention, OLED’s are good for dim rooms because the organic stuff cant be made very bright by default its set to energy saver/power saver/eco mode and if you crank the brightness or color up from its factory settings it will quickly burn out and die. You’ve never seen a jellyfish or algae as bright as the sun have you? Organic LED is made from jellyfish bioluminescent technology so they cant be very bright. The sun/daylight rating is 10,000 nits of brightness. Some Samsung TV’s you can buy are 4,000 nits most 4k blu-ray discs are mastered in 1,000 nits. Most OLED TV’s have been struggling to reach 1,000 nits for years while quantum dot TV panels are literally as bright as the LED lights placed behind them (the backlighting). So they’re ideal for use in places called houses that have “lights” and “windows” where illumination exists in some form or other. If you wanna get the absolute most out of what your OLED is capable of and its strongest features of strong blacks and take advantage of its better viewing angles should try viewing videos of black cats filmed at night while seated at near right angles in a dark room. That is exactly what OLED TV’s make possible and why they were revolutionary and “the best” for viewing movies when they were invented years ago and cost fortunes. But they’ve long since been replaced by new better technology that’s got ALL the benefits of OLED and “zero” of its awful organic drawbacks.. Thats right MicroLED technology its the same but better not sure if that’s called MLED or what but its microLED first microLED panels I saw revealed at tech expo’s in the news was from samsung.

eccentric wrote:

Hey Therg, I tried replying in my email but it didn't update the thread that I could see so I'm pasting it all here.

 

You can cheaply buy a display port to HDMI 2.0b adaptor which will support 4k 60hz and HDR I have one on my PC to my audio receiver and use a regular HDMI 2.1 cable straight to my TV I have a rx 5700xt and its got displayport 1.4 which supports 4k 120hz or 8k 60hz my TV supports 4k 120hz but sadly my TV has no displayport input! You will probably need to buy an “active” displayport to hdmi adaptor as the passive ones might not work. Either way trying both is cheaper than a new GPU for certain. But I’ve a Samsung QLED TV. You bought the wrong TV panel mechanism to use with a computer really as OLED isn’t meant for static content and isn’t meant for brightly lit sunny rooms with windows and proper lighting like a room you’d use a computer in. OLED are for dark cinema theatre rooms and playing back blu-ray discs basically in a nutshell. Samsung and a few other TV makers have quantum dot panels which is literally the evolution of the computer LED/LCD panel it literally just uses less layers of screens for brighter cleaner image and has special coated backlighting for closer to proper white whites, not yellowish or blueish whites that LED’s produce in general. Samsung TV’s have always had lower input lag and were first to have support for 4k 120hz and other things as far as I could tell and have PC input modes and have been 200hz 1080p a couple years ago. But in other countries many TV brands also have quantum dot models besides Samsung or just use a regular LED LCD panel on a cheaper TV model for PC use instead of OLED because OLED suffer from burn-in but its called “ghosting” as oled panel is different tech to plasma and OLED also suffer from colour bleed and the colour blue fading away over time to never return. Samsung panels have a 10 year guarantee on them no ‘ghosting/burn-in or color bleed’. Nice of them.

 

I forgot to mention, OLED’s are good for dim rooms because the organic stuff cant be made very bright by default its set to energy saver/power saver/eco mode and if you crank the brightness or color up from its factory settings it will quickly burn out and die. You’ve never seen a jellyfish or algae as bright as the sun have you? Organic LED is made from jellyfish bioluminescent technology so they cant be very bright. The sun/daylight rating is 10,000 nits of brightness. Some Samsung TV’s you can buy are 4,000 nits most 4k blu-ray discs are mastered in 1,000 nits. Most OLED TV’s have been struggling to reach 1,000 nits for years while quantum dot TV panels are literally as bright as the LED lights placed behind them (the backlighting). So they’re ideal for use in places called houses that have “lights” and “windows” where illumination exists in some form or other. If you wanna get the absolute most out of what your OLED is capable of and its strongest features of strong blacks and take advantage of its better viewing angles should try viewing videos of black cats filmed at night while seated at near right angles in a dark room. That is exactly what OLED TV’s make possible and why they were revolutionary and “the best” for viewing movies when they were invented years ago and cost fortunes. But they’ve long since been replaced by new better technology that’s got ALL the benefits of OLED and “zero” of its awful organic drawbacks.. Thats right MicroLED technology its the same but better not sure if that’s called MLED or what but its microLED first microLED panels I saw revealed at tech expo’s in the news was from samsung.

I have a DisplayPort to HDMI cable intended for 1080p displays, The logic is hidden in the connector.

My LG panel is so bright I used the Windows calibration to configure it for the studio light levels.

HDMI and DisplayPort standards have been updated to handle 8K panels properly

therg
Adept II

Wow you really typed a wall, OLED is the correct TV, if you think any other BS LCD marketing comes close you are wrong. Burn in wont be a problem as I wont run desktop icons, set desktop background to black, put the start menu/taskbar on auto hide, set a screen saver/ turn off after a few mins of inactivity etc. OLED blows LCD OUT OF THE WATER when it comes to refresh times. You sound like you work for Samsung really lol

Anonymous
Not applicable

And what will you do for video game health bars or for MMO action/skill icons? You get burn in that’s very visible and noticeable after just 8 to 11 hours of gameplay as shown in Samsung TV commercials with an MMO gamer gaming for that long in a continuous session to exaggeratedly illustrate the point as who the hell can game for 8 hours in a row or leave their web browser open for 8 hours in a row with an OLED TV am I right?

What will you do if you watch TV and its got a channel/network icon logo thing in the corner, or the news and it has a scrolling banner for a few hours?

What will you do if you use a “web browser” and its often got the “address bar” or “tabs” up on the screen for extended periods?

Anonymous
Not applicable

Also I forgot to mention that Samsung TV panels have greater reproduction of the visible light spectrum so they're far far more lifelike images because they have wider colour gamut (more of the rec2020 colourspace reproduced) and have around 4 times the brightness they are closer to representing daylight as we see it while OLED is just a little better at representing night scenes with low illumination. as shadows tend to be lit by sunlight and are rarely jet black they're usually displayed just fine on other panel tech. The blacks on a quantum dot panel are still 0.000% and such with decimal places of zeros for the black level its just OLED panels have far far more zeros on the black level which isn't commonly seen in movies unless you have a movie about the colour vanta black maybe as its literally the blackest pigment/colouring we can paint things with in existance? But for TV and movies they tend to have a camera man and director carefully select the lighting and angles of lighting to better illuminate the subject they are filming or film movies in daytime or in buildings with bright levels of illumination like malls/shopping arcades or offices and coffee shops and such. Its very rare that you will see a video game where you explore in total darkness even in dark caves because if you people spent years creating game engines that handle global illumination/lighting and shadows well and to be totally honest if theres no light in dark cave in a video game you cant see where you are going and its not like you can feel your way around the way a real world dark cave is navigated by touch so video games ALWAYS have a sort of artificial illumination so we can always see whats being displayed to entirely avoid pitch blackness when in caves. 

Actually speaking of realistic lighting in dark caves, nvidia's showcasing of RTX lighting and shadows in a game like metro was absolutely hilarious! you see RTX cards cant ray trace in real time very well at all, in movies they usually have millions of rays and high numbers of bounces but for their hyped up RTX title metro exodus it literally only has a single RAY point in the entire game. The sun! and its barely got any rays or bounces though its adjustable its a joke because in a game where you fight mutant monster things underground in dark caves all day the literally only time you are seeing RTX in the game is when you are seeing light from the sun. Its not that they didn't want to have every light in the game be RTX ray points.. its just the cards cant physically do that and keep any sort of playable FPS. I laughed so hard when heaps of RTX demos showed off a game about dark tunnels and caves as a 'lighting technology demo' you could argue it may be doing some better reflections or something but all that stuff is possible on AMD hardware just the same without fancy billions of dollars of investor fraud gimmick as visibly demonstrated by the cryengine neon noire ray tracing demo. Also your mention of the OLED 'refresh times' blowing Samsung out of the water, the Samsung TV's have the lowest input latency, many TV's and panels are something like 800hz but the processing CPU and the HDMI cabling bandwidth or displayport and such can only handle 60 or 120hz and their image processing enhancement chips can only do motion rates smoothing at maybe 200hz.. So currently the "refresh times" don't matter at all because theres no hardware on earth capable of using them.. Yes oled can be in a lab tested to have very high refresh times but graphics cards and any computer chip cant film and capture at those times or output at those times.. because even the RAM memory timings and CPU cycles are only so fast inside machines. 

0 Likes

Unless you are going to talk about Freesync/VRR support on AMD ATI Vega 56 on the LG B9 stop talking as it is not relevant to me. I have a B9 on order, and a Vega 56. I dont watch news I wont have burn in issues, LG moves the pixels around, and I also on my LCD panel dont keep a web browser in a static position allready. You are wasting your time Samsung employee. Did you forget to mention LG OLED 0.2 MS response time Grey to Grey? and 7 ms input lag?

Anonymous
Not applicable

Please look up both the best Samsung display and the best oled display on RTINGS.com and compare as they have some of the more detailed TV panel testing and values that I've encountered unless you know of a better display review site you could recommend me?

Anonymous
Not applicable

Thought I should link you to the commercials in question which actually use LG panels so yeah.

Samsung - QLED vs OLED - YouTube I saw this on the Samsung Australia youtube couple years back in ENGLISH.. I have no idea why I only see it on Samsung brasil or Samsung chile now? where did all their older commercials disappear to?

Here is a burn in checker for your TV. 

TV burn-in checker l Samsung - YouTube The first advert is for the previous models of Samsung and LG TV's but this latest commercial about OLED burn in checker was posted very recently in 2019. Not to mention input lag and other factors samsungs always been top of the pack with latency and such so yeah I wish people bothered to read up on available display technologies and chose the right ones for their use case instead of hearing about the best home cinema displays and trying to use them for PC and gaming use. You can hardly find any OLED monitors for sale and theres a very good reason for that.. they're overpriced and they burn in terribly ESPECIALLY if you disable the power saving/eco mode settings.

Your on drugs if you think a comercial means anything at all, I wish people actually knew what tech they are talking about and pick the right OLED TV as its the best tech available. 0.2 MS Grey to Grey do you even read your own reviews? NO LCD in the world is that fast, input lag on the 2019 LG oled is lower then 7ms. My panel cost me 1700 AUD how much is your loverbow QLED TV? Notice how they use a Q to look like an O even though the tech is nowhere near as good. Samsung stopped making OLED TV's because LG had a way cheaper manufacturing process and smashed them in price.

Anonymous
Not applicable

I'm not certain but I believe Samsung make better quality 8k panels rather than their 4k panels, Also this years QLED 60R,70R and 80R and 90R aren't anywhere close to as good as the previous 2017 and 2018 model called the Q7FN and Q9FN in many aspects especially in colour gamut wideness and most specs across the board. so you may want to look into things like possibly the 900R rather than the 90R at least to compare or the previous Q9FN as they've tried to drastically slash down on cost like customers wanted maybe but it could have led to lower quality or something. But I'm almost certain if you go up in price a lot they've better TV's from samsung I heard which are micro LED they've even got one I've heard rumours of some time ago that's maybe for sale by now that costs a like million $$ called the window or something I heard as opposed to their famous "the wall" series. However The "colour calibration" guy does admit that native mode looks better for many colours but I believe its also to do with source material if you use YCBCR 420 material or 4:4:4 input material the Samsung TV looks amazingly better as YCBCR is the colourspace of all film and photography and the format that all Blu-ray discs are stored in and as they are HDR TV's you'd want to be playing back HDR content. However running my PC desktop in YCBCR will make the deep magentas and navy colours a touch saturated and the greens more vibrant as its converting from PC RGB to proper TV format the guys probably testing while hooked up to a PC or Blu-ray player that is set to RGB colourspace probably but I'm not sure about what is the proper input source material to test in for pro calibration but i'd be using TV formatting YCBCR 444 or 420 and playing back some HQ 10bit HDR filesource like Samsung: Travel With My Pet HDR UHD 4K Demo | 4K Media or Sony: Whale in Tonga HDR UHD 4K Demo | 4K Media Interestingly I previously owned a nvidia gtx 1060 and changing to YCBCR 444 from full RGB had little to no difference and nvidia cards enabling 10bit colour on the same display looked lacklustre and less impressive colourwise compared to AMD's 10bit colour when using the same Samsung TV there was a very noticeable difference in image quality especially when enabling GPU scaling on the AMD card compared to nvidia gtx 1060 where turning GPU scaling on may have made it look slightly worse or pretty much the same.

However believe it or not using a recent AMD graphics card and changing to ycbcr 444 makes 8k HQ photograph desktop wallpapers of landscapes and such become stunningly lifelike and beautiful though convertion to the correct colourspace may make some things slightly skewed as he mentions its still far more lifelike overall as its the intended usecase colourspace. Setting the TV's display input to gameconsole for colourful PC desktop use with image enhancement options or Blu-ray player for high quality output but increased latency and an intended 24fps playback. You can then playback those files using an excellent media player like daum potplayer and the windows 10 app stores HEVC extensions decoder and depending on the video content you are playing back like maybe Netflix in the windows 10 app stores Netflix app for higher quality playback support of 4k and HDR and atmos audio and such you could go into AMD display settings and under colour adjust the saturation slider slightly to better fill out peoples flesh tones and make the images more vibrant and popping. All of this using windows 10 in SDR mode and the native colourspace mode even gaming in games that support HDR10 mode enabling HDR and such actually looks less good than using native colourspace and game console or Blu-ray mode to adjust all the image quality settings for best appearance and then possibly a slight bump of maybe around 20% of the saturation slider some Netflix movies are terribly washed out though and may need heavier saturation to achieve vibrant HDR movie like images. But I'm not trained in calibrating TV's but with a Samsung TV you can crank the colour from 25/50 default all the way right up to 50 and colour and saturate the heck out of it if needed which isn't really necessary but its nice you can do it. Most calibrators would recommend a backlight setting of 6 or maybe 8 for Samsung TV's and using "movie mode" because they think it makes it match movies better or someshit but I just set the TV to "natural" mode and have a backlight of around 16 to 30 I've often got it set to 25. Maybe some slight accuracy vanishes and light bleed may become stronger but the images become way more life like looking. you MUST set colourspaces to auto in HDR mode instead of native though or things look washed out I believed that colourspaces should be set to AUTO to be as director intended or is that not how HDR works?

TLDR: I'm 90% certain that TV calibrators go for image test pattern reproduction values and not "how life like they can make it look" they just fiddle with numbers on the TV's setting to try and get their calibration software to show the best numbers or whatever but then the TV will be maybe set to a fraction of its default values and isn't necessarily reflective of what the TV can look like when adjusted to make it look how you want it to look rather than worrying about moire, uniformity and saturation levels being "close as possible to a film reel movie recorded in the 1970's"

Anonymous
Not applicable

The LG C9 does support freesync unless its not HDMI 2.1 compatible and the vega 56 supports freesync/VRR.. because even the RX 580 I previously owned supports freesync ultimate with my 2017 Samsung TV.

If you hooked your vega 56 **DIRECTLY** into the LG C9 TV via HDMI or displayport and then your AMD adrenaline software said freesync not supported still its because you didn't navigate your TV's menu options till you found the game mode setting and enabled game mode then enabled freesync. You MUST enable game mode and freesync on the display to use freesync at all.  In my TV its found under external devices or something like that in the menu options not in the picture settings or anywhere like that. Then you must set it on in the AMD adrenaline software, but then you should add a game profile to your AMD adrenaline software for each game you wish to play and set the freesync setting to ON then rather than trying to cap FPS you should instead use AMD chill feature so when gaming at 1080p and getting 200+fps you can set your chill settings to maybe 70min 144max and your GPU will save power your fans noise will be reduced and your input latency can be reduced by up to 64% its maybe the best way to ensure that freesync frame rates stay within your displays target range without setting a hard cap to match your display hz max refresh rate as this may cause stuttering if there are any conflicts when capping a displays fps output using FRTC or in game game engine caps/limits. Enable enhanced sync it will remove the display cap limit just ensure you are getting FPS that exceeds your display hz rate considerably so if you are gaming at 120hz you want to have FPS of 160 average or better so it never drops below the monitors max hz refresh rate.. when you use chill to set target fps min 70 or whatever floats your boat, it only goes lower when you arent moving the mouse around or pressing keys. and max of 120 I guess? it should then greatly reduce latency and save power and be quieter and be a much better gaming experience. 

Freesync and VRR HDMI 2.1 are different mate.

Anonymous
Not applicable

freesync didn't exist in TV's until VRR was implemented into them as far as I can see.. because freesync was implemented into HDMI 2.1's standard and the HDMI 2.1's standards don't list freesync separately its part of VRR standard to lower the latency and the variable refresh rate of the display is just to better match the low latency output of freesync. Since with freesync supported game titles and freesync displays you can sometimes disable Vsync ideally when your GPU is outputting within the target FPS of the freesync display. You may feel that limiting your FPS to targets below the max hz of the monitor output would decrease the speed but it actually increases it while saving power. So you see the variable refresh rate is just a better way of showing the freesync. the true magic behind VRR and how it works is freesync. 

Anonymous
Not applicable

Its great that LG and other TV makers though only 2 years or so behind samsung have caught up in terms of input lag of 7ms and 4k 120hz and VRR which includes freesync support. Its also great that their extremely cheap to produce OLED panels are finally coming down in price. You see LG OLED panels aren't as amazingly good as you think, theyre just dirt cheap heres proof. Companies like B.O.E (see: Sony) produce their own OLED panels however LG's dirt cheap manufacturing method and cost cutting meant that almost every big brand name TV manufacturer in the world doesn't bother making their own panels because they purchase LG OLED panels in bulk much cheaper than their own factories can churn them out, and even if they did make better quality OLED displays the cost would be much much higher closer to how much they cost when first hitting the market but OLED's can burn out in just a couple of years and the image quality will degrade as the colour blue fades and you get colour bleed over time. So instead everyone uses cheaper LG panels in an effort to turn OLED displays into more of a disposable product as very few people were willing to pay hundreds to have their OLED panels professionally recalibrated every 6 months to a year to squeeze more brightness and image quality out of your $20,000+ display as its organic materials were used up like the manufacturer intended and kept insisting on and the manuals stated to do so. People instead went, hmm looks a bit dimmer or the colour looks a little off i'll just crank it up a notch and bam they suddenly had their TV stop working in maybe a few weeks. Now every OLED panel has like 50 different burn-in protection settings you can enable which every single one of them takes away from the brightness usually or the image quality. And people buy an expensive OLED panel which has maybe 900nits or even finally recently managed to reach the 1,000 nits milestone and then disable everything, get rid of their desktop wallpapers set everything to black and do whatever they can to NOT USE THEIR DISPLAY as they normally would at all then decrease the brightness and vibrancy from stunning and beautiful all the way down to 200nits so they can hopefully get the panel to a few years or their expenses cant be justified. Why even bother buying a good OLED panel if you will run it at a 5th of what its capable of running and not use your game consoles and PCs as you normally would? You can see what I'm talking about in this video here 

One Year Update - OLED Burn-in Test – RTINGS.com - YouTube 

If you actually listen to whats coming out of the guys mouth saying "set to 200 nits"  because heaps of mobile phones are 500-700nits displays nowadays and especially if you have a Samsung AMOLED display you're better off watching everything on your phone than a 900 to 1000 nits OLED that cost thousands but is only being run at 200nits with all the settings turned off and super eco low power energy saver mode enabled so its so dim that if you have a window with daylight nearby coming into the room you wont be able to make out whats on TV well at all it will look like trying to use an older smart phones TFT/LCD display in direct sunlight as they had brightness of maybe 3-400nits and you couldnt really read anything off the screen in direct sunlight because it all seemed too dim. Samsung used to create OLED panels themselves and do still make some great ones, just they've used Active matrix OLED panels in all their flagship phones ever since around the samsung S3? LG couldn't even afford to or figure out how to create an OLED mobile phone panel until the pixel 3 i believe it was which is pretty recent and the first phone they ever made with OLED panels the pixel 3 had many display issues and rapidly developed ghosting in just a few hours of use and had issues with image quality and uniformity. A few months after the pixel 3's release LG engineers managed to tweak and fine tune their mobile phone OLED panel further and improve the exact same panel that was in the pixel 3 to help minimize its terrible ghosting and image quality issues and ended up using that OLED panel in the LGv30+ So the company thats supposed to be the worlds best in OLED struggled for close to a decade to do what samsungs been doing and then couldn't get anywhere near the quality of the Samsung S8+ displays in the slightest. I've read that there were millions of OLED display phones being sold back in the year 2003 actually but LG couldnt get it quite right around the year 2018 and it was the first time they even bothered to attempt it and they did so with POLED (plastic screen instead of a glass screen because yes it saves costs). Users complain about the quality of LGD's pOLED used in the Google's Pixel 3 | OLED-Info 

Theres actually a bunch of other sites that complained about pixel 3 panel quality and image retention many assumed that the issue was maybe a bad batch or review samples or that it could be fixed via software updates only to later learn nope its just LG's quality OLED panels but people still went ahead and used them anyway in the pixel 3 -why? because they're cheap cheap cheap. 

I see it as people buying drag racing cars because they heard they're the fastest then trying to drive to the supermarket/grocery store with it and lying to themselves by saying its ok as long as I bulldoze all the houses in between me and the grocery store so my usual driving route becomes better inline with my vehicles design and I place a massive engine limit on it so that its slower than even the slowest and worst cheap cars because you don't wanna have its engine wearing out or the tyres becoming worn down in just a few weeks. I dont care what a great price you bought your specialized hardware that is usually very expensive for or how happy you are to drive it slower than a dirt cheap small car from the 90's because you're using "the fastest vehicle at the slowest possible settings for uses it wasn't built for". Everybody's happy but I'm just confused as heck. I dont even know why VRR/freesync was allowed for use in OLED displays as gaming and desktop use are literally not inteded for OLED displays and to even attempt it means making them worse/lower output than even some of the very cheap display panels on the market forget samsungs, hisense and TCL's probably crap all over a OLED configured to be 1/5 of all its true ability. The only reason its in there is its in the HDMI 2.1 standard so they cant avoid putting it in is my guess.

Many people think that companies like samsung with their QLED are trying to make it sound like OLED. quantum dot sounds nothing like the word organic to me. And quantum dot predate organic LED displays by decades they've actually existed since like the 70's in lots of laboratories in a very though the original technology was named something else the term quantum dot is what its called when it was repurposed to a screen technology. As its just applying a nano coating to make it so that light from LED's will shine through at specific frequency wavelengths of light to produce accurate colours every time. Since the coating isn't organic it doesn't dry up or wear out. I never said OLED panels weren't good they're great for home cinema use, the best in a dark room with lots of people trying to crowd in front of the display and needing to sit on the side of the display but still see the full picture clearly. The excellent blacks of OLED displays make everything appear and seem richer and rather contrasting, and they do have higher refresh rates and are very good at being a TV which is why review sites love them and say how great they are. But these people are absolutely not talking about for computer use, they mean for playing back 4k discs and high quality video content. But for other content or "computer graphics" which use saturated colours you totally should use a "computer display" a quantum dot display is literally a big monitor thats brighter and clearer and has more accurate colours that's literally all it is at the end of the day so I just have to suggest or rather insist that people use a computer monitor display for a computer and graphics card use case scenario and cant figure out why people cant understand or even bother to know what the differences are.

Just because a drag racing car is "fastest and best" doesnt mean everyone on earth should own one and use it for their everyday driving. The same way an OLED panel is "best - for certain use cases" doesn't mean every TV maker on earth and every home user must have one. Unless you are going to buy both a quantum dot display panel and a OLED panel one for movies and one for PC use then knock yourself out go ahead fine by me. Samsung are just one of the people that make quantum dot display panels theres many brands such as vizio and hisense and oh who cares its not like people ever use a TV with a computer right? 

0 Likes

I dont think anyone will read your walls of ramblings sorry.

renderman
Adept I

I bought a Radeon RX5700XT believing i would be able to use it with a LG OLED tv using freesync/vrr, it appears however this is not possible. AMD has been notably quiet about this. Since AMD helped develop the VRR standard and has already implemented it in their chips for XBOX consoles, AMD should have all the knowledge they need to implement this on their graphics cards as well.

AMD and RTG in particular. Please give us an update on the status of this. Is it even still being worked on?

0 Likes
Anonymous
Not applicable

Hahah your post is obsolete bro, AMD's supported VRR for a couple years now.

My 2017 market release Samsung Q7 QLED TV Q7FN model received an update early 2018 to make it 4k120hz and support VRR and worked with xbox one X. If you noticed in the AMD control panel I've had for my previous RX 580 8GB card and my new 5700XT 8GB card. you will see it says Variable Refresh Rate freesync enabled!

https://i.imgur.com/plE6uoe.jpg 

When you mouse over the question mark tool tip thingy under freesync next to where it says "Variable Refresh Rate" it says "Provides smooth responsive gameplay by updating the display as frames become available - requires freesync compatible display" freesync and VRR are the same thing really just lower latency but VRR might be a touch better as you can probably disable vsync in most or hopefully all cases.

You do know you've gotta grab your TV's remote control press the menu button then find game mode and turn it on and have a HDMI 2.1 capable cable probably and in game mode look for freesync ultimate and turn it on. I found mine under external devices section of my TV's menu options. They also have the black levels and "UHD color" options in there which are SUPER SUPER IMPORTANT so yeah uhh maybe learn how to use a TV?

I'm pretty certain that Varriable refresh rates been working perfectly well for me as all it does is lower latency and smoother gaming the way freesync does. walking around in games like doom in vulkan I get some pretty good low latency at 1440p 120hz gaming (my 5700xt only outputs 1440p 120hz hdmi 2.0B from the HDMI port though it does 4k 120hz via display port I don't have any display port inputs on my TV and not sure if the GPU is DP++ but couldn't find a passive cable to order to try it out) Anyways check my mad low latency 1440p 120hz doom2016 in VULKAN with performance metrics set to high. Not sure if me spamming screenshot capture button on steam is like slowing things down though. I went into some full on demon filled areas with large open areas to strain the GPU with ultra settings all enabled but disabled AA and depth of field and motion blur. Increased anisotropic to 16x. Freesync on and surface format optimization on and texture quality set to high, everything else including vsync disabled in adrenaline software tessellation forced to off globally anisotropic set to off globally. 

https://i.imgur.com/dvG4Jj9.jpg 

https://i.imgur.com/xfepkuO.jpg 

me spamming screen shot button made mad spikes to performance and increased latency as did exploding groups of monsters all at once 3 to 5 at the same time and the FPS dipped considerably as i spammed screenshot. Each time I pressed screen shot F12 for steam the latency shot up to like 200ms and red.

https://i.imgur.com/6qMVTSG.jpg 

https://i.imgur.com/1NRrazw.jpg 

https://i.imgur.com/UIlcUwd.jpg 

all those other monitors showing off their specs its for 1080p not 1440p.. my TV games at like 4k120hz with VRR should be around 8ms to 10ms latency? and I'm pretty sure that's in game mode with colour and image enhancements. Not too bad. My TV's currently got the input source set to game console because PC doesn't let me adjust colour or contrast enhancer or any of the nice stuff my TV offers to make things look better in its settings config. It makes things look amazingly good and it still supports freesync mode as I think its meant for xbox one X console. I been gaming on VRR for ages without paying too much attention maybe the last year or two? its all in the display latency times other TV's got like 16ms latency.. at 1440p or 4k the latency gets really high/bad for all GPU's and TV's but AMD is best and Samsung TV's are some of the best for low latency and gaming. importantly Samsung QLED TV's have a 10 year guarantee no burn in or colour bleed.. because they aren't organic matter like OLED which sorta dries up or dies off its just some super bright LED's coated to be the exact colour like putting a coloured lens over an LED light so you cant really get burn in on a quantum dot display the way you used to get burn in on plasma because plasma is superheated gas guess what burns in on plasma? that's right its actually burning. Well OLED is like jellyfish glowing. Try make a jellyfish glow bright as the sun its gonna start to uhh not live for very long at all.. its why OLED panels aren't good for gaming/PC use. They're meant to play back Blu-ray discs in dark rooms with rows of seating. QLED TV's are gaming powerhouses.

Anyway Enjoy your VRR. Aren't you glad you bought AMD? Its why VULKAN is sooo awesome. Borderlands 3 and directx 12 gets me like a few ms higher latency on AVG CPU and GPU frametime when I enable their performance metrics and terrible FPS like what 108FPS if I run the benchmark with a mix of ultra high and medium settings I found gives me like 2-4 fps off all medium settings but looks worlds better.. You could try enabling performance metrics in your games or in the adrenaline settings for performance metrics overlay and configure all your stuff so you get lower latency and VRR and your games will be great.

0 Likes

Nope, what you are using on that samsung TV is freesync. There are many TV's that support HDMI forum VRR but not freesync. If AMD would just support HDMI forum VRR we could use that with our TV's. It seems like you are confusing the 2.

0 Likes
Anonymous
Not applicable

Then why did AMD release a press release about all their 480's and 580's supporting VRR around December maybe 2 years ago? and why does it say VARIABLE REFRESH RATE under the freesync word? and why does the question mark thing mention variable refresh rate and say that the variable refresh rate is only possible with a freesync compatible display? also how about my latency and Vsync off? this is not possible unless VRR.

Also fact my TV supports VRR it supports 4k120hz via HDMI since early 2018. check with samsung

Q7FN QLED from mid 2017-2018 it was on sale. Or you could check the TV review website rtings.com for reviewing my TV's VRR functionality and latency testing and full comprehensive review though they didnt review it in game console mode or game mode maybe?

0 Likes

You are confusing here two different things.

Freesync ist a variable refresh rate (VRR) technology, like Gsync or Adaptive Sync. But in this case we are talking about "HDMI Forum VRR" (also called VRR in short), which is NOT Freesync, Adaptive Sync or Gsync.

AMD promised VRR Support 2 years ago and a long time nothing happened. Now the RX6000 Series supports HDMI Forum VRR. This is actually working in HDMI 2.0, but not in HDMI 2.1 as promised.

So, your Samsung TV may support HDMI Forum VRR (and in addition Freesync), but AMD GPUs (older than RX6000) do not.

 

 

0 Likes

I got a 6400 and VRR doesn't works neither with HDMI or DisplayPort to HDMI, neither 10 bit.

0 Likes

You have to use Adrenalin 23.2.1 or newer. HDMI VRR is fixed with newer drivers. Also make sure, that it is enabled on your TV/Screen.

Displayport to HDMI will never work, because these adapters are active converters and don't support Freesync or HDMI VRR. Passive cables using DP++ are HDMI 1.4 and don't support it either.

0 Likes