I'm not certain but I believe Samsung make better quality 8k panels rather than their 4k panels, Also this years QLED 60R,70R and 80R and 90R aren't anywhere close to as good as the previous 2017 and 2018 model called the Q7FN and Q9FN in many aspects especially in colour gamut wideness and most specs across the board. so you may want to look into things like possibly the 900R rather than the 90R at least to compare or the previous Q9FN as they've tried to drastically slash down on cost like customers wanted maybe but it could have led to lower quality or something. But I'm almost certain if you go up in price a lot they've better TV's from samsung I heard which are micro LED they've even got one I've heard rumours of some time ago that's maybe for sale by now that costs a like million $$ called the window or something I heard as opposed to their famous "the wall" series. However The "colour calibration" guy does admit that native mode looks better for many colours but I believe its also to do with source material if you use YCBCR 420 material or 4:4:4 input material the Samsung TV looks amazingly better as YCBCR is the colourspace of all film and photography and the format that all Blu-ray discs are stored in and as they are HDR TV's you'd want to be playing back HDR content. However running my PC desktop in YCBCR will make the deep magentas and navy colours a touch saturated and the greens more vibrant as its converting from PC RGB to proper TV format the guys probably testing while hooked up to a PC or Blu-ray player that is set to RGB colourspace probably but I'm not sure about what is the proper input source material to test in for pro calibration but i'd be using TV formatting YCBCR 444 or 420 and playing back some HQ 10bit HDR filesource like Samsung: Travel With My Pet HDR UHD 4K Demo | 4K Media or Sony: Whale in Tonga HDR UHD 4K Demo | 4K Media Interestingly I previously owned a nvidia gtx 1060 and changing to YCBCR 444 from full RGB had little to no difference and nvidia cards enabling 10bit colour on the same display looked lacklustre and less impressive colourwise compared to AMD's 10bit colour when using the same Samsung TV there was a very noticeable difference in image quality especially when enabling GPU scaling on the AMD card compared to nvidia gtx 1060 where turning GPU scaling on may have made it look slightly worse or pretty much the same.
However believe it or not using a recent AMD graphics card and changing to ycbcr 444 makes 8k HQ photograph desktop wallpapers of landscapes and such become stunningly lifelike and beautiful though convertion to the correct colourspace may make some things slightly skewed as he mentions its still far more lifelike overall as its the intended usecase colourspace. Setting the TV's display input to gameconsole for colourful PC desktop use with image enhancement options or Blu-ray player for high quality output but increased latency and an intended 24fps playback. You can then playback those files using an excellent media player like daum potplayer and the windows 10 app stores HEVC extensions decoder and depending on the video content you are playing back like maybe Netflix in the windows 10 app stores Netflix app for higher quality playback support of 4k and HDR and atmos audio and such you could go into AMD display settings and under colour adjust the saturation slider slightly to better fill out peoples flesh tones and make the images more vibrant and popping. All of this using windows 10 in SDR mode and the native colourspace mode even gaming in games that support HDR10 mode enabling HDR and such actually looks less good than using native colourspace and game console or Blu-ray mode to adjust all the image quality settings for best appearance and then possibly a slight bump of maybe around 20% of the saturation slider some Netflix movies are terribly washed out though and may need heavier saturation to achieve vibrant HDR movie like images. But I'm not trained in calibrating TV's but with a Samsung TV you can crank the colour from 25/50 default all the way right up to 50 and colour and saturate the heck out of it if needed which isn't really necessary but its nice you can do it. Most calibrators would recommend a backlight setting of 6 or maybe 8 for Samsung TV's and using "movie mode" because they think it makes it match movies better or someshit but I just set the TV to "natural" mode and have a backlight of around 16 to 30 I've often got it set to 25. Maybe some slight accuracy vanishes and light bleed may become stronger but the images become way more life like looking. you MUST set colourspaces to auto in HDR mode instead of native though or things look washed out I believed that colourspaces should be set to AUTO to be as director intended or is that not how HDR works?
TLDR: I'm 90% certain that TV calibrators go for image test pattern reproduction values and not "how life like they can make it look" they just fiddle with numbers on the TV's setting to try and get their calibration software to show the best numbers or whatever but then the TV will be maybe set to a fraction of its default values and isn't necessarily reflective of what the TV can look like when adjusted to make it look how you want it to look rather than worrying about moire, uniformity and saturation levels being "close as possible to a film reel movie recorded in the 1970's"
I bought a Radeon RX5700XT believing i would be able to use it with a LG OLED tv using freesync/vrr, it appears however this is not possible. AMD has been notably quiet about this. Since AMD helped develop the VRR standard and has already implemented it in their chips for XBOX consoles, AMD should have all the knowledge they need to implement this on their graphics cards as well.
AMD and RTG in particular. Please give us an update on the status of this. Is it even still being worked on?
Hahah your post is obsolete bro, AMD's supported VRR for a couple years now.
My 2017 market release Samsung Q7 QLED TV Q7FN model received an update early 2018 to make it 4k120hz and support VRR and worked with xbox one X. If you noticed in the AMD control panel I've had for my previous RX 580 8GB card and my new 5700XT 8GB card. you will see it says Variable Refresh Rate freesync enabled!
When you mouse over the question mark tool tip thingy under freesync next to where it says "Variable Refresh Rate" it says "Provides smooth responsive gameplay by updating the display as frames become available - requires freesync compatible display" freesync and VRR are the same thing really just lower latency but VRR might be a touch better as you can probably disable vsync in most or hopefully all cases.
You do know you've gotta grab your TV's remote control press the menu button then find game mode and turn it on and have a HDMI 2.1 capable cable probably and in game mode look for freesync ultimate and turn it on. I found mine under external devices section of my TV's menu options. They also have the black levels and "UHD color" options in there which are SUPER SUPER IMPORTANT so yeah uhh maybe learn how to use a TV?
I'm pretty certain that Varriable refresh rates been working perfectly well for me as all it does is lower latency and smoother gaming the way freesync does. walking around in games like doom in vulkan I get some pretty good low latency at 1440p 120hz gaming (my 5700xt only outputs 1440p 120hz hdmi 2.0B from the HDMI port though it does 4k 120hz via display port I don't have any display port inputs on my TV and not sure if the GPU is DP++ but couldn't find a passive cable to order to try it out) Anyways check my mad low latency 1440p 120hz doom2016 in VULKAN with performance metrics set to high. Not sure if me spamming screenshot capture button on steam is like slowing things down though. I went into some full on demon filled areas with large open areas to strain the GPU with ultra settings all enabled but disabled AA and depth of field and motion blur. Increased anisotropic to 16x. Freesync on and surface format optimization on and texture quality set to high, everything else including vsync disabled in adrenaline software tessellation forced to off globally anisotropic set to off globally.
me spamming screen shot button made mad spikes to performance and increased latency as did exploding groups of monsters all at once 3 to 5 at the same time and the FPS dipped considerably as i spammed screenshot. Each time I pressed screen shot F12 for steam the latency shot up to like 200ms and red.
all those other monitors showing off their specs its for 1080p not 1440p.. my TV games at like 4k120hz with VRR should be around 8ms to 10ms latency? and I'm pretty sure that's in game mode with colour and image enhancements. Not too bad. My TV's currently got the input source set to game console because PC doesn't let me adjust colour or contrast enhancer or any of the nice stuff my TV offers to make things look better in its settings config. It makes things look amazingly good and it still supports freesync mode as I think its meant for xbox one X console. I been gaming on VRR for ages without paying too much attention maybe the last year or two? its all in the display latency times other TV's got like 16ms latency.. at 1440p or 4k the latency gets really high/bad for all GPU's and TV's but AMD is best and Samsung TV's are some of the best for low latency and gaming. importantly Samsung QLED TV's have a 10 year guarantee no burn in or colour bleed.. because they aren't organic matter like OLED which sorta dries up or dies off its just some super bright LED's coated to be the exact colour like putting a coloured lens over an LED light so you cant really get burn in on a quantum dot display the way you used to get burn in on plasma because plasma is superheated gas guess what burns in on plasma? that's right its actually burning. Well OLED is like jellyfish glowing. Try make a jellyfish glow bright as the sun its gonna start to uhh not live for very long at all.. its why OLED panels aren't good for gaming/PC use. They're meant to play back Blu-ray discs in dark rooms with rows of seating. QLED TV's are gaming powerhouses.
Anyway Enjoy your VRR. Aren't you glad you bought AMD? Its why VULKAN is sooo awesome. Borderlands 3 and directx 12 gets me like a few ms higher latency on AVG CPU and GPU frametime when I enable their performance metrics and terrible FPS like what 108FPS if I run the benchmark with a mix of ultra high and medium settings I found gives me like 2-4 fps off all medium settings but looks worlds better.. You could try enabling performance metrics in your games or in the adrenaline settings for performance metrics overlay and configure all your stuff so you get lower latency and VRR and your games will be great.
Nope, what you are using on that samsung TV is freesync. There are many TV's that support HDMI forum VRR but not freesync. If AMD would just support HDMI forum VRR we could use that with our TV's. It seems like you are confusing the 2.
Then why did AMD release a press release about all their 480's and 580's supporting VRR around December maybe 2 years ago? and why does it say VARIABLE REFRESH RATE under the freesync word? and why does the question mark thing mention variable refresh rate and say that the variable refresh rate is only possible with a freesync compatible display? also how about my latency and Vsync off? this is not possible unless VRR.
Also fact my TV supports VRR it supports 4k120hz via HDMI since early 2018. check with samsung
Q7FN QLED from mid 2017-2018 it was on sale. Or you could check the TV review website rtings.com for reviewing my TV's VRR functionality and latency testing and full comprehensive review though they didnt review it in game console mode or game mode maybe?
You are confusing here two different things.
Freesync ist a variable refresh rate (VRR) technology, like Gsync or Adaptive Sync. But in this case we are talking about "HDMI Forum VRR" (also called VRR in short), which is NOT Freesync, Adaptive Sync or Gsync.
AMD promised VRR Support 2 years ago and a long time nothing happened. Now the RX6000 Series supports HDMI Forum VRR. This is actually working in HDMI 2.0, but not in HDMI 2.1 as promised.
So, your Samsung TV may support HDMI Forum VRR (and in addition Freesync), but AMD GPUs (older than RX6000) do not.