cancel
Showing results for 
Search instead for 
Did you mean: 

General Discussions

eccentric
Adept III

Re: When will AMD add HDMI Forum VRR support?

Also I forgot to mention that Samsung TV panels have greater reproduction of the visible light spectrum so they're far far more lifelike images because they have wider colour gamut (more of the rec2020 colourspace reproduced) and have around 4 times the brightness they are closer to representing daylight as we see it while OLED is just a little better at representing night scenes with low illumination. as shadows tend to be lit by sunlight and are rarely jet black they're usually displayed just fine on other panel tech. The blacks on a quantum dot panel are still 0.000% and such with decimal places of zeros for the black level its just OLED panels have far far more zeros on the black level which isn't commonly seen in movies unless you have a movie about the colour vanta black maybe as its literally the blackest pigment/colouring we can paint things with in existance? But for TV and movies they tend to have a camera man and director carefully select the lighting and angles of lighting to better illuminate the subject they are filming or film movies in daytime or in buildings with bright levels of illumination like malls/shopping arcades or offices and coffee shops and such. Its very rare that you will see a video game where you explore in total darkness even in dark caves because if you people spent years creating game engines that handle global illumination/lighting and shadows well and to be totally honest if theres no light in dark cave in a video game you cant see where you are going and its not like you can feel your way around the way a real world dark cave is navigated by touch so video games ALWAYS have a sort of artificial illumination so we can always see whats being displayed to entirely avoid pitch blackness when in caves. 

Actually speaking of realistic lighting in dark caves, nvidia's showcasing of RTX lighting and shadows in a game like metro was absolutely hilarious! you see RTX cards cant ray trace in real time very well at all, in movies they usually have millions of rays and high numbers of bounces but for their hyped up RTX title metro exodus it literally only has a single RAY point in the entire game. The sun! and its barely got any rays or bounces though its adjustable its a joke because in a game where you fight mutant monster things underground in dark caves all day the literally only time you are seeing RTX in the game is when you are seeing light from the sun. Its not that they didn't want to have every light in the game be RTX ray points.. its just the cards cant physically do that and keep any sort of playable FPS. I laughed so hard when heaps of RTX demos showed off a game about dark tunnels and caves as a 'lighting technology demo' you could argue it may be doing some better reflections or something but all that stuff is possible on AMD hardware just the same without fancy billions of dollars of investor fraud gimmick as visibly demonstrated by the cryengine neon noire ray tracing demo. Also your mention of the OLED 'refresh times' blowing Samsung out of the water, the Samsung TV's have the lowest input latency, many TV's and panels are something like 800hz but the processing CPU and the HDMI cabling bandwidth or displayport and such can only handle 60 or 120hz and their image processing enhancement chips can only do motion rates smoothing at maybe 200hz.. So currently the "refresh times" don't matter at all because theres no hardware on earth capable of using them.. Yes oled can be in a lab tested to have very high refresh times but graphics cards and any computer chip cant film and capture at those times or output at those times.. because even the RAM memory timings and CPU cycles are only so fast inside machines. 

0 Likes
eccentric
Adept III

Re: When will AMD add HDMI Forum VRR support?

Thought I should link you to the commercials in question which actually use LG panels so yeah.

Samsung - QLED vs OLED - YouTube I saw this on the Samsung Australia youtube couple years back in ENGLISH.. I have no idea why I only see it on Samsung brasil or Samsung chile now? where did all their older commercials disappear to?

Here is a burn in checker for your TV. 

TV burn-in checker l Samsung - YouTube The first advert is for the previous models of Samsung and LG TV's but this latest commercial about OLED burn in checker was posted very recently in 2019. Not to mention input lag and other factors samsungs always been top of the pack with latency and such so yeah I wish people bothered to read up on available display technologies and chose the right ones for their use case instead of hearing about the best home cinema displays and trying to use them for PC and gaming use. You can hardly find any OLED monitors for sale and theres a very good reason for that.. they're overpriced and they burn in terribly ESPECIALLY if you disable the power saving/eco mode settings.

therg
Adept II

Re: When will AMD add HDMI Forum VRR support?

Unless you are going to talk about Freesync/VRR support on AMD ATI Vega 56 on the LG B9 stop talking as it is not relevant to me. I have a B9 on order, and a Vega 56. I dont watch news I wont have burn in issues, LG moves the pixels around, and I also on my LCD panel dont keep a web browser in a static position allready. You are wasting your time Samsung employee. Did you forget to mention LG OLED 0.2 MS response time Grey to Grey? and 7 ms input lag?

eccentric
Adept III

Re: When will AMD add HDMI Forum VRR support?

Please look up both the best Samsung display and the best oled display on RTINGS.com and compare as they have some of the more detailed TV panel testing and values that I've encountered unless you know of a better display review site you could recommend me?

eccentric
Adept III

Re: When will AMD add HDMI Forum VRR support?

The LG C9 does support freesync unless its not HDMI 2.1 compatible and the vega 56 supports freesync/VRR.. because even the RX 580 I previously owned supports freesync ultimate with my 2017 Samsung TV.

If you hooked your vega 56 **DIRECTLY** into the LG C9 TV via HDMI or displayport and then your AMD adrenaline software said freesync not supported still its because you didn't navigate your TV's menu options till you found the game mode setting and enabled game mode then enabled freesync. You MUST enable game mode and freesync on the display to use freesync at all.  In my TV its found under external devices or something like that in the menu options not in the picture settings or anywhere like that. Then you must set it on in the AMD adrenaline software, but then you should add a game profile to your AMD adrenaline software for each game you wish to play and set the freesync setting to ON then rather than trying to cap FPS you should instead use AMD chill feature so when gaming at 1080p and getting 200+fps you can set your chill settings to maybe 70min 144max and your GPU will save power your fans noise will be reduced and your input latency can be reduced by up to 64% its maybe the best way to ensure that freesync frame rates stay within your displays target range without setting a hard cap to match your display hz max refresh rate as this may cause stuttering if there are any conflicts when capping a displays fps output using FRTC or in game game engine caps/limits. Enable enhanced sync it will remove the display cap limit just ensure you are getting FPS that exceeds your display hz rate considerably so if you are gaming at 120hz you want to have FPS of 160 average or better so it never drops below the monitors max hz refresh rate.. when you use chill to set target fps min 70 or whatever floats your boat, it only goes lower when you arent moving the mouse around or pressing keys. and max of 120 I guess? it should then greatly reduce latency and save power and be quieter and be a much better gaming experience. 

therg
Adept II

Re: When will AMD add HDMI Forum VRR support?

therg
Adept II

Re: When will AMD add HDMI Forum VRR support?

Freesync and VRR HDMI 2.1 are different mate.

eccentric
Adept III

Re: When will AMD add HDMI Forum VRR support?

freesync didn't exist in TV's until VRR was implemented into them as far as I can see.. because freesync was implemented into HDMI 2.1's standard and the HDMI 2.1's standards don't list freesync separately its part of VRR standard to lower the latency and the variable refresh rate of the display is just to better match the low latency output of freesync. Since with freesync supported game titles and freesync displays you can sometimes disable Vsync ideally when your GPU is outputting within the target FPS of the freesync display. You may feel that limiting your FPS to targets below the max hz of the monitor output would decrease the speed but it actually increases it while saving power. So you see the variable refresh rate is just a better way of showing the freesync. the true magic behind VRR and how it works is freesync. 

therg
Adept II

Re: When will AMD add HDMI Forum VRR support?

Your on drugs if you think a comercial means anything at all, I wish people actually knew what tech they are talking about and pick the right OLED TV as its the best tech available. 0.2 MS Grey to Grey do you even read your own reviews? NO LCD in the world is that fast, input lag on the 2019 LG oled is lower then 7ms. My panel cost me 1700 AUD how much is your loverbow QLED TV? Notice how they use a Q to look like an O even though the tech is nowhere near as good. Samsung stopped making OLED TV's because LG had a way cheaper manufacturing process and smashed them in price.

eccentric
Adept III

Re: When will AMD add HDMI Forum VRR support?

Its great that LG and other TV makers though only 2 years or so behind samsung have caught up in terms of input lag of 7ms and 4k 120hz and VRR which includes freesync support. Its also great that their extremely cheap to produce OLED panels are finally coming down in price. You see LG OLED panels aren't as amazingly good as you think, theyre just dirt cheap heres proof. Companies like B.O.E (see: Sony) produce their own OLED panels however LG's dirt cheap manufacturing method and cost cutting meant that almost every big brand name TV manufacturer in the world doesn't bother making their own panels because they purchase LG OLED panels in bulk much cheaper than their own factories can churn them out, and even if they did make better quality OLED displays the cost would be much much higher closer to how much they cost when first hitting the market but OLED's can burn out in just a couple of years and the image quality will degrade as the colour blue fades and you get colour bleed over time. So instead everyone uses cheaper LG panels in an effort to turn OLED displays into more of a disposable product as very few people were willing to pay hundreds to have their OLED panels professionally recalibrated every 6 months to a year to squeeze more brightness and image quality out of your $20,000+ display as its organic materials were used up like the manufacturer intended and kept insisting on and the manuals stated to do so. People instead went, hmm looks a bit dimmer or the colour looks a little off i'll just crank it up a notch and bam they suddenly had their TV stop working in maybe a few weeks. Now every OLED panel has like 50 different burn-in protection settings you can enable which every single one of them takes away from the brightness usually or the image quality. And people buy an expensive OLED panel which has maybe 900nits or even finally recently managed to reach the 1,000 nits milestone and then disable everything, get rid of their desktop wallpapers set everything to black and do whatever they can to NOT USE THEIR DISPLAY as they normally would at all then decrease the brightness and vibrancy from stunning and beautiful all the way down to 200nits so they can hopefully get the panel to a few years or their expenses cant be justified. Why even bother buying a good OLED panel if you will run it at a 5th of what its capable of running and not use your game consoles and PCs as you normally would? You can see what I'm talking about in this video here 

One Year Update - OLED Burn-in Test – RTINGS.com - YouTube 

If you actually listen to whats coming out of the guys mouth saying "set to 200 nits"  because heaps of mobile phones are 500-700nits displays nowadays and especially if you have a Samsung AMOLED display you're better off watching everything on your phone than a 900 to 1000 nits OLED that cost thousands but is only being run at 200nits with all the settings turned off and super eco low power energy saver mode enabled so its so dim that if you have a window with daylight nearby coming into the room you wont be able to make out whats on TV well at all it will look like trying to use an older smart phones TFT/LCD display in direct sunlight as they had brightness of maybe 3-400nits and you couldnt really read anything off the screen in direct sunlight because it all seemed too dim. Samsung used to create OLED panels themselves and do still make some great ones, just they've used Active matrix OLED panels in all their flagship phones ever since around the samsung S3? LG couldn't even afford to or figure out how to create an OLED mobile phone panel until the pixel 3 i believe it was which is pretty recent and the first phone they ever made with OLED panels the pixel 3 had many display issues and rapidly developed ghosting in just a few hours of use and had issues with image quality and uniformity. A few months after the pixel 3's release LG engineers managed to tweak and fine tune their mobile phone OLED panel further and improve the exact same panel that was in the pixel 3 to help minimize its terrible ghosting and image quality issues and ended up using that OLED panel in the LGv30+ So the company thats supposed to be the worlds best in OLED struggled for close to a decade to do what samsungs been doing and then couldn't get anywhere near the quality of the Samsung S8+ displays in the slightest. I've read that there were millions of OLED display phones being sold back in the year 2003 actually but LG couldnt get it quite right around the year 2018 and it was the first time they even bothered to attempt it and they did so with POLED (plastic screen instead of a glass screen because yes it saves costs). Users complain about the quality of LGD's pOLED used in the Google's Pixel 3 | OLED-Info 

Theres actually a bunch of other sites that complained about pixel 3 panel quality and image retention many assumed that the issue was maybe a bad batch or review samples or that it could be fixed via software updates only to later learn nope its just LG's quality OLED panels but people still went ahead and used them anyway in the pixel 3 -why? because they're cheap cheap cheap. 

I see it as people buying drag racing cars because they heard they're the fastest then trying to drive to the supermarket/grocery store with it and lying to themselves by saying its ok as long as I bulldoze all the houses in between me and the grocery store so my usual driving route becomes better inline with my vehicles design and I place a massive engine limit on it so that its slower than even the slowest and worst cheap cars because you don't wanna have its engine wearing out or the tyres becoming worn down in just a few weeks. I dont care what a great price you bought your specialized hardware that is usually very expensive for or how happy you are to drive it slower than a dirt cheap small car from the 90's because you're using "the fastest vehicle at the slowest possible settings for uses it wasn't built for". Everybody's happy but I'm just confused as heck. I dont even know why VRR/freesync was allowed for use in OLED displays as gaming and desktop use are literally not inteded for OLED displays and to even attempt it means making them worse/lower output than even some of the very cheap display panels on the market forget samsungs, hisense and TCL's probably crap all over a OLED configured to be 1/5 of all its true ability. The only reason its in there is its in the HDMI 2.1 standard so they cant avoid putting it in is my guess.

Many people think that companies like samsung with their QLED are trying to make it sound like OLED. quantum dot sounds nothing like the word organic to me. And quantum dot predate organic LED displays by decades they've actually existed since like the 70's in lots of laboratories in a very though the original technology was named something else the term quantum dot is what its called when it was repurposed to a screen technology. As its just applying a nano coating to make it so that light from LED's will shine through at specific frequency wavelengths of light to produce accurate colours every time. Since the coating isn't organic it doesn't dry up or wear out. I never said OLED panels weren't good they're great for home cinema use, the best in a dark room with lots of people trying to crowd in front of the display and needing to sit on the side of the display but still see the full picture clearly. The excellent blacks of OLED displays make everything appear and seem richer and rather contrasting, and they do have higher refresh rates and are very good at being a TV which is why review sites love them and say how great they are. But these people are absolutely not talking about for computer use, they mean for playing back 4k discs and high quality video content. But for other content or "computer graphics" which use saturated colours you totally should use a "computer display" a quantum dot display is literally a big monitor thats brighter and clearer and has more accurate colours that's literally all it is at the end of the day so I just have to suggest or rather insist that people use a computer monitor display for a computer and graphics card use case scenario and cant figure out why people cant understand or even bother to know what the differences are.

Just because a drag racing car is "fastest and best" doesnt mean everyone on earth should own one and use it for their everyday driving. The same way an OLED panel is "best - for certain use cases" doesn't mean every TV maker on earth and every home user must have one. Unless you are going to buy both a quantum dot display panel and a OLED panel one for movies and one for PC use then knock yourself out go ahead fine by me. Samsung are just one of the people that make quantum dot display panels theres many brands such as vizio and hisense and oh who cares its not like people ever use a TV with a computer right? 

0 Likes