Howdy.
I use Radeon Relive pretty frequently and its a feature I love.
Recently I was trying to capture replays from Red Dead Redemption, which runs in HDR, and as old posts on here show, the colors you get are either really washed out, or are overly dark.
I get HDR isnt a really common thing yet, I literally only have two games that support it. I also understand that the cards cannot encode 10 bit color, which is fine.
What would be nice is a semi-accurate remapping of the colors so that it doesnt look so trashy. Trying to adjust it after its recorded just looks plain ol bad
Does anyone have tips/tricks besides using the windows DVR tool? For some reason its able to capture SDR stills and video from an HDR game. Why cant relive do the same. You'd think it would be easier when its happening on the card lol.
Same issue in 2020 with latest "radeon software" CS:GO the HDR is so blown out it's just like yellow shine and black everywhere else, OBS seems to capture it fine?!
Hi,
All Video and FILM and TV is recorded in film photography colourspace.. and broadcast in those colourspaces..
its called YCBCR! and because the HDMI bandwidth is limited and needs like half of it for audio (its got like what 13 channels of audio most blurays maybe use 9 or so.. but atmos enhances those to up to 21 or so channels? the same way creative EAX worked in the 90's its the same stuff i guess) anyway since HDMI cant fit in the full signal there are cases like with every bluray on earth where the film is stored and transmitted with a colour compression called "chroma subsampling"which just basically makes deep navy blues and magentas look blurry and fuzzy and the crisp edges of tiny font harder to read.
All game consoles use 4:2:0 chroma subsampled output by default and are built around it. this has limited blacks so the blacks and picture looks washed out. So your TV detects a bluray player or your bluray player outputs in the correct black levels. You can find a black level setting in your TV or monitor if its got HDMI. anyway often its set to auto and tends to work. PC graphics is more precise and uses RGB. as its for work or photoshop or whatever and publishing and wasnt using film reels for a century. The YCBCR colourspace is sorta like how printing uses inks which is different colours to TV's so they use CMYK colourspace for a printer. TV and film and well game consoles tend to use YCBCR. but the level of compression of YCBCR is varying depending on your available bandwidth and resolution. If you set it to 720p most monitors can easily do max 4:4:4 but if you set it to 4k many older 4k displays and devices are made only for 4k30 at full quality 4:4:4 ..
you can see in your AMD display settings in the drivers that there is RGB limited and RGB FULL 4:4:4 16-255 for R and G and B
or 0-255 for full in an 8bit display. HDR is 10bits which is 1024 R and G and B and makes billions of colours vs 8bits 16million.
unfortunately 9999999.999% of software and apps dont support 10bit or HDR and when playing them back it looks washed out. Even the ones that do like the latest beta of windows 10 require a number of settings to be toggled on to get them to work. I have an extremely recent 2021 samsung TV neo qled and there a button i can hold on the remote which says the signal info and lets me know if freesync premium pro or no freesync and if HDR or not.. or view the info with a touch more options by holding the play pause button to show the game mode menu. similarly recording or capturing with different software and apps could cause it to look washed out after the fact. However some players that do playback correctly i'd found require the use of D3D11 renderer. However after i pointed out this fact as it allows you to use HARDWARE HEVC DECODING and makes linus tech tips reviewers of the 5700xt look like they dont know what a graphics card is and use fake X265 buggy stuff and no hardware decoding with Windows media video renderer 7 from like windows 98- im guessing, then claim the new RDNA 2 architecture has issues or is blocky and errors decoding..
EVEN the cheapest of android phones and tablets often with an AMD GPU in them or bargain bin TV's can hardware decode netflix HEVC hardware decoded with like a cheap via chip in them or something just fine but your hundreds of dollars dedicated graphics card meant for business and industry usage by graphics proffesionals for graphics professionals with HBM RAM and DDR 6x or whatever using hundreds of watts of power per second when your phone charges at 30 watts for an hour every 3 or 4 days and decodes the 4k HEVC netflix streams just fine or even the youtube VP9 2 or 3 or whatever they're up to now codec which had slightly higher hardware requirements claimed hollywood as they opted for HEVC as industry standard.. VP9 can be decoded just fine but your state of the art 5700xt couldnt? you guys have no idea how awesome it looks when you decode it with hardware and output via hardware on AMD correctly.
My advice maybe try put the file on USB stick and stick that into your TV directly.. or possibly the latest versions of potplayer after word got out it makes intel and nvidia hardware look absurdly obsolete the latest potplayer when selecting d3d11 renderer dont switch to d3d11 renderer.. when viewing playback info it shows D3D9 renderer.. but selecting 11 it says 11 but its actually running with 9 for some reason and looks like its full of errors with a green bar on the bottom for me and blurry smudgy and washed out. Trust me you will know when its working.
The alternative would be to build a player that accesses the d3d 11 renderer and hardware output yourself which maybe isnt as many lines of code as you'd imagine at all as its just accessing windows functions built into the OS for people who dont know how to code and do the things its to make life easier its why theres more apps on windows OS..
or use a video editing app to either convert the colourspace and or regrade 10bit to 8bit or something?
When playing back with potplayer on older versions to get it working you had to go into filters once d3d11 renderer was selected in video settings and 10bit output and hardware output enabled and the HW button toggled.. There was a video codecs tab in filters that had at the bottom of the page a video decoding page which lets you enable VLD bitstreaming for all of them and has checkboxes for DXVA .. you maybe needed to turn off FFdshow and enable DXVA and the prefer d3d11 renderer maybe needed to be toggled a few times to find the right one where it worked.. now all of them seem to say d3d11 renderer in config but play back as d3d9 which is awful. you've no idea how old that stuff is.. ideally you'd us a modern programming language like rust or dhart or C# and code with VULKAN and make a player that more directly outputs to the hardware and only ever works in the latest supported output methods. and even add in support for direct passthrough or bitstreaming of video and audio to the GPU for applicable formats and textures and photos.