cancel
Showing results for 
Search instead for 
Did you mean: 

General Discussions

Anonymous
Not applicable

Is nvidia still 10 years behind AMD or 5? RTX 2000 series cards vs xbox/ps4 rx480 as reference?

For the better part of a decade AMD cards have had good high quality GPU scaling. While nvidia's was often bested by even some of the very cheap TV panels or monitors scaling was generally lousy and often times you couldnt see a visible difference on or off or it was just a bit worse. They finally started to notice their cards were sorely lacking and tried to correct their slacking failure but cheaped out on it and faked it all with software.

If you buy a very cheap blu-ray player, media player, TV, phone or tablet its got an upscaling chip in there for the video in the graphics card and hardware usually expensive 4k bluray players have quality upscaling chips with proud writing on them saying 4k upscaling to make your regular blurays look a bit better and closer to how they would at 4k.  When you press play on your phones 4k screen or QHD display on a 720p video it gets upscaled to fill the screen. tablets and TV's will play your 720p video across the full 1080p panel or 4k panel using upscaling. With video feed they often use HQ upscaling for blu-ray players to 4k which takes lots of time to process the image and improve picture quality without blurring the image but since video playback is usually only 24fps or 30fps or 60 tops they dont need to do HQ image scaling much higher than 30FPS. AMD GPU's can upscale to whatever your displays resolution is like 4k or maybe 8k in high quality at 60hz with rather good quality and they've recently added integer scaling by popular demand as some people love to play old 8bit NES or SEGA games on their modern 4k and 8k displays with their PC so crisping up those analog TV signals into clean sharp pixels is win win. Upscaling chips arent too complicated or expensive after all they're in almost every device under the sun. 

Playstation 4, and Xbox One and almost all other consoles actually game at 30fps or 30hz for many game titles like the witcher 3 (an nvidia one and not optimized for AMD) or some more graphically intense titles perhaps bloodborne? and will use some fancy interleaving or other techniques to double the output and have it feel more fluid and be output at 60hz. 30 turns into 60 easily enough. However both consoles released a good number of years ago have modern AMD GPU's inside them, the same ones in fact.

Even the pro and X versions of playstation and xbox consoles have in fact the same AMD RX 480 graphics cards in them though perhaps slightly higher clocks or memory sizes. They're guaranteed by microsoft and sony to game at 4k 60hz. Sony used a special upscaling method in their AMD RX 480 GPU's years ago when playstation 4 pro was being developed and went to market called "checkerboard rendering" this lets 1080p become 4k with no real losses to image quality compared to a true 4k image and is easily doable and affordable on an RX 480 card unlike nvidia's lies that claim it "requires an AI super computer and for you to be connected to the internet always to optimize it for all your games". Sony just went "yep our AMD RX 480's can do it with checkerboard rendering" but they already did it anyway probably with the rx 480 and 580 and 5000 series GPU upscaling method or whatever but it may have slightly increased latency or not worked well with their framebuffering software buffer of output to double 30hz to 60hz with interleaving for 30hz game titles. So they probably had to come up with checkerboard rendering and Microsoft had to design a different console mainboard memory layout and GPU setup to allow it to work with HDMI 2.1 and freesync(VRR is freesync in its most up to date form). when nvidia cards announced freesync support as they had to to support VRR on TV's as everyones demanding it from them and not being able to game on TV's would reveal how awful nvidia are to the world obviously nvidia said they supported freesync then lied through their teeth about their software framebuffer frame doubling or interleaving or "smooth motion" method or whatever they use to falsify their frame rates thats preventing freesync a basically decade old free and open standard for monitor hardware from AMD from working.

AMD's graphics cards do SAY specifically that interleaving is in use for 30hz to make it 60hz as for any sort of modern display you need a minimum of 60hz for a proper desktop or gaming experience. Also nvidia's graphics settings maybe arent exactly adhereing to vesa standards. when you select 60hz is it 59.964hz or is it 60.000004 hz? is it just a number? does it really just do 60? whats up with that? but nvidia's cards dont mention interleaving on lower settings like 30hz but it must be there or you'd notice definitely.. and nvidia dont let you disable vsync theres vysnc or software controlled vsync software mode switching called "adaptive sync". AMD cards let you disable vsync and use freesync/VRR instead its far far better and lowers latency.

Are nvidia just not smart enough to figure out how to not use software vsync and just output frames from the card straight to the monitor hardware to let the monitor do everything and lower latency times? literally all they have to do is not do anything and let the monitor do it all... when nvidia announced that their cards are a steaming pile of garbage that only worked OK with freesync with 15 monitors out of thousands.. I felt something was very unusual that it may be a sign that nvidia's false framerates and benchmarks throughout the decades of the entire companies existence and business model may be more false than usual. So uhh years after xbox one X and ps4 pro were upscaling to 4k from 1080p no fuss and even regular PS4 or xbox one can maybe play a 720p video file and upscale it to your TV if its 8k or 4k.. nvidia finally figured out some method of say dividing 4k by 1080p gets us about 4x the number of pixels maybe say just multiplying all pixels on screen by 4 and somehow getting it working or like using checkerboard rendering or some other method with software and avoided using hardware upscaling and lied and said "ours is better but it needs internet and AI to do it" when they could have just used a very very inexpensive upscaling chip or whatever method the rx480 or phones or tablets or TV's or mediaplayer boxes or cable set top boxes are using.  Then no need for internet or "AI" or seeing everything in your computer and stuff.

Anybody able to figure out how nvidia's been getting away with all this DLSS 2.0 nonsense? also dont get me started on their RTX eww its really marketing rubbish they faking it all with software for everything instead of using a cheap chip and then charge far far more for it and people eat it up. Nvidia's marketing department are good at selling stuff. So are nvidia like primitive compared to AMD or what?

0 Likes
0 Replies