Dear Vegan, I get the feeling I'm missing your point.
I, and probably more people already expect that HDMI 2.0 will require an update in the circuitry, wether it's easy or hard to do, it still requires some skills to do it yourself, or AMD (and/or board partners) has to do revisions of the card (and maybe they will in case of an R9 380X). I don't know if it requires additional changes in the IC's or a BIOS-update, my technical expertise will not get me that far.
Why is it time to 'crack open' Photoshop only when HDMI 2.0 panels become available? And why wouldn't DP 1.2/1.3 suffice?
And third it's obvious a single R9 290X would have issues with Crysis 3 to get a steady 60fp in 4K. Even a GTX 980 is probably not up to the task entirely.
So what are you really trying to say Vegan? What's the hidden message between the lines? I'm not trying to assault you here, I'm just really curious!
I would like more resolution for Photoshop, mostly for graphics arts reasons, that would even make 8K desireable
I say BF3 on 4 GTX Titans that could not hold 60fps
I have DisplayPort on my panel now, I simply perfer to use HDMI so I can use $5 swich boxes instead of $150 splitters
Red makes video cameras that are higher resolution than 4K and they have cards that work at those resolutions with no GPU to perk up performance
Originally posted by: Vegan I would like more resolution for Photoshop, mostly for graphics arts reasons, that would even make 8K desireable</end quote></div></p>
For resolutions beyond 4K/5K I guess DX12's framebuffer stacking capabilities would be a great addition. I'm curious when the first screens will arrive sporting an 8K resolution by itself. And what DP and/or HDMI version would be required to steer them.
I say BF3 on 4 GTX Titans that could not hold 60fps</end quote></div></p>
Titans are overrated and overpriced. The first Titan was even beat by their own 780Ti, and a single R9 290X. But they're apparently just like Apple's expensive smartwatch toys. People buy them anyway.
I have DisplayPort on my panel now, I simply perfer to use HDMI so I can use $5 swich boxes instead of $150 splitters</end quote></div></p>
Ah that's what you were on about, in case of the HDMI 2.0. I didn't know those splitters were that expensive.
Red makes video cameras that are higher resolution than 4K and they have cards that work at those resolutions with no GPU to perk up performance </p>
I'm curious if that's even useful. In case of photocamera's there's not much of a reason to make pictures beyond 10Megapixels or something, because it's harder to get a sharp image at those resolutions. Correct me if I'm wrong, but my understanding that at resolutions beyond like 10Mp, anomalies in the lens, in the surroundings (like dust particles), motion blur etc. become more apparent so you would get high-res blur in result. This might hold similar value to film camera's.
But maybe there are those highly advanced camera's which have some very expensive magic tricks making sharp 16K pictures possible.
my current camera is 16 megapixels and there are now models with even higher resolutions CCD sensors.
it appears that camera resolutions are going to increase over time to even more extreme levels
this is driven more by print requirements than screen capability
check out Red