1 of 1 people found this helpful
Hmm that makes no sense since the HD7990 is supported with the current 16.8.3 drivers...
Have you tried Display Driver Uninstaller to nuke your old drivers before installing newer ones? I used to sometimes run into bad driver issues upgrading my Nvidia GTX 680 SLI drivers as sometimes I noticed in the device manager that one GPU's drivers never updated and thus both GPUs had different driver versions lol.
* Tell it to reboot into safe mode
* Once in safe mode? select "clean & restart"
* Once in normal boot? Install driver of choice
I would also attempt running at stock clocks at first just to be safe... tbh though? I don't think Doom supports Vulkan's Multi-Display Adapter (so far anyways) so it would be running in single-GPU mode... so until MDA gets added to Vulkan on Doom 2016? You might be better off in OpenGL mode... but of course I can't test this as I only have a single R9 390, but I'm happy with my Vsync lock at 60fps since that's all my monitor can do anyways.
Oh, another thing to watch out for in your settings on Doom 2016? Asynchronous Compute is currently only supported with Anti-Aliasing set to OFF or TSSAA! Here's proof of that: Tiago Sousa on Twitter: "https://t.co/LGFzQZTkoO DOOM GL vs Vulkan on an awesome AMD 480. Heads up benchmarkers, use TSS…
Just to clarify, Vulkan and DirectX 12 does NOT have SLI or Crossfire. There is no proprietary multi-GPU support. Both Vulkan and DirectX 12 support Multi-Display adapter (MDA), but Linked-Display Adapter (LDA) is only supported in DirectX 12.
MDA allows mixing of different GPUs to varrying success and also the potential to share video memory. Currently Ashes of the Singularity supports MDA but not shared video memory. It uses Alternative-Frame Rendering (AFR) thus video memory is cloned on each GPU. Shared video memory requires split-frame rendering (as seen on AMD's Mantle in Civilization V).
LDA does not allow mixing of different GPUs and uses CF (or on newer AMD GPUs via PCIE) or SLI bridges to communicate between two GPUs. This is very similar to CF and SLI, but yet again doesn't require any proprietary code to achieve, thus if a DirectX 12 game supports LDA? In theory it should scale and have the exact same compatibility on both AMD and Nvidia GPUs. An example of LDA is 3DMark's Time Spy DirectX 12 benchmark.
... I know, information overload lol but I wanted to spread some knowledge. Good luck!
Hey! That was a load of infos! We need more people like you, while I was reading I could only confirm what you wrote.
All the story about DX12 supporting MDA but is up to the developer to support it.
I thought on the long run and being able at least to have drivers with Vulkan and DX12 is going to be a must, anyway I've spent the last 4 Hours messing up with DDU (which I already knew and has been a life saver more that once) and with various version of drivers (15.11.1 and then upgrade, DDU, 16.7.3 downgrade... and so on) and I've managed to install the 16.7.3 with that awful crimson interface (I'll explain later why). The 16.8.3 managed to work too but all the HUD were messed up or missing in BF3-BF4 and Black Ops 2, fine with 16.7.3.
So techincally I'm ok with DX12 and Vulkan?
Radeon Software Version
Radeon Settings Version
Driver Packaging Version
The only thing that I'm missing right now are the HOTKEYS in CCC, Running a multidisplay setup forces my gpu at 501 Mhz in 3D mode idleing at 55 °C, having the hotkeys I could switch between forced 250Mhz Core - 150Mhz Mem and the full power 1100Mhz Mem - 1550Mhz Mem. The voltage is almost fine, while in 2D idles at 0.850 V and spikes at 1.210 V (she tries 1100Mhz even if is fixed at 250Mhz, usually it goes while watching youtube) but is fine enough, even at 1.210v idles at 38 °C.
So reccaping I had 2 Profiles:
2D: 250Mhz - 150 Mhz - 0.850v (spikes at 1.210 believing to be at 1100Mhz)
3D: 1100Mhz - 1550 Mhz - 1.210v
If hotkeys are not an option anymore I can use MSI Afterburner which forces the voltage too, the problem with it is that I can go low as 550Mhz Core and 750 mem...
OT: I have to repaste the GPU almost each month, temps went from a max of 75 to 90+, pop the heatsink off, half of the die is clean, no thermal paste on and looks like it has evaporated I thought the MX-4 was good
I'm sorry for any error in my poor english, I'm 17 and I live in Italy, here english is not that common. Feel free to correct me