I have a Laptop (model HP ProBook 450 H0V97EA). It comes with an Intel® Core™ i5-3230M Processor 2,60GHz dual-core processor (3rd generation x64) with a integrated video card (Intel HD Graphics 4000) plus a dedicated GPU (AMD Radeon HD 8600/8700).
I first noticed I had two different cards just recently. However I am having a hard time trying to get my dedicated card to work. I was originally with the drivers that Windows (I'm on Windows 8.1) automatically finds & installs at start up. I've tried manually finding the latest version of the driver & then the AMD Driver Autodetect Utility & finally the Crimson Edition Graphics Driver Installer. I have used the Display Driver Uninstaller when uninstalling the old drivers.
Right now the AMD driver version is 15.300.1025.0. Radeon Crimson says it is Sofware version 15.11. The Intel Video Card driver version is 10.18.10.4276.
I used the Catalyst Center to optimize the laptop to work on high performance and I have set specific application to run high performance too. Yet the laptop seems to be using my integrated Intel card. The Graphics load just fine, but pretty soon the machine gets overheated and starts stuttering and lagging. I'm trying to run Fallout3 and at the Launcher screen where you can chose which to use I'm only given the Intel option. Yet when I try to run the Switchable Graphics application monitor it says, the game runs on high performance.
If I try to disable the Intel HD Graphics 4000 driver from the device manager, the game says that I only have the "Windows default driver" and wont play.
I have checked in the Bios. The only relevant option I see is a checkbox for switchable graphics which already ticked.
Does anybody have an idea about what to do
Is it possible my dedicated GPU has somehow been damaged, even though it's detected and has a driver installed?