cancel
Showing results for 
Search instead for 
Did you mean: 

Archives Discussions

dgolden
Journeyman III

OpenCL on both GPUs of a HD4870x2 card under Linux

The important thing when doing OpenCL on both GPUs of a HD4870x2 at least at time of writing is to NOT to have the usual internal CrossFire chain enabled, or even defined.

This is just at time of writing:  I guess AMD devs are busily working on making OpenCL work in the presence of CrossFire.

I could be plain wrong on various points, I'm just doing this brain dump now in a vague hope it's helpful to others.  I'm hoping to get a later-gen card ASAP, and then I guess I'll stop caring about the old beast. Maybe similar info still applies to the newer dual-GPU AMD cards, I dunno.

In amd-app-sdk 2.4 / fglrx 11.5, apparently still only one GPU is "beta" supported for OpenCL use on this by now old (and power-hungry) dual-GPU card - at least, as far as I can make out from the compatibility table footnotes.

But let's say you want to use both your GPUs for OpenCL anyway, let's call them GPU0 and GPU1, out of sheer bloodymindedness.

If you try it naively, having seen both GPUs apparently show up as separate OpenCL devices in "clinfo", you'll probably find things seem to work okay for a bit, but no faster than one GPU, then the kernel you have running on GPU1 will start spouting gibberish and everything will go horribly wrong.

If this happened, well, likely you've got crossfire enabled, as is a reasonable enough default configuration for graphical uses of such a dual-GPU card! See if there is a crossfire chain defined and enabled with "aticonfig --lsch".

"Avoid CrossFire" has been sort of common knowledge in AMD OpenCL land, but unlike two single GPU cards, it may not be all that obvious how to get a HD4870x2 to ...not be CrossFire-y...  If you try to use the relevant "aticonfig --cfd" options to apparently remove it, for example, it'll tell you to restart X11, and when you do, it'll just spring back into existence again.

So, the trick is to have an "/etc/X11/xorg.conf" that lists both your GPUs as separate devices with different BusIDs (check "lspci"), with their own X11 Screens - with both Screens active in the active ServerLayout. "aticonfig --initial -f --adapter=all" MAY do that, but you may be better off hand-constructing or at least inspecting and hand-tuning whatever it constructs.

On my two-output HD4870x2 card at least, it seems GPU0 is driving all real outputs (DFP1/DFP2 say).  GPU1 just apears to see a single sort of fake CRT1 that takes on the dimensions of first output on GPU0. But you must configure a separate X11 screen on it anyway.  If you don't, it'll apparently get autoadded to the CrossFire chain. You'll want something vaguely like the following example (please bear in mind this is not a complete xorg.conf, I'm assuming you're familiar enough with it to extrapolate the rest). Note xrandr multihead is still usable on Screen0, for multiple connected LCD panels.  Screen1 exists but is invisible (except if you mess about with x11vnc (i.e. a vnc server from an existing x11 screen)):


Section "ServerLayout"
    Identifier "Default Layout"
    Screen 0 "Screen0" 0 0
    Screen 1 "Screen1" RightOf "Screen0"
    InputDevice "Keyboard0"
    InputDevice "Mouse0"
    Option "Xinerama" "off"
EndSection

Section "Monitor"
     Identifier "0-CRT1"
     Option "VendorName" "ATI Proprietary Driver"
     Option "ModelName" "Generic Autodetecting Monitor"
     Option "DPMS" "true"
     Option "Disable" "true"
EndSection

Section "Monitor"
     Identifier "0-CRT2"
     Option "VendorName" "ATI Proprietary Driver"
     Option "ModelName" "Generic Autodetecting Monitor"
     Option "DPMS" "true"
     Option "Disable" "true"
EndSection

Section "Monitor"
    Identifier "0-DFP1"
    Option "VendorName" "ATI Proprietary Driver"
    Option "ModelName" "Generic Autodetecting Monitor"
    Option "DPMS" "true"
    Option "PreferredMode" "1024x768"
    Option "Position" "0 0"
    Option "Disable" "false"
EndSection

Section "Monitor"
    Identifier "0-DFP2"
    Option "VendorName" "ATI Proprietary Driver"
    Option "ModelName" "Generic Autodetecting Monitor"
    Option "DPMS" "true"
    Option "PreferredMode" "1024x768"
    Option "Position" "1024 0"
    Option "Disable" "false"
EndSection

Section "Monitor"
    Identifier "1-CRT1"
    Option "VendorName" "ATI Proprietary Driver"
    Option "ModelName" "Generic Autodetecting Monitor"
    Option "DPMS" "true"
    Option "Disable" "true" # this doesn't stick...
EndSection

Section "Device"
   Identifier "Card0"
   Driver "fglrx"
   Option "Monitor-CRT1" "0-CRT1"
   Option "Monitor-CRT2" "0-CRT2"
   Option "Monitor-DFP1" "0-DFP1"
   Option "Monitor-DFP2" "0-DFP2"
   BusID "PCI:7:0:0"
EndSection

Section "Device"
   Identifier "Card1"
   Driver "fglrx"
   Option "Monitor-CRT1" "1-CRT1"
   BusID "PCI:8:0:0"
EndSection

Section "Screen"
    Identifier "Screen0"
    Device "Card0"
    DefaultDepth 24
    SubSection "Display"
       Virtual 2048 1024
       Depth 24
    EndSubSection
EndSection

Section "Screen"
    Identifier "Screen1"
    Device "Card1"
    DefaultDepth 24
    SubSection "Display"
       Virtual 2048 1024
       Depth 24
    EndSubSection
EndSection



You'll know your de-crossfiring has worked when you can get something like this:

$ aticonfig --lsch
No CrossFire chains defined

$ aticonfig --lscc

Master adapter:  0. 07:00.0 ATI Radeon HD 4870 X2
    Candidates:  none
Master adapter:  1. 08:00.0 ATI Radeon HD 4870 X2
    Candidates:  none

Once you get that magic CrossFire-chain-less diagnostic output, you can successfully use both GPU devices at once with OpenCL  (Make sure to set COMPUTE=:0, and not :0.0 or :0.1, as you'll likely have DISPLAY=:0.0)

Strangely, clock speed doesn't seem to automatically ramp up no matter how much OpenCL stuff you thow at it, so you may want to use "amdcccle" gui option "3d->More Settings->Force Maximum Performance Clocks" (you'll have to use the screen selector idget in the upper right to do that for both x11 screens i.e. both GPUs, independently.)  And/or maybe you could try fiddling with aticonfig's overdrive options, but on your own head be it - be careful.  I've stupidly managed to allow my card to get pretty hot on occasion, not wise - be sure to use "aticonfig --adapter=all --od-gettemperature" sometimes.

 

Note the above is totally NOT the xorg.conf you want for OpenGL 3D
gfx with a HD4870x2 !!!

To use your two gpus working in crossfire mode for OpenGL linux software like quake4, etc.*, you want just one device section with GPU0 listed, or at least the second _not_ associated with any active x11 Screen section in an active ServerLayout section. That should wind up giving you:

$ aticonfig --lsch

CrossFire chain for adapter 0, status: enabled
  0. 07:00.0 ATI Radeon HD 4870 X2
  1. 08:00.0 ATI Radeon HD 4870 X2

(Chances are the crossfire chain will also be auto-defined and auto-enabled, but if not, try the relevant aticonfig options to recreate it)

* Apparently only a very limited selection of fullscreen linux/x11 opengl apps are officially tested and enabled to work in CrossFire mode. However, I have found that some (presumably thoroughly AMD unsupported) messing around with "/etc/ati/atiogl.xml" can be used to try CrossFire rendering for other OpenGL apps.  This may or may not be a win depending on the individual app/game, presumably as per the crossfire writeup on the radeon developer docs site. (try e.g. logging the output of a loop calling "aticonfig --adapter=all --od-getclocks" to see if a given fullscreen opengl program is hitting both of your gpus.  Even if it is, it doesn't mean it's benefitting from doing so)

0 Likes
0 Replies