Originally I'd posted about this in Drivers & Software and requested it be moved here...
but after pondering and learning a few things, I'd actually like that post removed please. 🙂
As a better way to rewrite that entire post, I have a Strix (R9390) I'd just gotten set up as well as a stronger Nvidia GPU that now has it's output disabled by the Strix (both drivers are installed, and my XFCE genmon script still displays temp and fan speed for the gpu (nvidia-smi)).
(credit to @elstaci for helping me out (easing discomforts) and dealing with me, as well as recommending me to post here)
I'm currently limited to HDMI ports, my secondary monitor uses an HDMI to VGA adapter on my nvidia while my primary monitor only uses HDMI on my Strix...
but the highest my VGA monitor will go is 1280x1024, so I need to upscale like I've done in this image of an older 3200x1200 setup with nvidia here:
https://ipfs.io/ipfs/QmacXrdR2RFbj2bv34qVsN1FzmvB9Lh5TsVBj9TnByCq2N
^ modern nvidia drivers break upscaling with XFCE, so that's why the root window doesn't fully extend, legacy drivers still work properly though...
... anyways (sorry about my autism), what I'd like to do currently is structure my xorg.conf for that second monitor on my nvidia GPU...
I know it's possible to do, I just don't know how, let alone how to properly structure an xorg.conf file (no info found (unless I'm overlooking something), only developer reference lists like this: https://www.computerhope.com/unix/xorg.htm )... 😛
Eventually I want to move that secondary VGA monitor to my Strix once I get a DP to VGA adapter, and just somehow configure application profiles through nvidia-settings to render using the nvidia GPU while the strix only displays the render result (no info found on this either, though people say it's possible).
I have a use case for the weaker strix as a primary display driver, though that use case won't come until later...
may as well go through the headaches now and get everything set up for then 😉
also one last thing...
if it's possible to get more direct status for both my GPUs than a frontend like nvidia-smi, I'd love to use that in my genmon script 😉
(this includes controlling fan speed and potentially GPU performance levels)
all my genmon script does is display GPU status at the left of my top panel, updated 4 times a second
(eventually I want to build a python graph to be far less resource hogging)
Thanks for everything! 🙂