cancel
Showing results for 
Search instead for 
Did you mean: 

General Discussions

Tcll
Adept I

Information needed for manually building a custom xorg.conf

Originally I'd posted about this in Drivers & Software and requested it be moved here...
but after pondering and learning a few things, I'd actually like that post removed please. 🙂

As a better way to rewrite that entire post, I have a Strix (R9390) I'd just gotten set up as well as a stronger Nvidia GPU that now has it's output disabled by the Strix (both drivers are installed, and my XFCE genmon script still displays temp and fan speed for the gpu (nvidia-smi)).
(credit to @elstaci for helping me out (easing discomforts) and dealing with me, as well as recommending me to post here)

I'm currently limited to HDMI ports, my secondary monitor uses an HDMI to VGA adapter on my nvidia while my primary monitor only uses HDMI on my Strix...
but the highest my VGA monitor will go is 1280x1024, so I need to upscale like I've done in this image of an older 3200x1200 setup with nvidia here:
https://ipfs.io/ipfs/QmacXrdR2RFbj2bv34qVsN1FzmvB9Lh5TsVBj9TnByCq2N
^ modern nvidia drivers break upscaling with XFCE, so that's why the root window doesn't fully extend, legacy drivers still work properly though...

... anyways (sorry about my autism), what I'd like to do currently is structure my xorg.conf for that second monitor on my nvidia GPU...
I know it's possible to do, I just don't know how, let alone how to properly structure an xorg.conf file (no info found (unless I'm overlooking something), only developer reference lists like this: https://www.computerhope.com/unix/xorg.htm )... 😛

Eventually I want to move that secondary VGA monitor to my Strix once I get a DP to VGA adapter, and just somehow configure application profiles through nvidia-settings to render using the nvidia GPU while the strix only displays the render result (no info found on this either, though people say it's possible).

I have a use case for the weaker strix as a primary display driver, though that use case won't come until later...
may as well go through the headaches now and get everything set up for then 😉

also one last thing...
if it's possible to get more direct status for both my GPUs than a frontend like nvidia-smi, I'd love to use that in my genmon script 😉
(this includes controlling fan speed and potentially GPU performance levels)
all my genmon script does is display GPU status at the left of my top panel, updated 4 times a second
(eventually I want to build a python graph to be far less resource hogging)

Thanks for everything! 🙂

0 Likes
2 Replies
Tcll
Adept I

alright, I've solved the major issue on my end and bought myself 2 new 27" monitors
got ripped off by HP expecting more value than advertised for more price, but only got 1080p_60 sadly u.u
it's a downgrade from 1200p_75, but whatever... at least color is accurate... though the DPI is window-screen quality
it's exactly the monitor I was trying to avoid buying 😄
thankfully it's only temporary =3=

anyways, so I now have both monitors running off the strix while my nvidia is on idle...
how can I downscale 1440p to 1080p like I could with nvidia??

the main problem I have with these monitors aside from the garbage 60Hz is everything is now so huge and in your face
not sure why 27" can't do 1440p natively, but I know xorg can absolutely downscale...

if I can get a dual 1440p setup going, I should be set...
I just need to figure out how to make games render on the nvidia, passing the result through PCIe to the strix to display

would anyone have any references for both??
thanks 🙂

oh hey, I found this for offloading processes to my nvidia GPU:
https://wiki.archlinux.org/index.php/PRIME#PRIME_GPU_offloading

though my friend tells me him and his friend experienced low performance with this...
but if it works well for me, then it solves that issue here...

hopefully it works as I expect it to:

CPU (game)
|
x16
|
nvidia (render game)
|
x8
|
strix (merge render result with desktop)
|
display