Gentoo Forums
Gentoo Forums
Gentoo Forums
Quick Search: in
[SOLVED] Configuring two monitors sharing an NVIDIA GPU in X
View unanswered posts
View posts from last 24 hours
View posts from last 7 days

 
Reply to topic    Gentoo Forums Forum Index Desktop Environments
View previous topic :: View next topic  
Author Message
TheMasterBuilder
n00b
n00b


Joined: 04 Nov 2020
Posts: 5

PostPosted: Wed Jun 01, 2022 12:43 am    Post subject: [SOLVED] Configuring two monitors sharing an NVIDIA GPU in X Reply with quote

Hi,

I have two physical monitors connected to my NVIDIA GeForce GTX 1070 GPU (which form one virtual screen in X), one connected via HDMI and the other via DisplayPort. The two monitors are different sizes and have different resolutions and X detects the HDMI one (which has a lower resolution) as the primary monitor and sets the DPI (global, annoyingly) to the correct value according to that. I then have an xrandr command in my .xinitrc which scales the DisplayPort monitor by a crazy ratio that I calculated one time and while It looks a little rough it's not blurry so I can live with it. However, I want to use the DisplayPort monitor as the main monitor and I would like for X to detect the DisplayPort monitor as the primary monitor so that I don't need to calculate and set the DPI manually.

Any help is greatly appreciated.


Last edited by TheMasterBuilder on Sun Jun 05, 2022 7:12 pm; edited 1 time in total
Back to top
View user's profile Send private message
NeddySeagoon
Administrator
Administrator


Joined: 05 Jul 2003
Posts: 54234
Location: 56N 3W

PostPosted: Sun Jun 05, 2022 10:57 am    Post subject: Reply with quote

TheMasterBuilder,

Try this and ask questions.
_________________
Regards,

NeddySeagoon

Computer users fall into two groups:-
those that do backups
those that have never had a hard drive fail.
Back to top
View user's profile Send private message
TheMasterBuilder
n00b
n00b


Joined: 04 Nov 2020
Posts: 5

PostPosted: Sun Jun 05, 2022 12:15 pm    Post subject: Reply with quote

Well, the problem is that I don't know what the problem is or what questions to ask. I had already read the article you linked, and I think the section relevant to my problem is https://wiki.gentoo.org/wiki/Xorg/Multiple_monitors#Different_resolutions. However, it provides absolutely no information about how the problem was diagnosed or how the solution was devised other than that "there is nothing user friendly about the process", so it's not particularly useful.

I did manage to get somewhat far on solving it by myself by starting the X server with a DPI of 160 and then issuing the following command in .xinitrc:

Code:
xrandr --fb 6720x3780 --output DP-0 --pos 2880x0 --mode 3840x2160 --scale 1 --primary --output HDMI-0 --pos 0x0 --mode 1920x1080 --scale 1.5


After also changing the Xft DPI everything seems to be in order, however everything looks a little fuzzy/grainy. Afterwards I tried unplugging my HDMI monitor and letting X detect and configure everything automatically as if the DisplayPort monitor were the only monitor and that problem still persists.

Does this seem like a problem with X or the GPU driver?
Back to top
View user's profile Send private message
NeddySeagoon
Administrator
Administrator


Joined: 05 Jul 2003
Posts: 54234
Location: 56N 3W

PostPosted: Sun Jun 05, 2022 12:46 pm    Post subject: Reply with quote

TheMasterBuilder,

Once upon a time, Xorg (XFree as it was) did not have any autodetect at all.
Every install required that the user write an xorg.conf. Occasionally, users would destroy a CRT display by getting it wrong too.
The were assorted tools, like
Code:
X --configure
to do some of the guessing but that's long been broken too.

The automation does a reasonable job of setting up a single display for you as long as you can tolerate the default HID settings.

Anything more complex requires some xorg.conf fragments to describe what you need.

When you run a flat panel display at anything other than its native resolution (or integer fractions), something somewhere has to do image scaling.
This results in one pixel in the graphics card not mapping to a whole number of screen pixels.
Both displays and graphics card do it but it always looks a mess. That's a feature of discrete pixels.

I agree that its difficult to pose questions about things you don't completely understand and documentation is mostly useless too.
Man pages and the like tell you how to use something, not solve a particular problem.

Your xrandr is not wrong. You can write the same thing into a xorg.conf fragment if you like.
It cannot fix the pixel mapping feature.
_________________
Regards,

NeddySeagoon

Computer users fall into two groups:-
those that do backups
those that have never had a hard drive fail.
Back to top
View user's profile Send private message
TheMasterBuilder
n00b
n00b


Joined: 04 Nov 2020
Posts: 5

PostPosted: Sun Jun 05, 2022 1:01 pm    Post subject: Reply with quote

NeddySeagoon wrote:
When you run a flat panel display at anything other than its native resolution (or integer fractions), something somewhere has to do image scaling.
This results in one pixel in the graphics card not mapping to a whole number of screen pixels.


I recognize this problem. However, I am running the 4K monitor at its native resolution (tried both actual size and scaled to 200% - which is what I would want to use anyway) and setting the correct DPI. So it shouldn't need to do any (fractional) scaling for the DP monitor, right?

I think maybe I'm not setting the Xft DPI correctly, I tried a bunch of different values to see if I could find it by trial and error (as I understand it, it's not the same as the monitor DPI) but haven't been successful yet. Almost everything on my system is rendered by Pango which AFAIK uses this value. I'm not sure how to determine the correct value to put here, I haven't tried fractional values yet though so it's possible units of one was not fine enough granularity.

EDIT: Okay, I think I found the problem. X is detecting the wrong resolution based on the monitor's EDID and calculating a DPI from that, even though the nvidia-auto-select mode actually has the correct resolution and refresh rate. So that's why automatically configuring the monitor doesn't work and the DPI needs to be set manually.

EDIT2: I'm going to consider this issue closed, since I managed to settle on an acceptable configuration.
Back to top
View user's profile Send private message
Display posts from previous:   
Reply to topic    Gentoo Forums Forum Index Desktop Environments All times are GMT
Page 1 of 1

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum