Currently have a two LCD monitors, one connected to DVI and one connected to analog. For some reason when I configured dual head, it decided, and I haven't been able to change this, that the analog was screen0, and the dvi was screen1. Anyway to swap these??
It's a known problem with the nvidia drivers. When using twinview, if there is an analog device and a DVI device connected, the driver makes the analog the primary screen no matter what.
It worked for me, though I have 2 VGAs and the first one is connected via a DVI2VGA plug converter...
the map is not the territory.
Military plans are at best 50% reliable; if more, war would only have winners.
Athlon64 X2 4200+, Gigabyte s969 NForce3 Mainboard, 1GB Ram, NVidia 6600 GT 128MB graphic card, Gentoo 2005.1 for X86_64
CaptainDangeax wrote:ve 2 VGAs and the first one is connected via a DVI2VGA plug converter.
That is the critical difference. When you have one digital/DVI monitor and one analog crt or lcd monitor, the driver always makes the analog monitor the primary. Some person from nvidia admits (on the forum post I linked) there is no way to set "primary monitor" in this case. If you have two analog or two digital monitors then I think you will not see this problem.
See nvidia driver override my "ConnectedMonitor" "DFP-0, CRT-1"
(**) NVIDIA(0): Option "ConnectedMonitor" "DFP-0, CRT-1"
(**) NVIDIA(0): Option "TwinView" "on"
(**) NVIDIA(0): Option "TwinViewOrientation" "RightOf"
(**) NVIDIA(0): Option "SecondMonitorHorizSync" "30 - 96.0"
(**) NVIDIA(0): Option "SecondMonitorVertRefresh" "50 - 75"
(**) NVIDIA(0): Option "MetaModes" "1024x768,1024x768;1024x768,NULL" (**) NVIDIA(0): ConnectedMonitor string: "DFP-0, CRT-1"
(**) NVIDIA(0): TwinView enabled
(--) NVIDIA(0): Linear framebuffer at 0xD8000000
(--) NVIDIA(0): MMIO registers at 0xF4000000
(II) NVIDIA(0): NVIDIA GPU detected as: GeForce 6600 GT
(--) NVIDIA(0): VideoBIOS: 05.43.02.23.04
(--) NVIDIA(0): Interlaced video modes are supported on this GPU
(II) NVIDIA(0): Detected PCI Express Link width: 16X
(--) NVIDIA(0): VideoRAM: 131072 kBytes (II) NVIDIA(0): Using ConnectedMonitor string "CRT-1, DFP-0"
Since the last time I wrote, I changed my first monitor (the one connected to the DVI via a DVI->VGA) for a Philips 200W6 DVI-D, and I kept the second standard VGA CRT one (LG55V). This config DOES NOT use Nvidia's twinview capacity, but Xorg twinview capacity. BTW, the two screens have separate desktops, and you can't drag a window from one to another. I tryed Nvidia's twinview, but I don't like that new opened windows to be cut between the 2 screens.
the map is not the territory.
Military plans are at best 50% reliable; if more, war would only have winners.
Athlon64 X2 4200+, Gigabyte s969 NForce3 Mainboard, 1GB Ram, NVidia 6600 GT 128MB graphic card, Gentoo 2005.1 for X86_64
Yes, if you don't use twinview then that will work. That is expected, because you have two independant desktops. But as you said you can't move windows between them.
I have little hope that nvidia will fix the twinview problem. I think I am just going to get two DVI monitors instead and then it won't matter.
Hi again
I recently upgraded the NVidia package from 81.74 to 87.something. Now I'm not able to get dual display to work anymore. I'm able to have the VGA as first and the DVI as second, but, guess what ? That's not what I want to use : I want the DVI at first. So I went to a NVIDIA-Twinview resolution, with the DVI on the left at 1680x1050 and the VGA on the right at 1400x1050. I cannot attach my new xorg.conf for the moment, the gentoo box is in use to test Vista (arggh!)
the map is not the territory.
Military plans are at best 50% reliable; if more, war would only have winners.
Athlon64 X2 4200+, Gigabyte s969 NForce3 Mainboard, 1GB Ram, NVidia 6600 GT 128MB graphic card, Gentoo 2005.1 for X86_64