View previous topic :: View next topic |
Author |
Message |
bender86 Guru
Joined: 18 Mar 2005 Posts: 484
|
Posted: Sat Sep 02, 2017 9:45 am Post subject: NVIDIA black screen with both Optimus and NVIDIA-only config |
|
|
Hello,
I am trying to set up my XPS 9560 (4K version) to use the discrete NVIDIA GPU, without success.
I followed this guide:
https://wiki.gentoo.org/wiki/NVIDIA/Optimus
But when I run startx I get a black screen. I can still CTRL+ALT+F1 and go to other consoles or CTRL-ALT-BACKSPACE. As far as I understand twm starts, but I don't get any output. I have
Code: |
xrandr --setprovideroutputsource modesetting NVIDIA-0
xrandr --auto
twm
|
in my .initrc file.
My /etc/X11/xorg.conf.d/10-nvidia.conf is
Code: |
Section "Module"
Load "modesetting"
EndSection
Section "ServerLayout"
Identifier "layout"
Screen 0 "nvidia"
Inactive "intel"
EndSection
Section "Device"
Identifier "nvidia"
Driver "nvidia"
BusID "PCI:1:0:0"
#Option "UseEdidDpi" "False"
#Option "DPI" "282 x 282"
EndSection
Section "Screen"
Identifier "nvidia"
Device "nvidia"
Option "AllowEmptyInitialConfiguration"
EndSection
Section "Device"
Identifier "intel"
Driver "modesetting"
#BusID "PCI:0:2:0"
#Option "AccelMethod" "sna"
EndSection
Section "Screen"
Identifier "intel"
Device "intel"
EndSection
|
The only error I have in Xorg.log.0 is related to glamorgl. It USE flag is enabled.
If I put xrandr --listproviders > file.txt in .xinitrc I get the expected output:
Code: | Providers: number : 2
Provider 0: id: 0x1fa cap: 0x1, Source Output crtcs: 0 outputs: 0 associated providers: 0 name:NVIDIA-0
Provider 1: id: 0x47 cap: 0x2, Sink Output crtcs: 3 outputs: 5 associated providers: 0 name:modesetting
|
Since it looks like the problem is the copy between the integrated Intel GPU and discrete NVIDIA GPU, I tried to run an NVIDIA-only setup, with this xorg.conf generated by nvidia-xconfig:
Code: | Section "ServerLayout"
Identifier "Layout0"
Screen 0 "Screen0"
InputDevice "Keyboard0" "CoreKeyboard"
InputDevice "Mouse0" "CorePointer"
EndSection
Section "Files"
EndSection
Section "InputDevice"
# generated from data in "/etc/conf.d/gpm"
Identifier "Mouse0"
Driver "mouse"
Option "Protocol"
Option "Device" "/dev/input/mice"
Option "Emulate3Buttons" "no"
Option "ZAxisMapping" "4 5"
EndSection
Section "InputDevice"
# generated from default
Identifier "Keyboard0"
Driver "kbd"
EndSection
Section "Monitor"
Identifier "Monitor0"
VendorName "Unknown"
ModelName "Unknown"
HorizSync 28.0 - 33.0
VertRefresh 43.0 - 72.0
Option "DPMS"
EndSection
Section "Device"
Identifier "Device0"
Driver "nvidia"
VendorName "NVIDIA Corporation"
BusID "PCI:1:0:0" # This is the only change I did
EndSection
Section "Screen"
Identifier "Screen0"
Device "Device0"
Monitor "Monitor0"
DefaultDepth 24
SubSection "Display"
Depth 24
EndSubSection
EndSection
|
But the outcome is always a black screen.
Relevant versions:
Code: | [ebuild R ] sys-kernel/gentoo-sources-4.12.5:4.12.5::gentoo USE="-build -experimental -symlink" 96,989 KiB
[binary R ] x11-base/xorg-server-1.19.3:0/1.19.3::gentoo USE="glamor ipv6 kdrive suid systemd udev xorg -debug -dmx -doc (-libressl) -minimal (-selinux) -static-libs -tslib -unwind -wayland -xcsecurity -xephyr -xnest -xvfb" 0 KiB
[binary R ] x11-drivers/nvidia-drivers-384.59-r1:0/384::gentoo USE="X acpi compat driver gtk3 kms multilib tools uvm -pax_kernel -static-libs -wayland" ABI_X86="32 (64) (-x32)" 0 KiB
|
Relevant kernel command line:
Code: |
acpi_rev_override=5 # Plus non-relevant root and init parameters
|
Kernel configuration should comply with the guide. Let me know if I should provide some more information about anything.
I found some information online (like this), but most people have troubles in setting up Optimus to get energy saving, it seems that a plain NVIDIA-only setup should be working. Moreover, I ran into an amount of kernel parameters that might or might not be obsolete (modeset=1, acpi_rev_override=1, acpi_rev_override=5, acpi_os=!, pcie_port_pm=off and many more), and I couldn't figure out what they do.
I made some attempts with bumblebee and it looked like it was working, but once the NVIDIA GPU was turned off, there was no way to turn it on again. On Windows it works.
Any idea? I want to be able to use the discrete card and the integrated one.
Last edited by bender86 on Sat Sep 02, 2017 3:23 pm; edited 1 time in total |
|
Back to top |
|
|
krinn Watchman
Joined: 02 May 2003 Posts: 7470
|
Posted: Sat Sep 02, 2017 10:24 am Post subject: |
|
|
you have no monitor define in your configuration, from what i know, the monitor is plug and handle by the intel card and the nvidia is blind to its presence.
so without any monitor define, you'll get back to auto detection, which fail on many monitors.
in your nvidia xorg.conf generated file, there's one monitor define, it is attach to Screen0, that use Device0 (that is the nvidia card).
Even if you don't want (or need, which might be need because of some poor monitor EDID) to use a custom EDID, i think this link provide a way better example of building a valid optimus configuration
https://wiki.gentoo.org/wiki/NVIDIA/Optimus/EDID_Xorg.conf_Example |
|
Back to top |
|
|
bender86 Guru
Joined: 18 Mar 2005 Posts: 484
|
Posted: Sat Sep 02, 2017 3:22 pm Post subject: |
|
|
I tried to add a Monitor section but nothing changed. I even tried to use the file you linked (commenting out the EDID lines) and nothing changed. Still black screen and no errors.
About the EDID, I'm not sure what that is, but I tried to follow the instructions here, but when I run get-edid I got this back:
Code: | > get-edid > a.dat
This is read-edid version 3.0.2. Prepare for some fun.
Attempting to use i2c interface
Looks like no busses have an EDID. Sorry!
Attempting to use the classical VBE interface
Performing real mode VBE call
Interrupt 0x10 ax=0x4f00 bx=0x0 cx=0x0
Function supported
Call successful
VBE version 300
VBE string at 0x11100 "Intel(R) SKL/KBL Mobile/Desktop Graphics Chipset Accelerated VGA BIOS"
VBE/DDC service about to be called
Report DDC capabilities
Performing real mode VBE call
Interrupt 0x10 ax=0x4f15 bx=0x0 cx=0x0
Function supported
Call successful
Monitor and video card combination does not support DDC1 transfers
Monitor and video card combination supports DDC2 transfers
0 seconds per 128 byte EDID block transfer
Screen is not blanked during DDC transfer
Reading next EDID block
VBE/DDC service about to be called
Read EDID
Performing real mode VBE call
Interrupt 0x10 ax=0x4f15 bx=0x1 cx=0x0
Function supported
Call failed
The EDID data should not be trusted as the VBE call failed
Looks like VBE was successful. Have a good day.
|
The last two lines are somewhat confusing, but I trust the first one more, since the a.dat file contains 128 NULL bytes.
I also tried to add the following command to .xinitrc:
Code: | xrandr --listmonitors > xrandrmonitors |
and the xrandrmonitors file contains
in every configuration involving the NVIDIA card.
When I use only the Intel card it contains
Code: | Monitors: 1
0: +*eDP1 3840/350x2160/190+0+0 eDP1
|
So it does look like there are no configured monitors. |
|
Back to top |
|
|
ct85711 Veteran
Joined: 27 Sep 2005 Posts: 1791
|
Posted: Sat Sep 02, 2017 4:37 pm Post subject: |
|
|
This is a common issue with optimus setups, most specifically on notebooks in that the intel graphics is the actual display driver, the nvidia is ONLY 3d rendering which is passed to the intel card to display, the nvidia card has NO actual display connections. This is why you need the intel graphics card in addition to the nvidia card or you can go with intel card solo.
The idea of optimus setups is that the built in intel graphics driver uses less power and the other graphics card (being neutered) is around to handle the heavy rendering (3D graphics and such) for the intel graphics card.. |
|
Back to top |
|
|
bender86 Guru
Joined: 18 Mar 2005 Posts: 484
|
Posted: Sat Sep 02, 2017 5:32 pm Post subject: |
|
|
ct85711 wrote: | This is a common issue with optimus setups, most specifically on notebooks in that the intel graphics is the actual display driver, the nvidia is ONLY 3d rendering which is passed to the intel card to display, the nvidia card has NO actual display connections. This is why you need the intel graphics card in addition to the nvidia card or you can go with intel card solo.
The idea of optimus setups is that the built in intel graphics driver uses less power and the other graphics card (being neutered) is around to handle the heavy rendering (3D graphics and such) for the intel graphics card.. |
Ok, that would be very acceptable for me. The problem is I get 0 monitors even when using both, as per Optimus instructions, and the screen stays black. So far I've only been able to use exclusively the Intel card alone. |
|
Back to top |
|
|
ct85711 Veteran
Joined: 27 Sep 2005 Posts: 1791
|
Posted: Sat Sep 02, 2017 6:13 pm Post subject: |
|
|
Ok, have you checked dmesg and /var/log/Xorg.0.log to see if says anything could help narrow down the issue? (When in doubt, put them in pastebin and we can check ourselves.) |
|
Back to top |
|
|
bender86 Guru
Joined: 18 Mar 2005 Posts: 484
|
Posted: Sat Sep 02, 2017 7:12 pm Post subject: |
|
|
dmesg
Xorg.log for NVIDIA+Intel
Xorg.log for Intel alone
There are a couple lines in Xorg.log that look strange (but I don't know if they really are issues):
Code: |
[ 114.101] (II) NVIDIA(0): Setting mode "NULL"
[ 114.104] (==) NVIDIA(0): Disabling shared memory pixmaps
|
Code: |
[ 114.037] (EE) modeset(G0): eglGetDisplay() failed
[ 114.037] (EE) modeset(G0): glamor initialization failed
|
glamor use is on for xorg-server.
Code: |
[ 114.474] randr: falling back to unsynchronized pixmap sharing
|
This is the last line during X startup, so I guess it corresponds when I call xrandr in my .xinitrc. And probably depends on the notice above.
In dmesg I can see that the NVIDIA GPU is allocated and freed within the same millisecond (second last line), while I would expect it to be kept on at least as long as X (I waited about 20 seconds before hitting CTRL+ALT+BACKSPACE). |
|
Back to top |
|
|
ct85711 Veteran
Joined: 27 Sep 2005 Posts: 1791
|
Posted: Sat Sep 02, 2017 9:42 pm Post subject: |
|
|
Ok, going through the logs, the Nvidia on, it does see the monitor and retrieves information about the monitor like it should; so that is good! On the intel only, I did not see it attempt to retrieve the monitor information ever, and it also loaded the nvidia driver too...
So, going with the nvidia and intel side; xorg is seeing everything as it should, so we are good on that part.
Now, have you tried running the xrandr commands yourself before starting Xorg? I am thinking it may be that those commands are not being executed before hand (possibly needs to be in a different file). After you run both xrandr commands, you could check the results and see that it lists 2 providers Code: | xrandr --listproviders |
|
|
Back to top |
|
|
bender86 Guru
Joined: 18 Mar 2005 Posts: 484
|
Posted: Sun Sep 03, 2017 7:02 am Post subject: |
|
|
ct85711 wrote: | Now, have you tried running the xrandr commands yourself before starting Xorg? I am thinking it may be that those commands are not being executed before hand (possibly needs to be in a different file). After you run both xrandr commands, you could check the results and see that it lists 2 providers Code: | xrandr --listproviders |
|
If I run xrandr, whatever options, before starting X, all I get is "Can't open display".
I read somewhere that they might be added to ~/.xsessionrc instead of ~/.xinitrc, but it made no difference for me. |
|
Back to top |
|
|
NeddySeagoon Administrator
Joined: 05 Jul 2003 Posts: 54208 Location: 56N 3W
|
Posted: Sun Sep 03, 2017 9:19 am Post subject: |
|
|
bender86,
An Optimus Graphics system is a graphics system and a half, not two complete graphics systems.
The Intel graphics works as expected. The nVidia graphics has no connection to the display, which saves an expensive power hungry video output switch.
Instead the two graphics engines share a pixel buffer. They can both draw in the shared pixel buffer but only the Intel can transfer the pixel buffer to the display.
A good first step is to make the Intel graphics system work alone, using the modesetting driver. That's the driver it will use later, when the nVidia chip is doing the drawing.
Until this works, the nvidia won't work either.
The Code: | xrandr --setprovideroutputsource modesetting NVIDIA-0
xrandr --auto | commands tell Xorg to do the drawing with the nVidia graphics chip.
The commands need to be run during the Xorg startup process.
Where they do depends on how you start Xorg. For startx ~/.xinitrc is correct. _________________ Regards,
NeddySeagoon
Computer users fall into two groups:-
those that do backups
those that have never had a hard drive fail. |
|
Back to top |
|
|
bender86 Guru
Joined: 18 Mar 2005 Posts: 484
|
Posted: Tue Sep 05, 2017 6:22 am Post subject: |
|
|
NeddySeagoon wrote: | bender86,
An Optimus Graphics system is a graphics system and a half, not two complete graphics systems.
The Intel graphics works as expected. The nVidia graphics has no connection to the display, which saves an expensive power hungry video output switch.
Instead the two graphics engines share a pixel buffer. They can both draw in the shared pixel buffer but only the Intel can transfer the pixel buffer to the display.
A good first step is to make the Intel graphics system work alone, using the modesetting driver. That's the driver it will use later, when the nVidia chip is doing the drawing.
Until this works, the nvidia won't work either. |
The intel GPU worked fine from start with no configuration. I am currently using the laptop, but I want to be able to use the NVIDIA GPU as well.
NeddySeagoon wrote: | The Code: | xrandr --setprovideroutputsource modesetting NVIDIA-0
xrandr --auto | commands tell Xorg to do the drawing with the nVidia graphics chip.
The commands need to be run during the Xorg startup process.
Where they do depends on how you start Xorg. For startx ~/.xinitrc is correct. |
Yes, I'm using startx. No point in adding gdm's complexity to all this :D |
|
Back to top |
|
|
Fitap Guru
Joined: 13 Mar 2011 Posts: 437 Location: Rosario, Argentina
|
Posted: Sat Dec 09, 2017 11:11 pm Post subject: |
|
|
Hello,
My notebook built in 2 GPU, one Intel and one nVidia.
I have exactly the same issue like you bender86
Code: |
[ 6.757] (EE) modeset(G0): eglGetDisplay() failed
[ 6.758] (EE) modeset(G0): glamor initialization failed
|
On intel card works fine, but when I wish to use nVidia card doesn't work.
Can you resolv?
Regards. |
|
Back to top |
|
|
Fitap Guru
Joined: 13 Mar 2011 Posts: 437 Location: Rosario, Argentina
|
Posted: Sun Dec 10, 2017 9:20 pm Post subject: |
|
|
Fitap wrote: | Hello,
My notebook built in 2 GPU, one Intel and one nVidia.
I have exactly the same issue like you bender86
Code: |
[ 6.757] (EE) modeset(G0): eglGetDisplay() failed
[ 6.758] (EE) modeset(G0): glamor initialization failed
|
On intel card works fine, but when I wish to use nVidia card doesn't work.
Can you resolv?
Regards.
|
Ok, I auto quote because I did resolve this issue.
I use SDDM like desktop manager and I misconfiguration for "xrandr" command like wiki said https://wiki.gentoo.org/wiki/NVIDIA/Optimus#Simple_Desktop_Display_Manager_.28SDDM.29
Cheers |
|
Back to top |
|
|
bender86 Guru
Joined: 18 Mar 2005 Posts: 484
|
Posted: Tue Dec 19, 2017 7:55 pm Post subject: |
|
|
Fitap wrote: | Hello,
My notebook built in 2 GPU, one Intel and one nVidia.
I have exactly the same issue like you bender86
Code: |
[ 6.757] (EE) modeset(G0): eglGetDisplay() failed
[ 6.758] (EE) modeset(G0): glamor initialization failed
|
On intel card works fine, but when I wish to use nVidia card doesn't work.
Can you resolv?
Regards. |
Unfortunately I can't help you. I gave up on this issue, I use Windows when I need to use the GPU. Perhaps I will try to fix it again when I have more time. |
|
Back to top |
|
|
Fitap Guru
Joined: 13 Mar 2011 Posts: 437 Location: Rosario, Argentina
|
Posted: Sun Dec 24, 2017 2:13 am Post subject: |
|
|
bender86 wrote: | Fitap wrote: | Hello,
My notebook built in 2 GPU, one Intel and one nVidia.
I have exactly the same issue like you bender86
Code: |
[ 6.757] (EE) modeset(G0): eglGetDisplay() failed
[ 6.758] (EE) modeset(G0): glamor initialization failed
|
On intel card works fine, but when I wish to use nVidia card doesn't work.
Can you resolv?
Regards. |
Unfortunately I can't help you. I gave up on this issue, I use Windows when I need to use the GPU. Perhaps I will try to fix it again when I have more time. |
Please, read quote myself after post on my own post.
Maybe that's your workaround, take in mind.
Regards. |
|
Back to top |
|
|
|
|
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum
|
|