View previous topic :: View next topic |
Author |
Message |
grooveman Veteran
Joined: 24 Feb 2003 Posts: 1217
|
Posted: Sat Oct 12, 2013 9:11 pm Post subject: [SOLVED] Optimus making me a pessimus... |
|
|
Hello again,
I have been beating my head against this wall for some time now, and I know I'm missing something here. I have read more than a dozen articles on Optimus and bumblebee, but I cannot find a good, definitive source on the subject. I think some of the articles are too old some are just not applicable, and some are just not well written -- or maybe I'm being stupid.. not sure. I'm hoping someone here can set me straight.
The most well written thread I've seen on the subject thus far has been this one: https://forums.gentoo.org/viewtopic-t-959568-postdays-0-postorder-asc-highlight-optimus-start-0.html
Since it seems like people have had some luck with it, it is the one I've been trying to base my new approach on.
However, I cannot even get X to start, when I use the xorg.conf suggested in it. It will run without any xorg.conf, however -- but without acceleration. When I try to do a glxgears, I get this error:
Code: | Xlib: extension "GLX" missing on display ":0.0".
Error: couldn't get an RGB, Double-buffered visual |
If I try to execute using optirun, I get an error"Optimus X server is not running" I don't even know if I'm supposed to use optirun...
I have set my opengl to nvidia.
My xorg.conf:
Code: | Section "ServerLayout"
Identifier "layout"
Screen 0 "nvidia"
Inactive "intel"
EndSection
Section "Device"
Identifier "nvidia"
Driver "nvidia"
BusID "PCI:1:0:0"
EndSection
Section "Screen"
Identifier "nvidia"
Device "nvidia"
# Uncomment this line if your computer has no display devices connected to
# the NVIDIA GPU. Leave it commented if you have display devices
# connected to the NVIDIA GPU that you would like to use.
#Option "UseDisplayDevice" "none"
EndSection
Section "Device"
Identifier "intel"
Driver "modesetting"
BusID "PCI:0:02:0"
EndSection
Section "Screen"
Identifier "intel"
Device "intel"
EndSection |
lspci | grep -i vga
Code: | 00:02.0 VGA compatible controller: Intel Corporation 3rd Gen Core processor Graphics Controller (rev 09)
01:00.0 VGA compatible controller: NVIDIA Corporation GF114M [GeForce GTX 675M] (rev a1 |
Essential package versions installed:
Code: | [ebuild R ] x11-proto/inputproto-2.3 ABI_X86="(64) (-32) (-x32)" 0 kB
[ebuild R ] x11-apps/xrandr-1.4.1 0 kB
[ebuild R ] x11-base/xorg-server-1.14.3:0/1.14.3 USE="kdrive nptl suid udev xorg -dmx -doc -ipv6 -minimal (-selinux) -static-libs -tslib -xnest -xvfb" 0 kB
[ebuild R ~] x11-drivers/nvidia-drivers-331.13 USE="X acpi (multilib) tools -pax_kernel" 0 kB
[ebuild R ] x11-drivers/xf86-input-evdev-2.8.1 0 kB
[ebuild R ] x11-drivers/xf86-input-synaptics-1.7.1 0 kB
[ebuild R ] x11-drivers/xf86-video-modesetting-0.8.0 0 kB |
uname -a Code: | Linux MSILT-lin 3.10.7-gentoo-r1 #1 SMP Fri Oct 11 10:22:34 UTC 2013 x86_64 Intel(R) Core(TM) i7-3630QM CPU @ 2.40GHz GenuineIntel GNU/Linux |
My laptop is a GT70 ONDUS. Bumblebee is installed and running. and bbswitch is installed and working (afaik).
The errors from Xorg.0.log:
Code: |
[ 16252.695]
X.Org X Server 1.14.3
Release Date: 2013-09-12
[ 16252.695] X Protocol Version 11, Revision 0
[ 16252.695] Build Operating System: Linux 3.10.7-gentoo x86_64 Gentoo
[ 16252.695] Current Operating System: Linux chrisMSILT-lin 3.10.7-gentoo-r1 #1 SMP Fri Oct 11 10:22:34 UTC 2013 x86_64
[ 16252.695] Kernel command line: root=/dev/sda7
[ 16252.695] Build Date: 10 October 2013 09:37:29PM
[ 16252.695]
[ 16252.695] Current version of pixman: 0.30.2
[ 16252.695] Before reporting problems, check http://wiki.x.org
to make sure that you have the latest version.
[ 16252.695] Markers: (--) probed, (**) from config file, (==) default setting,
(++) from command line, (!!) notice, (II) informational,
(WW) warning, (EE) error, (NI) not implemented, (??) unknown.
[ 16252.696] (==) Log file: "/var/log/Xorg.0.log", Time: Sat Oct 12 20:55:05 2013
[ 16252.696] (==) Using config file: "/etc/X11/xorg.conf"
[ 16252.696] (==) Using system config directory "/usr/share/X11/xorg.conf.d"
[ 16252.696] (==) ServerLayout "layout"
[ 16252.696] (**) |-->Screen "nvidia" (0)
[ 16252.696] (**) | |-->Monitor "<default monitor>"
[ 16252.696] (**) | |-->Device "nvidia"
[ 16252.696] (==) No monitor specified for screen "nvidia".
Using a default monitor configuration.
[ 16252.696] (**) |-->Inactive Device "intel"
[ 16252.696] (==) Automatically adding devices
[ 16252.696] (==) Automatically enabling devices
[ 16252.696] (==) Automatically adding GPU devices
[ 16252.696] (WW) The directory "/usr/share/fonts/TTF/" does not exist.
[ 16252.696] Entry deleted from font path.
[ 16252.696] (WW) The directory "/usr/share/fonts/OTF/" does not exist.
[ 16252.696] Entry deleted from font path.
[ 16252.696] (WW) The directory "/usr/share/fonts/Type1/" does not exist.
[ 16252.696] Entry deleted from font path.
[ 16252.696] (WW) `fonts.dir' not found (or not valid) in "/usr/share/fonts/100dpi/".
[ 16252.696] Entry deleted from font path.
[ 16252.696] (Run 'mkfontdir' on "/usr/share/fonts/100dpi/").
[ 16252.696] (WW) `fonts.dir' not found (or not valid) in "/usr/share/fonts/75dpi/".
[ 16252.696] Entry deleted from font path.
[ 16252.696] (Run 'mkfontdir' on "/usr/share/fonts/75dpi/").
[ 16252.696] (==) FontPath set to:
/usr/share/fonts/misc/
[ 16252.696] (==) ModulePath set to "/usr/lib64/xorg/modules"
[ 16252.696] (II) The server relies on udev to provide the list of input devices.
If no devices become available, reconfigure udev or disable AutoAddDevices.
[ 16252.696] (II) Loader magic: 0x809bc0
[ 16252.696] (II) Module ABI versions:
[ 16252.696] X.Org ANSI C Emulation: 0.4
[ 16252.696] X.Org Video Driver: 14.1
[ 16252.696] X.Org XInput driver : 19.1
[ 16252.696] X.Org Server Extension : 7.0
[ 16252.696] (II) xfree86: Adding drm device (/dev/dri/card0)
[ 16252.697] (--) PCI:*(0:0:2:0) 8086:0166:1462:10cb rev 9, Mem @ 0xf6400000/4194304, 0xd0000000/268435456, I/O @ 0x0000f000/64
[ 16252.697] Initializing built-in extension Generic Event Extension
[ 16252.697] Initializing built-in extension SHAPE
[ 16252.697] Initializing built-in extension MIT-SHM
[ 16252.697] Initializing built-in extension XInputExtension
[ 16252.697] Initializing built-in extension XTEST
[ 16252.697] Initializing built-in extension BIG-REQUESTS
[ 16252.698] Initializing built-in extension SYNC
[ 16252.698] Initializing built-in extension XKEYBOARD
[ 16252.698] Initializing built-in extension XC-MISC
[ 16252.698] Initializing built-in extension XINERAMA
[ 16252.698] Initializing built-in extension XFIXES
[ 16252.698] Initializing built-in extension RENDER
[ 16252.698] Initializing built-in extension RANDR
[ 16252.698] Initializing built-in extension COMPOSITE
[ 16252.698] Initializing built-in extension DAMAGE
[ 16252.698] Initializing built-in extension MIT-SCREEN-SAVER
[ 16252.698] Initializing built-in extension DOUBLE-BUFFER
[ 16252.698] Initializing built-in extension RECORD
[ 16252.698] Initializing built-in extension DPMS
[ 16252.698] Initializing built-in extension X-Resource
[ 16252.698] Initializing built-in extension XVideo
[ 16252.698] Initializing built-in extension XVideo-MotionCompensation
[ 16252.698] Initializing built-in extension XFree86-VidModeExtension
[ 16252.698] Initializing built-in extension XFree86-DGA
[ 16252.698] Initializing built-in extension XFree86-DRI
[ 16252.698] Initializing built-in extension DRI2
[ 16252.698] (II) LoadModule: "glx"
[ 16252.698] (II) Loading /usr/lib64/xorg/modules/extensions/libglx.so
[ 16252.706] (II) Module glx: vendor="NVIDIA Corporation"
[ 16252.707] compiled for 4.0.2, module version = 1.0.0
[ 16252.707] Module class: X.Org Server Extension
[ 16252.707] (II) NVIDIA GLX Module 331.13 Sun Sep 29 21:08:45 PDT 2013
[ 16252.707] Loading extension GLX
[ 16252.707] (II) LoadModule: "nvidia"
[ 16252.707] (II) Loading /usr/lib64/xorg/modules/drivers/nvidia_drv.so
[ 16252.707] (II) Module nvidia: vendor="NVIDIA Corporation"
[ 16252.707] compiled for 4.0.2, module version = 1.0.0
[ 16252.707] Module class: X.Org Video Driver
[ 16252.707] (II) LoadModule: "modesetting"
[ 16252.707] (II) Loading /usr/lib64/xorg/modules/drivers/modesetting_drv.so
[ 16252.707] (II) Module modesetting: vendor="X.Org Foundation"
[ 16252.707] compiled for 1.14.3, module version = 0.8.0
[ 16252.707] Module class: X.Org Video Driver
[ 16252.707] ABI class: X.Org Video Driver, version 14.1
[ 16252.707] (II) NVIDIA dlloader X Driver 331.13 Sun Sep 29 20:48:50 PDT 2013
[ 16252.707] (II) NVIDIA Unified Driver for all Supported NVIDIA GPUs
[ 16252.707] (II) modesetting: Driver for Modesetting Kernel Drivers: kms
[ 16252.707] (++) using VT number 8
[ 16252.714] (II) modesetting(0): using drv /dev/dri/card0
[ 16252.714] (EE) Screen 0 deleted because of no matching config section.
[ 16252.714] (II) UnloadModule: "modesetting"
[ 16252.714] (EE) Device(s) detected, but none match those in the config file.
[ 16252.714] (EE)
Fatal server error:
[ 16252.714] (EE) no screens found(EE)
[ 16252.714] (EE)
Please consult the The X.Org Foundation support
at http://wiki.x.org
for help.
[ 16252.714] (EE) Please also check the log file at "/var/log/Xorg.0.log" for additional information.
[ 16252.714] (EE)
[ 16252.727] (EE) Server terminated with error (1). Closing log file. |
And... I just noticed I'm getting about 3 of these a second in my /var/log/messages:
Code: | nvidia 0000:01:00.0: irq 47 for MSI/MSI-X |
I appreciate any help. Thank you!
G _________________ To look without without looking within is like looking without without looking at all.
Last edited by grooveman on Thu Nov 28, 2013 2:33 am; edited 2 times in total |
|
Back to top |
|
|
eyoung100 Veteran
Joined: 23 Jan 2004 Posts: 1428
|
Posted: Mon Oct 14, 2013 4:57 pm Post subject: |
|
|
Here is the Error:
Code: | [ 16252.714] (II) modesetting(0): using drv /dev/dri/card0
[ 16252.714] (EE) Screen 0 deleted because of no matching config section.
[ 16252.714] (II) UnloadModule: "modesetting"
[ 16252.714] (EE) Device(s) detected, but none match those in the config file.
[ 16252.714] (EE)
[ 16252.696] (**) |-->Screen "nvidia" (0)
[ 16252.696] (**) | |-->Monitor "<default monitor>"
[ 16252.696] (**) | |-->Device "nvidia"
[ 16252.696] (==) No monitor specified for screen "nvidia".
Using a default monitor configuration.
[ 16252.696] (**) |-->Inactive Device "intel" |
Fill in <Default Monitor> with a real name... _________________ The Birth and Growth of Science is the Death and Atrophy of Art -- Unknown
Registerd Linux User #363735
Adopt a Post | Strip Comments| Emerge Wrapper |
|
Back to top |
|
|
grooveman Veteran
Joined: 24 Feb 2003 Posts: 1217
|
Posted: Tue Oct 15, 2013 12:29 am Post subject: |
|
|
Thanks.
I figured out what that was all about, it was about uncommenting that line in my xorg.conf: Option "UseDisplayDevice" "none". I apologize, but I had been through about 6,000 permutations of xorg.conf options, and that is the one that happened to be coming up when I finally decided to post.
However, setting the UseDisplayDevice to none just gets me to the black screen that I have heard other people talk about. X seems to start without any complaints, but the entire screen is just black -- even with the xrandr commands in my .xinitrc, and with either GDM or with startx.
I haven't come across anything to fix the black screen... I have tried as root, as non-root, I have tried older packages of nvidia drivers... enabled bumblebee, disabled (still not sure if I'm supposed to be using this or not...).
So I guess this is a little progress... Anyone know what I'm missing here?
Thanks!
G _________________ To look without without looking within is like looking without without looking at all. |
|
Back to top |
|
|
ayvango Tux's lil' helper
Joined: 08 Feb 2012 Posts: 118
|
Posted: Wed Oct 16, 2013 2:55 am Post subject: |
|
|
read /usr/share/docs/nvidia-drivers-<your version>/html/randr14.html
You would see two xrandr commands needed to add to the .xinitr script there. They chain nvidia card output (initially set to nowhere so there is black screen) to intel card input.
Adding mentioned commands to init scripts is not actually needed, though. You can enter them manually from virtual console exporting proper DISPLAY variable to interact with running X server. |
|
Back to top |
|
|
grooveman Veteran
Joined: 24 Feb 2003 Posts: 1217
|
Posted: Wed Oct 16, 2013 10:41 am Post subject: |
|
|
ayvango wrote: | read /usr/share/docs/nvidia-drivers-<your version>/html/randr14.html
You would see two xrandr commands needed to add to the .xinitr script there. They chain nvidia card output (initially set to nowhere so there is black screen) to intel card input.
Adding mentioned commands to init scripts is not actually needed, though. You can enter them manually from virtual console exporting proper DISPLAY variable to interact with running X server. |
Thanks, but I already have those in my .xinitrc:
Code: | xrandr --setprovideroutputsource modesetting NVIDIA-0
xrandr --auto |
There must be something else... _________________ To look without without looking within is like looking without without looking at all. |
|
Back to top |
|
|
grooveman Veteran
Joined: 24 Feb 2003 Posts: 1217
|
Posted: Thu Oct 24, 2013 8:21 pm Post subject: |
|
|
Bump? _________________ To look without without looking within is like looking without without looking at all. |
|
Back to top |
|
|
Atmmac Tux's lil' helper
Joined: 17 Oct 2013 Posts: 130 Location: Watertown, MA
|
Posted: Fri Nov 08, 2013 7:28 pm Post subject: |
|
|
Can we get an update on this. I too am in the same boat here. No one has any answers about optimus on 3.10 do we even need an xorg.conf with this? On my debian install xorg.conf was not needed and you would hit an optirun or primusrun and you were golden. Here in Gentoo i have been in limbo for weeks and am extremely frustrated. I have been posting everywhere. No one can give me an answer. bbswitch with bumblebee seems to make sense to me and worked great on debian why no dice in Gentoo? I always get the following with an optirun.
[ 4415.955994] bbswitch: enabling discrete graphics
[ 4416.364499] pci 0000:01:00.0: power state changed by ACPI to D0
[ 4416.398870] nvidia: module license 'NVIDIA' taints kernel.
[ 4416.398871] Disabling lock debugging due to kernel taint
[ 4416.406076] vgaarb: device changed decodes: PCI:0000:01:00.0,olddecodes=io+mem,decodes=none:owns=none
[ 4416.406286] [drm] Initialized nvidia-drm 0.0.0 20130102 for 0000:01:00.0 on minor 1
[ 4416.406293] NVRM: loading NVIDIA UNIX x86_64 Kernel Module 331.20 Wed Oct 30 17:43:35 PDT 2013
[ 4416.415277] nvidia 0000:01:00.0: irq 46 for MSI/MSI-X
[ 4429.443311] NVRM: GPU at 0000:01:00.0 has fallen off the bus.
[ 4429.443317] NVRM: os_pci_init_handle: invalid context!
[ 4429.443318] NVRM: os_pci_init_handle: invalid context!
[ 4429.443322] NVRM: GPU at 0000:01:00.0 has fallen off the bus.
[ 4429.443323] NVRM: os_pci_init_handle: invalid context!
[ 4429.443324] NVRM: os_pci_init_handle: invalid context!
[ 4429.462716] NVRM: RmInitAdapter failed! (0x25:0x28:1157)
[ 4429.462722] NVRM: rm_init_adapter failed for device bearing minor number 0
[ 4429.462739] NVRM: nvidia_frontend_open: minor 0, module->open() failed, error -5
[ 4429.462966] NVRM: request_irq() failed (-22)
[ 4429.462968] NVRM: nvidia_frontend_open: minor 0, module->open() failed, error -22
If i try to do the xorg.xonf and switch the card with eselect opengl set and set the xrandr variables in .xinitrc after disabling xdm and going staright in with startx i get the same error as the OP. Can someone help us. I will even create an entire wiki on how to get it set up if we could just get some assistance.
Thanks,
-Andy |
|
Back to top |
|
|
grooveman Veteran
Joined: 24 Feb 2003 Posts: 1217
|
Posted: Wed Nov 27, 2013 9:49 pm Post subject: |
|
|
Ok. I got this. As I suspected, my confusion results from the lack of an explicitly stated, authoritative document for doing this. After meeting with several failures on this, the debris from each method began getting in the way.
MY problem stems from the fact that I had a shell script in /usr/local/bin called "optirun". The gentoo bumblebee package installs its own version of optirun, which is a binary file, and lives in /usr/run. The shell-script version was the result of following this howto -- which was very well written and may have worked at one time in the evolution of bumblebee, but does not work now. Stay away, lest you risk becoming as confused as I was.
So, in the interest of clarity, I will share what I have garnered, because I know a straight answer on this would have saved me a LOT of grief Bear in mind, I'm using bumblebee here, I believe there are other methods, but I'm not using them (yet...).
-You do NOT need to run two simultaneous sessions of X
-You DO need to emerge bumblebee
-You DO need to add yourself to the bumblebee group
-You do NOT need an .xinitrc
-You do NOT need an xorg.conf in /etc/X11 (at least I didn't, and I suspect you don't, if you do, it will be very simple.)
-It does NOT seem to matter if I have my opengl set to nvidia, or to xorg-x11, ymmv, reserve it for a trouble-shooting step if you are having problems
-You do NOT NOT NOT need an /etc/init.d/optimus script (another thing from that same article that was messing me up)
-You DO need bbswitch, and you should have that module load at boot
-Using bumblebee is incompatible with this post, Stay Away (unless you want to bail on bumblebee entirely).
-Do add ABI_X86="64 32" to your make.conf if you want to run wine or 32 bit games.
So, basically, I would just follow what is stated here, and you should be fine. It is really simple -- when you know which path to follow!
Hope it is helpful.
G _________________ To look without without looking within is like looking without without looking at all. |
|
Back to top |
|
|
dweezil-n0xad Apprentice
Joined: 30 Oct 2006 Posts: 156 Location: Ostend, Belgium
|
Posted: Tue Dec 03, 2013 2:31 pm Post subject: |
|
|
grooveman wrote: | -It does NOT seem to matter if I have my opengl set to nvidia, or to xorg-x11, ymmv, reserve it for a trouble-shooting step if you are having problems | I had to set my opengl to xorg-x11, otherwise the glx extension would not be loaded in xorg and steam would complain (Xlib: extension "GLX" missing on display ":0")
I run the games in steam with "primusrun %command%" and it works great _________________ i7-4790K | 16GB DDR3 | GTX 970 | 500GB SSD
ASUS N56VV | i7-3630QM | 12GB DDR3 | GT 750M | 256GB SSD |
|
Back to top |
|
|
grooveman Veteran
Joined: 24 Feb 2003 Posts: 1217
|
Posted: Wed Dec 04, 2013 3:07 am Post subject: |
|
|
I haven't bumped into that...
I notice you are using primusrun, and not optirun. I have optirun on my system, not primus run. How did you wind up with that? _________________ To look without without looking within is like looking without without looking at all. |
|
Back to top |
|
|
dweezil-n0xad Apprentice
Joined: 30 Oct 2006 Posts: 156 Location: Ostend, Belgium
|
Posted: Wed Dec 04, 2013 9:01 am Post subject: |
|
|
I followed this article on steam support about running Steam on the Intel GPU and only the graphically intensive games on the nvidia GPU: https://support.steampowered.com/kb_article.php?ref=6316-GJKC-7437
I installed primus from the bumblebee overlay. _________________ i7-4790K | 16GB DDR3 | GTX 970 | 500GB SSD
ASUS N56VV | i7-3630QM | 12GB DDR3 | GT 750M | 256GB SSD |
|
Back to top |
|
|
|
|
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum
|
|