Gentoo Forums
Gentoo Forums
Gentoo Forums
Quick Search: in
xorg-server-1.17, nvidia-drivers, kernel-3.18+ [2xSOLVED]
View unanswered posts
View posts from last 24 hours

 
Reply to topic    Gentoo Forums Forum Index Desktop Environments
View previous topic :: View next topic  
Author Message
SemmZemm
n00b
n00b


Joined: 05 Jul 2013
Posts: 62
Location: France, Russia

PostPosted: Fri Feb 13, 2015 3:17 pm    Post subject: xorg-server-1.17, nvidia-drivers, kernel-3.18+ [2xSOLVED] Reply with quote

Hi, I have a laptop with nvidia optuimus card and I used to configure it as it is written in nvidia-drivers man (with XRandR).
Everything worked fine, here is my old configuration:

Code:
Section "ServerLayout"
   Identifier "layout"
   Screen 0 "nvidia"
   Inactive "intel"
EndSection

Section "Device"
   Identifier "nvidia"
   Driver "nvidia"
   BusID "PCI:1:0:0"
EndSection

Section "Screen"
   Identifier "nvidia"
   Device "nvidia"
#   Option "UseDisplayDevice" "none"
   Option "AllowEmptyInitialConfiguration"
EndSection

Section "Device"
   Identifier "intel"
   Driver "modesetting"
   BusID "PCI:0:2:0"
EndSection

Section "Screen"
   Identifier "intel"
   Device "intel"
EndSection

Section "InputClass"
   Identifier "touchpad"
   Driver "synaptics"
   MatchIsTouchpad "on"
      Option "TapButton1" "1"
      Option "TapButton2" "2"
      Option "TapButton3" "3"
      Option "VertEdgeScroll" "on"
      Option "VertTwoFingerScroll" "on"
      Option "HorizEdgeScroll" "on"
      Option "HorizTwoFingerScroll" "on"
      Option "CircularScrolling" "on"
      Option "CircScrollTrigger" "2"
      Option "EmulateTwoFingerMinZ" "40"
      Option "EmulateTwoFingerMinW" "8"
      Option "CoastingSpeed" "0"
EndSection


Everything worked fine until the recent update to xorg-server-1.17, for which I had to remove manually xf86-video-modesetting.

After that my X didn't start. Xorg.0.log:
Code:
[  4556.379]
X.Org X Server 1.17.1
Release Date: 2015-02-10
[  4556.384] X Protocol Version 11, Revision 0
[  4556.386] Build Operating System: Linux 3.17.4-gentoo x86_64 Gentoo
[  4556.387] Current Operating System: Linux szldlc 3.17.4-gentoo #2 SMP PREEMPT Thu Jan 22 15:14:07 CET 2015 x86_64
[  4556.387] Kernel command line: root=/dev/sdb2 video=uvesafb:mtrr:4,ywrap,1920x1080-32@100 acpi_osi=Linux acpi_backlight=vendor
[  4556.391] Build Date: 13 February 2015  03:24:53PM
[  4556.392] 
[  4556.394] Current version of pixman: 0.32.6
[  4556.397]    Before reporting problems, check http://wiki.x.org
   to make sure that you have the latest version.
[  4556.397] Markers: (--) probed, (**) from config file, (==) default setting,
   (++) from command line, (!!) notice, (II) informational,
   (WW) warning, (EE) error, (NI) not implemented, (??) unknown.
[  4556.404] (==) Log file: "/var/log/Xorg.0.log", Time: Fri Feb 13 17:05:09 2015
[  4556.405] (==) Using config file: "/etc/X11/xorg.conf"
[  4556.407] (==) Using config directory: "/etc/X11/xorg.conf.d"
[  4556.408] (==) Using system config directory "/usr/share/X11/xorg.conf.d"
[  4556.408] (==) ServerLayout "layout"
[  4556.408] (**) |-->Screen "nvidia" (0)
[  4556.408] (**) |   |-->Monitor "<default monitor>"
[  4556.409] (**) |   |-->Device "nvidia"
[  4556.409] (==) No monitor specified for screen "nvidia".
   Using a default monitor configuration.
[  4556.409] (**) |-->Inactive Device "intel"
[  4556.409] (==) Automatically adding devices
[  4556.409] (==) Automatically enabling devices
[  4556.409] (==) Automatically adding GPU devices
[  4556.409] (==) FontPath set to:
   /usr/share/fonts/misc/,
   /usr/share/fonts/TTF/,
   /usr/share/fonts/OTF/,
   /usr/share/fonts/Type1/,
   /usr/share/fonts/100dpi/,
   /usr/share/fonts/75dpi/
[  4556.409] (**) ModulePath set to "/usr/lib64/opengl/nvidia,/usr/lib64/xorg/modules,/usr/lib32/xorg/modules"
[  4556.409] (II) The server relies on udev to provide the list of input devices.
   If no devices become available, reconfigure udev or disable AutoAddDevices.
[  4556.409] (II) Loader magic: 0x800c40
[  4556.409] (II) Module ABI versions:
[  4556.409]    X.Org ANSI C Emulation: 0.4
[  4556.409]    X.Org Video Driver: 19.0
[  4556.409]    X.Org XInput driver : 21.0
[  4556.409]    X.Org Server Extension : 9.0
[  4556.409] (II) xfree86: Adding drm device (/dev/dri/card1)
[  4556.409] (II) xfree86: Adding drm device (/dev/dri/card0)
[  4556.410] (--) PCI:*(0:0:2:0) 8086:0166:1558:6500 rev 9, Mem @ 0xf7400000/4194304, 0xd0000000/268435456, I/O @ 0x0000f000/64
[  4556.410] (--) PCI: (0:1:0:0) 10de:1292:1558:6500 rev 161, Mem @ 0xf6000000/16777216, 0xe0000000/268435456, 0xf0000000/33554432, I/O @ 0x0000e000/128, BIOS @ 0x????????/524288
[  4556.410] (II) LoadModule: "glx"
[  4556.410] (II) Loading /usr/lib64/opengl/nvidia/extensions/libglx.so
[  4556.418] (II) Module glx: vendor="NVIDIA Corporation"
[  4556.418]    compiled for 4.0.2, module version = 1.0.0
[  4556.418]    Module class: X.Org Server Extension
[  4556.418] (II) NVIDIA GLX Module  346.35  Sat Jan 10 20:53:39 PST 2015
[  4556.418] (II) LoadModule: "nvidia"
[  4556.418] (II) Loading /usr/lib64/xorg/modules/drivers/nvidia_drv.so
[  4556.419] (II) Module nvidia: vendor="NVIDIA Corporation"
[  4556.419]    compiled for 4.0.2, module version = 1.0.0
[  4556.419]    Module class: X.Org Video Driver
[  4556.419] (II) LoadModule: "modesetting"
[  4556.419] (II) Loading /usr/lib64/xorg/modules/drivers/modesetting_drv.so
[  4556.419] (II) Module modesetting: vendor="X.Org Foundation"
[  4556.419]    compiled for 1.17.1, module version = 1.17.1
[  4556.419]    Module class: X.Org Video Driver
[  4556.419]    ABI class: X.Org Video Driver, version 19.0
[  4556.419] (II) NVIDIA dlloader X Driver  346.35  Sat Jan 10 20:32:18 PST 2015
[  4556.419] (II) NVIDIA Unified Driver for all Supported NVIDIA GPUs
[  4556.419] (II) modesetting: Driver for Modesetting Kernel Drivers: kms
[  4556.419] (--) using VT number 7

[  4556.424] (II) Loading sub module "fb"
[  4556.424] (II) LoadModule: "fb"
[  4556.424] (II) Loading /usr/lib64/xorg/modules/libfb.so
[  4556.424] (II) Module fb: vendor="X.Org Foundation"
[  4556.424]    compiled for 1.17.1, module version = 1.0.0
[  4556.424]    ABI class: X.Org ANSI C Emulation, version 0.4
[  4556.424] (II) Loading sub module "wfb"
[  4556.424] (II) LoadModule: "wfb"
[  4556.424] (II) Loading /usr/lib64/xorg/modules/libwfb.so
[  4556.424] (II) Module wfb: vendor="X.Org Foundation"
[  4556.424]    compiled for 1.17.1, module version = 1.0.0
[  4556.424]    ABI class: X.Org ANSI C Emulation, version 0.4
[  4556.424] (II) Loading sub module "ramdac"
[  4556.424] (II) LoadModule: "ramdac"
[  4556.424] (II) Module "ramdac" already built-in
[  4556.425] (II) modeset(1): using drv /dev/dri/card0
[  4556.425] (II) modeset(G0): using drv /dev/dri/card0
[  4556.425] (EE) Screen 1 deleted because of no matching config section.
[  4556.425] (II) UnloadModule: "modesetting"
[  4556.425] (II) NVIDIA(0): Creating default Display subsection in Screen section
   "nvidia" for depth/fbbpp 24/32
[  4556.425] (==) NVIDIA(0): Depth 24, (==) framebuffer bpp 32
[  4556.425] (==) NVIDIA(0): RGB weight 888
[  4556.425] (==) NVIDIA(0): Default visual is TrueColor
[  4556.425] (==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
[  4556.425] (**) NVIDIA(0): Option "AllowEmptyInitialConfiguration"
[  4556.425] (**) NVIDIA(0): Enabling 2D acceleration
[  4556.525] (II) NVIDIA(GPU-0): Found DRM driver nvidia-drm (20130102)
[  4556.526] (II) NVIDIA(0): NVIDIA GPU GeForce GT 740M (GK208) at PCI:1:0:0 (GPU-0)
[  4556.526] (--) NVIDIA(0): Memory: 1048576 kBytes
[  4556.526] (--) NVIDIA(0): VideoBIOS: 80.28.22.00.31
[  4556.526] (II) NVIDIA(0): Detected PCI Express Link width: 8X
[  4556.526] (--) NVIDIA(0): Valid display device(s) on GeForce GT 740M at PCI:1:0:0
[  4556.526] (--) NVIDIA(0):     none
[  4556.526] (II) NVIDIA(0): Validated MetaModes:
[  4556.526] (II) NVIDIA(0):     "NULL"
[  4556.526] (II) NVIDIA(0): Virtual screen size determined to be 640 x 480
[  4556.526] (WW) NVIDIA(0): Unable to get display device for DPI computation.
[  4556.526] (==) NVIDIA(0): DPI set to (75, 75); computed from built-in default
[  4556.526] (==) modeset(G0): Depth 24, (==) framebuffer bpp 32
[  4556.526] (==) modeset(G0): RGB weight 888
[  4556.526] (==) modeset(G0): Default visual is TrueColor
[  4556.526] (II) Loading sub module "glamoregl"
[  4556.526] (II) LoadModule: "glamoregl"
[  4556.526] (II) Loading /usr/lib64/xorg/modules/libglamoregl.so
[  4556.528] (II) Module glamoregl: vendor="X.Org Foundation"
[  4556.528]    compiled for 1.17.1, module version = 1.0.0
[  4556.528]    ABI class: X.Org ANSI C Emulation, version 0.4
[  4556.528] (II) glamor: OpenGL accelerated X.org driver based.
[  4556.552] (EE)
[  4556.552] (EE) Backtrace:
[  4556.552] (EE) 0: /usr/bin/X (xorg_backtrace+0x48) [0x57fa38]
[  4556.552] (EE) 1: /usr/bin/X (0x400000+0x183969) [0x583969]
[  4556.552] (EE) 2: /lib64/libc.so.6 (0x7f07e41b5000+0x34ec0) [0x7f07e41e9ec0]
[  4556.552] (EE) 3: /usr/lib64/libX11.so.6 (_XSend+0x1b) [0x7f07dc2add0b]
[  4556.552] (EE) 4: /usr/lib64/libX11.so.6 (_XFlush+0x15) [0x7f07dc2ae1b5]
[  4556.552] (EE) 5: /usr/lib64/libX11.so.6 (_XGetRequest+0x55) [0x7f07dc2b0bc5]
[  4556.552] (EE) 6: /usr/lib64/libX11.so.6 (XQueryExtension+0x3d) [0x7f07dc2a4add]
[  4556.552] (EE) 7: /usr/lib64/libX11.so.6 (XInitExtension+0x22) [0x7f07dc299202]
[  4556.552] (EE) 8: /usr/lib64/libXext.so.6 (XextAddDisplay+0x4f) [0x7f07dc067d3f]
[  4556.552] (EE) 9: /usr/lib64/libnvidia-glsi.so.346.35 (0x7f07dc5a7000+0x63017) [0x7f07dc60a017]
[  4556.552] (EE) 10: /usr/lib64/libnvidia-glsi.so.346.35 (0x7f07dc5a7000+0x4484) [0x7f07dc5ab484]
[  4556.552] (EE) 11: /usr/lib64/opengl/nvidia/lib/libEGL.so.1 (0x7f07dca39000+0x2381e) [0x7f07dca5c81e]
[  4556.552] (EE) 12: /usr/lib64/opengl/nvidia/lib/libEGL.so.1 (0x7f07dca39000+0x2417a) [0x7f07dca5d17a]
[  4556.552] (EE) 13: /usr/lib64/opengl/nvidia/lib/libEGL.so.1 (0x7f07dca39000+0x2c946) [0x7f07dca65946]
[  4556.552] (EE) 14: /usr/lib64/xorg/modules/libglamoregl.so (glamor_egl_init+0x89) [0x7f07de271399]
[  4556.552] (EE) 15: /usr/lib64/xorg/modules/drivers/modesetting_drv.so (0x7f07de8fd000+0x6949) [0x7f07de903949]
[  4556.552] (EE) 16: /usr/bin/X (InitOutput+0xbd1) [0x477771]
[  4556.552] (EE) 17: /usr/bin/X (0x400000+0x3a91b) [0x43a91b]
[  4556.553] (EE) 18: /lib64/libc.so.6 (__libc_start_main+0xf5) [0x7f07e41d6ad5]
[  4556.553] (EE) 19: /usr/bin/X (0x400000+0x2620e) [0x42620e]
[  4556.553] (EE)
[  4556.553] (EE) Segmentation fault at address 0x0
[  4556.553] (EE)
Fatal server error:
[  4556.553] (EE) Caught signal 11 (Segmentation fault). Server aborting
[  4556.553] (EE)
[  4556.553] (EE)
Please consult the The X.Org Foundation support
    at http://wiki.x.org
 for help.
[  4556.553] (EE) Please also check the log file at "/var/log/Xorg.0.log" for additional information.
[  4556.553] (EE)
[  4556.561] (EE) Server terminated with error (1). Closing log file.


I'm not very strong at Xorg configuration and I just make X -configure. Now the X starts but without GLX support. How can I keep my old GLX support with the new xorg-server?

Thanks in advance


Last edited by SemmZemm on Mon Feb 23, 2015 2:07 am; edited 3 times in total
Back to top
View user's profile Send private message
mikfire
n00b
n00b


Joined: 13 Feb 2015
Posts: 5

PostPosted: Fri Feb 13, 2015 10:32 pm    Post subject: Re: xorg-server-1.17, optimus and glamore => Segmentation Reply with quote

The modesetting driver has now been absorbed into the core of xorg. I had the same problem you are having. I will tell you what I did, but I am making no claims that any one of them is required.

First, I added the dmx and glamor USE flags for building xorg-server. From the very little I was able to find, I think those two flags enable the internal modesetting functions. I can at least verify that I had a new modesetting_drv.so in /usr/lib64/xorg/drivers after that. It may have been there before.

Second, I made three changes to my xorg.conf file. Yours is fairly similar to mine, so I hope this works. Oh. And this should go without saying, but make a backup copy before you start mucking around.

Change your server layout to look like
Code:
Section "ServerLayout"
        Identifier "layout"
        Screen 0 "nvidia"
        Inactive "inteld"
EndSection

Change your intel device to look like
Code:
Section "Device"
        Identifier "inteld"
        Driver  "modesetting"
        BusID   "PCI:0:2:0"
EndSection

Change your intel screen to look like
Code:
Section "Screen"
        Identifier "intel"
        Device "inteld"
EndSection


Save and try the changes.

I hope it works. If it does, don't ask me why. If it doesn't, I don't think I can offer much more help.

Mik

https://xkcd.com/963/
Back to top
View user's profile Send private message
SemmZemm
n00b
n00b


Joined: 05 Jul 2013
Posts: 62
Location: France, Russia

PostPosted: Sat Feb 14, 2015 12:50 am    Post subject: Reply with quote

Thank you for your help, but it didn't work.

I already had glamor enabled, so I added dmx and rebuilt the xorg-server (with dependencies).
Btw, I'm sure I had my modesetting_drv.so before.

For the changes in xorg.conf. In theory it shouldn't change nothing but to be sure I tried to rename the identifiers. It didn't help: I see the same result.

Any other suggestions?
Back to top
View user's profile Send private message
mikfire
n00b
n00b


Joined: 13 Feb 2015
Posts: 5

PostPosted: Sat Feb 14, 2015 3:31 am    Post subject: Reply with quote

Yes, actually.

Try eselect opengl list
and then eselect opengl set xorg-x11

and see if that works. I realized that my tests had left the x11-xorg opengl as the system openGL library. Once I set it back to nvidia, I am getting a very similar core dump to you.

Reverting to nvidia-343.36 didn't help. I will keep digging at it.
Back to top
View user's profile Send private message
SemmZemm
n00b
n00b


Joined: 05 Jul 2013
Posts: 62
Location: France, Russia

PostPosted: Sat Feb 14, 2015 4:27 am    Post subject: Reply with quote

Thank you so much! Now it works. I don't remember the previous value for glxgears, but now it's about 900 FPS.
I find it quite reasonable for my GT740M.

Once again, thank you a lot. .

SOLVED.

By the way, the only thing needed to me was to choose correct opengl implementation.
It works with disabled glamor and dmx flags, and I didn't have to modify my xorg.conf

The only thing I'd like to verify is that applications really use my nvidia card, and not intel integrated, for computations.


Last edited by SemmZemm on Sat Feb 14, 2015 5:08 am; edited 1 time in total
Back to top
View user's profile Send private message
mikfire
n00b
n00b


Joined: 13 Feb 2015
Posts: 5

PostPosted: Sat Feb 14, 2015 5:06 am    Post subject: Reply with quote

SemmZemm wrote:
Thank you so much! Now it works. I don't remember the previous value for glxgears, but now it's about 900 FPS.
I find it quite reasonable for my GT740M.

Once again, thank you a lot. .

SOLVED.


I wouldn't say solved. You are using the x11 openGL implementation, not the nvidia. This is a work around.
Back to top
View user's profile Send private message
SemmZemm
n00b
n00b


Joined: 05 Jul 2013
Posts: 62
Location: France, Russia

PostPosted: Sat Feb 14, 2015 5:09 am    Post subject: Reply with quote

Already edited my previous message. I was wrong. But still it's better workaround than one I used to have.
Back to top
View user's profile Send private message
SemmZemm
n00b
n00b


Joined: 05 Jul 2013
Posts: 62
Location: France, Russia

PostPosted: Mon Feb 16, 2015 9:31 am    Post subject: Reply with quote

I asked also on nvidia board.
In the case if someone would like to follow:
https://devtalk.nvidia.com/default/topic/811657/linux/xorg-server-1-17-on-optimus-laptop-doesn-t-start-when-nvidia-opengl-implementation-is-selected/
Back to top
View user's profile Send private message
SemmZemm
n00b
n00b


Joined: 05 Jul 2013
Posts: 62
Location: France, Russia

PostPosted: Wed Feb 18, 2015 12:00 pm    Post subject: Reply with quote

up
Back to top
View user's profile Send private message
SemmZemm
n00b
n00b


Joined: 05 Jul 2013
Posts: 62
Location: France, Russia

PostPosted: Wed Feb 18, 2015 6:29 pm    Post subject: Reply with quote

The workaround posted here:

https://devtalk.nvidia.com/default/topic/811657/linux/xorg-server-1-17-on-optimus-laptop-doesn-t-start-when-nvidia-opengl-implementation-is-selected/

worked for me.

Disable glamor for the modesetting driver - in the section of the config file that configures the modesetting driver, add
Code:
Option "AccelMethod" "none"


For the moment I'm not completely sure, how much it degrades the performance, but it works for me.
Could someone explain, at which point it is worse than use glance?
Back to top
View user's profile Send private message
Barbieken
n00b
n00b


Joined: 22 Mar 2014
Posts: 62

PostPosted: Thu Feb 19, 2015 9:09 am    Post subject: Reply with quote

This method with "AccelMethod" "none" does not work for me with kernel 3.19/xorg 1.17.1/nvidia-drivers-346.35.
nvidia.conf:
Code:

Section "ServerLayout"
    Identifier "layout"
    Screen 0 "nvidia"
    Inactive "intel"
EndSection


Section "Device"
    Identifier "nvidia"
    Driver "nvidia"
    BusID "PCI:1:0:0"
    Option "RegistryDwords" "EnableBrightnessControl=1"
#    Option "RegistryDwords" "PerfLevelSrc=0x3333"
EndSection

Section "Screen"
    Identifier "nvidia"
    Device "nvidia"
    # Uncomment this line if your computer has no display devices connected to
    # the NVIDIA GPU.  Leave it commented if you have display devices
    # connected to the NVIDIA GPU that you would like to use.
    #Option "UseDisplayDevice" "none"
     Option "AllowEmptyInitialConfiguration".
EndSection

Section "Device"
    Identifier "intel"
    Driver "modesetting"
    BusID "PCI:0:2:0"
    Option "AccelMethod" "none"
EndSection

Section "Screen"
    Identifier "intel"
    Device "intel"
EndSection


With default drivers I get blackscreen immediatelly, without even cursor. With the following two patches for nvidia-drivers (see: https://devtalk.nvidia.com/default/topic/796559/kernel-3-18-warning-no-drm_driver-set_busid-implementation-provided-by-nvidia_frontend_exit_modu/) I could get to KDE login screen, but after login it immediatelly segfaults and returns back to the login screen again.

Code:

--- kernel/nv-drm.c<--->2015-01-11 07:30:46.000000000 +0300
+++ kernel/nv-drm.c.2<->2015-02-18 22:51:57.215196554 +0300
@@ -128,6 +128,10 @@
     .gem_prime_vmap = nv_gem_prime_vmap,
     .gem_prime_vunmap = nv_gem_prime_vunmap,
.
+#if LINUX_VERSION_CODE >= KERNEL_VERSION(3, 18, 0)
+    .set_busid = drm_pci_set_busid,
+#endif
+
     .name = "nvidia-drm",
     .desc = "NVIDIA DRM driver",
     .date = "20130102",






--- kernel/nv-linux.h<->2015-01-11 07:30:46.000000000 +0300
+++ kernel/nv-linux.h.2>2015-02-18 22:57:03.879194292 +0300
@@ -2000,7 +2000,7 @@
 #if defined(NV_FILE_HAS_INODE)
 #define NV_FILE_INODE(file) (file)->f_inode
 #else
-#define NV_FILE_INODE(file) (file)->f_dentry->d_inode
+#define NV_FILE_INODE(file) (file)->f_path->f_dentry->d_inode
 #endif
.
 /* Stub out UVM in multi-RM builds */






/var/log/kdm.log content:
Code:

(EE).
(EE) Backtrace:
(EE) 0: /usr/bin/X (xorg_backtrace+0x49) [0x57f329]
(EE) 1: /usr/bin/X (0x400000+0x183199) [0x583199]
(EE) 2: /lib64/libc.so.6 (0x7f73fc468000+0x34ed0) [0x7f73fc49ced0]
(EE) 3: /lib64/libc.so.6 (strlen+0x2a) [0x7f73fc4e95ca]
(EE) 4: /lib64/libc.so.6 (__strdup+0xe) [0x7f73fc4e930e]
(EE) 5: /usr/lib64/xorg/modules/drivers/nvidia_drv.so (0x7f73f6bd2000+0x90a9c) [0x7f73f6c62a9c]
(EE) 6: /usr/lib64/xorg/modules/drivers/nvidia_drv.so (0x7f73f6bd2000+0x5d6292) [0x7f73f71a8292]
(EE) 7: /usr/bin/X (0x400000+0x36a07) [0x436a07]
(EE) 8: /usr/bin/X (0x400000+0x3a95b) [0x43a95b]
(EE) 9: /lib64/libc.so.6 (__libc_start_main+0xf5) [0x7f73fc489aa5]
(EE) 10: /usr/bin/X (0x400000+0x261ae) [0x4261ae]
(EE).
(EE) Segmentation fault at address 0x0
(EE).
Fatal server error:
(EE) Caught signal 11 (Segmentation fault). Server aborting
(EE).
(EE).
Please consult the The X.Org Foundation support.
<------> at http://wiki.x.org
 for help..
(EE) Please also check the log file at "/var/log/Xorg.0.log" for additional information.
(EE).
(EE) Server terminated with error (1). Closing log file.
klauncher(3302) kdemain: No DBUS session-bus found. Check if you have started the DBUS server..
kdeinit4: Communication error with launcher. Exiting!
kdmgreet(3296)/kdecore (K*TimeZone*): KSystemTimeZones: ktimezoned initialize() D-Bus call failed:  "Not connected to D-Bus server".

kdmgreet(3296)/kdecore (K*TimeZone*): No time zone information obtained from ktimezoned.



/usr/share/config/kdm/Xsetup:
Code:

#! /bin/sh
# Xsetup - run as root before the login dialog appears

#xconsole -geometry 480x130-0-0 -notify -verbose -fn fixed -exitOnFail -file /dev/xconsole &
xrandr --setprovideroutputsource modesetting NVIDIA-0
xrandr --auto
xrandr --dpi 144



After I removed nvidia.conf file from /etc/X11/xorg.conf.d/ KDE works without problems, but it's not a solution, as I think nvidia and opengl do not work,
Back to top
View user's profile Send private message
Ant P.
Watchman
Watchman


Joined: 18 Apr 2009
Posts: 5592

PostPosted: Thu Feb 19, 2015 5:58 pm    Post subject: Reply with quote

SemmZemm wrote:
Thank you so much! Now it works. I don't remember the previous value for glxgears, but now it's about 900 FPS.
I find it quite reasonable for my GT740M.

Just for reference, you should be seeing either a five digit number or your monitor's refresh rate there if everything's set up correctly.
Back to top
View user's profile Send private message
SemmZemm
n00b
n00b


Joined: 05 Jul 2013
Posts: 62
Location: France, Russia

PostPosted: Sun Feb 22, 2015 6:04 am    Post subject: Reply with quote

Ant P. wrote:
SemmZemm wrote:
Thank you so much! Now it works. I don't remember the previous value for glxgears, but now it's about 900 FPS.
I find it quite reasonable for my GT740M.

Just for reference, you should be seeing either a five digit number or your monitor's refresh rate there if everything's set up correctly.

Yes, always understood it, now it's correct.

Barbieken, it's rather a problem between kernel and nvidia-drivers, but thank you I have the same problem if kernel >3.17.* and I'll try your solution.
Back to top
View user's profile Send private message
SemmZemm
n00b
n00b


Joined: 05 Jul 2013
Posts: 62
Location: France, Russia

PostPosted: Mon Feb 23, 2015 2:06 am    Post subject: Reply with quote

Barbieken, thank you, I tested your solution and now I can use nvidia-drivers with Kernel 3.19.
The only problem I had is I couldn't copy-paste your patch, there are two commas, and maybe some problems with spaces.
I didn't understand it and make my own diff's, so here are the patches for the people experiencing the same problem:
http://paste.ubuntu.com/10365368/
http://paste.ubuntu.com/10365376/
Back to top
View user's profile Send private message
weidong
n00b
n00b


Joined: 01 Mar 2015
Posts: 4

PostPosted: Sun Mar 01, 2015 3:03 am    Post subject: Reply with quote

Barbieken wrote:

/var/log/kdm.log content:
Code:

(EE).
(EE) Backtrace:
(EE) 0: /usr/bin/X (xorg_backtrace+0x49) [0x57f329]
(EE) 1: /usr/bin/X (0x400000+0x183199) [0x583199]
(EE) 2: /lib64/libc.so.6 (0x7f73fc468000+0x34ed0) [0x7f73fc49ced0]
(EE) 3: /lib64/libc.so.6 (strlen+0x2a) [0x7f73fc4e95ca]
(EE) 4: /lib64/libc.so.6 (__strdup+0xe) [0x7f73fc4e930e]
(EE) 5: /usr/lib64/xorg/modules/drivers/nvidia_drv.so (0x7f73f6bd2000+0x90a9c) [0x7f73f6c62a9c]
(EE) 6: /usr/lib64/xorg/modules/drivers/nvidia_drv.so (0x7f73f6bd2000+0x5d6292) [0x7f73f71a8292]
(EE) 7: /usr/bin/X (0x400000+0x36a07) [0x436a07]
(EE) 8: /usr/bin/X (0x400000+0x3a95b) [0x43a95b]
(EE) 9: /lib64/libc.so.6 (__libc_start_main+0xf5) [0x7f73fc489aa5]
(EE) 10: /usr/bin/X (0x400000+0x261ae) [0x4261ae]
(EE).
(EE) Segmentation fault at address 0x0
(EE).
Fatal server error:
(EE) Caught signal 11 (Segmentation fault). Server aborting
(EE).
(EE).
Please consult the The X.Org Foundation support.
<------> at http://wiki.x.org
 for help..
(EE) Please also check the log file at "/var/log/Xorg.0.log" for additional information.
(EE).
(EE) Server terminated with error (1). Closing log file.
klauncher(3302) kdemain: No DBUS session-bus found. Check if you have started the DBUS server..
kdeinit4: Communication error with launcher. Exiting!
kdmgreet(3296)/kdecore (K*TimeZone*): KSystemTimeZones: ktimezoned initialize() D-Bus call failed:  "Not connected to D-Bus server".

kdmgreet(3296)/kdecore (K*TimeZone*): No time zone information obtained from ktimezoned.



/usr/share/config/kdm/Xsetup:
Code:

#! /bin/sh
# Xsetup - run as root before the login dialog appears

#xconsole -geometry 480x130-0-0 -notify -verbose -fn fixed -exitOnFail -file /dev/xconsole &
xrandr --setprovideroutputsource modesetting NVIDIA-0
xrandr --auto
xrandr --dpi 144



After I removed nvidia.conf file from /etc/X11/xorg.conf.d/ KDE works without problems, but it's not a solution, as I think nvidia and opengl do not work,



I have the same problem,Optimus Laptops.

I think it is NVIDIA driver problem, kscreen error configuration leads X server to stop.

Remove /etc/X11/xorg.conf, start KDE, systemsettings->Startup and Shutdown->Service Manager->Startup Service->KScreen 2, stop and set no startup use.

Reboot and enable NVIDIA driver.

But multi screen configuration cannot be saved.
Back to top
View user's profile Send private message
SemmZemm
n00b
n00b


Joined: 05 Jul 2013
Posts: 62
Location: France, Russia

PostPosted: Tue Mar 03, 2015 8:48 am    Post subject: Reply with quote

I use mutliscreen with XFCE and it works ok, I can reboot, I can even not to use my second screen and after restarting X with second monitor connected it will restore the parameters (except some small problem with panel settings, but it's clearly XFCE's problem). So I don't think it's nvidia-drivers problem
Back to top
View user's profile Send private message
weidong
n00b
n00b


Joined: 01 Mar 2015
Posts: 4

PostPosted: Thu Mar 12, 2015 2:21 am    Post subject: Reply with quote

I update the NVIDIA drivers to 346.47, there is no problem (Intel hd4600+gtx860m, kde5, xorg-server-1.16.4).
Back to top
View user's profile Send private message
Barbieken
n00b
n00b


Joined: 22 Mar 2014
Posts: 62

PostPosted: Sat Mar 14, 2015 12:01 pm    Post subject: Reply with quote

I confirm, the nvidia-drivers-346.47 work fine with kernel 3.19/xorg 1.17.1 with Optimus, without any patches. Lenovo Thinkpad T540p, IPS 3K display, Nvidia 730M/Intel HD Graphics, KDE 4
Back to top
View user's profile Send private message
Atmmac
Tux's lil' helper
Tux's lil' helper


Joined: 17 Oct 2013
Posts: 130
Location: Watertown, MA

PostPosted: Tue Mar 17, 2015 7:59 pm    Post subject: Reply with quote

I just upgraded to 3.19.1 and enable the noaccel option mentioned above. I just migrated off of the old bumblebee bbswitch configuration with primus. Performance seems good but was definitely a hassle to switch off.
Back to top
View user's profile Send private message
Display posts from previous:   
Reply to topic    Gentoo Forums Forum Index Desktop Environments All times are GMT
Page 1 of 1

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum