Gentoo Forums
Gentoo Forums
Gentoo Forums
Quick Search: in
How can I tell if NVidia Optimus is working properly?
View unanswered posts
View posts from last 24 hours
View posts from last 7 days

 
Reply to topic    Gentoo Forums Forum Index Desktop Environments
View previous topic :: View next topic  
Author Message
mr-simon
Guru
Guru


Joined: 22 Nov 2002
Posts: 367
Location: Leamington Spa, Warks, UK

PostPosted: Fri Aug 10, 2018 1:45 pm    Post subject: How can I tell if NVidia Optimus is working properly? Reply with quote

I'm setting up a new laptop that has hybrid Intel (integrated) and NVidia (discrete) graphics.

I *think* I've got it working properly but the battery life on the laptop is not really what I'd like. It's possible that it's only using the discrete graphics card, and not the integrated one. Or, it's possible that the battery life on my laptop just isn't very good.

glxinfo says that NVidia is doing the rendering, but maybe it should do that because it's 3d, rather than 2d?

Is there a way to tell which adapter is in use for what, so I can tell whether it's working?
_________________
"Pokey, are you drunk on love?"
"Yes. Also whiskey. But mostly love... and whiskey."
Back to top
View user's profile Send private message
fedeliallalinea
Administrator
Administrator


Joined: 08 Mar 2003
Posts: 30837
Location: here

PostPosted: Fri Aug 10, 2018 1:56 pm    Post subject: Re: How can I tell if NVidia Optimus is working properly? Reply with quote

mr-simon wrote:
glxinfo says that NVidia is doing the rendering, but maybe it should do that because it's 3d, rather than 2d?

This means that rendering is done by nvidia
intel rendering:
$ glxinfo | grep OpenGL
OpenGL vendor string: Intel Open Source Technology Center
OpenGL renderer string: Mesa DRI Intel(R) Skylake Halo GT2
OpenGL core profile version string: 3.3 (Core Profile) Mesa 11.0.6
OpenGL core profile shading language version string: 3.30
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL core profile extensions:
OpenGL version string: 3.0 Mesa 11.0.6
OpenGL shading language version string: 1.30


nvidia rendering:
$ optirun glxinfo | grep OpenGL
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: GeForce GTX 960M/PCIe/SSE2
OpenGL core profile version string: 4.4.0 NVIDIA 361.28
OpenGL core profile shading language version string: 4.40 NVIDIA via Cg compiler
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL core profile extensions:
OpenGL version string: 4.5.0 NVIDIA 361.28
OpenGL shading language version string: 4.50 NVIDIA

_________________
Questions are guaranteed in life; Answers aren't.
Back to top
View user's profile Send private message
mr-simon
Guru
Guru


Joined: 22 Nov 2002
Posts: 367
Location: Leamington Spa, Warks, UK

PostPosted: Fri Aug 10, 2018 2:41 pm    Post subject: Reply with quote

Thanks. I'm actually trying to use native (automatic) Optimus with the Nvidia driver without Bumblebee, as described here: https://wiki.gentoo.org/wiki/NVIDIA/Optimus - I don't have optirun.

This is with `xrandr --auto`

That's what I'm not sure about: how to tell what automatic mode is doing?
_________________
"Pokey, are you drunk on love?"
"Yes. Also whiskey. But mostly love... and whiskey."
Back to top
View user's profile Send private message
hhfeuer
Apprentice
Apprentice


Joined: 28 Jul 2005
Posts: 185

PostPosted: Fri Aug 10, 2018 9:17 pm    Post subject: Reply with quote

I think there's a misunderstanding what 'PRIME' is.
There are two types of PRIME, output and render offload. Offload is the mechanism like in Windows, running the desktop on the iGPU and set specific applications to run on the dGPU. Output means that everything is running on the dGPU and then displayed through the iGPU.
Nouveau supports both types, the proprietary driver only the 'output' type.
This means that there's no automatic switching, you're using the nvidia gpu the whole time.
Switching to the iGPU for powersave involves
- Switching xorg.conf (depending on if you're using an xorg.conf or a OutputClass config snippet)
- Switching OpenGL from nvidia to mesa (eselect opengl)
- Stopping the Xserver
- Unloading nvidia modules (modprobe -r nvidia)
- Turning off the dGPU (bbswitch)
- Starting the xserver
Depending on your init system, this is really easily implemented as a script and service, a search for prime and gentoo should come up with something usable.
A general advice for powersaving on notebooks is to use kernel 4.17. I get the same power consumption with it that earlier kernels only gave me with a lot of fiddling with powertop.

xrandr --auto just detects monitors and sets modes by autodetection.
Bumblebee tries to mimic offload using the proprietary driver at the expense of gpu power.
Back to top
View user's profile Send private message
mr-simon
Guru
Guru


Joined: 22 Nov 2002
Posts: 367
Location: Leamington Spa, Warks, UK

PostPosted: Sat Aug 11, 2018 1:04 pm    Post subject: Reply with quote

hhfeuer wrote:
I think there's a misunderstanding what 'PRIME' is.
There are two types of PRIME, output and render offload. Offload is the mechanism like in Windows, running the desktop on the iGPU and set specific applications to run on the dGPU. Output means that everything is running on the dGPU and then displayed through the iGPU.
Nouveau supports both types, the proprietary driver only the 'output' type.
This means that there's no automatic switching, you're using the nvidia gpu the whole time

Thanks for the clarification... It's not what it says on the wiki (or at least, it's not clear):
Quote:
Laptops with Nvidia graphics cards using Nvidia Optimus can be configured to render scenes on the discrete Nvidia GPU (Graphics Processing Unit) card using x11-drivers/nvidia-drivers and copy the rendered scenes to the Intel GPU using XRandR.

But, our friends at Arch have written this a bit more clearly it would seem:
Quote:
NVIDIA Optimus is a technology that allows an Intel integrated GPU and discrete NVIDIA GPU to be built into and accessed by a laptop. Getting Optimus graphics to work on Arch Linux requires a few somewhat complicated steps, explained below. There are several methods available:
  • disabling one of the devices in BIOS, which may result in improved battery life if the NVIDIA device is disabled, but may not be available with all BIOSes and does not allow GPU switching
  • using the official Optimus support included with the proprietary NVIDIA driver, which offers the best NVIDIA performance but does not allow GPU switching and can be more buggy than the open-source driver
  • using the PRIME functionality of the open-source nouveau driver, which allows GPU switching and powersaving but offers poor performance compared to the proprietary NVIDIA driver and may cause issues with sleep and hibernate
  • using the third-party Bumblebee program to implement Optimus-like functionality, which offers GPU switching and powersaving but requires extra configuration
  • using nvidia-xrun. Utility to run separate X with discrete nvidia graphics with full performance

So, yes you're right. Looks like using Bumblebee would be the appropriate next step.

My last laptop was a macbook which switched transparently depending on workload, which was really nice. I was hoping that this would achieve something similar, but I guess that's not possible. Shame.
_________________
"Pokey, are you drunk on love?"
"Yes. Also whiskey. But mostly love... and whiskey."
Back to top
View user's profile Send private message
hhfeuer
Apprentice
Apprentice


Joined: 28 Jul 2005
Posts: 185

PostPosted: Sat Aug 11, 2018 1:55 pm    Post subject: Reply with quote

Yes, the Gentoo Wiki entry is a bit chaotic but at least contains a lot of specific config info now, while it was previously only more like a stub article with some outdated info. The Arch article has been for a long time the only comprehensive one. There's also a lot of bogus info about prime offload, prime output and reverse prime floating on the web so it's hard to get the facts right.
AFAIK, OS X only switches depending on power source (AC/battery), not on workload but that might have changed in recent versions.
Back to top
View user's profile Send private message
mr-simon
Guru
Guru


Joined: 22 Nov 2002
Posts: 367
Location: Leamington Spa, Warks, UK

PostPosted: Sat Aug 11, 2018 2:34 pm    Post subject: Reply with quote

On OSX, it definitely cares about workload when dynamic switching (the default) and has done since I started using it about 3 years ago. I ran a gpu information utility that told me every time it was using the discrete gpu. When I opened a heavy-duty browser app tab, or a game, it would t give me a notification that the gpu had switched.

The mechanism was completely transparent, and the only way you could tell what was happening was too run a third party program to tell you the state.
_________________
"Pokey, are you drunk on love?"
"Yes. Also whiskey. But mostly love... and whiskey."
Back to top
View user's profile Send private message
Display posts from previous:   
Reply to topic    Gentoo Forums Forum Index Desktop Environments All times are GMT
Page 1 of 1

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum