Bluefin: Making your NVIDIA dGPU primary for external monitors
Because otherwise you will get choppy graphics
I’m currently borrowing a laptop that I’m booting off from an NVMe enclosure with all my Linux shenanigans. In particular, my Bluefin installation with everything.
The laptop comes with NVIDIA Graphics and an integrated Intel Graphics 630 HD (Gen9), which you would know as “Hybrid Graphics”. Both of them work well until I shove in an external monitor to the HDMI port.
For some reason, the monitor gets really “choppy”, “laggy”, like it was running a low fps. The cursor is often unresponsive, and even browsing the Internet feels like you were on remote desktop. Well, there is a reason, and is the HDMI port.
NVIDIA is the gatekeeper
After surfing on the Internet about why this problem happened, very few post actually tell what the problem is and how to fix it. Eventually I landed with two Reddit comments that gave insight on why this performance problem exists on a secondary monitor:
When you connect a secondary screen, the iGPU renders both screens. Because the HDMI port is connected to the NVIDIA dGPU, the latter has to read the framebuffer from the iGPU, which AFAIK resides in RAM. Not only the dGPU has to request access from the CPU to read the RAM, but also output that data to the HDMI port. This is slow by miles.
There are three solutions to avoid this:
- Get back to Windows 11 (unfeasible)
- Connect the monitor to a port connected to the CPU/iGPU, like a USB4 or Thunderbolt port (this model doesn’t have any)
- Set the primary display adapter to the dGPU (only choice).
The following content has been edited to accommodate a better fix for this problem
Wayland, GNOME and displays
I’m using Bluefin, a Fedora Silverblue “spin” that distributes itself as an OCI image. In other words, the OS is immutable (read only), and you cannot change most of the core configuration. Normal people wouldn’t anyway.
Eventually, after testing a lot of solutions with no avail, I found out that adding some environment variables would fix it, but rather, GNOME Mutter developers pointed me on the right direction.
First, we need the Vendor ID and Device ID for our GPU, which in this case is NVIDIA. We can check which is by listing all PCI Devices and filtering for our graphic card name.
lspci -nn | grep NVIDIA
01:00.0 3D controller [0302]: NVIDIA Corporation ... [10de:1234] (rev a1)
01:00.1 Audio device [0403]: NVIDIA Corporation ... [10de:00ff] (rev a1)
We can find both enclosed in brackets at the end of the device name. In this example, the Vendor ID for NVIDIA is 10de
and the Device ID is 1234
.
Once we got them, let’s reate a udev rule file with super user permissions. For sake of brevity, we can use the nano
editor to quickly create the file in one go.
sudo nano /etc/udev/rules.d/61-mutter-preferred-primary-gpu.rules
Once done, we can copy and paste the following udev rule, where VENDORID
and DEVICEID
are your Vendor ID and Device ID you picked up before, respectively.
SUBSYSTEM=="drm", ENV{DEVTYPE}=="drm_minor", ENV{DEVNAME}=="/dev/dri/card[0-9]", SUBSYSTEMS=="pci", ATTRS{vendor}=="0xVENDOR_ID", ATTRS{device}=="0xDEVICE_ID", TAG+="mutter-device-preferred-primary"
So, if we replaced it, it would look like this:
ATTRS{vendor}=="0x10de", ATTRS{device}=="0x1234"
Once you’re done. I would just restart the system using systemctl reboot
. I didn’t have success reloading the udev rules, closing the session and starting it again with the following:
udevadm control --reload-rules && udevadm trigger
Help! My apps don’t open!
If your applications are not opening, may be is because GNOME 46 is resorting to using the Vulkan renderer in Wayland, which is the default on GNOME 47. This should be fixed temporarily until NVIDIA fixes this in their drivers through this environment variable:
echo "GSK_RENDERER=ngl" > ~/.config/environment.d/gsk.conf
This ensures GNOME uses OpenGL to render the desktop apps. If that doesn’t fix it, then to the forums you go.
Caveats
There are two caveats for this. The first is that the dGPU will always be running. In other words, your laptop will last less on battery compared to only using the iGPU.
The second caveat, and more important for people who switch between places, is that changing the configuration requires a full reboot. Probably this is because it works at kernel level. At least, if restarting the udev rules and logging-in-and-out doesn’t work for you.
Who knows if in the future someone makes a way to change the primary graphic driver “on the go” with a graphical switch, or better, hearing the AC power events, but I assume that would mean a major rework on the rendering stack to avoid disrupting the user session.
For now, before unplugging your laptop, comment the lines and restart the system so your battery doesn’t fly off like a cat in a bathtub.