Hi all,

I’m in the market for a new big desktop replacement gaming laptop, and looking at the market there are almost exclusively Nvidia powered.

I was wondering about the state of their new open-source driver. Can I run a plain vanilla kernel with only open source / upstream packages and drivers and expect to get a good experience? How is battery life, performance? Does DRI Prime and Vulkan based GPU selection “just work”?

The only alternative new for my market is a device with an Intel Arc A730M, which I currently think is going to be the one I end up buying.

Edit 19/11: Thanks for all the feedback everyone! Since the reactions were quite mixed - “it works perfectly for me” vs “it’s a unmaintainable mess that breaks all the time”, I’m going to err on the side of caution and look elsewhere. I found a used laptop with an AMD Radeon RX 6700M, which I’m going to check out the coming days. If not, I’ve also found Alienware sells their m16 laptop with an RX 7600M XT, which might be a good buy for me (I currently still rock an Alienware 17R1 from 2013 with an MXM card from a decomissioned industrial computer in it).

  • redxef@feddit.de
    link
    fedilink
    arrow-up
    43
    arrow-down
    5
    ·
    1 year ago

    Nouveau is stable and runs, but don’t expect the best performance. The official NVIDIA driver is unstable, lacks proper wayland support but has decent performance. I’d go with anything but a NVIDIA GPU.

  • h3ndrik@feddit.de
    link
    fedilink
    arrow-up
    21
    arrow-down
    1
    ·
    edit-2
    1 year ago

    I don’t think so. I can’t find any good information about those new ‘open-source’ kernel modules in any of the Linux wikis. Just news articles from 2022. Something isn’t right there. It’s either a marketing stunt and nothing changed or something else. I would dig deeper if I were you.

    Concerning NVidia’s history: Don’t rely on them making user-friendly decisions. Especially when it comes to Linux. The usual drivers work. They have some hiccups and you’re going to have some annoying issues with things like Wayland, if something major changes in the kernel you have to wait for NVidia but they’ll eventually fix it. It’s not open source and you have to live with what they give to you. It mostly works though and performance is great. I’d say this is the same with the newer ‘open-source’ drivers that just shift things into (proprietary) userspace and firmware.

    The true open-source alternative is the ‘Nouveau’ drivers. For newer graphics cards, expect them to get only a fraction of the performance out of your GPU and having half the features not yet implemented, including power management. So your game will have 10fps and fans on max while it empties your battery in 20 minutes.

    On my laptop Nouveau started to be an alternative after several years when development kept up and it got comparable performance and battery life to the proprietary drivers. But you might replace the laptop at that point. Waiting for NVidia or the open source drivers to keep up hasn’t been worth it for me in the past. I did that two times and everytime I had to live with the proprietary drivers instead.

    So my advice is: Be comfortable using the proprietary drivers if you want to buy NVidia.

    Intel Arc got really bad performance reviews. It’s not worth spending lots of money on them. But fortunately they’re cheap because the gamers don’t buy them (for that reason). I live with the iGPU that’s part of my CPU. It’s alright since I don’t play modern games anyways.

    But you missed AMD. There are some laptops available with the Ryzen 7040 series and it seems to be a fast CPU. They also made the integrated graphics way faster than before, albeit probably still not on the level for proper gaming. But I bet there are desktop replacements out there that combine it with an AMD GPU.

    • wim@lemmy.sdf.orgOP
      link
      fedilink
      arrow-up
      3
      arrow-down
      2
      ·
      edit-2
      1 year ago

      Thanks, that’s what I was thinking as well.

      I didn’t miss AMD. The dedicated GPUs just aren’t available new in my wide area, unless they’re put into mediocre plastic shells of a budget laptop, and the integrated GPUs don’t work for my use case.

      I just sold an AMD laptop (with RX 6800s) because I wanted a bigger screen. I don’t need top-tier performance, most of the games I play are fine on mainstream gaming hardware. The software experience was perfect but I didn’t use the laptop very often because it was 14" and uncomfortable to use in the couch because of the screen hinge design.

      I already have a perfectly fine 2021ThinkPad X1 Nano that does everything I want from a portable computer and I noticed I just never had a reason to use the gaming laptop unless I was gaming. I just want something with a bigger screen and better GPU that will only be moved on our living room table and the storage rack, and the occasional car trip. If the 18" Alienware with RX 7900M was for sale here (for a reasonable price) I would buy that, but that is not going to happen.

  • russjr08@outpost.zeuslink.net
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    5
    ·
    1 year ago

    As someone who just had to shell out the money to do a lateral move from an Nvidia 2080 to a RX 6700XT - don’t go with Nvidia if you’re wanting to have a good time.

    • wim@lemmy.sdf.orgOP
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      1 year ago

      That’s what I got from my past experiences as well, but I haven’t owned anything Nvidia since the Pascal (GTX 10x0) era so I wanted to check if anything got better with their open source efforts.

      • interceder270@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        3
        ·
        1 year ago

        As someone with an Nvidia GPU (3060 mobile) with no issues, this is mostly FUD from AMD fanboys.

        Experience > theory, everytime. Especially the theory of strangers on internet forums. :)

        • russjr08@outpost.zeuslink.net
          link
          fedilink
          English
          arrow-up
          5
          ·
          1 year ago

          Yeah, I wish it had just been theory, I wouldn’t blatantly say something like my original comment if it weren’t based off experience. I’ve written numerous comments on my experience with Nvidia + Linux [+ Wayland] - such as this comment, primarily the the second, third, and fourth paragraphs. Sadly I don’t think its possible to “relative” link direct comments, so I’ve just linked my instance instead.

          Since you mentioned it’s a mobile GPU, I’m not sure if perhaps you have also have an internal GPU that is drawing your regular desktop. My friend doesn’t have nearly the same amount of issues that I have with Wayland, because he’s able to drive his desktop with his iGPU and does GPU passthrough to play games through a Windows VM - the 5600X that I have doesn’t include integrated graphics so this was not possible for me.

          Either way, if it works for you then fantastic. It certainly didn’t work for me, and definitely not for a lack of trying.

  • umbrella@lemmy.ml
    link
    fedilink
    arrow-up
    12
    arrow-down
    1
    ·
    edit-2
    1 year ago

    DONT get an nvidia GPU.

    It works but its a nightmare sometimes. The drivers are still bad. Dont’t.

    -nvidia user

  • buckykat [none/use name]@hexbear.net
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    3
    ·
    1 year ago

    Desktop replacement gaming laptops are a mistake. You can buy a normal laptop and the parts to build a gaming desktop for the same price and the laptop will be much more practical to carry around while the desktop will perform better and last longer.

    • rufus@discuss.tchncs.de
      link
      fedilink
      arrow-up
      5
      arrow-down
      2
      ·
      edit-2
      1 year ago

      And you will be able to upgrade a desktop computer. You could at some point swap the GPU or buy another stick of RAM for $60, whereas most things are soldered in laptops nowadays. Oftentimes they even solder the RAM to move it closer to the CPU and make the laptop a bit cheaper since it now requires less mechanical brackets/parts.

      Also a laptop will almost never get the same performance because it’s more difficult to get all the heat out and it’ll switch to a lower clockrate once all the heat builds up in that small form-factor.

      But it can be worth it if you need one device that can do both gaming and be carried around. Desktop replacements are quite popular. But they come with exactly those downsides. And it may be or might not be cheaper than buying one ultrabook plus a pc that’s tailored to gaming. It’s always a compromise, though.

    • sovietknuckles [they/them]@hexbear.net
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      1 year ago

      But desktop builds won’t use less electricity. I use a desktop replacement gaming laptop at home, without taking it anywhere, because it consumes less power

    • wim@lemmy.sdf.orgOP
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      1 year ago

      I mean that’s fine if that’s your opinion. But while they may be a mistake for you, I’ve found them to be a great compromise and enjoyed several of them for the past 10 years.

      I have a normal laptop, a ThinkPad X1 Nano, which I love. I also have a desktop with an RX 6800, but I can only use that in my office, a cramped space which has poor Internet and is in an inconvenient spot in our house.

      I’m looking for something that I can keep in the living room, and set up on our living room table to play some games with friends. I’ve had that desktop for almost 3 years and yet I’ve done most of my gaming since I had it on a 2013 Alienware laptop with an upgraded MXM graphics card.

      Different solutions for different people.

  • grinceur@programming.dev
    link
    fedilink
    arrow-up
    8
    ·
    1 year ago

    it’s not there yet but in a few years from now there is hope, a vulkan driver is in the work and the nvidia signed firmware would allow power management for newer gpu, but it’s not ready yet…

  • Yerbouti@lemmy.ml
    link
    fedilink
    arrow-up
    11
    arrow-down
    4
    ·
    1 year ago

    Linux noob here. Why do people refuse to use the proprietary driver? I did not had any seriousl issue with my 2080ti on Nobara. I can game and edit videos with better performances than in windows with same pc

    • wim@lemmy.sdf.orgOP
      link
      fedilink
      arrow-up
      10
      ·
      1 year ago

      I have had so many issues with Nvidia drivers, especially on laptops with Optimus. Black screens after booting, random breakage when updating, having to fuck around with OpenGL libraries all the time when you have integrated Intel graphics and Nvidia graphics on the same system. It’s just a pain for me on laptops.

      Wouldn’t be such a big issue on a desktop, but I’ve had a work-provided workstation with an Nvidia and 99% of the time if something broke on that machine, it was because Nvidia wasn’t compatible with some updated kernel or libraries.

      Intel and AMD have both provided us with a painless driver experience that just works out of the box all the time and is integrated in all the open source things (mainly the Linux kernel and the Mesa libraries for OpenGL & Vulkan). With Nvidia, you need to throw all that out and use their proprietary blobs for OpenGL and Vulkan.

      Also, I just think Nvidia is a scumbag company, trying to force single-vendor proprietary solutions on the market by abusing their dominant position (pushing CUDA while refusing to implement any new OpenCL version for over a decade, so software vendors couldn’t just pick a competitive open alternative is one example, the original G-Sync is another). I prefer not to give them any money if I can help it.

      • interceder270@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        1 year ago

        I’ve had all those issues back in like 2014.

        Nvidia Optimus has come a long way on Linux. Manjaro and Mint have utilities to enable it out of the box.

        THAT SAID

        We still have to prepend all programs we want to use the Nvidia GPU with prime-run. I’m not sure if mobile AMD users have to do the same, but this is legitimately annoying as hell this many years later and would actually be a good reason to pick AMD over Nvidia.

        • Sentau@feddit.de
          link
          fedilink
          arrow-up
          3
          ·
          edit-2
          1 year ago

          I’m not sure if mobile AMD users have to do the same

          No we don’t. Mesa and the kernel automatically decide to use the dGPU for intensive tasks. It is only on rare ocassions that I have to use the DRI_PRIME=1 to force the use of the dGPU. It has been months since I last did it

          • interceder270@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            Thanks. I’ve been curious about that.

            Gonna start sharing it as another reason why I would choose AMD over Nvidia, in addition to the drivers being open source.

    • lemmyvore@feddit.nl
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      1 year ago

      I’ve been using Linux for over 20 years and I don’t get it either. I don’t know why a vocal minority get so fixated on it. It’s not like it’s the only manufacturer with proprietary drivers. As long as the drivers work and are easy to install I don’t see a problem.

      I’ve used ATI/AMD cards equally over the years and I’ve always ended up having more problems overall with them than with Nvidia cards & drivers. If I were inclined to generalize I could say that open source drivers are apparently lower quality, right? 🙂

      But that would be just as silly as the other way around. I don’t think that open or closed drivers, in itself, automatically says anything about quality.

      If closed source drivers really were a problem then Nvidia wouldn’t be used by 80% of Linux gamers.

        • lemmyvore@feddit.nl
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          That’s because starting with April 2022 they mixed the AMD chipset from the Deck in with the PC stats. If you go back to March 2022 it’s different.

          • donio@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            Yep, I realized that as soon as I posted and tried to ninja-delete but too late :)

            If I sum up the numbers from March 2022 it’s 26% AMD and 38% NVIDIA.

    • PseudoSpock@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      A few reasons:

      • There is a strong desire to see if there is secret sauce in the driver that makes their cards so darn performant. Could it be applied to other video drivers?
      • To audit for vulnerabilities and fix them.
      • To allow the driver to use some kernel internals that the kernel developers keep trying to wall proprietary drivers off from.
      • Ideology
      • Community might be able to hack it to work better with Wayland, since the Wayland team has no interest in extending any kind of support to proprietary driver driving GPU’s… despite x11 working just fine forever. … see Ideology.
    • yum13241@lemm.ee
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      Often times it doesn’t install or they insist on using free software (read: free as in free speech)

    • Pantherina@feddit.de
      link
      fedilink
      arrow-up
      8
      arrow-down
      5
      ·
      1 year ago

      Its a proprietary driver, which could be an insane security and privacy risk. Its a modification to your kernel, normal on Windows, but not on Linux. It basically makes Linuxes security model weak.

    • Swiggles@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      5
      arrow-down
      3
      ·
      1 year ago

      Try playing games like Cyberpunk. I dare you :)

      You are lucky if you can play without a crash for even one minute with that card. I am not exaggerating. Something is seriously messed up with the 20XX series.

      Also Wayland is still a mess for Nvidia cards overall which is becoming more and more important.

      • Yerbouti@lemmy.ml
        link
        fedilink
        arrow-up
        5
        ·
        1 year ago

        Weird. I’ve tried about 12 games, they all work perfectly. Only in one case, I had to switch to an x session. Wayland is super responsive, only some small visual glitch from time to time. Da vinci studio edits and render videos super fast.

    • EddyBot@feddit.de
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      using external kerner driver (“out of tree”) come with caveats you need to take care of
      typically most linux distros will do this completely transparent but certain usecases will be more complicated
      espcially if you install packages outside of your linux distro repository like a newer kernel version or an older Virtual Box version

  • chunkyhairball@lemmy.ml
    link
    fedilink
    arrow-up
    7
    ·
    1 year ago

    I REGRET buying an nvidia adapter when I had the opportunity to buy an AMD/Radeon adapter.

    During the pandemic, I purchased an GeForce GTX 1650. It’s an older, Turing hardware-based card, so you’d think the driver support would be pretty mature, right? It has been NOTHING but problems.

    On nouveau, it’s stable, but 3d acceleration just doesn’t work right. Under the nvidia open source driver, it corrupts the screen after boot and locks up entirely second later. Under the proprietary driver, it freezes on boot a good amount of the time.

    Now, once I get it booted, it’s solid as a rock. I’ve gotta crank the engine over five or six times every time I DO boot, though. If I had it to do over again, I’d definitely have stuck with AMD.

    • Audacity9961@feddit.ch
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Even then it is 200 series and up. 100 and back through to 900 will still not just work at this stage.

  • Strit@lemmy.linuxuserspace.show
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    1 year ago

    As far as I know, only the kernel module was open sourced and in doing that Nvidia moved a lot of stuff from the driver, to the firmware/software part of their stack instead. So you would still need those, which are not open.

    • wim@lemmy.sdf.orgOP
      link
      fedilink
      arrow-up
      7
      arrow-down
      3
      ·
      1 year ago

      What makes you say Intel sucks? The A730M should be somewhere between an RTX 3060 and 3070 but with 12GB of VRAM. From my experience with Intel iGPUs, the software experience is very nice, so I just expect the same thing but with faster performance.

      I’ve tried an A730M laptop last year when they were new, and the drivers worked fine, everything was working out of the box. The only issue was that performance was not stable and power usage was high, but I’m assuming performance will only have improved with 12 months of driver engineering from Intel.

        • worldofgeese@lemmy.world
          link
          fedilink
          arrow-up
          4
          arrow-down
          1
          ·
          1 year ago

          The Intel discrete cards are fantastic value for money. There’s plenty of folks on the internet who can attest to this. Intel’s support story in general (so not just graphics cards) on Linux has been nothing less than sterling. If you’re using any Linux kernel you can expect Intel stuff to just work. It’s been this way for at least a decade.

            • LeFantome@programming.dev
              link
              fedilink
              arrow-up
              3
              arrow-down
              1
              ·
              1 year ago

              When is the last time you tried Intel hardware and with what software? I ask because your links do not really tell the same story as your post.

              The first link says that Mesa got “more Intel optimizations”. That sounds like a good thing. It basically says the same thing about AMD and NVIDIA. The only GPU “crash” that was addressed was for AMD which is widely regarded as the best option for Linux. I would not read that article and come away with any concerns about Intel.

              The second link says that kernel 6.2 added “full Intel support”. We are now in kernel 6.7. I use a rolling release and how a much newer kernel than 6.2. A brief Google leads me to believe that 6.5 ships with both Ubuntu 23.10 and Fedora 39.

              I have not used these cards myself so I do it know but others have said the experience was decent now. The OP does not seem that demanding. If it ok now and actively improving, he may be quite happy. It sounds better than nouveau for sure. Is it really as bad as you say?