So I’ve been trying to install the proprietary Nvidia drivers on my homelab so I can get my fine ass art generated using Automatic1111 & Stable diffusion. I installed the Nvidia 510 server drivers, everything seems fine, then when I reboot, nothing. WTF Nvidia, why you gotta break X? Why is x even needed on a server driver. What’s your problem Nvidia!

  • fx_@feddit.de
    link
    fedilink
    English
    arrow-up
    81
    arrow-down
    5
    ·
    1 year ago

    Nvidia doesn’t hate linux, it just don’t care and the linux community hates nvidia

    • Vilian@lemmy.ca
      link
      fedilink
      arrow-up
      37
      ·
      1 year ago

      amd didn’t care a few years ago, but their drivers are open, so the community can fix it even if the company don’t care(now amd care a lot more, so it’s better) nvidia is a closed source crap, and it don’t give a fuck too

    • HurlingDurling@lemm.ee
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      1 year ago

      And they can’t get all those sweet sweet tracking data they get from Windows users

  • GenderNeutralBro@lemmy.sdf.org
    link
    fedilink
    arrow-up
    56
    arrow-down
    1
    ·
    edit-2
    1 year ago

    Linux is their bread and butter when it comes to servers and machine learning, but that’s a specialized environment and they don’t really care about general desktop use on arbitrary distros. They care about big businesses with big support contracts. Nobody’s running Wayland on their supercomputer clusters.

    I cannot wait until architecture-agnostic ML libraries are dominant and I can kiss CUDA goodbye for good. I swear, 90% of my tech problems over the past 5 years have boiled down to “Nvidia sucks”. I’ve changed distros three times hoping it would make things easier, and it never really does; it just creates exciting new problems to play whack-a-mole with. I currently have Ubuntu LTS working, and I’m hoping I never need to breathe on it again.

    That said, there’s honestly some grass-is-greener syndrome going on here, because you know what sucks almost as much as using Nvidia on Linux? Using Nvidia on Windows.

    • lightstream@lemmy.ml
      link
      fedilink
      arrow-up
      7
      ·
      1 year ago

      I cannot wait until architecture-agnostic ML libraries are dominant and I can kiss CUDA goodbye for good

      I really hope this happens. After being on Nvidia for over a decade (960 for 5 years and similar midrange cards before that), I finally went AMD at the end of last year. Then of course AI burst onto the scene this year, and I’ve not yet managed to get stable diffusion running to the point it’s made me wonder if I might have made a bad choice.

      • Ádám@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        It’s possible to run stable diffusion on amd cards, it’s just a bit more tedious and a lot slower. I managed to get it working on my rx 6700 under arch linux just fine. Now that I’m on fedora, it doesn’t really want to work for some reason, but I’m sure that it can be fixed as well, I just didn’t spend enough time on it.

    • ProtonBadger@kbin.social
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      Yeah they don’t hate Linux, they just have their own priorities. That said I’m running Nvidia+Wayland happily, for desktop they have worked a lot more on Wayland this year, the upcoming driver fixes a bunch of things, and my distrib handled driver installation and updates, I never have to think about it.

    • Sparking@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      It just makes no sense to me though, how is it sustainable for nvidia to not have great Linux kernel support? Like, let the kernel maintainers do their job and reap the benefits. I’m guessing that nvidia sees enterprise support contracts as an essential revenue stream, but eventually even enterprises are going to go with hardware that Linus isn’t giving the finger to right? Am I crazy?

  • xrun_detected@programming.dev
    link
    fedilink
    arrow-up
    43
    ·
    1 year ago

    nvidia has always been hostile to open source, as far back as i can remember.

    back when nvidia bought 3dfx they took down the source code for the open 3dfx drivers within days, if not on the same day. i remember because i had just gotten myself a sweet voodoo 5 some weeks before that, and the great linux support was the reason i chose it… of course the driver code survived elsewhere, but it told me all i needed to know about that company.

    also: linus’ rant wasn’t just a fun stunt, it was necessary to get nvidia to properly cooperate with the open source community if they want to keep making money running linux on their hardware.

  • sealneaward@lemmy.ml
    link
    fedilink
    arrow-up
    34
    arrow-down
    1
    ·
    1 year ago

    Takes about 8 hrs to setup properly. But once you do set your Nvidia card with Linux, you just never update your OS and cry to sleep every night.

  • WasPentalive@beehaw.org
    link
    fedilink
    arrow-up
    19
    arrow-down
    1
    ·
    1 year ago

    Nvidia does not ‘hate’ Linux, Nvidia simply never thinks about Linux. They need to keep secrets so people can’t buy the cheap card and with a little programming turn it into the expensive card.

      • WasPentalive@beehaw.org
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        Of course you do. Nvidia wants you to buy the expensive card instead. Since they are almost the same card in some instances the only difference is knowing that you can change values in certain registers to make cheapcard act like expensivecard. I personally use Intel graphics and won’t have nvidea.

    • michel@lemmy.ml
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      This. I bet the experience is better if you use it on an enterprise distro they have precompiled drivers for.

      With the boom in AI their focus is increasingly on the data center market, so it’s a small miracle (thanks Red Hat and others prodding them) they even have an open driver right now for newer cards (tellingly it’s in a better state for computational use than for rendering pixels on the screen)

  • Sparking@lemm.ee
    link
    fedilink
    English
    arrow-up
    15
    ·
    1 year ago

    What i don’t get is how nvidia stock is exploding when using their hardware for AI is a nightmare on Linux. How are companies doing this? Are they just offering enterprise support to ibsiders or something?

    • Crayphish@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      23
      ·
      1 year ago

      For what it’s worth, NVIDIA’s failings on Linux tend to be mostly in the desktop experience. As a compute device driven by cuda and not responsible for the display buffer, they work plenty good. Enterprise will not be running hardware GUI or DEs on the machines that do the AI work, if at all.

      • Aasikki@lemmy.ml
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        Even the old 1060 in my truenas scale server, has worked absolutely flawlessly with my jellyfin server.

      • Boo@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        I’ve had a bunch of issues with my GTX 1080 before I switched to an AMD RX 5700 XT. I love it, but I recently put the 1080 back in use for a headless game streaming server for my brother. It’s been working really well, handling both rendering and encoding at 1080p without issue, so I guess I’ve arrived at the same conclusion. They don’t really care about desktop usage, but once you’re not directly interacting with a display server on an Nvidia GPU, it’s fine.

      • Diplomjodler@feddit.de
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        1 year ago

        They don’t give a fuck about consumers these days and Linux being just a tiny fraction of the userbase, they give even less of a fuck.

    • cybersandwich@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      1 year ago

      Nvidia is a breeze on linux vs amd. cuda is the only thing meaningfully supported across Windows and Linux. I fought with my 6900xt for so long trying to get ROCm working that I eventually bought a used 1080ti just to do the AI/ML stuff I wanted to do. I threw that into a server and had everything up and running in literally 10 minutes (and 5 minutes was making proxmox pass the gpu through to the VM).

      People want to bitch about nvidia, but their entire ecosystem is better than AMD. The documentation is better and the tooling is better. On paper AMD is competitive but in practice Nvidia has so much more going for it–especially if you are doing any sort of AI/ML.

      There are some benefits to to amd on linux; its the reason I replaced my 3070ti for a 6900xt. But that experience taught me: 1. AMD isn’t as good on linux as people give it credit for 2. nvidia isn’t as bad on linux as people blame it for. You trade different issues. Eg. Lose nvenc and cant use amf unless you use the amdpro driver not the open source one. if you use the pro driver you immediately lose half the benefits of the open source driver which is probably why you switch to amd on linux to begin with. So if you game, you can’t stream with a decent encoder–so you have to play with settings and throw cpu horsepower at it.

      But hey, my DE doesn’t stutter and I dont have to do kludgy workarounds to get some games to play.

  • scorpiosrevenge@lemmy.ml
    link
    fedilink
    arrow-up
    15
    arrow-down
    1
    ·
    1 year ago

    Switched to high powered AMD GPUs years ago… No regrets. Awesome graphics, better support, and a better price point usually.

  • planish@sh.itjust.works
    link
    fedilink
    arrow-up
    14
    ·
    1 year ago

    They love to publish drivers that worked with like 1 release of X 5 years ago when the card came out and never update them.

    Except when they update them and it breaks X.

  • mub@lemmy.ml
    link
    fedilink
    arrow-up
    13
    ·
    1 year ago

    I’m on the cusp off jumping to Arch. Before I do I’m replacing my rtx 3080 with an RX 6800 XT. They are close enough in performance and identical pricing on eBay.

    I’ve done a bunch of testing and found great support for all my hardware except my Razer Ripsaw HDMI capture device, which I can replace with something supported. It is just the Nvidia bullshit holding me back.

    • /home/pineapplelover@lemm.ee
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      When I built my pc, I made sure to get AMD because of the nvidia outcry from the linux community. Thank goodness I got a 6800xt. I haven’t had any problems with it. It worked straight out of the box.

    • brakenium@lemm.ee
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      While I’m using AMD, I have had no issues with Nvidia on Arch using X before I switched earlier this year. One just installs the nvidia or nvidia-dkms package. My main reasons to switch were I had a 1060 6GB and it was getting old, AMD had a better price and if I’m keeping this one as long as my last I wanted to be certain wayland support was good even though I don’t use it right now

      • Delta_44@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Using X, but X is shitty since ages and Wayland is gaining neat features that I personally don’t give a damn to (see: tearing protocol)

  • ngp@lemmy.sdf.org
    link
    fedilink
    arrow-up
    9
    ·
    1 year ago

    I’m hoping the recent explosion of AI/ML stuff will create more incentives for them to have proper support for desktop Linux, but I’m not counting on it.

    • AggressivelyPassive@feddit.de
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      Those are different drivers, or rather different parts of the driver.

      CUDA has been a staple in HPC for years now and the situation didn’t exactly improve.

      • ngp@lemmy.sdf.org
        link
        fedilink
        arrow-up
        5
        ·
        1 year ago

        I mean the number of people using beefy Linux workstations with desktop environments is likely to increase because of it, not referring to the datacenter market they’re already entrenched in.

    • Dudewitbow@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      They dont see a reason to their biggest buyers are in enterprise level and will pay for the extra support.

      Common ML/AI stuff already work on AMD albeit not optimally (Tensorflow, Pytorch) and even some projects already work on AMD (e.g Stable Diffusion). Users are far better off creating a more generic branch for projects that would support CPU based acceleration (via both Intels and AMDs inclusion of AI acceleators in their products) then to hope Nvidia of all companies mess with their bottom line to give linux proper support.

  • danielton@outpost.zeuslink.net
    link
    fedilink
    arrow-up
    7
    ·
    edit-2
    1 year ago

    I call them “novideo” because the nvidia GPU in a PC someone gave me was the bane of my existence on Linux. I ended up buying a Radeon for it because I got so tired of having no video after security updates. Nvidia seems to hate everybody except Windows for some reason. Even Apple ditched them long before they ditched Intel.

    But yet, it seems like the majority of Linux users have nvidia anyway.

    • Rassilonian Legate@mstdn.social
      link
      fedilink
      arrow-up
      8
      ·
      1 year ago

      @danielton
      @Mr_Esoteric
      >But yet, it seems like the majority of Linux users have nvidia anyway.

      Probably becouse it’s more popular among windows users, so when most people switch to linux from Windows, they use the hardware they already had, which more often than not includes an nvidia GPU

    • 1984@lemmy.today
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      Nvidia seems to hate everybody except Windows for some reason.

      It’s called money. Microsoft and all these big tech companies have lots of agreements with eachother to support certain choices and ignore others. This is also why Lenovo has very limited choice of amd processors, and if they put that in, it’s in a model with other serious flaws.

      • danielton@outpost.zeuslink.net
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 year ago

        But it’s still stupid, especially if it’s about money, since Nvidia wants to sell a lot of chips to the Android market. And with Linux users being dumb enough to keep buying Nvidia products and using their mediocre proprietary drivers, nothing will ever change.

    • Rassilonian Legate@mstdn.social
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      @danielton
      @Mr_Esoteric
      >But yet, it seems like the majority of Linux users have nvidia anyway.

      Probably becouse it’s more popular among windows users, so when most people switch to linux from Windows, they use the hardware they already had, which more often than not includes an nvidia GPU