“A republic, if you can keep it.”

  • drwankingstein@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 hours ago

    for AI, a lot of what mozilla is doing is kinda… meh, llamafile is maybe useful. But mostly, the only really neat thing that is relevant to mozilla is webgpu local AI stuff which chromium has better support for anyways atm lol.

    Been with firefox since it was seamonkey, and been donating regularly until around baker. Mozilla has had it’s up and downs throughout then for sure. but lately, it’s just been downs.

    • jarfil@beehaw.org
      link
      fedilink
      arrow-up
      1
      ·
      9 hours ago

      Yeah, I don’t think I like llamafile, reusing some weights between models, and smaller updates, sounds like a better idea.

      What I’d like to see is a unified WebNN support, for CPU, GPU, and NPU: WebNN Overview

      (Not to pull rank, but my mail profile can be tracked to Netscape Navigator, across multiple OSs 😁)

      • drwankingstein@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 hours ago

        webnn is neat. I’m not super excited for it too much since last time I looked into it, it for sure had some issues. They seem to be addressed though. ONNX is not a terrible interface to work with either and has good platform support so that wont be an issue. I do prefer wgpu solutions however. While they don’t work with NPUs (for obvious reasons :D) they are pretty much a “program once run anywhere” solution since it supports metal, dx12 and vulkan. (Only recently got fp16 support though so most things are still rough)

        but for higher perf needs I can see webnn being a lot more useful

        (Not to pull rank, but my mail profile can be tracked to Netscape Navigator, across multiple OSs 😁)

        just means older then I am ;D