• Mukigachar@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I’m not very knowledgeable in this realm, so somebody clue me in. I always thought of JAX as targeted towards more “bespoke” stuff, is there any advantage to using it in a high-level way instead of Torch or TF? Anything in the performance or ecosystem etc?

    • Relevant-Yak-9657@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Saves time when creating neural nets. If you want to utilize subsets of its speed and not spend hours analysing training methods, Keras can be a good addition. Other than that, Flax is probably the best way for full tkinter while having an abstraction better than a NumPy like api.

    • narex456@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Biggest ‘advantage’ i can see is that, since Google is deprecating tf soon, JAX is the only googly deep learning lib left. It fills a niche, insofar a that is a definable niche. I’m sticking with pytorch for now.

      No clue about things like speed/efficiency, which may be a factor.

      • pm_me_your_smth@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        since Google is deprecating tf soon

        Do you have a source? IMO TF is too big to deprecate soon. They did stop support for windows, but nobody abandons an enormous project suddenly

        • Relevant-Yak-9657@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Yeah, from what I see, despite the mess TensorFlow might be, it still is getting updated frequently and has been improving these days. Not sure why they would depracate anytime soon.

        • narex456@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          TLDR: No, they are not officially planning to deprecate TF. Yes they are still actively developing TF. No, that doesn’t fill me with much confidence, coming from Google, especially while they are also developing Jax.


          Just searched this again and kudos, I can’t find anything but official Google statements that they are continuing support for TF in the foreseeable future. For a while people were doom-saying so confidently that Google is completely dropping TF for JAX that I kinda just took it on blind faith.


          All that said: #TF REALLY COULD GET DEPRECATED SOON Despite their insistence that this won’t happen, Google is known for deprecating strong projects with bright futures with little/no warning. Do not take the size of Tensorflow as evidence that the Goog is going to stand by it. Especially when they are actively developing a competing product in the niche.

          fwiw, it is also the current fad in tech to make high level decisions abruptly without proper warning to engineers. It really does mean almost nothing when a company’s engineers are enthusiastically continuing their support of a product.

          TF is just not on solid ground.

          • Relevant-Yak-9657@alien.topB
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Also, JAX is not official a google product, but rather a research product. So on paper, Tensorflow is google’s official framework for deep learning.

                • narex456@alien.topB
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 year ago

                  Noop (Language)

                  AngularJS (Framework)

                  The latter was quite popular as a JavaScript web framework. There may be more examples, I’m not an expert at hating google.

                  • Relevant-Yak-9657@alien.topB
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    1 year ago

                    But saying that it dropped Angularjs is like saying that google dropped tensorflow. They just rebooted it like tensorflow right? Thanks for Noop though. No idea that it existed lol.

              • Relevant-Yak-9657@alien.topB
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                Actually, another perspective to put is that TensorFlow’s deployment is something JAX doesn’t have (not that I know of) and cutting it would be idiotic for google, since they eliminated their own tool in an ongoing AI revolution. TensorFlow is their current tool and if they are going to abandon it, they will need a strong replacement for it’s deployability which does guarantee a few years (since the JAX team doesn’t seem to be quite focused in deployment). IIRC JAX deploys by Tensorflow rn.

          • Relevant-Yak-9657@alien.topB
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            That could be a valid concern. Personally, not too worried, since this is just a speculation though. Besides, the field is diverse enough that most people would benefit from learning multiple frameworks.

      • CampAny9995@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        My experience is that JAX is much lower level, and doesn’t come with batteries included so you have to pick your own optimization library or module abstraction. But I also find it makes way more sense than PyTorch (‘requires_gradient’?), and JAX’s autograd algorithm is substantially better thought out and more robust than PyTorch’s (my background was in compilers and autograd before moving into deep learning during postdocs, so I have dug into that side of things). Plus the support for TPUs makes life a bit easier compared to competing for instances on AWS.

        • Due-Wall-915@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          It’s a drop in replacement for numpy. It does not get sexy than that. I use it for my research on PDE solvers and deep learning and to be able to just use numpy and with automatic differentiation on it is very useful. Previously I was looking to use auto diff frameworks like tapenade but that’s not required anymore.

    • ranchdaddo@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      This is mainly for people who only learned Keras and don’t want to learn anything else