I am looking for an easy way to turn my local Jupyter notebooks into a deployable model without having to write a lot of code or configuration. Thank you.

  • qalis@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    You shouldn’t do that, for multiple reasons (I can elaborate if needed). Your model is a binary file, a set of weights, basically no matter what you train. Once you write it to disk, typically with built-in serialization (e.g. pickle for Scikit-learn, or .pth format for PyTorch), there are lots of frameworks to deploy it.

    The easiest to use and the most generic one is BentoML, which will package the code into a Docker image and automatically deploy with REST and gRPC endpoints. It has a lot of integrations, and is probably the most popular option. There are also more specialized solutions, e.g. TorchServe.

    However, if you care about inference speed, you should also compile or optimize your model for the target architecture before packaging it for the API and target runtime, e.g. with ONNX, Apache TVM, Treelite or NVidia TensorRT.

    • Medium_Ad_3555@alien.topOPB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      I agree that ONNX would be the right solution if you need to serve 100M inference requests. However, my code is not for that case; most likely it will serve up only 100K requests and will be either thrown away or completely re-engineered for the next iteration of requirements. Additionally, it’s not just about the binary model file; there is pre-processing involved, data needs to be pulled from an internal API, the inference needs to be run, and finally, the results need to be post-processed.

      I know how to convert it to fast API, but was curious if there is any solution where I can parameterize and sever an inference cell code with low effort.

    • ThisIsBartRick@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      this is the right answer. The question screams novice. And his comment “You are entitled to your own opinion” when faced with advice shows that he’s not willing to learn

      • Medium_Ad_3555@alien.topOPB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        You can label it as a wish in your hologram of reality; I’m not seeking a pro badge or someone’s approval.

  • MangoReady901@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Us nbimporter, create a function of the code you want to use. Pick your favorite api framework (sanic, flask, fast API). Import your function from your notebook and wrap in a simple get/post request. Additionally take the output of your function and return it as some kind of Json item.

    I can setup a quick demo if any of this went over your head. Think this is a great idea to develop robust/testable code that’s easy to debug.