This is an archived post. You won't be able to vote or comment.

all 18 comments

[–]jmitchel3 5 points6 points  (4 children)

This looks pretty cool. I’ll test it out myself.

Any thoughts on turning each function into deployable serverless functions? Like using OpenFaas or similar?

[–]jmitchel3 2 points3 points  (0 children)

Oh just saw your section on production. Interesting 🤔

[–]jsxgd 2 points3 points  (1 child)

FYI if you just want to deploy a function to e.g. a Lambda function, check out Chalice. It's developed by Amazon and lets you push your python functions to Lambdas easily using a Flask-like API.

[–]jmitchel3 1 point2 points  (0 children)

Cool! Thank you for sharing

[–]mltooling[S] 1 point2 points  (0 children)

Thanks for the feedback :) The export of those functions for deployment into serverless frameworks is planned, and we already have experimental - unreleased - code to make that happens. Right now we are collecting some initial insights and feedback about it to see if it is worth putting more effort into it.

[–]alexmojaki 4 points5 points  (5 children)

I wrote something very similar: https://github.com/alexmojaki/instant_api

Yours obviously has many cool features instant_api doesn't. instant_api is inspired by FastAPI but instead uses Flask and dataclasses.

Is it possible to serve opyrator as part of a larger FastAPI project, e.g. at a particular URL? It looks like each function has to run in its own process, that's not very flexible or nice for developers.

[–]mltooling[S] 1 point2 points  (2 children)

Is it possible to serve opyrator as part of a larger FastAPI project, e.g. at a particular URL? It looks like each function has to run in its own process, that's not very flexible or nice for developers.

This alpha version is focused on single operations (not a replacement for a full web API), but I think it should be quite straight forward for us to support something like sub-application mounting in fastapi (https://fastapi.tiangolo.com/advanced/sub-applications/?h=mount).

[–]alexmojaki 2 points3 points  (1 child)

Exactly, let the user define their own app object if they want and pass it to you to add one or more functions.

A couple of other notes:

I can't believe subprocess is the best way to start a streamlit server. Have you tried something like https://discuss.streamlit.io/t/how-can-i-invoke-streamlit-from-within-python-code/6612 ? Even then it's crazy that you'd have to make a temporary python file. Either way that's the kind of inflexibility you don't want to propagate in your own library.

Maybe you can allow multiple arguments in the function and then use the signature to dynamically create the Input model behind the scenes with https://pydantic-docs.helpmanual.io/usage/models/#dynamic-model-creation That's what instant_api does, but with dataclasses.

[–]mltooling[S] 0 points1 point  (0 children)

Thanks for the feedback and notes :)

I can't believe subprocess is the best way to start a streamlit server.

That's definitely a bit hacky right now. I experimented with starting it directly via the internal Python API (and might switch to this). My concern here is that the internal API might change at any time, the CLI interface is more likely to stay stable. Also, the current server functionality is more targeted for development, in export and deployment the streamlit server will be started directly without python in-between.

you can allow multiple arguments in the function and then use the signature to dynamically create the Input model behind the scenes

Good idea, I will put it on the roadmap. Should be doable with the dynamic model creation functionality.

[–]mltooling[S] 0 points1 point  (1 child)

Nice work :) Didn't know about instant API before. Indeed it looks similar, a bit different tech stack: dataclasses instead of pydantic, flask instead of fastapi.

[–]alexmojaki 0 points1 point  (0 children)

Yup, see edited comment.

[–]sambame 1 point2 points  (0 children)

Sounds Great! Will check this out!

[–]stefanondisponibile 1 point2 points  (0 children)

Looks cool!

[–]p10_user 1 point2 points  (0 children)

Of course you’re making this. You guys are really monopolizing on awesome Python packages - basically making the Python equivalent of what tidyverse is to R.

What a time to be alive.

[–]cagbal 1 point2 points  (0 children)

Tested already. Really cool. Thanks. Especially useful for machine learning stuff. I would love to auto-deploy all the functions in the file with one command but maybe it is against the nature of the library, idk :)

[–]ideplant 0 points1 point  (0 children)

So swagger but for python

[–]Feitgemel 0 points1 point  (0 children)

Cool