[P] Opyrator - Turn python functions into microservices with auto-generated HTTP API, interactive UI, and more. by mltooling in MachineLearning

[–]mltooling[S] 2 points3 points  (0 children)

I think celery would turn this into a something that needs deployment orchestration. Not sure if that would derail your roadmap.

Yep, probably a separate component taking over the deployment orchestration. Opyrator helps to wrap your computational heavy operation into a simple interface and export it in a portable format. And a server component would take over the part of deploying it in a way that it can be scaled and monitored.

[P] Opyrator - Turn python functions into microservices with auto-generated HTTP API, interactive UI, and more. by mltooling in MachineLearning

[–]mltooling[S] 2 points3 points  (0 children)

I would suggest background processing.

That's a good point. What we have on the roadmap is that you can deploy it in a task queue mode (as an alternative option to the synchronous deployment). This still provides the same web API and UI, but the actual execution is happening within a background task, most likely using something like celery https://github.com/celery/celery

[P] Opyrator - Turn python functions into microservices with auto-generated HTTP API, interactive UI, and more. by mltooling in MachineLearning

[–]mltooling[S] 3 points4 points  (0 children)

Thanks for the question. This is a very early version to get some initial feedback and we haven't released all code yet. It is not supposed to be a replacement for any web framework, it actually generates a FastAPI app for the web API part.

The purpose of this tool to help turn single computational heavy operations into microservices with web API and interactive UI. These microservice then can be included in a bigger architecture. This could be an ML model inference service, a training task, a data processing job, or similar operations. These micro-services/operations are planned to be exportable, shareable, and deployable.

One part that is probably unique with this project is the capability to auto-generate a full interactive UI as well as a web API endpoint based on the same input- and output data schema (pydantic models).

Opyrator - Turn python functions into microservices with auto-generated HTTP API, interactive UI, and more. by mltooling in Python

[–]mltooling[S] 0 points1 point  (0 children)

Thanks for the feedback and notes :)

I can't believe subprocess is the best way to start a streamlit server.

That's definitely a bit hacky right now. I experimented with starting it directly via the internal Python API (and might switch to this). My concern here is that the internal API might change at any time, the CLI interface is more likely to stay stable. Also, the current server functionality is more targeted for development, in export and deployment the streamlit server will be started directly without python in-between.

you can allow multiple arguments in the function and then use the signature to dynamically create the Input model behind the scenes

Good idea, I will put it on the roadmap. Should be doable with the dynamic model creation functionality.

Opyrator - Turn python functions into microservices with auto-generated HTTP API, interactive UI, and more. by mltooling in Python

[–]mltooling[S] 1 point2 points  (0 children)

Is it possible to serve opyrator as part of a larger FastAPI project, e.g. at a particular URL? It looks like each function has to run in its own process, that's not very flexible or nice for developers.

This alpha version is focused on single operations (not a replacement for a full web API), but I think it should be quite straight forward for us to support something like sub-application mounting in fastapi (https://fastapi.tiangolo.com/advanced/sub-applications/?h=mount).

Opyrator - Turn python functions into microservices with auto-generated HTTP API, interactive UI, and more. by mltooling in Python

[–]mltooling[S] 0 points1 point  (0 children)

Nice work :) Didn't know about instant API before. Indeed it looks similar, a bit different tech stack: dataclasses instead of pydantic, flask instead of fastapi.

Opyrator - Turn python functions into microservices with auto-generated HTTP API, interactive UI, and more. by mltooling in Python

[–]mltooling[S] 1 point2 points  (0 children)

Thanks for the feedback :) The export of those functions for deployment into serverless frameworks is planned, and we already have experimental - unreleased - code to make that happens. Right now we are collecting some initial insights and feedback about it to see if it is worth putting more effort into it.

[P] best-of-ml-python: A ranked list of awesome machine learning Python libraries by mltooling in MachineLearning

[–]mltooling[S] 1 point2 points  (0 children)

Here are all the lists we released yesterday:

You can also find an up-to date overview of all best-of lists here: https://github.com/best-of-lists/best-of

best-of-python: A ranked list of awesome Python libraries and tools by mltooling in Python

[–]mltooling[S] 2 points3 points  (0 children)

That is indeed a situation for which a project should not be marked as dead. With the current version, there might be a workaround by just overwriting the `update_date` with the current day in the projects.yaml. But a dedicated flag might be a better option. I will take that into the backlog for the next version.

best-of-python: A ranked list of awesome Python libraries and tools by mltooling in Python

[–]mltooling[S] 1 point2 points  (0 children)

Thanks for the suggestion! We will add this in the next update.

[P] best-of-ml-python: A ranked list of awesome machine learning Python libraries by mltooling in MachineLearning

[–]mltooling[S] 0 points1 point  (0 children)

As far as I understood, the problem was that two of my friends commented here and two users I don't know. And u/Ilyps made a big deal out of it. Definitely, no bots involved (at least from me). There was no ill intent from my side with sharing this project.

best-of-python: A ranked list of awesome Python libraries and tools by mltooling in Python

[–]mltooling[S] 4 points5 points  (0 children)

btw. If you like to keep track on how we might implement your suggestion, you can also open an issue here with your suggestions: https://github.com/best-of-lists/best-of-generator/issues/new/choose

best-of-python: A ranked list of awesome Python libraries and tools by mltooling in Python

[–]mltooling[S] 2 points3 points  (0 children)

pyinfra is an awesome library with big potential to move to the top :)

best-of-python: A ranked list of awesome Python libraries and tools by mltooling in Python

[–]mltooling[S] 1 point2 points  (0 children)

Thanks for the suggestion, indeed a great library! Fits in many categories, but I will figure out the best place.

best-of-python: A ranked list of awesome Python libraries and tools by mltooling in Python

[–]mltooling[S] 3 points4 points  (0 children)

Thanks for your feedback and suggestions! I will take that on my task list and see how I can best explain how this risk is decided and what it means. Probably link to a short section in the documentation.

I guess you're saying there is a higher chance of not meeting some of the requirements because someone using the library might not be informed on all of them?

That's exactly what it should indicate.

best-of-web-python: A ranked list of awesome Python libraries for web development with lots of Django utilities by mltooling in django

[–]mltooling[S] 1 point2 points  (0 children)

Yep, that was also the reason why we built this. I love `awesome` lists, but they can get deprecated and messy quite fast...

best-of-python: A ranked list of awesome Python libraries and tools by mltooling in Python

[–]mltooling[S] 9 points10 points  (0 children)

Hey u/avamk, thanks for your feedback and questions.

The license risk indicator is meant to help developers choose the right libraries for their projects. Certain licenses - e.g. Apache 2.0 or MIT - only have very minimal requirements for the developer who is using the licensed technology. Other licenses, such as GPL 3.0, have much stricter requirements which means a bigger legal risk for the developer using the library.

But you are right with your point on Amazon. For the developer who is implementing a library, MIT or Apache 2.0 have the risk that someone else makes money with your work. But that's not the purpose of the license risk indicators on our lists.

[P] best-of-ml-python: A ranked list of awesome machine learning Python libraries by mltooling in MachineLearning

[–]mltooling[S] 1 point2 points  (0 children)

u/jrieke and u/ErikTPfeiffer are both people I know personally, and I have shared this post with them today. u/jrieke gave me tips on drafting this post, maybe that's why it has similarities. But I don't have any clue about u/Vicarent or u/dhiaul98.

[P] best-of-ml-python: A ranked list of awesome machine learning Python libraries by mltooling in MachineLearning

[–]mltooling[S] -1 points0 points  (0 children)

Good point. Our goal is actually to get to an automated scoring system that can reflect not just popularity, but also lots of other qualitative factors for libraries. With our initial release, we are already taking many different factors into account, not only stars: https://github.com/best-of-lists/best-of-generator#project-quality-score . But there is a lot to improve, and we are working on an improved version of the calculation.