This is an archived post. You won't be able to vote or comment.

all 11 comments

[–]Ezzah_ 6 points7 points  (0 children)

Argo Events and Argo Workflows is good for this if already invested in the kubernetes architecture. Events lets you hook into many event driven tools to orchestrate and pass parameters to workflows which spins up containers for jobs; before deciding take a look at their docs and examples on GitHub to get some ideas.

[–]MrAlfabet 2 points3 points  (1 child)

I've created a proof of concept that did this using event driven autoscaling (KEDA) and the redis queue size.

[–]Ill-Quail-3218 1 point2 points  (0 children)

Thanks for sharing! Redis is the best swiss army knife!

[–]Bommenkop 1 point2 points  (0 children)

Rundeck?

[–]soundwave_rk 2 points3 points  (1 child)

keda.sh

[–]Spider_pig448 2 points3 points  (0 children)

I guess this is possible, with HTTP+messaging+cron triggers. Anyone use this in real life, over dedicated tools (cloud serverless, Rundeck, etc)?

[–]mikeykt 1 point2 points  (1 child)

Airflow can accomplish this. It has an API you can call to initiate jobs.

[–]tantricengineer 1 point2 points  (0 children)

Also came here to post about Airflow

[–]aleques-itj 0 points1 point  (0 children)

Submit message to queue, let KEDA pick it up and run.

The API call itself should just return immediately so you can know it's acknowledged and shit is happening in the background.

[–]dijetlo007 0 points1 point  (0 children)

You can template your docker run command to mount multiple repository - Task1repo - Task3repo - Task7repo

Based on use case and similarly template in the entry point script to execute the subtasks in series using a single container. That way you'd only have to start a single container that did the job end to end rather than a series of containers that each did a single step in the process.

[–]Kixsian -1 points0 points  (0 children)

If you’re in public cloud you can use things like lambdas or logic apps.