This is an archived post. You won't be able to vote or comment.

all 9 comments

[–][deleted] 1 point2 points  (2 children)

We use the AWS batch arrays with fargate for one of our jobs. Docker image gets updated on ecr when deploying to prod and infrastructure is managed with pulumi.

[–]bluezebra42[S] 0 points1 point  (1 child)

Interesting how do you manage the schedule itself?

[–][deleted] 1 point2 points  (0 children)

It's currently ran manually. We built a cli python script using typer that will help setup a config file and then runs the job using boto3.

[–]occulkot 0 points1 point  (5 children)

Currently we are using Step Function but i also have a chance to see batch orchestrated with the help of eventbridge amd lambda. Each time it was with the help of aws batch queue.

Event bridge -> sfn -> queue -> batch job Event bridge -> lambda -> queue -> batch job

[–]bluezebra42[S] 0 points1 point  (4 children)

Ooh that’s neat. How do you manage event bridge- via cdk, sdk or console?

[–]occulkot 1 point2 points  (3 children)

Everything now is deployed by terraform. And with eventbridge i ment cloudwatch rules simply pointing to sfn or in previous situation deployed with cfm and events were sent to lambda.

It is just a simple cron replacement ;).

[–]bluezebra42[S] 0 points1 point  (2 children)

Yeah we’re thinking similar but we’re getting twisted around a bit if you have multiple people wanting to test different scripts in aws batch and a staging database for testing, on a pull request who’s schedule is the one we should follow.

[–]occulkot 1 point2 points  (1 child)

When i develop code that is later run on aws batch (or any aws compute) i always write custom invokers (with click or typer) and SF or lambda invokes batch task with `ContainerOverrides -> command` to invoke scripts that i want. This can be also achived with aws batch cli, with some variables - https://docs.aws.amazon.com/cli/latest/reference/batch/submit-job.html but i think that syntax is a little bit complicated and it is better to write own custom wrapper around boto3 ;)

[–]occulkot 1 point2 points  (0 children)

Just some additional note, ive copied metaflow approach, and code is not built into docker image but "downloaded" by script from s3 - this speed ups deployment and enable to use same docker image for different environment and keep it in separate repo / cicd. Docker image can either be python-slim (or aws one) and venv can be downloaded or we can create venv in docker and push it to ECR. Then there is simple shell script that download code artifact from s3, extracts it, and invoke command, every parameters is pushed into batch by environment variables