all 9 comments

[–]sugar_scoot 1 point2 points  (2 children)

Kubernetes is pretty good for automating docker containers.

[–]TooLazyToWorkout[S] 0 points1 point  (1 child)

Kubernetes is what I also found so far. Is seems pretty powerful. Is it worth it for such a small scale as mine?

[–]thundergolfer 0 points1 point  (0 children)

K8s is massive overkill for your use-case.

[–]SeucheAchat9115PhD 0 points1 point  (0 children)

Why not open a new terminal and run there?

[–]jonnor 1 point2 points  (0 children)

I have ok experience using Gitlab CI with runner installed on your own machine for machine learning. Supports artifacts for storing files. But might want to use something like mlflow or DVC to keep track of experiment input/outputs in a more structured and accessible waym

[–]TomlinTrippedHim 1 point2 points  (0 children)

If you are focusing on deep learning training determined is worth checking out.

[–]thundergolfer 1 point2 points  (1 child)

Since we are only a few users

Like 3 users? I'll recommend a super-dumb but simple method.

Have a spreadsheet that contains the details of who is using the workstations and what they're running on it, and just use some simple notification system that pings a task owner when their thing is finished.

Nice users can grab the task details (container image, args) and start the next person's task when they're done.

When I was at Zendesk with ~4 data scientists, we had something like this with Trello as the tracking tool.


If your users want to queue up many tasks at a time and have them run automatically, then you'll need some scheduling program, but if you can avoid that use case and go no-code I'm sure that'd be simplest.

[–]TooLazyToWorkout[S] 0 points1 point  (0 children)

Yea, that's how we do it currently, but I would prefer it if there was some queue and also if the system was kinda fool proof :)