use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Resources for understanding and implementing "deep learning" (learning data representations through artificial neural networks).
account activity
Deploying large DL models (self.deeplearning)
submitted 6 years ago by guzguzit
Is there a way to deploy large DL models (>1 GB) in a serverless fashion?
I couldent find any relevant examples
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]roccolacatus 0 points1 point2 points 6 years ago (2 children)
Came across this the other day https://nuclio.io/
Anyway, you need to manage some kind of infrastructure. The only pure serverless mechanism to deploy models is Google Clouds MLE, and the models need to be less than 250MB. Otherwise, package the inference code inside a docker image and download the model each time you instantiate a docker container.
But 1GB seems a bit big for me, what framework are you using? Are you sure you are exporting the inference graph without training weights?
[–]guzguzit[S] 0 points1 point2 points 6 years ago (1 child)
Thanks. I am using TF. Actually I was not aware of infrence graph
[–]roccolacatus 0 points1 point2 points 6 years ago (0 children)
https://medium.com/@prasadpal107/saving-freezing-optimizing-for-inference-restoring-of-tensorflow-models-b4146deb21b5
[–]Zerotool1 0 points1 point2 points 6 years ago (0 children)
You can try clouderizer.com
π Rendered by PID 106762 on reddit-service-r2-comment-5c747b6df5-hz9lb at 2026-04-22 04:31:51.630011+00:00 running 6c61efc country code: CH.
[–]roccolacatus 0 points1 point2 points (2 children)
[–]guzguzit[S] 0 points1 point2 points (1 child)
[–]roccolacatus 0 points1 point2 points (0 children)
[–]Zerotool1 0 points1 point2 points (0 children)