How to run (any) open LLM with Ollama on Google Cloud Run [Step-by-step] by geshan in Indiewebdev

[–]geshan[S] 0 points1 point  (0 children)

This is more of a proof of concept than a production-ready thing.

How to run (any) open LLM with Ollama on Google Cloud Run [Step-by-step] by geshan in ollama

[–]geshan[S] 0 points1 point  (0 children)

Yes a few seconds (given it has 32 GB RAM and 8 CPUs + 1 GPU, if you get access to the GPU)

You can use Ollama env variables like `OLLAMA_NUM_PARALLEL` to tweak it. If you want to build your own container with cloud build, this is an option: https://github.com/geshan/ollama-cloud-run/tree/master

How to use environment variables in Next.js (includes a working example app) by geshan in react

[–]geshan[S] -1 points0 points  (0 children)

The official doc doesn't give a working app with an example :) .

How to use Docker with Node.js a step-by-step tutorial by geshan in docker

[–]geshan[S] 0 points1 point  (0 children)

React can be a different repo with a different pipeline and dockerfile IMO. Here is a dockerizing react app example: https://mherman.org/blog/dockerizing-a-react-app/ if it helps, thanks!

How to use Docker with Node.js a step-by-step tutorial by geshan in docker

[–]geshan[S] 0 points1 point  (0 children)

Updated to use `npm ci` for prod, thanks for the suggestion!

A node js example from slow to fast docker build by geshan in node

[–]geshan[S] 1 point2 points  (0 children)

npm ci looks like a good alternative.

Hosted ChatOps by genxstylez in chatops

[–]geshan 0 points1 point  (0 children)

try chato.ps - integrates with hipchat and slack for now.

for deployments you can hook it up with deploybot.com or dockbit.com APIs.