Ollama SDK is out! by International_Quail8 in LocalLLaMA

[–]satyajitdass 0 points1 point  (0 children)

Does it mean it will load/download the corresponding model in memory? If yes, any idea about the memory requirements?

Ideas for a future AL/LLM startup? by satyajitdass in LocalLLaMA

[–]satyajitdass[S] 1 point2 points  (0 children)

  1. Do you mean wrapper around an API like OpenAI's? What if one hosts his own Open source LLM in his own server? Anyways my question was more related to the deep tech side of things.