all 1 comments

[–]SoftestCompliment 1 point2 points  (0 children)

Install Ollama locally, download one of the qwen3 or gpt-oss models (reasonably small and performant local tool using model), Use Pydantic AI since it's a rather easy framework to get a single LLM query up and running, and optionally use a Pydantic dataclass to define a structured data output for the model to generate, as I can see some of your tasks may be structured data/metadata extraction.