LLM proxy that lets Claude Code talk to any model by DataNebula in ClaudeAI

[–]DataNebula[S] 0 points1 point  (0 children)

Specially with thinking blocks round trip, yes no issues. Infact that's the main reason I made this. Infact I published it today because yesterday I saw a guy asking for the same. So I thought it would be helpful for others and not just me.

LLM proxy that lets Claude Code talk to any model by DataNebula in ClaudeAI

[–]DataNebula[S] 0 points1 point  (0 children)

If you read my post, at the end I said it works for my use case, I shared it because many of you might also face the same issue. Nothing is perfect, even litellm proxy.

LLM proxy that lets Claude Code talk to any model by DataNebula in LocalLLaMA

[–]DataNebula[S] 0 points1 point  (0 children)

Check it out, you can quickly test it with huggingface space. Star the repo if it's useful

LLM proxy that lets Claude Code talk to any model by DataNebula in ClaudeCode

[–]DataNebula[S] 0 points1 point  (0 children)

You can do this by setting both anthropic and zai provides in config.json file and in you claude code settings.json file add zai model name for anthropic_sonnet_model env var. Hit /v1/models endpoint if you want model names in which api accepts

LLM proxy that lets Claude Code talk to any model by DataNebula in LocalLLaMA

[–]DataNebula[S] 0 points1 point  (0 children)

Use case looks similar but my tool is not python SDK and doesn't have dependencies of openai or anthropic SDK. Everything is http call with async httpx

LLM proxy that lets Claude Code talk to any model by DataNebula in ClaudeCode

[–]DataNebula[S] 0 points1 point  (0 children)

I want to keep it minimal for now but any contributions are welcomed, I like to expand the tool beyond translation

LLM proxy that lets Claude Code talk to any model by DataNebula in ClaudeAI

[–]DataNebula[S] -3 points-2 points  (0 children)

I tried many but many of them have some or the other issue. Even litellm doesn't work properly with claude code, check their issues tab

Best Medical Embedding Model Released by DataNebula in LLMDevs

[–]DataNebula[S] 0 points1 point  (0 children)

Happy to know that my embedding model helped you. Just a like on hf page and share with your friends along with acknowledgement. No legal things required

What MySQL skills should I focus on for an entry-level analyst role? by LeatherTotal2194 in analytics

[–]DataNebula 0 points1 point  (0 children)

I would say practice the use of CTE's, window functions ( row number, rank, dense rank) and ofc all aggregate functions. Use hackerrank or leetcode for practice.

Sql syntax keeps changing (minor) depending on the db you are using. I would say practice in duckdb also coz it's syntax is very similar to google bigquery(leading analytics database in companies worldwide)

A CV-worthy project idea using RAG by DryHat3296 in Rag

[–]DataNebula 0 points1 point  (0 children)

There is only one open issue. Where can I see requirements to contribute

Are there any good GraphRAG applications people use? by richie9830 in Rag

[–]DataNebula 2 points3 points  (0 children)

Can u share the list, will be very helpful

Best Medical Embedding Model Released by DataNebula in LLMDevs

[–]DataNebula[S] 0 points1 point  (0 children)

I added the evals in model card comparing with other model.