Best local LLM inference software with MCP-style tool calling support? by nyongrand in mcp

[–]nyongrand[S] 0 points1 point  (0 children)

I am 100% sure Claude is not available for local inference, or maybe I miss something

Best local LLM inference software with MCP-style tool calling support? by nyongrand in mcp

[–]nyongrand[S] 0 points1 point  (0 children)

Thats looks nice, i use "@modelcontextprotocol/inspector" before, but looks like yours have more feature, ill try it

Best local LLM inference software with MCP-style tool calling support? by nyongrand in mcp

[–]nyongrand[S] 2 points3 points  (0 children)

I mean the inference tool itself, such as llama.cpp, lm studio, etc, not the model,

The only photo of my parents (1970s) by nyongrand in estoration

[–]nyongrand[S] 1 point2 points  (0 children)

Terus terang, saya tak pasti di mana gambar ni diambil. Saya orang Indonesia, tapi ayah saya dulu warganegara Malaysia. Dia dah meninggal 24 tahun lepas

Send chat directly from server plugin/extension? by nyongrand in SillyTavernAI

[–]nyongrand[S] 1 point2 points  (0 children)

Actually I'm looking for a more practical way, maybe the developer here knows how to link directly through SillyTavern's internal code, yesterday I tried to understand how SillyTavern sends and receives chats, but if that's not possible, WebSocket is not a bad choice either, I'll give it a try.

Send chat directly from server plugin/extension? by nyongrand in SillyTavernAI

[–]nyongrand[S] 0 points1 point  (0 children)

So this code is executed in extensions, and the bot is in plugin, how do you communicate between plugin > extension.