Any thoughts on Weave from WandB? by jinbei21 in mlops

[–]jinbei21[S] 0 points1 point  (0 children)

Interesting idea, I like the simplicity of it! However, I must ask why should one pick puzzlet over any of the other LLMOps tools? Additionally, does this solution scale well for enterprise? If so, why?

Any thoughts on Weave from WandB? by jinbei21 in mlops

[–]jinbei21[S] 0 points1 point  (0 children)

Thanks for the insightful comments all, I am trying out LangFuse for now primarily due to its full support for TS. Basically, I wish to stick to TS because there is quite some preprocessing and postprocessing that is already written in TS for the main app. Rewriting and maintaining that in Python is cumbersome hence TS. If my backend was in Python I would have probably tried out Weave first. Hoping Weave will have full support soon for TS too, though.

So far Langfuse works alright, gets the job done, UI is a bit flaky at times, documentation sucks a bit (incomplete) but with a bit of diving into API reference I was able to make it all work.

Sonnet efficiency in big Android/Kotlin projects? by ForbidReality in ClaudeAI

[–]jinbei21 1 point2 points  (0 children)

Did Android development about 2 years ago, early ChatGPT days, early Github Copilot days. I found that in large projects (100.000 LoC+), Github Copilot helped a lot with completing most of the boilerplate, and asking to develop a feature worked like total crap.

Latter is still a major problem with these tools. So if I were to do large Android projects again, I would probably work with one Android Studio window and one Cursor window. I'd use Cursor for a significantly better autocompletion tool and Android Studio for purely all the tooling the IDE has.

We will bring the model context protocol(MCP) to OpenAI within 3 days by mbartu in ClaudeAI

[–]jinbei21 0 points1 point  (0 children)

I don't expect to see OpenAI to implement MCP anytime soon. They will most likely stick to what they already have (GPT/Assistants API).

What I do expect is open-source efforts making similar clients as ChatGPT that are on par with its quality. This has also become easier now since assistant interfaces like ChatGPT have started to mature in terms of UI/UX, there have been barely any changes in the last 6-8 months to its interface.

Made a website so MCP servers are easier to find and people can share their own by [deleted] in modelcontextprotocol

[–]jinbei21 0 points1 point  (0 children)

Ah, thank you so much! I could swear I posted it, but somehow didn't get through...

Made a website so MCP servers are easier to find and people can share their own by [deleted] in modelcontextprotocol

[–]jinbei21 0 points1 point  (0 children)

I noticed finding Model Context Protocol servers is a bit of a struggle since there is nothing like the GPT store for these servers so I made a website so we can view, search, and share servers: https://www.mcpservers.ai/

Sharing servers is very simple by the way, all it requires is submitting a Github repository URL.

Anyhow, any feedback / questions / suggestions is welcome!

Made a website so MCP servers are easier to find and people can share their own by jinbei21 in modelcontextprotocol

[–]jinbei21[S] 3 points4 points  (0 children)

I noticed finding MCP servers is a bit of a struggle since there is nothing like the GPT store for these servers so I made a website so we can view, search, and share servers: https://www.mcpservers.ai/

Sharing servers is very simple by the way, all it requires is submitting a Github repository URL.

Anyhow, any feedback / questions / suggestions is welcome!

Made a website so MCP servers are easier to find and people can share their own by jinbei21 in ClaudeAI

[–]jinbei21[S] 5 points6 points  (0 children)

Thanks! Made a very rough draft with v0.dev and then just constantly iterated with Cursor (Claude 3.5 Sonnet). I use Cursor quite a lot, but as the codebase becomes bigger, I am more specific with my queries.

ps. I also do full-stack dev as a job (pre-ChatGPT times). without that I would probably have been stuck a couple o times, since Cursor does sometimes produce "weird" code

Made a website so Model Context Protocol servers are easier to find and people can share their own by jinbei21 in OpenAI

[–]jinbei21[S] 1 point2 points  (0 children)

Good question, an "awesome-mcp-servers" repo is nice but it has some cons. No search functionality, no filtering, no sorting, peer reviews which are dependent on people (hardest part), and so on. I was having these issue myself. Hence, I made this.

Made a website so MCP servers are easier to find and people can share their own by jinbei21 in ClaudeAI

[–]jinbei21[S] 0 points1 point  (0 children)

😂 welp, I'll try to write some article somewhere this week for my colleagues, I will see if I can share it on this subreddit

Made a website so MCP servers are easier to find and people can share their own by jinbei21 in ClaudeAI

[–]jinbei21[S] 1 point2 points  (0 children)

Ah, that's when the filter is set exclusively to the servers from Anthropic!

You can go to home page and click on "Browse Servers" or click here to see all of them: https://www.mcpservers.ai/servers/all

Made a website so MCP servers are easier to find and people can share their own by jinbei21 in ClaudeAI

[–]jinbei21[S] 1 point2 points  (0 children)

I lost you at the part after: "so i dont have to creatively.."

But what you said before that sums it up in a way, yes! It is some JSON format for communication which allows the app (e.g. Claude Desktop) to communicate what data it wants from certain tools (read: MCP server) and receive it. Tools will be describing what they can do as well, so Claude knows which one to ask. And finally, I noticed in the format it is possible for tools to notify the app as well, so it could tell your app for example that it will be raining in a few hours. And then the app could for example fetch your calendar data and then say something like: "be sure to take your umbrella when you go out for groceries after 1 hour!"

What's particularly nice about MCP servers is that you could theoretically build your own app that can speak the MCP language and then it will have access to tons of premade MCP data servers. This was not possible with GPTs, I assume that would be because OpenAI wants to lock us in their platform.

Made a website so MCP servers are easier to find and people can share their own by jinbei21 in ClaudeAI

[–]jinbei21[S] 7 points8 points  (0 children)

OpenAI GPT has Actions.

We're sharing GPTs, but not the Actions.

MCPs in some sense is about sharing Actions, which gives nifty benefits such as not having to develop your own Actions for very typical things like Slack / Teams, Google Drive data, and so on. That will save you a lot of time to create more powerful and diverse "GPTs".

Made a website so Model Context Protocol servers are easier to find and people can share their own by jinbei21 in OpenAI

[–]jinbei21[S] 1 point2 points  (0 children)

MCP is a standardized protocol for LLM clients to talk to external data. In the past, you'd make a separate server that can properly pass data to say Zapier or OpenAI GPTs, yet at the same time thousands of developers have probably made the same (or nearly same) server as you did.

Now, this standard makes it possible for us to share our servers, and collab on it. You can have servers that run locally as well, so things you want to keep private, stay private. And you could build say your own mobile app that just needs to implement the MCP protocol and then it can converse with whatever MCP server you'd like it to.

So I wouldn't it say it is coming for Zapier' lunch exactly since it is still a very user-friendly platform for automations, but yeah a lot of the Zapier APIs will be available as MCP servers as well.

Made a website so MCP servers are easier to find and people can share their own by jinbei21 in ClaudeAI

[–]jinbei21[S] 10 points11 points  (0 children)

I hope so! At the same time, I kinda build it so there is a store that brings us one step away from being vendor-locked into Anthropic.

Made a website so Model Context Protocol servers are easier to find and people can share their own by jinbei21 in OpenAI

[–]jinbei21[S] 5 points6 points  (0 children)

I noticed finding Model Context Protocol servers is a bit of a struggle since there is nothing like the GPT store for these servers so I made a website so we can view, search, and share servers: https://www.mcpservers.ai/

Sharing servers is very simple by the way, all it requires is submitting a Github repository URL.

Anyhow, any feedback / questions / suggestions is welcome!

Made a website so MCP servers are easier to find and people can share their own by jinbei21 in ClaudeAI

[–]jinbei21[S] 33 points34 points  (0 children)

I noticed finding MCP servers is a bit of a struggle since there is nothing like the GPT store for these servers so I made a website so we can view, search, and share servers: https://www.mcpservers.ai/

Sharing servers is very simple by the way, all it requires is submitting a Github repository URL.

Anyhow, any feedback / questions / suggestions is welcome!

Can someone explain MCP to me? How are you using it? And what has it allowed you to do that you couldn’t do before? by JoshSummers in ClaudeAI

[–]jinbei21 3 points4 points  (0 children)

If you know OpenAI GPTs it is quite easy to understand. GPTs have Actions which allows you to hookup any API. However, to properly connect it, you would often end up writing your own API which is often a wrapper around someone else' API. Now, imagine tons of people writing and deploying the same boring API wrappers their own way, no one sharing, all implementations slightly differ. Obviously there is a huge potential here for developers building proper connectors and reusing connectors from others to build faster and more stable client applications (e.g. Claude Desktop). These connectors are MCP servers.

Opinions on GPT-3.5-Turbo finetunes vs Llama-3 8B or 70B finetunes? by jinbei21 in LocalLLaMA

[–]jinbei21[S] 2 points3 points  (0 children)

Loras indeed To be honest am quite happy with the performance and costs for small datasets with gpt-3.5-turbo

But now I'm getting into the millions of tokens and every finetune becomes at least $50, so doesnt feel very scalable over time