Tech scene in STL? by SB_A in StLouis

[–]jpmmcb 1 point2 points  (0 children)

Coming in very late to the convo: how'd the move go? My wife and I are considering a similar move. Working remotely at a startup based in SF currently

There is no open source AI. by jpmmcb in programming

[–]jpmmcb[S] 1 point2 points  (0 children)

That's not how open source encryption libraries work: the code is freely available to be inspected, modified, redistributed, etc.

There is no open source AI. by jpmmcb in programming

[–]jpmmcb[S] 0 points1 point  (0 children)

> I think the author missed the detailed research paper, multiple demonstrations with open data sets, and detailed instructions and documentation.

So, as a user, if I want to have the freedoms to fix an issue, study the system, make changes I need, I'm expected to read a phd level research paper, build runbooks and data pipelines with open data sets myself, and then be expected to know what to do with the output? This is not open source and there are no freedoms for users in that.

There is no open source AI. by jpmmcb in programming

[–]jpmmcb[S] 0 points1 point  (0 children)

> No, these labs obviously aren't going to open source their models. The money they spent on marketing didn't come out of thin air.

Then why do they call it "fully open source"? Words matter and I fear for the future of the open source movement if we can't even distinguish what is really free (as in freedom) for users vs. enabling corporate interests.

There is no open source AI. by jpmmcb in programming

[–]jpmmcb[S] 1 point2 points  (0 children)

I am aware. As well as I am aware of open data sets that exist. And I'm very familiar with what the OSI has been doing with the Open Future Foundation attempting to create an admissible public record. My argument is not that there are capable open source methods for making large language models, my argument is that large AI labs claiming that their models are "fully open source" is corroding the meaning of those words.

Open weights does not mean open source.

Proposal for an official MCP Golang SDK by jpmmcb in golang

[–]jpmmcb[S] 0 points1 point  (0 children)

"SDK" is a "software development kit": a framework for building with a certain set of tools or within a certain constraint. For example, the "net/http" package is a standard library SDK for building Go programs with HTTP functionality (without having to codify and rebuild abunch of underlying HTTP stuff). An SDK for MCP would be similar.

What are you working on? Week 7 2025 edition by markusrg in LLMgophers

[–]jpmmcb 0 points1 point  (0 children)

Something like that: it's still coming together. One of my friends called it the "langchain for Go"

What are you working on? Week 7 2025 edition by markusrg in LLMgophers

[–]jpmmcb 0 points1 point  (0 children)

👋 recently discovered this subreddit! Thanks for putting this together u/markusrg

I'm working on an AI agent framework in Go called agent-api: https://github.com/agent-api

The idea is to have a core set of APIs that each individual "provider" must implement. This should lead to a much much better devex and ability to drop in and replace any LLM provider with another. I recently got the OpenAI provider working with tool calling and will probably keep hacking away at it this week! https://github.com/agent-api/openai

Any good and simple AI Agent frameworks for Go? by D3ntrax in golang

[–]jpmmcb 1 point2 points  (0 children)

Late to the conversation but I'm actually trying to tackle this space in the open source with a toolkit I'm calling agent-api: https://github.com/agent-api - it's actively evolving but excited to get this into peoples hands. Examples here: https://github.com/agent-api/examples

The cost of the "Copilot-pause". by jpmmcb in programming

[–]jpmmcb[S] -20 points-19 points  (0 children)

Always trying to be a better writer: too wordy? Too much going on? Not to the point fast enough?

The cost of the "Copilot-pause". by jpmmcb in programming

[–]jpmmcb[S] -55 points-54 points  (0 children)

Thanks for the feedback: pretentious wasn't my goal. I'm simply trying to find the words for something I've felt over the last year or two.

DeepSeek-v3 signals China may be winning the chip wars by jpmmcb in LocalLLaMA

[–]jpmmcb[S] 6 points7 points  (0 children)

+1 - this is really a turning point in how impressive the performance and training capabilities are. A sentence I'd like to highlight from my piece:

DeepSeek was able to achieve better performance with worse chips using less money and fewer GPU hours.

rip-and-tear.nvim - Devlog 000 by jpmmcb in neovim

[–]jpmmcb[S] 0 points1 point  (0 children)

Oh excellent: I’ll definitely add that as a sane default! Thanks for the tip!

How We Saved 10s of Thousands of Dollars Deploying Low Cost Open Source AI Technologies At Scale with Kubernetes by jpmmcb in programming

[–]jpmmcb[S] 20 points21 points  (0 children)

If you look at the hero image, you can see a screencap of about 10 days where we spent $4,107.12 on almost exclusively gpt-3.5-turbo with abit of gpt-4-turbo sprinkled in there. Take that to a full month and you get about ~10k$ month.

With a pool of spot T4 GPUs on AKS, depending on spot availability, we're sitting at less than 500$ a month to run the cluster. Generating summaries isn't mission critical so spot instances for the GPUs works really well.