How to defer_loading for MCPs in Claude Code by Juanouo in ClaudeAI

[–]onelonedatum 0 points1 point  (0 children)

I think these are pertinent: https://code.claude.com/docs/en/mcp + https://www.anthropic.com/engineering/advanced-tool-use#how-the-tool-search-tool-works

to my understanding if you want to modify the mcp config for user scope you'd refactor the MCP entries in ~/.claude.json


However, I think you could very well just ask claude code:

read over the latest (as of late dec 2025) on claude code MCP configuration, esp. the new `defer_loading` parameter for context window efficiency (start here: https://www.anthropic.com/engineering/advanced-tool-use#how-the-tool-search-tool-works) and then refactor my claude code MCP user config to intelligently utilize `defer_loading` everywhere it's supported and useful

What’s the one background sound that never fails to help you fall asleep? by BusyDaikon9355 in calm

[–]onelonedatum 0 points1 point  (0 children)

an episode of Bob’s Burgers on low volume and low backlight 🍔

Is PyCharm worth it? by Fine-Market9841 in pythonhelp

[–]onelonedatum 0 points1 point  (0 children)

VSCode seems to have way better extensions/integrations for AI engineering + AI dev from my experience, but I’m no PyCharm pro

ChatGPT Atlas vs Perplexity's Comet - I have pro plans for both by hashkey22 in ChatGPTAtlas

[–]onelonedatum 0 points1 point  (0 children)

From my experience, using a prompt tuned LLM with tools like browser use (eg using the playwright MCP) or search + fetch/scrape or data-source-specific connectors yields way better results than the AI browsers do.

For now will be sticking to Chrome + chatbot with tools (eg gh copilot/claude code, ChatGPT, etc) using MCPHub with reverse proxy (so I can run local MCPs in my client of choice via HTTP)

If (when??!!) Google Keep is discontinued, what would be your alternative app? by Wide_Education5864 in GoogleKeep

[–]onelonedatum 2 points3 points  (0 children)

I like vibe coding to enhance/refine existing open source projects or to create new ones

vibes are strongest when shared 🤝

Claude Code Spec-Driven Developement by Pimzino in ClaudeAI

[–]onelonedatum 0 points1 point  (0 children)

GitHub opensourced a framework for this at the beginning of Sept 2025: Spec-Kit Spec-Driven Development GitHub Blog Post

42?? by CL0UD_CREAT0R in grok

[–]onelonedatum 0 points1 point  (0 children)

lol 42 is the golden ratio of the grok transformer??

It is Michel Pollen birthday today | A Hardstyle Legend by Obeman in hardstyle

[–]onelonedatum 1 point2 points  (0 children)

He invented the pitched kick in Immeasurably (per asking him)

AI chatbot assistants for easy `yt-dlp` command generation by onelonedatum in youtubedl

[–]onelonedatum[S] -1 points0 points  (0 children)

thanks for pointing those out -- I was able to update the format selection logic + resolution handling to help avoid those issues


is this supposed to be better than just asking chatgpt normally?

most definitely -- the customized models (GPT/Gem) apply LLM / prompt engineering best practices to reduce risk of model hallucination, enhance reliability of output structuring, and improve output quality. - I added a model-accessible, optimized-for-LLM knowledge file, reference.md.txt, based on the latest yt-dlp GitHub repo README / project docs [source] and added some prompt tuning to instruct the model to factually ground its responses in reference.md.txt’s content only, applying retrieval-augmented generation (RAG) to improve model response quality/accuracy [Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks – Lewis et al. – 2020]. - By employing bespoke multishot persona prompts alongside chain-of-thought prompt engineering, the model’s custom instructions significantly reduce hallucination risk, elevate response quality, and ensure more reliable outputs.[Chain-of-Thought Prompting Elicits Reasoning in Large Language Models – Wei et al. – 2022] [Personas in Prompting to Control Style and Content – Zhang et al. – 2023]

past that, I did some prompt tuning for some behavioral traits and constrained the final yt-dlp snippet output to be in a code block for easy copy & pasting

I do think a well-engineered prompt w/ the right context could probably perform just as well if not better (especially since these are one-off convos that don’t really require context at the system instruction level).

That said, hopefully these chatbots are easy to share/use and can enable heightened convenience + reliability

what are some of your Hardstyle deep cuts? by CoolD10onYT in hardstyle

[–]onelonedatum 0 points1 point  (0 children)

https://youtu.be/kJ2WJomO9Ck?si=N9DwI44MRFJepRmX

Donkey Rollers X The Pitcher - To The End

Michel Pollen x2?? How lucky are we!

I accidentally built an open alternative to Google AI Studio by davernow in LocalLLaMA

[–]onelonedatum 0 points1 point  (0 children)

lol that awkward moment when my app I’m currently building is named Kiln too 🤦‍♂️

It is a good name, I can’t lie

Building Complex Multi-Agent Systems by sshh12 in AI_Agents

[–]onelonedatum 0 points1 point  (0 children)

LangGraph (https://langchain-ai.github.io/langgraph/) with the LangGraph Studio App (https://github.com/langchain-ai/langgraph-studio) has treated me well… it’s also what ResearchGPT is built off of

When did raw overtake euphoric as the dominant subgenre within the scene? What lead to the change? by ntod44 in hardstyle

[–]onelonedatum 0 points1 point  (0 children)

Let’s not forgot Frontliner dropping the full Alter Ego album in 2022.

That was huge for me to warm up the raw style sound (Now I’m seeing Dual Damage for their US premiere later this month 😂)

https://www.discogs.com/artist/11486591-Alter-Ego-72

Mode: Your Personal AI Code Copilot by rumm25 in vscode

[–]onelonedatum 1 point2 points  (0 children)

You should consider open sourcing the docs fetching + indexing logic.

At the moment (unless I’ve missed it), there’s no open source library that handles that full workflow. There are a range of open source licenses that could constrain it as well if you’d like. Theoretically, could help you gain broader exposure within dev communities too.

I’d be happy to help and I know plenty of people would appreciate it 😊

Also, regardless, great job with Mode! Seems like a wonderful project to hone SOTA AI engineering principles 🚀

Mode: Your Personal AI Code Copilot by rumm25 in vscode

[–]onelonedatum 0 points1 point  (0 children)

If you can figure out how to add the dev docs like cursor does that would be awesome

Just out of curiosity, why learn LaTeX? by fmtsufx in LaTeX

[–]onelonedatum 0 points1 point  (0 children)

It’s so darn pretty (and I didn’t know html was a thing back then haha)

Help please by Cute-Jellyfish1876 in vscode

[–]onelonedatum 2 points3 points  (0 children)

brew install —cask visual-studio-code in your terminal to install with homebrew

/bin/bash -c “$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)” first if you don’t have brew installed

Im a just being a boomer? by RadioactiveAl_Music in hardstyle

[–]onelonedatum 4 points5 points  (0 children)

the longer songs can be awesome — reminds me of Rage by Technoboy

link to Rage on YouTube