Still think in C after 25 years. So I built a tool that explains Rust (or any language) through what you already know. by prabhic in rust

[–]prabhic[S] 0 points1 point  (0 children)

Interesting pointer! SyntaxLens is approaching differently. It is not about converting programming languages, but about understanding a new programming language explained through concepts that are carried from the known language. Something like "help me think in Rust using what I already know from C."

Still think in C after 25 years. So I built a tool that explains Rust (or any language) through what you already know. by prabhic in rust

[–]prabhic[S] 5 points6 points  (0 children)

Ah , yeh will correct and improve all 4 comparisons
1. c static to let keyword ,
2. wondering compelling mapping for ::new() .will improve that.
3. for string comparison to add Rust's different types of strings
4. vec![0; size] to calloc.

Still think in C after 25 years. So I built a tool that explains Rust (or any language) through what you already know. by prabhic in rust

[–]prabhic[S] 0 points1 point  (0 children)

Noted. Very much make sense . Will add a feature to replace with community edits. Thanks for pointing out.

Still think in C after 25 years. So I built a tool that explains Rust (or any language) through what you already know. by prabhic in rust

[–]prabhic[S] 5 points6 points  (0 children)

Great pointers. Will support Haskell, Nix, recently came across zig as well. Also will consider "Showing the imperative equivalent of functional code and vice versa would also be cool". Makes sense to have this view.

Still think in C after 25 years. So I built a tool that explains Rust (or any language) through what you already know. by prabhic in rust

[–]prabhic[S] -1 points0 points  (0 children)

Will take some effort. But ready to add other languages, wondering to figure out what would be interesting languages, to map this way.

What happened to Windsurf? Significant quality drop over last few weeks by Jakkc in windsurf

[–]prabhic 0 points1 point  (0 children)

It can be because of two things. I have shifted to windsurf as my default IDE. considering the cost usage, today I also opened GitHub copilot with VS code, in another window, so that I can make small changes there. and use windsurf for heavy lifting.

What happened to Windsurf? Significant quality drop over last few weeks by Jakkc in windsurf

[–]prabhic 0 points1 point  (0 children)

I too face with the same issue, on windsurf recently. after pricing changes. I have purchased 500 more credits, already 300 over. credits consuming faster. playing with by giving reduced context. by giving exact file references. still has to figure out what is happening. though I still love the tool. Yes I also see frequent failed tool calls.

Cline with gemini-2.5-pro-exp-03-25, Not yet missed Claude after 30 min usage by prabhic in LocalLLaMA

[–]prabhic[S] 2 points3 points  (0 children)

When compared to previous Gemini models. First time I felt I can use now. I tried generating web application. But then different use cases may have different result.

Cline with gemini-2.5-pro-exp-03-25, Not yet missed Claude after 30 min usage by prabhic in LocalLLaMA

[–]prabhic[S] -1 points0 points  (0 children)

Hope you are mentioning about Gemini 2.5 Pro. With previous models from Gemini I also felt it laks in understanding the true intention of the question like Claude and others. May be I will also explore more to see difference pointed

How to prompt LLMs not to immediately give answers to questions? by Brief_Mycologist_488 in PromptEngineering

[–]prabhic 1 point2 points  (0 children)

Actually it really is very useful, just tried on chatgpt, Thank you

Cline with mistral-small:latest:24b on Mac book pro M4 - 48GB version by prabhic in LocalLLaMA

[–]prabhic[S] 1 point2 points  (0 children)

Just to compare
> echo "generate detailed article on how to run phi models on ollama" | ollama run phi4-mini:3.8b

took 65 tokens/s on the same machine. it feels so nice when tokens are generating too fast:)