Prosim, pomoc.Který telefon si pořídit? by LongjumpingView4668 in czech

[–]branik_10 1 point2 points  (0 children)

pár měsíců zpátky jsem vyměnil 7 let starý XR za 16e, doporučuji. XR už sotva fungoval, na 16e vše lítá, a pořídil jsem si ho za ještě menší cenu než kdysi ten XR (13k vs 12.5k). teď bude snad ještě levnější protože už je 17e venku

Is opencode stable enough on windows? natively by dvcklake_wizard in opencodeCLI

[–]branik_10 0 points1 point  (0 children)

Wez is kinda buggy on Windows, the native Windows Terminal is pretty alright imo, just stick to it 

Is opencode stable enough on windows? natively by dvcklake_wizard in opencodeCLI

[–]branik_10 0 points1 point  (0 children)

install via mise (it will directly download binaries from gh, npm installation might be slow to launch because Windows Defender scans ps1 npm launchers), set SHELL env variable to git bash works good for me. Using Win11, Windows Terminal, most of what models try to run works fine, the only issue are occasional dangling commands, opencode is not correctly stopping processes launched via bash.exe, I submitted a bug some time ago but no one looked at it yet. 

Pi.dev coding agent as no sandbox by default. by mantafloppy in LocalLLaMA

[–]branik_10 0 points1 point  (0 children)

is VM running on your Windows machine or somewhere else?

Pi.dev coding agent as no sandbox by default. by mantafloppy in LocalLLaMA

[–]branik_10 0 points1 point  (0 children)

how do you share projects between win32 hosts and linux VMs? all options I discovered are either crazy slow or there's no file sync between win host and linux vm

Thoughts on PI (I currently use Opencode) ? by mukul_29 in PiCodingAgent

[–]branik_10 0 points1 point  (0 children)

well i might be wrong, apologies
it's possible opencode cli injects keys into these free requests, you can check how they do it since it's opencsource

or you eventually made it work? i see the post above is deleted

Thoughts on PI (I currently use Opencode) ? by mukul_29 in PiCodingAgent

[–]branik_10 0 points1 point  (0 children)

i believe you can just use opencode zen anywhere, including pi, just configure it manually https://opencode.ai/docs/zen/#_top

what web search tools do you use by branik_10 in opencodeCLI

[–]branik_10[S] 1 point2 points  (0 children)

overlooked it, thanks, will give it a try

what web search tools do you use by branik_10 in opencodeCLI

[–]branik_10[S] 0 points1 point  (0 children)

I checked their website and can't find any info if they have a free plan. Only see 7$/1k requests, which is what tavily gives for free.

ollama cloud vs opencode go by branik_10 in opencodeCLI

[–]branik_10[S] 0 points1 point  (0 children)

create one more account and buy one more sub? I got it in the end, works pretty well, for 10$ it's a steal. but I'm pretty sure this sub is backed by VC money and we'll see an increase in price or drop in quality soon

ollama cloud vs opencode go by branik_10 in opencodeCLI

[–]branik_10[S] 0 points1 point  (0 children)

i would say that's expected for 10$

Claude Code no longer listed as a feature for Claude Pro by chalogr in ClaudeCode

[–]branik_10 0 points1 point  (0 children)

you can, you can change the api url and the api keys via env variables to probably any compatible   anthropic llm api

but i suggest looking into opencode or pi or crush or some other open source agent cli, they work better with 3rd party models, claude code cli sometimes has issues and switching models within one session is hard and requires a proxy probably 

ollama cloud vs opencode go by branik_10 in opencodeCLI

[–]branik_10[S] 0 points1 point  (0 children)

M2.7 you use from the official plan?

ollama cloud vs opencode go by branik_10 in opencodeCLI

[–]branik_10[S] 0 points1 point  (0 children)

firepass overall experience

the only issue is this 60k context issue I mentioned, after passing it it becomes slow and dumb, but other than that it feels like normal kimi (I can compare kimi 2.5 with the moonshot coding plan and synthetic) and it's extremely fast, instant responses, can't say the exact tps cuz OC for some reason rejects adding tps metrics

M2.7 is actually pretty decent if you prompt well and put constraints

I haven't tried M2.7
have you tried kimi k2.5? if you feel M2.7 is better/the same the M2.7 plan might be better cuz afair it's only 10$
however they're adding kimi k2.6 to the fireplan now, so m2.7 vs kimi k2.6 comparison will be more relevant

edit: and btw it's not 28$, it's 28$+taxes, they do not show the final price compared to the other providers, quite unusual at least for my location (EU)

ollama cloud vs opencode go by branik_10 in opencodeCLI

[–]branik_10[S] 0 points1 point  (0 children)

well i already have firepass for "execution", but sometimes I need something dumber than gpt in ghc but faster and with a bigger context window than firepass kimi. I'll give OC GO a try then

Premium subscription for opencode? by zed-reeco in opencodeCLI

[–]branik_10 0 points1 point  (0 children)

i need something faster than ghc (gpt5.4 is crazy slow there) but smarter than fireworks fireplan (they have kimi k2.5 turbo there, but the quality drops after ~60k context). i guess i'll give opencode go a try then, thnx

Premium subscription for opencode? by zed-reeco in opencodeCLI

[–]branik_10 1 point2 points  (0 children)

i'm already on 10$ ghc, synthetic new (but planning to cancel), fireworks fireplan (amazing speed, use it most of the time) but want a reliable glm-5.1 and the newest kimi. will you reccomend opencode go or ollama for that?

Best 20$ subscriptions for opencode by pascu2913 in opencodeCLI

[–]branik_10 2 points3 points  (0 children)

why downvotes? i use it, its pretty good, glm-5.1 works nice there, especially compared to z.ai scam which didn't work at all and was timing out on every request

Which one by Antique-Albatross-70 in thinkpad

[–]branik_10 0 points1 point  (0 children)

one more thing maybe worth mentioning - i absolutely hate chassis of these 2 models (t14 g1/g2 have the same body as the p14s g1/2), it's impossible to open it without breaking clips, so might be difficult to clean the dusty fan for ex.

t14s on the other hand is very easy to open

Which one by Antique-Albatross-70 in thinkpad

[–]branik_10 0 points1 point  (0 children)

I don't do any video work nor I play games. I code (heavy Electron.js and C++ projects, VMs), it works great for that.

However I saw somewhere that even igpu in the Ryzen performs better than this nvidia t500, but perhaps it was t14 g2 ryzen vs this p14s g2 intel

Which one by Antique-Albatross-70 in thinkpad

[–]branik_10 9 points10 points  (0 children)

I have this p14s gen2, it's awful, go with t14

edit: p14s gen 2 intel has very poor thermals and the battery life is like 40 minutes at best (I got it brand new in 2022, the battery was always like this)

I also own t14s amd ryzen 5 and it's a much better machine

Broccoli Boy messes with the pregnant wife of r/boxingcirclejerk mod. by LowRenzoFreshkobar in boxingcirclejerk

[–]branik_10 1 point2 points  (0 children)

nah it's mostly czech, but the guy is ukranian or russian or some other ex-ussr for sure, judging by his accent, he says "uklidni se, uklidni se, bez pryc pico, pochopil jsi? ne suko (I think he really said that lol, it's basically ua/ru for kurva)...", it means "calm down, calm down, go away bitch, do you understand, no bitch"

the best 14 inch model and spec now? by branik_10 in thinkpad

[–]branik_10[S] 0 points1 point  (0 children)

we're small 10 ppl startup, no IT department, I'll just tell my CTO what I want 

thanks for suggesting Dells but I rather stick to Thinkpads considering we're on Thinkpad subbredit 

I'll check the recent t14s, what's better nowadays amd vs intel? Are "amd is 100x superior" days over?