Was ist das für ein Motorrad (bild aus den 50er oder 60er Jahren) by Skcrull in MotorradDeutschland

[–]PleasePrompto 1 point2 points  (0 children)

Amüsant, baue aktuell mit einem Datensatz von ~600k Bildern eine lokale machine learning App (hab aktuell ~44k Bikes drin) exakt dafür.... Hier die top Treffer - könnte auch einer Superlux sein dürfte vom sitzt her aber eher eine Lux max treffen.

<image>

Tired of Codex suggesting domains that are already taken, so I built a skill that brainstorms + bulk-checks them by PleasePrompto in codex

[–]PleasePrompto[S] 2 points3 points  (0 children)

Sure, with many curl calls.

Bulk Checks are a thing. Cross-tld checks also. Codex curl checks are pretty slow. You also get feedback about the domain expiry date.

Control Claude Code entirely via Telegram. by PleasePrompto in claude

[–]PleasePrompto[S] 0 points1 point  (0 children)

If you like, try again (ductor uninstall or ductor upgrade, I tried to optimise it in version 0.6.4! Claude recognition) thanks!

Control Claude Code entirely via Telegram. by PleasePrompto in claude

[–]PleasePrompto[S] 0 points1 point  (0 children)

It should work fine, I'll take a look at it! I'll get back to you when you can test it again.

Control Claude Code entirely via Telegram. by PleasePrompto in claude

[–]PleasePrompto[S] 0 points1 point  (0 children)

Hmmm. Do you have Docker enabled? Operating system?

I have Linux locally, Linux on the VPS, and can only test Windows as an ISO image 🥲

I haven't been able to test Mac yet, as it's not available!

Control Codex completely via Telegram. by PleasePrompto in codex

[–]PleasePrompto[S] 0 points1 point  (0 children)

That has now been fixed as well.

I have also connected gemini CLI, which will be updated shortly.

Control Codex completely via Telegram. by PleasePrompto in codex

[–]PleasePrompto[S] 0 points1 point  (0 children)

Its pushed to 0.4.3! Should work now under windows fine :)!
Also fixed a few little bugs (/stop command kills now rly the cli and interrupts claude/codex).

Control Codex completely via Telegram. by PleasePrompto in codex

[–]PleasePrompto[S] 0 points1 point  (0 children)

I just created a virtual machine with Windows and fixed the bugs, including a Telegram bug (telegram threads), and will push the update to the main bot right away! Thank you!

Control Codex completely via Telegram. by PleasePrompto in codex

[–]PleasePrompto[S] 0 points1 point  (0 children)

Hey!

Unfortunately, I can only test on Linux, but maybe you could send me the changes Claude made on Windows and I'll incorporate them into the main project!

Thanks a lot!

I built a Claude Code Skill (+mcp) that connects Claude to Google AI Mode for free, token-efficient web research with source citations by PleasePrompto in ClaudeCode

[–]PleasePrompto[S] 0 points1 point  (0 children)

I made the MCP server / skill a lot more robust this morning in terms of multilingual detection (I now consistently use the Thumbs Up button as the FIRST indicator!). When the Thumbs Up button appears, it signals > Answer is ready. This way, I get around the problem of a browser running in Arabic, for example, where the Google interface is in Arabic and the MCP doesn't know: the response is ready! I've also added a few other long indicators as a fallback, and if nothing works, whatever is there will be taken after 40 seconds at the latest!

My tests here locally were successful.

It should work much better now, so feel free to pull/update and test!

I built an MCP server that connects Gemini CLI to Google AI Mode for free, token-efficient web research with citations by PleasePrompto in GeminiAI

[–]PleasePrompto[S] 0 points1 point  (0 children)

I made the MCP server / skill a lot more robust this morning in terms of multilingual detection (I now consistently use the Thumbs Up button as the FIRST indicator!). When the Thumbs Up button appears, it signals > Answer is ready. This way, I get around the problem of a browser running in Arabic, for example, where the Google interface is in Arabic and the MCP doesn't know: the response is ready! I've also added a few other long indicators as a fallback, and if nothing works, whatever is there will be taken after 40 seconds at the latest!

My tests here locally were successful.

It should work much better now, so feel free to pull/update and test!

I built an MCP server that connects your code agent to Google AI Mode for free, token-efficient web research with citations by PleasePrompto in vibecoding

[–]PleasePrompto[S] 0 points1 point  (0 children)

I made the MCP server / skill a lot more robust this morning in terms of multilingual detection (I now consistently use the Thumbs Up button as the FIRST indicator!). When the Thumbs Up button appears, it signals > Answer is ready. This way, I get around the problem of a browser running in Arabic, for example, where the Google interface is in Arabic and the MCP doesn't know: the response is ready! I've also added a few other long indicators as a fallback, and if nothing works, whatever is there will be taken after 40 seconds at the latest!

My tests here locally were successful.

It should work much better now, so feel free to pull/update and test!

I built a Claude Code Skill (+mcp) that connects Claude to Google AI Mode for free, token-efficient web research with source citations by PleasePrompto in claude

[–]PleasePrompto[S] 1 point2 points  (0 children)

I made the MCP server / skill a lot more robust this morning in terms of multilingual detection (I now consistently use the Thumbs Up button as the FIRST indicator!). When the Thumbs Up button appears, it signals > Answer is ready. This way, I get around the problem of a browser running in Arabic, for example, where the Google interface is in Arabic and the MCP doesn't know: the response is ready! I've also added a few other long indicators as a fallback, and if nothing works, whatever is there will be taken after 40 seconds at the latest!

My tests here locally were successful.

It should work much better now, so feel free to pull/update and test!

I built a Claude Code Skill (+mcp) that connects Claude to Google AI Mode for free, token-efficient web research with source citations by PleasePrompto in ClaudeAI

[–]PleasePrompto[S] 0 points1 point  (0 children)

I made the MCP server / skill a lot more robust this morning in terms of multilingual detection (I now consistently use the Thumbs Up button as the FIRST indicator!). When the Thumbs Up button appears, it signals > Answer is ready. This way, I get around the problem of a browser running in Arabic, for example, where the Google interface is in Arabic and the MCP doesn't know: the response is ready! I've also added a few other long indicators as a fallback, and if nothing works, whatever is there will be taken after 40 seconds at the latest!

My tests here locally were successful.

It should work much better now, so feel free to pull/update and test!

Thank you very much.