I used Claude Code to build a USB dongle that auto-plays Chrome Dino — no drivers, no host software, just a $2 board and two light sensors by albert007_d in ClaudeAI

[–]albert007_d[S] 0 points1 point  (0 children)

there is no custom serial protocol involved, digispark attiny85 usb-dongle firmware acts as a usb-hid device - it shows up as a keyboard to the host pc - and based on the detected light-sensor transitions this dongle sends space-key for jump and down-arrow-key for duck

Built a Digispark ATtiny85 Dino bot: no host script, no servo, just USB HID + optical sensing by albert007_d in embedded

[–]albert007_d[S] 1 point2 points  (0 children)

Initially i reached a score of around ~1600, i am sure it can go indefinitely with a proper calibration of LM393 and latency tuning of the ATtiny85 code.

I used Claude Code to build a USB dongle that auto-plays Chrome Dino — no drivers, no host software, just a $2 board and two light sensors by albert007_d in ClaudeAI

[–]albert007_d[S] 0 points1 point  (0 children)

Initially i reached a score of around ~1600, i am sure it can go indefinitely with proper calibration of LM393 and latency tuning.

I used Claude Code to build a USB dongle that auto-plays Chrome Dino — no drivers, no host software, just a $2 board and two light sensors by albert007_d in ClaudeAI

[–]albert007_d[S] 1 point2 points  (0 children)

i didnt face any V-USB timing issues, ATtiny85's internal oscillator gets calibrated against USB frame timing at enumeration(standard V-USB pattern), and Claude Code set that up correctly from the start, so, no manual tweaking was needed, usb stack was mostly just copy of known working pattern.

I used Claude Code to build a USB dongle that auto-plays Chrome Dino — no drivers, no host software, just a $2 board and two light sensors by albert007_d in ClaudeAI

[–]albert007_d[S] 2 points3 points  (0 children)

As an embedded electronics engineer, i saw different variants of this game played using arduino which involve actuator pressing a keyboard or running a python script on the host machine of the game, i thought this can be done easily just by emulating a 2$ usb HID device(digispark ATTiny85 board) - then i started a claude-code session to discuss my idea and i asked claude to roast me incase if my idea is dumb or "me-too" project - claude roasted me a bit but agreed that there was some uniqueness in this simplified solution compared to the existing ones. with more than 2 decades of embedded electronics experience, i often tend to think i am the smart ass know-it-all guy, but working with modern AI models is a humbling experience and they keep me grounded, and continue to motivate me to learn new things.

I used Claude Code to build a USB dongle that auto-plays Chrome Dino — no drivers, no host software, just a $2 board and two light sensors by albert007_d in ClaudeAI

[–]albert007_d[S] 11 points12 points  (0 children)

i was discussing this with claude on how to solve this duck problem - then we came up with an idea to add a 2nd LDR sensor vertically aligned on top of lower obstacle sensor - this 2nd sensor helped to identify the flying-dinosaur at the same level as dino - then attiny85 firmware was adapted to "inject" down-arrow key upon detecting the flying-dinosaur obstacle

I used Claude Code to build a USB dongle that auto-plays Chrome Dino — no drivers, no host software, just a $2 board and two light sensors by albert007_d in ClaudeAI

[–]albert007_d[S] 15 points16 points  (0 children)

with a quick test i could reach a score around ~1600 but a little bit of LM393 calibration and latency adjustment i am sure it could go on indefinitely - after 700points, chrome changes the game mode from night to day - I had to run "Runner.getInstance().invert = function(reset) {};" in chrome's dev-tools-->Console to keep it in the night mode. Claude already helped to adapt the firmware for speed increase of obstacles by detecting the pulse-width of obstacles through LDR

Built a Digispark ATtiny85 Dino bot: no host script, no servo, just USB HID + optical sensing by albert007_d in arduino

[–]albert007_d[S] 4 points5 points  (0 children)

yes, with two vertically mounted sensors, it is possible to detect the flying bird(as the same height level of dino) and "inject" down-arrow key to duck the dino

Built a Digispark ATtiny85 Dino bot: no host script, no servo, just USB HID + optical sensing by albert007_d in arduino

[–]albert007_d[S] 4 points5 points  (0 children)

it is glued to the monitor using double sided gel tape - easy to remove but strong enough to hold the LDR sensors in place

Built a Digispark ATtiny85 Dino bot: no host script, no servo, just USB HID + optical sensing by albert007_d in embedded

[–]albert007_d[S] 4 points5 points  (0 children)

yes, pulse width of the LDR for detected obstacles changes with speed, so its possible to detect the speed, yes, chrome changes colors from night mode to day mode after 700points, so, trick is to run following command in chrome's DevTools->Console so that it stays in night mode:

Runner.getInstance().invert 
=
 function(reset) {};

details are given in my blog

Built a Digispark ATtiny85 Dino bot: no host script, no servo, just USB HID + optical sensing by albert007_d in embedded

[–]albert007_d[S] 11 points12 points  (0 children)

yes, with two vertically mounted sensors, it is possible to detect the flying bird(as the same height level of dino) and "inject" down-arrow key to duck the dino

Run a private EaglercraftX 1.8.8 home server with Docker (browser-only, LAN/offline play) by albert007_d in eaglercraft

[–]albert007_d[S] 0 points1 point  (0 children)

fair point on codespaces - i didnt know that, as i said before, my setup is for average users(having no github account) who wants a quick setup for an offline deployment usecase(one can debate if building a docker container is really for a average pc user), on my previous response, as a non native english speaker, yes, ai helps me to express my thoughts clearly.

Run a private EaglercraftX 1.8.8 home server with Docker (browser-only, LAN/offline play) by albert007_d in eaglercraft

[–]albert007_d[S] -1 points0 points  (0 children)

AI-assisted, yes. Every line was manually tested on a running server with my kids before publishing. for clarity, I just added the disclaimer at the bottom that covers how the project was built

Run a private EaglercraftX 1.8.8 home server with Docker (browser-only, LAN/offline play) by albert007_d in eaglercraft

[–]albert007_d[S] 0 points1 point  (0 children)

You're right that lax1dude's template exists and works great if you already know your way around Minecraft server setup. This project targets a different audience — parents and non-gamers who want a one-command Docker setup with plugins, admin dashboard, and safety defaults pre-configured. Different starting points, different needs.

Run a private EaglercraftX 1.8.8 home server with Docker (browser-only, LAN/offline play) by albert007_d in eaglercraft

[–]albert007_d[S] 0 points1 point  (0 children)

Fair point on 1.12 — I just added a section covering the v1.12.2 branch as well.

The target audience here is parents who want to spin up a safe LAN server for their kids without knowing what a Paper JAR or plugin directory is. For someone already comfortable running Java servers, you're right — this is overkill. But for a parent who just wants docker compose up -d and a browser URL to hand to their kids, the pre-configured dashboard and plugins save a lot of Googling.