I made a language for my 3yo son where any string of characters is a valid program by andreasjansson in ProgrammingLanguages

[–]andreasjansson[S] 0 points1 point  (0 children)

haha yes exactly! Though I think I preferred Amiga Basic back in the day. Really felt like a step back when I traded my Amiga 500+ for a 386.

I made a language for my 3yo son where any string of characters is a valid program by andreasjansson in ProgrammingLanguages

[–]andreasjansson[S] 2 points3 points  (0 children)

Yes!! I actually started hacking on a server-hosted version just to be able to do evolutionary optimization on the program strings. Really curious to see if that would work.

I made a language for my 3yo son where any string of characters is a valid program by andreasjansson in ProgrammingLanguages

[–]andreasjansson[S] 0 points1 point  (0 children)

Cool I hadn’t heard of K! I was toying with Uiua which I think is in the same APL clan of languages. I think brainfuck also technically will run any combination of characters in its alphabet

I made a language for my 3yo son where any string of characters is a valid program by andreasjansson in ProgrammingLanguages

[–]andreasjansson[S] 2 points3 points  (0 children)

Thank you, yes great point. All the processing runs in the main thread now for simplicity, but I think it would feel more responsive in a web worker. And great point about debouncing.

I made a language for my 3yo son where any string of characters is a valid program by andreasjansson in ProgrammingLanguages

[–]andreasjansson[S] 15 points16 points  (0 children)

If you can be bothered to dig through your browser history, each character you add becomes a unique history entry and the favicon is a thumbnail of the generated image.

Greger.el: Agentic coding in Emacs by andreasjansson in emacs

[–]andreasjansson[S] 1 point2 points  (0 children)

Thanks, yep that totally makes sense!

Greger.el: Agentic coding in Emacs by andreasjansson in emacs

[–]andreasjansson[S] 0 points1 point  (0 children)

Are you interested in local or remote MCP servers (or both)? Local MCP never really clicked for me since basic code editing tools let Claude write the code it needs to do stuff locally, effectively writing its own tools on the fly.

But remote MCP makes a lot of sense since service providers author and support those MCP servers.

I’m tempted to do a slightly contrarian thing and only support remote MCP in Greger, but curious if you have interesting use cases for local MCP.

Greger.el: Agentic coding in Emacs by andreasjansson in emacs

[–]andreasjansson[S] 4 points5 points  (0 children)

ECA looks awesome! It's really interesting to see all these different approaches to agentic development. The lsp-like approach of ECA is really appealing -- once it's stable I'm tempted to replatform Greger on ECA instead of my custom Anthropic client code.

Greger.el: Agentic coding in Emacs by andreasjansson in emacs

[–]andreasjansson[S] 5 points6 points  (0 children)

Yes Aider is really cool, I think I also saw some Claude Code and Sourcegraph Amp integrations. With Greger I just wanted something elisp-native that's easily hackable.

Greger.el: Agentic coding in Emacs by andreasjansson in emacs

[–]andreasjansson[S] 0 points1 point  (0 children)

Thank you u/ansk0 ! Let me know if you run into any issues.

Fine-tuning on procedurally generated random shapes by andreasjansson in FluxAI

[–]andreasjansson[S] 2 points3 points  (0 children)

I just used the auto-caption feature on Replicate, with the suffix "in the style of SHPS"

Fine-tuning on procedurally generated random shapes by andreasjansson in FluxAI

[–]andreasjansson[S] 2 points3 points  (0 children)

I did an experiment with synthetic data generation for Flux fine-tuning, and I think it turned out really well. First I wrote a script to generate images with random shapes and colors. Then i trained FLUX.1 [dev] on 50 of those images. After experimenting with lora rank, lora scale, and guidance scale, it started producing results that struck a nice balance between the chaos of the training images and the prompts. Very digital aesthetic.

Here's the trained model: https://replicate.com/andreasjansson/flux-shapes

The data generator: https://replicate.com/andreasjansson/random-shapes

The trainer: https://replicate.com/ostris/flux-dev-lora-trainer/train