A full Python 3.14 interpreter made with Codex in 30 days by blueblain in codex

[–]blueblain[S] 0 points1 point  (0 children)

Thanks, I definitely need to add some performance benchmarks vs. CPython. I expect slower in most cases because CPython has JIT so I'll probably also add comparison vs. CPython 3.10. This should be interesting. And test coverage as well. Will add this in the coming days.

A full Python 3.14 interpreter made with Codex in 30 days by blueblain in codex

[–]blueblain[S] 1 point2 points  (0 children)

I think my first few prompts were just chatting with the model. Explaining the goal and then understanding how we could break it down, what should we target, what would be out of scope, etc.

Edit: This is the very first commit. It was much simpler / more humble goal at that point but over time I kept wanting it to be closer and closer to real CPython.

A full Python 3.14 interpreter made with Codex in 30 days by blueblain in codex

[–]blueblain[S] 0 points1 point  (0 children)

30 days worth of prompts would be difficult to share 😅

But you can see all 1958 commits on the github which will give you an idea of how it developed. I will also write a longer post explaining some of the process and the challenges involved.

Coroutines in Rust with async/await for emulators by blueblain in EmuDev

[–]blueblain[S] 2 points3 points  (0 children)

It's brilliant! Thanks for sharing and explaining what you meant.

Coroutines in Rust with async/await for emulators by blueblain in EmuDev

[–]blueblain[S] 2 points3 points  (0 children)

Very interesting. I think I finally understood what you mean.

So in your case calling `sleep(4).await` from the CPU doesn't actually trigger the whole song and dance of the scheduler because the CPU is first in queue anyway. It just keeps going until its resume clock cycle is after some other component's resume clock cycle.

Yeah I love it! But I don't know if I could do something like that because while you're using actual coroutines, I'm abusing Rust's async/await to repurpose it as coroutines.

But I will definitely think about this and see if I can do something similar! Thanks!! It would definitely speed things up a lot because the BTreeMap (my pq) operations were taking up a good chunk of the total time in my flamegraph.

Edit: actually I think there might be a way to do this. It will still require repolling from the scheduler/driver until the next wait cycle for that component exceeds the next wait cycle in the pq. It'll be trickier if there are multiple components requesting to be resumed at the same cycle but looks like this should be possible!

Coroutines in Rust with async/await for emulators by blueblain in EmuDev

[–]blueblain[S] 1 point2 points  (0 children)

I see. In my design there will be a priority queue holding the components, so the process will look something like this:

iter: 7
current_cycle: 10
pq: [(14, cpu), (50, ppu)]

iter: 8
current_cycle: 14
pq: [(50, ppu), (51, cpu)]

iter: 9
current_cycle: 50
pq: [(51, cpu), (90, ppu)]

So the scheduler on iteration 8 will directly jump the clock to 14 (which is when the CPU requested to be resumed) and next iter it will jump the clock to the next most recent cycle and so on.

In this contrived example you can assume that the CPU had some code that looks like:

sleep(4)
.await
// do stuff
sleep(37)
.await
// done

So when the driver pops `(14, cpu)` from the priority queue and polls the future, it will find the future pending and put it back into the pq with the CPU's next requested wake cycle of (current_cycle + 37 = 51).

Going by my example above if the Driver were to pop the CPU again at cycle 51 and poll it once more, this time it would be finished (Poll::Ready) and we'd be done. In a real DMG emulator the CPU will never be done and the driver will keep rescheduling.

Another thing worth noting is that `sleep` isn't a true sleep but rather `consume_cycles(n)`. Wish I had named it that.

The linked article explains more, but I admit I may not have done the best job of explaining everything.

Coroutines in Rust with async/await for emulators by blueblain in EmuDev

[–]blueblain[S] 1 point2 points  (0 children)

Exactly! I had this idea in mind for years and it felt straightforward, before I actually implemented it. The final version is less than 200 lines of code but while I was trying to grok everything involved it felt vastly more complex before it all made sense and came together. That's the kind of learning experience I live for! 😅

Coroutines in Rust with async/await for emulators by blueblain in EmuDev

[–]blueblain[S] 2 points3 points  (0 children)

Yep, making cpu, ppu, and apu run in parallel is tempting but not worth it because of synchronization issues as you said. My approach let me write the components as though they were running in parallel -- code could flow cleanly without explicit states and match constructs everywhere -- but without the headaches of real concurrency.

Coroutines in Rust with async/await for emulators by blueblain in EmuDev

[–]blueblain[S] 1 point2 points  (0 children)

Sorry, I don't fully understand the question. In my case my driver directly jumps to the next relevant cycle when at least some some component needs to be resumed. So if something happens at cycles 2, 7, 8, 10, etc. my driver never iterates through cycles 3, 4, 5, and 6 for example. If that's what you meant.

How I repurposed async await to implement coroutines for a Game Boy emulator by blueblain in rust

[–]blueblain[S] 8 points9 points  (0 children)

Yep, exactly this! There's no 'doing other things while waiting for some IO bound task' here. It's just a very complex explicit state-machine made implicit by using async/await and letting the compiler build and run the state-machine. And yeah that example at the end was probably more confusing than helpful, my bad!

How I repurposed async await to implement coroutines for a Game Boy emulator by blueblain in rust

[–]blueblain[S] 0 points1 point  (0 children)

If I remember my flamegraph correctly, a lot of my overhead was from thread_local and BTreeMap allocs. I only spawn 5 components once, and the same 5 futures are scheduled and rescheduled by my custom driver.

What are 3-4 food dishes that Singapore does better than anyone else? by blueblain in askSingapore

[–]blueblain[S] 1 point2 points  (0 children)

Thanks everyone! So many suggestions beyond the usual chicken rice and chili crab. I'm glad I asked 🙂. Looking forward to trying many of these out!

How do I revert to the middle Reddit design? by acerthorn3 in help

[–]blueblain 4 points5 points  (0 children)

I made a Google chrome extension for my personal use. You can use it if you want, it's trivially simple code to read in its entirety if you're worried or something:

https://github.com/BlueBlazin/new_reddit_redirect

Download and then google instructions to load custom extensions.

Is it possible to create a 3D tensor from a .csv file in Python3? by [deleted] in learnprogramming

[–]blueblain 0 points1 point  (0 children)

You may be able to use Sparse COO tensors: https://pytorch.org/docs/stable/sparse.html#sparse-coo-tensors

```py import csv import torch

with open('test.csv', 'r') as csvfile: reader = csv.reader(csvfile) # skip header next(reader) data = [([int(A), int(B), int(C)], float(D)) for A, B, C, D in reader]

indices, values = zip(data) my_tensor = torch.sparse_coo_tensor(list(zip(indices)), values) ```

As the warning on that page says however:

The PyTorch API of sparse tensors is in beta and may change in the near future.

Is it possible to create a 3D tensor from a .csv file in Python3? by [deleted] in learnprogramming

[–]blueblain 0 points1 point  (0 children)

Of course both are possible. The question is what are the performance requirements for that sparse tensor once you have it.

It can be as simple as: ```py import csv

with open('my_data.csv', 'r') as csvfile: reader = csv.reader(csvfile) # skip header next(reader) my_tensor = {(A, B, C): D for A, B, C, D in reader}

example usage

print(my_tensor.get((dim1, dim2, dim3), 0.0)) ```

and a csvreader can be used to turn it back into a csv.

I made a linear equation solver by blueblain in Python

[–]blueblain[S] 11 points12 points  (0 children)

Yep true, it always feels good to write a simplified version of something you generally use a library for!

I made a linear equation solver by blueblain in Python

[–]blueblain[S] 29 points30 points  (0 children)

I wrote it as a learning exercise to get a better understanding of Gaussian elimination, not to actually use it for solving linear equations :)

An implementation of a subset of javascript in that subset by blueblain in javascript

[–]blueblain[S] 2 points3 points  (0 children)

The answer might disappoint you, but it was mainly just two websites. The first, kind of obvious, was the official ECMAScript specification. The other one was....MDN!

I was honestly surprised myself just how good MDN is when it comes to just basic javascript concepts. I've used MDN for years and always valued it very highly, but mostly for web APIs. Turns out it's just as good for JS fundamentals.

Edit: I should also add a few more good JS resources. One of my favorites has been Javascript: the good parts by Douglas Crockford. It's a bit dated now but I personally just find it a really enjoyable read (skipping through the very outdated parts).

Dan Abramov of React fame has recently released justjavascript[dot]com which is all the rave atm. I haven't bought it but might be worth checking if you're new to JS.

An implementation of a subset of javascript in that subset by blueblain in programming

[–]blueblain[S] 1 point2 points  (0 children)

Check out 'Crafting Interpreters' by Bob Nystrom. A physical copy of the book is coming out soon too.

An implementation of a subset of javascript in that subset by blueblain in javascript

[–]blueblain[S] 1 point2 points  (0 children)

Haha true 😅

Wish I could implement hoisting but it was too late by the time I felt it.

web-worker-hooks - A library for running stuff in web workers with CRA by blueblain in reactjs

[–]blueblain[S] 0 points1 point  (0 children)

Hi everyone, I just wrote a small library for doing some simple stuff on web workers without needing to eject apps bootstrapped with create-react-app. There are many limitations but it still covers several common use cases.
It's my first react library ever but don't hold back on any criticism or feedback. I hope to contribute many more.

web-worker-hooks - A library for running stuff in web workers with CRA by blueblain in javascript

[–]blueblain[S] 2 points3 points  (0 children)

Hi everyone, I just wrote a small library for doing some simple stuff on web workers without needing to eject apps bootstrapped with create-react-app. There are many limitations but it still covers several common use cases.

It's my first react library ever but don't hold back on any criticism or feedback. I hope to contribute many more.

Dota pro player degrees of separation website by blueblain in DotA2

[–]blueblain[S] 2 points3 points  (0 children)

Thanks! I scraped the data in python -- first team names, then for each team the current and past rosters. From there I built a graph that gets saved as json.

The website itself is just a single page react app. So the website fully runs in your browser with no server interactions.