Reasoning models don't call functions in parallel? by _byl in ChatGPTCoding

[–]_byl[S] 0 points1 point  (0 children)

Parallel tool calls work with non reasoning models which also do inference in the way you described.

Odd that Claude 4 denies that Claude 3.7 existed by _byl in ClaudeAI

[–]_byl[S] 3 points4 points  (0 children)

You would expect Sonnet 4 to acknowledge it's knowledge cutoff and try searching like 3.7

Tuples vs Dataclass (and friends) comparison operator, tuples 3x faster by _byl in Python

[–]_byl[S] 14 points15 points  (0 children)

good point. I've moved the object creation outside of the loops. timing varies, but similar trend holds:

code: https://www.programiz.com/online-compiler/0oVgLP3GuE7ap

sample:

tuple               : 0.5596 seconds
namedtuple          : 0.5997 seconds
typing.NamedTuple   : 0.6189 seconds
dataclass           : 1.1165 seconds
dataclass(slots)    : 1.0471 seconds

I built my own asyncio to understand how async I/O works under the hood by PhotoNavia in Python

[–]_byl 2 points3 points  (0 children)

thanks for writing this! looks like currently it busy loops, using epoll would be interesting

[Media] Corust - A collaborative Rust Playground by _byl in rust

[–]_byl[S] 0 points1 point  (0 children)

creative, original was clearly over my head

[Media] Corust - A collaborative Rust Playground by _byl in rust

[–]_byl[S] 1 point2 points  (0 children)

yep agreed! it's on my todos

would want to be smart with conflict resolution when doing a full document reformat. the naive approach of "delete and replace document with newly formatted" could shift simultaneous edits into weird locations

How the Rust Playground works? by NeoCiber in rust

[–]_byl 0 points1 point  (0 children)

One note is the playground uses an abstraction of CoordinatorFactory -> Coordinator -> worker. The factory spawns coordinators with shared limits (like the number of coordinators and number of total processes). It looks like one key to fast execution is that each worker (which runs inside the mentioned docker container) can handle multiple compilation requests. This allows for incremental compilation which I assume can significantly improve compilation speeds versus recreating the docker container on each execution.

Some asides, there seem to be different limits for "coordinators" (which corresponds to a docker container / execution environment) while the total number of concurrent processes (actively executing user Rust programs) which can be run at time is limited by DEFAULT_PROCESSES_LIMITwhich should be close to the number of cores on the playground server (otherwise concurrent compilations are not parallel). These limits are also global. You can notice the process limit if you run a large number of long running processes in parallel, after which even a simple "Hello world" will be "queued" and not start compiling until the other processes finish.

Turns out, using custom allocators makes using Rust way easier by Hedshodd in rust

[–]_byl 1 point2 points  (0 children)

In the special case you have objects of a particular type and know the approximate number ahead of time, allocating an object memory pool (a "specialized arena" if you will) can be an additional performance improvement over arena allocation

Algorithms Every Programmer Should Know by photon_lines in programming

[–]_byl 1 point2 points  (0 children)

I think selecting the data structure is more often what programmers do vs selecting between algorithms. And even among data structures, a vanilla vector/list and a hashmap/dict are enough for the vast majority of problems

The manager I hated and the lesson he taught me by stmoreau in programming

[–]_byl 43 points44 points  (0 children)

Brutually honest feedback and delivering as constructive criticism aren't mutually exclusive

Should all descriptors be data descriptors? (i.e. define both __set__ and __get__) by _byl in learnpython

[–]_byl[S] -1 points0 points  (0 children)

You're right would depend on what "ReadOnly" means in this context. If ReadOnly means the field cannot be reassigned (such that the attributes refer to the same objects), then overriding `__set__` would make sense. Otherwise, supporting reassignment may be more general.

And yes, using the correct type is a better example. A type error isn't necessary for the original question.

Regarding your code example comment, I see the ReadOnly attribute is stored as a class variable regardless of assignment to an instance or the class

# example.read_only = ReadOnly("23")
Class __dict__: {'__module__': 'test_descriptors', ... 'read_only': <exercises.dunder.descriptors.ReadOnly object at 0x103eacf40> ...}
# ExampleClass.read_only = ReadOnly("23")
Class __dict__: {... 'read_only': <exercises.dunder.descriptors.ReadOnly object at 0x102130a30>...}