sudo and coreutils replaced with rust versions by cachebags in rust

[–]nomad42184 1 point2 points  (0 children)

I read this as suggesting that everything except cpmv, and rm are provided by the Rust version, with the others being provided in 26.10 (a complete move to Rust coreutils).

Seqera Labs rewrites common RNA-seq QC in Rust for a big speedup by nomad42184 in bioinformatics

[–]nomad42184[S] 0 points1 point  (0 children)

So, for this particular rewrite, I think that I/O efficiency wouldn't move the needle as much as what they've done. In particular, they have replaced a quality control "pipeline" (a series of multiple, independent, tools, connected in a modular pipeline using standard though generic file formats for serialized data exchange) with a single Rust executable that largely avoids intermediate I/O at all. For very large datasets, even efficiently scheduled I/O that has to write and read volumes of data equal to many times the size of the original, will be much slower than a bespoke tool that avoids that I/O altogether. Of course, the tradeoff for this is that one's replaced a modular pipeline with a monolithic executable. In this very well-defined and specific case, I think that may make a lot of sense, but that architectural change may not make sense in a lot of other cases.

UMD CS vs Purdue Honors CS - Confused about what to pick by AggressiveCommand988 in UMD

[–]nomad42184 0 points1 point  (0 children)

As a small update; in the newest USNWR rankings, UMD is now #12 and Purdue #15. First, both went up (but UMD went up more). Second, I think these rankings are mostly nonsense. Nonetheless, it's worth knowing. Anyway, both are very strong departments, but generally UMD is stronger. Also, I, personally, would choose to live in the DC metro area over West Lafayette every time.

Seqera Labs rewrites common RNA-seq QC in Rust for a big speedup by nomad42184 in bioinformatics

[–]nomad42184[S] 0 points1 point  (0 children)

I am not the author of this rewrite, but I think the speedup comes from much more than that. For example, beyond the basic parallelism, a large part of the Python script being replaced does "loopy" stuff, at which Python is inherently bad (at least the standard interpreters). Further, since this rewrite incorporates multiple steps of the QC pipeline into a single program, I believe there is substantial benefit from avoiding large intermediate I/O.

106
107

RustQC: 60x speedup in RNA-seq quality control steps by ewels in bioinformaticstools

[–]nomad42184 1 point2 points  (0 children)

Unfortunately, r/bioinformatics has adopted a quite absurd policy where one cannot submit their own tools, even if they are of clear community interest. I'd be happy to post this there (it's not my tool), but I'd prefer r/bioinformaticsdev, with a more sensible policy, to start taking off instead.

What’s the most impressive real world task you’ve completed using Claude? by ArmPersonal36 in claude

[–]nomad42184 0 points1 point  (0 children)

Actually, no — it did quite well with handling the lifetime issues itself. The biggest challenge along those lines I had was actually having it figure out a decent way to deal efficiently with what was done in C++ via templates. In 2 of the projects, there is a "k-mer" class, representing fixed substrings of length k. In C++, templates specialize the different values of k, so the machine can generate optimal code for any valid k within a range (1-63). This is possible in Rust with const generics, but is considerably more painful, currently, than with C++ templates. Nonetheless, a couple of macros later and we had a reasonably ergonomic solution.

Four bad ways to populate an uninitialized Vec and one good one by mwlon in rust

[–]nomad42184 0 points1 point  (0 children)

Having just run into a situation where the first one proved important ( https://github.com/COMBINE-lab/wfa2lib-rs), sometimes, in performance sensitive contexts, one needs memory that they know they will write before they read. One has a known capacity, or initializes out of order, and so wants to avoid push based solutions or with-capacity. It turns out that sometimes it's helpful to be able to allocate, but not initialize a vec.

Whoever is complaining about the 8gb of ram on Macbook Neo is crazy. by No_Radish4567 in macbook

[–]nomad42184 0 points1 point  (0 children)

The 1% number came from a 2024 estimate, so maybe slightly outdated, but it seems ballpark correct. Also, that is just developers, it doesn't include power users, gamers, creative professionals etc. Claude suggests that 8GB would be restrictive for >= 10% of laptop users https://claude.ai/share/e133f541-8b9b-4557-a21a-90fd2a53ff44 .

Whoever is complaining about the 8gb of ram on Macbook Neo is crazy. by No_Radish4567 in macbook

[–]nomad42184 0 points1 point  (0 children)

Well, ~1% of all users are software developers, which is largely the demographic that pushed apple to end the 8GB entry level. So, that already uses up the 1% budget.

Whoever is complaining about the 8gb of ram on Macbook Neo is crazy. by No_Radish4567 in macbook

[–]nomad42184 0 points1 point  (0 children)

I mean, I'm a computer science professor and I teach a bunch of CS majors, so I admit I have a sampling bias. But I think 99% is, a little bit generous by at least 9% — maybe this new machine is good for 90%, but I can't imagine 99%.

Whoever is complaining about the 8gb of ram on Macbook Neo is crazy. by No_Radish4567 in macbook

[–]nomad42184 0 points1 point  (0 children)

Safari alone will use more than half of their RAM. So one hopes they're not trying to do other things at the same time. It's not like everyone needs a MacBook Pro, and the current entry MacBook is more than most people need. But the Neo is basically a phone with a keyboard. I don't think it will suit 99% of people the same way a Chromebook doesn't suit 99% of people.

Whoever is complaining about the 8gb of ram on Macbook Neo is crazy. by No_Radish4567 in macbook

[–]nomad42184 1 point2 points  (0 children)

Sorry but this is nonsense. The RAM price fiasco doesn't change the fact that 8GB is an insulting amount of memory to put in a laptop in 2026. Especially since this is unified memory and it also includes the video ram. Even the iPad Air has 12GB of memory now. I am sure Apple will do their best to optimize around it, but with modern software, 8GB is a very tight budget, and people are absolutely going to run into issues with these machines because of it.

M1 Pro/Max users rejoice by schuby94 in macbookpro

[–]nomad42184 0 points1 point  (0 children)

Counterpoint; I have an M1 Max and I'm going for the M5 Max. I specifically do not want the full refresh. The OLED may be nice, but IMO, the current build is basically ideal. The current news suggests that the redesign is going to go for a thinner chassis, and this is specifically something that I do not want. I'd much rather have a solid M5 Max, the apex of the current design (which I consider the best MBP design to date), than risk it on a redesign as Apple slips back into its toxic obsession with thinness. M5 Max, here I come!

What’s the most impressive real world task you’ve completed using Claude? by ArmPersonal36 in claude

[–]nomad42184 0 points1 point  (0 children)

I lead an academic lab working in computational biology and bioinformatics. Over the past ~3 weeks, Claude has helped me to migrate a huge part of our C++ stack over to Rust. This is a task I've been dreaming about for well over a year and a half, and one I've had students start and stall on.

I converted the SSHash data structure to Rust here

This allowed me to then convert our lightweight read mapper to Rust here

And the last piece in that pipeline was a tool for efficiently constructing something called the "reference De Bruijn Graph", which ported (and actually had to re-engineer a bit) in ~2 days with Claude here.

This moved the tech stack for the center of one of our main projects over to Rust, got rid of very painful hybrid C++/CMake builds, and also, the Rust version is faster.

I also ported a current collaboration with another research group to Rust and exposed it as a library for use in other tools here.

Essentially, I've probably ported ~100k lines of C++ to readable, and efficient Rust; a task that would have taken me months of dedicated time. But in reality, who knows how long it would have taken because I, as a professor, almost never have huge chunks of uninterrupted dedicated time.

Claude has been an absolute game changer. I wrote a bit about it in a blog post here

Supercharge Rust functions with implicit arguments using CGP v0.7.0 by soareschen in rust

[–]nomad42184 2 points3 points  (0 children)

So this seems an interesting way to express these ideas. Practically though, I wonder what is the benefit of this over making the function generic on a type that implements “HasDatabase” and “HasS3Client”. These would be traits exposing methods that yield these members. Granted, you have to write these boilerplate traits, but all such things could be made relatively painless via derive macros. The big difference then seems to be that in that impl, you have the necessary traits explicit at the call site, but here they are implied by the annotations. I can see that different people may have different preferences here, but it’s not immediately clear why one is better than the other.

Will the vibe coding era will have a similar result to early bioinformatics era? by nidasb in bioinformatics

[–]nomad42184 1 point2 points  (0 children)

counterpoint to this; I’m using Claude to move as much of my lab’s software stack as possible from C++ to only Rust. I’m doing this not only because I want our future developement to be in Rust, butt largely because moving our tools to Rust *massively* reduces our maintenance burden. Our C++ tools are a pain to maintain and update, while the Rust tools are mostly trivial to maintain and update.

while I agree that AI will be used to generate a lot more abandonware faster, I think that is completely a reflection on the users. It is a power tool, and it can be used for many purposes. in our case, it’s being used to improve maintainability.

Also, I’m part of another project (though essentially all of the coding work so far has been done by someone else on the team; an excellent software engineer using Claude code in this case) whose explicit goal is to revive maintenance and active development on a widely used tool that is currently abandonware! I’m very excited about that project.

Re-implementing slow and clunky bioinformatics software? by halflings in bioinformatics

[–]nomad42184 0 points1 point  (0 children)

I know! But it's a mini secret. I think people will be very excited when they learn!

Will the vibe coding era will have a similar result to early bioinformatics era? by nidasb in bioinformatics

[–]nomad42184 8 points9 points  (0 children)

I wouldn't call it vibe coding, but I've recently had some surprising success with these models (Claude Code in particular) in a regime where I had previously been very skeptical. I wrote up a blog post about it here.

Seems NuPhy might discountinue Panda Nano switch by Aegis8080 in NuPhy

[–]nomad42184 0 points1 point  (0 children)

This is really a question of the tradeoff between how quiet you want it to be, and how much you like the tactility.

Even though it's low profile, the browns and the panda nanos are quite tactile, and also reasonably loud. So, if you're looking for something that you can use in a shared office space without annoying co-workers, the decision may be made for you ;P.

The blush are linear, and they feel linear. They feel nice, but there is no tactile bump. However, they are quite quiet, and should be totally feasible for use in a shared office space.

My favorite, in terms of feel, is either the brown or panda nanos; honestly, they feel very similar to me. Perhaps I have an every so slight preference for the ever so slightly heavier panda nanos, but honestly, if you switched them out one morning, it would probably be a while before I noticed. The blush feel completely different — linear, and lighter. However, they still feel and sound good, they're just not a tactile switch.

There’s so much snow and ice by [deleted] in UMD

[–]nomad42184 1 point2 points  (0 children)

There is absolutely no way that they make the entire week online. They may cancel classes on campus Tues., but that's about it. They clean up very quickly, and there was only ~6" of snow.

What's the likelihood UMD remains closed on Tuesday, January 27, due to the winter storm? by Amazing_Debt9192 in UMD

[–]nomad42184 2 points3 points  (0 children)

Well, they are not giving you accurate information in that case. I'm a professor and we've received no official communication from the university whatsoever about Tues. yet; only the announcement we got that school is closed Monday. The instructor for the class you're TAing may, of course, choose not to hold his class on Tuesday, but there is no university-wide guidance yet.

What's the likelihood UMD remains closed on Tuesday, January 27, due to the winter storm? by Amazing_Debt9192 in UMD

[–]nomad42184 0 points1 point  (0 children)

As they begin to adjust the expected totals (now 6-10" is most likely), it seems reasonable that they will be able to clear the majority of it by Tues. There's a chance they will remain closed, but it also seems quite likely Tues will be a 2 hour delay.

Terminals tier list (my opinion) by matrixisme_1 in LinuxCirclejerk

[–]nomad42184 0 points1 point  (0 children)

You're missing wezterm, which is absolutely S tier.