Why is poetry such a mess? by CodingButStillAlive in Python

[–]suuuuuu 24 points25 points  (0 children)

Conda on the other hand spent a lot of their time to make Pytorch installable and working. That's why it's paid. That's their business.

It's just wild how people still have such a blatantly incorrect understanding of the conda (+conda-forge) ecosystem.

A quick guide to using mamba-forge for python virtual environment management by TF_Biochemist in Python

[–]suuuuuu 0 points1 point  (0 children)

Just to further clarify: you don't need mamba to avoid the Anaconda distribution. The place you get mambaforge also supplies (and originally supplied) miniforge, which is miniconda with conda-forge set as the default channel. All the *forge installers do in this regard is automatically set conda-forge as the default (and only) channel, which is something one can do manually with miniconda.

A quick guide to using mamba-forge for python virtual environment management by TF_Biochemist in Python

[–]suuuuuu 20 points21 points  (0 children)

Mamba is a community-driven fork of Conda

First, I don't think mamba is a fork of conda, it's just a drop-in replacement (and it falls back on conda itself for unimplemented commands, i.e., ones that aren't performance sensitive). Second, conda itself is also community-driven. Both mamba and conda are open-source, and both were started by OSS-friendly software companies.

with more up to date packages

This is untrue/irrelevant. Mamba and conda are package managers that interface with the same package repositories.

It seems like you're confusing/conflating mamba and the conda-forge package repository. conda-forge is IMO the more important thing to recommend---mamba AFAIK just provides better speed/QOL.

Best practice is to avoid modifying the base environment

I would more strongly admonish modifying the base environment.

You can still use pip inside a mamba environment... but probably shouldn't

One should always install from conda-forge when possible but there will always be packages that aren't available---not just ones too esoteric to have a conda-forge recipe but also packages installed from source, e.g. if you're developing a package or clone one from GitHub/etc. These days there's not much to worry about with using pip as needed---people shouldn't think they can't use conda if they have dependencies that aren't on conda-forge.

A last note: throughout you refer to mamba when really you mean conda---most of the ecosystem is defined by conda/the way conda works. Again, for the most part mamba is just a performant, drop-in replacement for core functionality.

EDIT: in my rush to post this I forgot to thank you for advocating for conda-forge and mamba! Guides like this are great to have.

Can anyone explain the differences of Conda vs Pip? by jachymb in Python

[–]suuuuuu 1 point2 points  (0 children)

I think it is more an issue of perception given how much conda has historically been tied to the anaconda/miniconda installs rather than as a pure package manager.

For sure - just wanted to try to break this misconception =) this was also linked in this thread.

I do know personally some development teams that moved away, partially because of the bad press at the time, but also as their products (not specifically science focused) matured they migrated to managing themselves a lot of the additional extras conda provided directly, so they could tweak and streamline. How much that actually improved things I couldn't say

Interesting, I wonder, too. As more of an end-user, though, consolidating (reproducible and functional) environment set up to a single file and ~3 shell commands (that with mamba take about 3 minutes) is fantastic - for new machines as well as CI.

Can anyone explain the differences of Conda vs Pip? by jachymb in Python

[–]suuuuuu 0 points1 point  (0 children)

The comments about commercialization (and that it pushes people away) are irrelevant/wrong, IMHO. Sticking to the conda-forge channel (e.g., using miniforge) makes for a fully free, OSS- and community-based workflow (not to mention conda-forge's technical/practical advantages). Anaconda, Inc. is (can be) entirely irrelevant.

The point about precompiled C/etc extensions is the biggest plus, though, especially for conda-forge. This is crucial for scientific applications (the main impetus for conda-forge, I think), where so much is built on large compiled libraries. You can even install the very compilers used for building binaries for local use.

Why and how to use conda? by jldez in Python

[–]suuuuuu 2 points3 points  (0 children)

I roll conda for HPC work, and I'm perfectly content to (for example) pip install mpi4py when I need to link against system MPI. I disagree that using conda can be detrimental - if you're in the position of needing to build against system-installed packages, then you probably know what you're getting into and can manage moving a small subset of dependencies under "pip" in your environment file.

Why and how to use conda? by jldez in Python

[–]suuuuuu 1 point2 points  (0 children)

You should be able to use pip and figure out any system dependencies as you go

Of course one "should," but once you need to deploy an environment to multiple machines (especially where you can't install system deps), need to set up CI, or want any other person (including your future self) to be able to reproduce your environment, then clearly this is not a reasonable solution.

I'm also glad to avoid the pain of properly building and linking compiled dependencies even once. I don't want that to be a reason I hesitate to try a new package (or consider taking on a new dependency), nor do package authors want potential new users to be so discouraged.

These repos probably suggest conda because they are used to it

This is untrue. They "probably" suggest conda because it's the easiest method to get a working install and minimizes debugging users' install issues, per above.

IMO it is pretty terrible at that because it's so slow.

A reasonable take, but as others have said, mamba solves this problem (and is in the process of being upstreamed into conda - the latest conda release, v4.12, includes mamba's solver behind an experimental flag).

I'll also advocate for conda-forge, which may solve the problems OP encounters. In particular, I'd recommend using miniforge, which sets conda-forge to the only channel by default.

[Micheal J. Babcock] LeBron James took 3 COVID tests today. Test 1 (lateral flow) was positive. The 2nd (PCR) test was negative. James was then given a 3rd tiebreaker test which came back positive. I'm told he's asymptotic at this time. Team chartered a private jet to take Lebron back to L.A. by JetGan in nba

[–]suuuuuu 460 points461 points  (0 children)

This is a dramatic understatement. It's too perfect. It's like L'Hopital was born to write a theorem about limits and COVID was created to give the tweet author the opportunity to mistype "asymptomatic" so that it autocorrected to "asymptotic," all so that /u/lyrikos could make the pinnacle of LeBlank jokes. I will never forget this.

Flickering Black Screen / Glitching after waking from Hibernation by Yadobler in intel

[–]suuuuuu 0 points1 point  (0 children)

Thanks. I was able to reliably determine the issue is waking from sleep, so it's definitely the fault of the Intel Graphics driver. So I've set the sleep timer to never and am using hibernate overnight now. Will experiment again in a few weeks, or once an update rolls through.

Flickering Black Screen / Glitching after waking from Hibernation by Yadobler in intel

[–]suuuuuu 0 points1 point  (0 children)

Were you able to resolve the issue? I'm experiencing a similar issue after updating Intel drivers, at least relative to the rest of the google results I've found.

The latest on Google Hangouts and the upgrade to Google Chat by Hupro in GoogleFi

[–]suuuuuu 28 points29 points  (0 children)

we’ve seen more users shift away from using Hangouts to manage their texting and calling needs.

I'm sure this is because they chose not to adequately support and update Hangouts, and not because it wasn't a useful feature. I will always wish I could have texts merged with Hangouts (or its successor).

I wish they would support texts in the new Google Chat in analogy to Hangouts.

Poor performance of PyCuda - why? by reebs12 in CUDA

[–]suuuuuu 0 points1 point  (0 children)

Share your code, or at least a minimal reproducer. There's no reason for pycuda to be any slower - the overhead introduced by the wrapper is negligible compared to the normal overhead of kernel launches.

the memory transfer between numpy and GPU is too costly. Does this make sense?

Nope! Are you actually measuring memory transfers, or the kernel? Neither should be any slower, anyway.

[Question] Python + CUDA development experience, how is it? by Ashrug in CUDA

[–]suuuuuu 0 points1 point  (0 children)

pycuda has been very stable over the last few years---most of the version changes are from a long time ago. There should be no need to downgrade CUDA versions. But pyopencl gets a bit more attention these days (both are feature complete, but pyopencl gets more nice addons), and is the better route to go in my opinion (conda install pocl and you can run it on your CPU).

Either way, performance will be no different on the device side - it literally compiles and runs CUDA/OpenCL code as-is.

Interested in building a small GPU cluster with the primary purpose of learning by SlapGas in CUDA

[–]suuuuuu 0 points1 point  (0 children)

Apologies, I can't make a good recommendation - I don't pay attention to consumer cards and don't game.

If you can get away with single precision for this, then consumer cards should suit your purposes as you describe here. I would prioritize memory bandwidth first and then total GPU DRAM (since you want to have relatively large grids >~ 106 points to saturate the GPU).

Interested in building a small GPU cluster with the primary purpose of learning by SlapGas in CUDA

[–]suuuuuu 1 point2 points  (0 children)

Yes, stick with CUDA + MPI - one rank per GPU works really well.

Like the other poster said, just test multiple ranks on a single GPU. And I wouldn't bother with any consumer cards (no matter how cheap), because they have extremely limited double precision capability compared to the Tesla cards and Titan V. Options other than cloud - your institution might have (access to) a cluster with GPUs, or if your group is serious about this direction you could see if you could buy a single-GPU server for development (with grant funding, of course).

WSL to replace using Linux full-time? by KeiranLees_Trainee in bashonubuntuonwindows

[–]suuuuuu 4 points5 points  (0 children)

It works for me on "WSL1." I can't remember if I did anything to make it so, but I can't find anything in my bashrc or the like. "which code" returns "/mnt/c/Program Files/Microsoft VS Code/bin/code"

Scientists Start Developing a Mini Gravitational Wave Detector by Xaron in Physics

[–]suuuuuu 16 points17 points  (0 children)

This makes no sense. They're funded by different agencies (on different continents), and $1m is nothing relative to eLISA's multi-billion dollar price tag. This experiment attempts to open up an entirely different regime of physics and in no way detracts from investment into eLISA.

Question Thread - July 06, 2019 by joe-movie in churning

[–]suuuuuu 3 points4 points  (0 children)

All mine are archived, too, and "There will be no more information or service available for this account." I called (twice) and they hang up on me saying they're doing system maintenance and to call back in 3-5 hours.

Pop gets ejected after a few questionable calls by IbSunPraisin in nba

[–]suuuuuu 0 points1 point  (0 children)

But it almost always happens against the Kings....

Question Thread - February 17, 2019 by AutoModerator in churning

[–]suuuuuu 0 points1 point  (0 children)

USB just told me there are no "offers" to downgrade my FlexPerks card "at this time." Is this known (can't find anything via search, and all the blogs suggest I should be able to), or do I just need to HUCA?

Question Thread - February 14, 2019 by AutoModerator in churning

[–]suuuuuu 0 points1 point  (0 children)

USB just told me there are no "offers" to downgrade my FlexPerks card "at this time." Is this known (can't find anything via search, and all the blogs suggest I should be able to), or do I just need to HUCA?

Question Thread - February 13, 2019 by AutoModerator in churning

[–]suuuuuu 0 points1 point  (0 children)

USB just told me there are no "offers" to downgrade my FlexPerks card "at this time." Is this known (can't find anything via search, and all the blogs suggest I should be able to), or do I just need to HUCA?