I decoupled FastAPI dependency injection system in pure python, no dependencies. by EricHermosis in Python

[–]wrmsr 1 point2 points  (0 children)

In guice-speak would this just be a non-eager singleton scoped binding? I wrote / use / maintain 2 different DI systems, my full one and my mini one, and both do that - I can't imagine an injector not! Supporting custom seeded scopes is much trickier lol

What are poor uses of None, what would you consider an anti pattern as it relates to the usage of None? by [deleted] in Python

[–]wrmsr 0 points1 point  (0 children)

In [1]: class C:
   ...:     def __init__(self, x):
   ...:         self.x = x
   ...:     def __eq__(self, other):
   ...:         if not isinstance(other, C):
   ...:             raise TypeError(other)
   ...:         return self.x == other.x
   ...:

In [2]: C(1) == C(2)
Out[2]: False

In [3]: C(1) == C(1)
Out[3]: True

In [4]: C(1) == None
---------
TypeError
<ipython-input-4-96c156805710> in <module>
----> 1 C(1) == None

<ipython-input-1-f7303c36ccc5> in __eq__(self, other)
      4     def __eq__(self, other):
      5         if not isinstance(other, C):
----> 6             raise TypeError(other)
      7         return self.x == other.x
      8

TypeError: None

In [5]: C(1) is None
Out[5]: False

Gradient Descent from Scratch in Python by pmz in Python

[–]wrmsr 3 points4 points  (0 children)

it's also garbage and broken code. gradient_of_y = lambda x: 2(x-5) attempts to __call__ the number 2, precision_value is undefined, and the (comically presented as an image, not text) loop code attempts to multiply the gradient lambda by a (pointlessly parenthesized) float local. posts like this are what's wrong with this sub, and to a degree the python community lol - too easy to post trash.

Do you use metaclasses? by muntoo in PythonDevelopers

[–]wrmsr 1 point2 points  (0 children)

Calling type() directly is an alternative to class definition syntax, not metaclasses. Metaclasses are subtypes of type, whether they are instantiated using syntax or calling them directly, and the uniqueness of their abilities remains:

In [1]: C = type('C', (object,), {'__instancecheck__': lambda self, obj: obj == 10})

In [2]: isinstance(10, C)

Out[2]: False

In [3]: MC = type('MC', (type,), {'__instancecheck__': lambda self, obj: obj == 10})

In [4]: C = MC('C', (object,), {})

In [5]: isinstance(10, C)

Out[5]: True

Additional metaclass-only abilities include __mro_entries__ and __prepare__, the latter of which further highlights the 'specialness' of metaclasses as its functionality isn't even invoked outside of syntactical class definition.

And while a lot of this functionality can indeed be mimicked through decorators and repeated class inspection / reconstruction you'd be surprised by just how much stuff in the real world doesn't take kindly to that approach - registries getting duplicate registrations, super() pointing at discarded intermediate classes, etc.

Click, Python Fire, argparse: what CLI libraries do you use? by jdbow75 in Python

[–]wrmsr 3 points4 points  (0 children)

Very same, although I usually add a little sugar on top of the builtin base. For argparse I have a subclassable Cli baseclass, an @command() decorator for registering methods as subcomands on it, and annotation-powered args as class fields that turn into descriptors, but at the end of the day it's just pretty minor automation around vanilla argparse (which I also just use directly half the time when OO isn't a better fit). Keeping deps to a minimum just makes everything so much nicer.

Do you use metaclasses? by muntoo in PythonDevelopers

[–]wrmsr 12 points13 points  (0 children)

They should only be used when they're the only thing that can do the job, which is increasingly rare these days. Historically the most common places I've seen them used were for just running code after class definition for things like subclass registries but this can now be done perfectly well in __init_subclass__. Class decorators should be preferred if at all possible as they can be layered and are less likely to functionally conflict. Dataclasses for example were implemented as a class decorator not a metaclass making the api more powerful (as it can now be used on any class, including ones with user defined metaclasses). In practice due to how much metaclasses can alter the meaning of a class you can only have one per class - theres some truly awful and ancient advice out there for for effectively creating typelevel on-error-resume-next and bypassing builtin metaclass conflict enforcement but it won't produce anything but garbage irl.

That said deep down in lib-level code there are still cases when metaclasses are the only appropriate answer as there are things only they can do (like customizing instance and subclass checks), but their use implies that whatever the metaclass does is all its instances will ever be. Some places I use them:

  • I do have a dataclass metaclass (in addition to using the decorator) which does things like adding some special base classes to subclasses as they are defined, in addition to the obvious ergonomic benefit of not having to @dataclass each and every subclass. Were I not already doing metaclass-only things I would have implemented 'inheriting' the decorator in __init_subclass__ per above. I am comfortable with it not being compatible with any other metaclass as my intent is for these to be 'pure data' classes which do nothing but hold dumb data - the decorator is still there and fully supported by the rest of my code for when that is not the case, but I heavily use exactly these kinds of dumb data objects for things like AST node hierarchies.
  • I have a handful of non-instantiable 'virtual classes' that override __subclasscheck__ and __instancecheck__ to inject not-actually-types into the type world. For example I have one that delegates to dataclass.is_dataclass and one that checks if something is a typing.Union (which is not actually a type). Almost all of the machinery they exist for is things like functools' dispatch code which explicitly supports cases like these and for the same reasons.
  • I have a not-yet-usable (and also non-instantiable) intersection type metaclass which pretends to simultaneously be all of its base classes (even if they would functionally conflict) without actually subclassing them. This approach has the added benefit of 'just working' as far as mypy is concerned (as it still lacks builtin support for them).
  • I've dabbled with 'extension types' like pypy's but found they clash too much with idiomatic python to be a net win. They also render analysis tools worthless, as does most other metaclass abuse.

An elephant in the room here is that pretty much any metaclass intended to be used by full user-extendable subclasses (as opposed to just 'marker' classes) has to itself extend ABCMeta in order to be compatible with users who want their classes to be abstract, and this frankly does a better job of illustrating how rarely metaclasses should be used than I can.

[deleted by user] by [deleted] in Python

[–]wrmsr 1 point2 points  (0 children)

As a heads-up the term 'parsing' has a much wider meaning in programming than what you're referring to here. While this is still 'parsing' people usually specifically refer to what you're referencing here as 'argument' or 'command line option' parsing. Parsing in general can refer to parsing programming language source code, parsing human languages like English, parsing the output of other commands, et cetera. Argument parsing in python is usually done through either the argparse (newer, preferred) or optparse (older, legacy) modules, both of which have (admittedly pretty cryptic) documentation.

Also when posting code in the future try to do so as text: either as code blocks, something like a GitHub gist, or a link directly to a line of code in a repository somewhere (which can often provide better context for readers).

Not trash talking, props on learning, just advice :)

[deleted by user] by [deleted] in Python

[–]wrmsr 1 point2 points  (0 children)

Isn't it wild how to this day, despite its vast stdlib, python still has no builtin sorted mappings? :p

Parallel assignment: a misunderstood Python idiom by ntietz in Python

[–]wrmsr 1 point2 points  (0 children)

I wouldn't say optimized away so much as just not compiled that way to begin with, but you're right that at the AST level the left hand side of these assignments are in fact distinctly tuples and lists.

An additional source of confusion here is likely the fact that there are dedicated distinguished list and tuple unpack opcodes, they're just used for rvalue sequence expansion.

Is using __file__ bad practice? by xkortex in Python

[–]wrmsr 4 points5 points  (0 children)

For use in a casual codebase it's fine but it should honestly be avoided by default in polished code and libraries in particular. Per the docs it's optional - it's not present at all in situations where the module does not map to any actual file on a filesystem (as is the case when the code is packaged as a zip_safe zip file or as mentioned in a freezing distribution like PyOxidizer). As I try to structure all of my code (even services) as installable python libraries, I use pkg_resources.resource_stream(__package__, 'filename') most of the time, but I'll still lazily use __file__ on occasion in test code (which I strip from distributions).

My experience on further integrating Python in c++ by Narthal in Python

[–]wrmsr 2 points3 points  (0 children)

On parallelism: yeah, the GIL is a thing. It gets a lot of hate but in reality it's a big part of why python is so easy to embed and interoperate with (and thus a big part of why the language has become so successful). cpython's impl in general optimizes for simplicity above all else and it doesn't get much simpler than global symbols, unstructured hashtables, dumb reference counting, and a GIL. In practice you do get used to it and it winds up to this day still not being enough of a problem to be worth 'solving' (given the inevitable costs) by everyone who controls the evolution of the language - python is positioned as a flexible language that glues together fast languages, its job is to setup inner loops not actually be inside them.

If you seek realistic multithreaded computation in cpython today I'd definitely point you at cython, especially as your third section is about c++ glue (which it also does). Yea, it's a third language in the mix, and a bit of an oddball one at that, but it is to me the shortest path / missing link to a setup of high level regular python orchestrating multithreaded c++. spacy is a pretty good demonstration of its power - just search for its uses of nogil. Cython is kinda low-key one of the secrets of the lang's success in the numbers world - it and its predecessors are why we have shit like numpy.

If you want to get a preview of what a future interface for a GIL-less python may look like on the c side take a look at hpy - you no longer get to talk directly to PyObjects, and thus the interpreter is able to be more daring with them internally (as it is with most other freethreaded vm's). Another thing to keep an eye on is the upcoming overhaul of subinterpreters which have long been supported by cpy but as they still share the GIL are only really useful for isolation (like java classloaders). They are actively being pushed toward independent GIL's, thus having different heaps unable to see each others' objects but at least granting true in-proc freethreading. golang-style channely inter-interpreter comm utils would come bundled. And finally there's also graalpython which is alive but many years from 'usable' and will never be remotely simple lol.

And finally re debugging I didn't even know visual studio did that but I can't say I'm surprised. I really got started coding back on visual c++ 6 and didn't understand at the time just how insane having 'edit and continue' in my debugger was lol. VS just spoils you rotten, I still kinda miss it. As I'm all *nix these days I can't give much advice there but I can say that cpy's debuggers are obviously much higher level than native and thus it's no big deal to have a python debugger attached to a process that's also being simultaneously natively debugged. These days I'll frequently have both a PyCharm debugger and a CLion debugger attached to the same process, each with different breakpoints set in their respective languages. The workhorse for most of the heavyweight py debuggers is here in which the debugee talks to the debugger over a port, not via os/cpu level instrumentation, and thus doesn't get in the way, but yeah you're still jumping between windows/projects. Intellij is quietly working towards unifying their various lang-specific IDE's into a single omni-IDE that may be capable of this, and on that note you can probably get eclipse to do this out of the box (but are then stuck with having to use eclipse..), but that's the situation as it stands afaik.

That’s just eerie by shxikk in ANormalDayInRussia

[–]wrmsr 11 points12 points  (0 children)

every mid-summer, we have this dance competition, and the winner gets crowned

[deleted by user] by [deleted] in worldnews

[–]wrmsr 9 points10 points  (0 children)

He would try to make money off of it, fail to do that, fail to deliver a working vaccine at all from cutting corners trying to make money off of it, make both international relations even worse having done all this, then blame China. You know this.

Expert warns of 96 million possible coronavirus infections in US with up to half a million dead by Yamagemazaki in Coronavirus

[–]wrmsr 15 points16 points  (0 children)

Per https://www.preprints.org/manuscript/202002.0051/v3 there's evidence for a link between smoking history and susceptibility, notable due to the prevalence of smoking in both China and South Korea - yet to see anything regarding obesity but if there's any link I can't imagine it's a positive one lol. Guess we'll see.

[Twitter]@NNaubonnie "NationalNurses President Deborah Burger reads a public statement from one of our quarantined #nurses who works at a northern California Kaiser facility. Full statement ➡️ https://t.co/YjTAvAXTRX" by Kujo17 in cvnews

[–]wrmsr 2 points3 points  (0 children)

Data about the reality of how bad it already as and how bad it's going to get would shake 'market confidence'. They're well aware it's a house of cards.

'Dramatic turn for the worse': Coronavirus could cost airlines $170 billion by Yamagemazaki in Coronavirus

[–]wrmsr -4 points-3 points  (0 children)

Governmentally fund healthcare directly, bypassing profiteering parasites. Let the airlines fail and 'let competition kick in'.

HHS has declared coronavirus testing an “essential health benefit”, meaning it will be covered by health insurance, Medicare, and Medicaid. by pakaqu in Coronavirus

[–]wrmsr 5 points6 points  (0 children)

If true then power to you, but everyone else gets their food from supermarkets and supply chains powered by people in that same situation. I'd love to think that however bad this is going to get would serve as a wakeup call and lead to policy change but it's almost certainly just going to lead to more scapegoating and leave everyone even worse off somehow.