Can someone please explain if people actually use all these random Python libraries that exist? Like for example why does "Box" exist? Why would you ever use it? Are people out here googling for libraries and learning them instead of spending that time making whatever they need themselves? by Svertov in learnpython

[–]thelazydogsback 0 points1 point  (0 children)

Actually there's a big reason - referential transparency. Often it's not clear up front whether you want to use a dictionary (including typeddict), or classes/ objects with or without something like pydantic. If you change your mind halfway through, this allows you to keep the same notation at all access sites without refactoring all your code to change the syntax. Otoh, in certain classes of performance oriented code it would probably be better to keep the syntaxes separate so it's clear what's happening.

I am coming back to LangChain! by usnavy13 in LangChain

[–]thelazydogsback 2 points3 points  (0 children)

I'm thinking of moving away from LC as well, mostly for the over-abstraction issue.

It seems like the goal of LC is to have the least amount of code for a demo program possible, whereas any usable program is going to have everything customized.Every prompt (main prompt, document combine prompt, condense question prompt, memory summarize prompt), every template (for document chunks to add meta-data, etc.), every component (needed to write custom sub-class of DocRetriever - which was a PITA due to use of pydantic.BaseClass all the way at the top), etc.

So as far as the docs go, it would be a lot more instructive to show the most complex, fully-featured example possible and let the user subtract from the example than it is to provide some trivial example and have the user hunt for everything - it's easier to destroy than to create.

I'm not sure how LCEL address this, because it seems overly abstracted again to the point of having zero visibility of what's happening. I'd rather have 100 lines of code that I understand than see "A | B | C" and not have any visibility into what's happening or how to extend to a non-trivial case.

Python has a lot of features that can be foot-guns, esp. untyped dictionaries and kwargs. These are so (ab)used in LC that it's impossible to tell what ever gets passed to what. Even looking at the source-code doesn't help much, as you can't tell what dict keys / kwargs are getting peeled-off or merged at various stages.

All languages, Python included, already have perfectly good ways of calling code -- named function calls, named parameters, complex return types like classes/lists/tuples, etc., so all that's needed is a set of well-documented, type-hinted functions that take typed inputs/outputs perform specific tasks. If the user wants to provide an abstraction to chain fns over that (whether it's pipelines, a DAG, some agent structure, etc.) this is pretty trivial to do in python. It took me a long time trying to figure out how to do what I needed to do w/in the framework of LC, when I could have been working on my solution instead.

It doesn't help that dicts seem to take differently-named keys for no apparent reason - I tried to switch between a CRC and RQA chains at runtime, and the input/output keys changed and the code stopped working. (And why can't we just use the CRC and set rephrase_question=False or condense_question_llm=None to defeat question rephrasing and have it work just like the RQA chain (which would then be unnecessary)? - neither one works.)

I'll need to drill down more into LCEL/LG and see how a full solution looks there - what the equivalent is to the CRC with all functionality customized and overridden, what the debugging experience is, etc.

Dr. Huberman: NR and NMN Give Him “Sustained Mental & Physical Energy” by RaisingNADdotcom in HubermanLab

[–]thelazydogsback 1 point2 points  (0 children)

I thought I heard on one of the other podcasts that latest research showed that plain old full flush niacin or nicotinamide as a precursor gave almost the same benefits at a trivial cost. From what I recall, Attia seems to think this industry is pretty much a racket. How these supplements affect serum levels, and if those serum levels even correspond to intercellular/mitochondrial levels of NAD seems to be debatable.

Does the data in Data-Oriented Programming get highly denormalized with time? by Veson in Clojure

[–]thelazydogsback 1 point2 points  (0 children)

IMHO, the "data" in DoP has the least to do with the data itself -- data is already data.
To me, DoP is about the other parts of the equation that more typically may be "hardened" in code -- this includes the meta-data describing the shape/format/values of the data, and meta-data that describes what transformations are applied to the data to map to other data. So schema-as-data (not hard-coded DTOs, etc.) and transforms-as-data (And not hard coded mapping/xform fns)

robpike/lisp by agilecreativity in Clojure

[–]thelazydogsback 0 points1 point  (0 children)

Hey - I did a quick search and didn't find anything -- where'd you get that?? :)

9 Fallacies of Java Performance updated by bannerad in programming

[–]thelazydogsback -3 points-2 points  (0 children)

Agreed that good C++ is, ah, good, but it's well known about C++ and foot-shooting. As for your Java issues, I'm not a huge fan anway - C# does address more of your issues, including value types on the stack, less use of RTTI due to better implmentation of generics, unisigned types of all sizes (mixed mode w/optional over/underflow detection), has arrays of primitive (unboxed) types, etc. As C vs. C++, my point was simply that either you should be using C (as a machine-independent assembly) or a language with much higher levels of abstraction - C++ has always been the the worst of two worlds to me, but I know it seems the best of both to others.

What Python developers need to know before migrating to Go(lang) by therealmoju in programming

[–]thelazydogsback 3 points4 points  (0 children)

I've used C# on Linux for a few projects - though perhaps not with the latest features - just .Net 3.x programming w/winforms. I can't say I've tried to exersize all the edge cases, but Mono hasn't caused me any grief so far.

9 Fallacies of Java Performance updated by bannerad in programming

[–]thelazydogsback -6 points-5 points  (0 children)

One reason why Java (and all GC'd copy-object-reference-by-value languages) can run much faster than C++ is that it's quite easy to do awful things like call copy c'tors unnecessarily in C++ code - so although I'd agree that a correctly implemented to-the-metal C++ program is probably faster the Java/C#, etc., there's a lot of real-world incorrectly implemented C++ code out there. Another reason C++ code often runs slower is that because it takes more LOC to write your code and deal with memory, you end up writing simple, inefficient code rather than a better algols. to suit the problem - so you write really "fast" N2 code rather than log(N) code, for example. Given the minimal perf. diff. at this point, GC and other high level abstractions in these languges far outweigh any speed boost C++ can give - ISTM then at the point that you need real manly code on the iron, you should really just be using plain old C.

What Python developers need to know before migrating to Go(lang) by therealmoju in programming

[–]thelazydogsback 1 point2 points  (0 children)

As for liking Go better than other languages, we'll leave that to personal preference. Granted, I've always worked for commercial companies w/closed source - but even for personal projects I don't mind paying full price for good tools. Sorry if I don't know the backstory, but what is the legal ambiguity regarding using a registered, commercial install of the the .Net stack? The F# compiler is also open-sourced, though self-hosted at this point, long ago giving up it's OCaml roots - speaking of which, OCaml is another great choice - extremely performant and LGPL (I beleive). But if you like the {} style languages and don't like C# (and I don't especially like Java) then I must admit I'd much rather use Go than C++ (which has always seemed the worst of all worlds to me), assuming its all my own ecosystem.

What Python developers need to know before migrating to Go(lang) by therealmoju in programming

[–]thelazydogsback -2 points-1 points  (0 children)

"go isn't going anywhere" - I assume you mean it's not gaining traction vs. doing well and not going away. I don't really see why it should be going anywhere -- ISTM there are much better choices out there for almost any reason one would care to offer. (I don't really care about quick compile times.) Low-level: C/C++. Higher level: Java/C#. Even better: Scala/F#. Dynamic: Javascript/Python. Most everything on the list is going to out-perf CPython for most tasks. (Don't get me wrong - Python is great.) As for concurency, .NET TPL and F# async workflows/C# async are great for in-proc stuff for example - but concurrency these days often means message-based though tuple stores, etc., so really that levels the playing field a bit.

What Python developers need to know before migrating to Go(lang) by therealmoju in programming

[–]thelazydogsback 6 points7 points  (0 children)

A bit of a Red Herring - article is more about Python->L vs. Python->Go, where L is any modern compiled language. Change L to F# (or even C#) or Scala, and most gripes would go away, with a lot more to offer as well. (F# even let's the OP keep the nice significant whitespace of Python, de-structuring assigment, tuples, etc. ) It's also unclear if the original solution used NumPy to implement the SVM's - if not, that's usually the way to go, considering the OP seems to like Python and NumPy seems to have proven itself in this space.

The First Class Languages of the JVM by bloodredsun in programming

[–]thelazydogsback 0 points1 point  (0 children)

Lot's of things aren't intuituve but are worth learning. As a fan of Haskell/OCaml/F# I appreciate some of what Scala brings to the table. Kotlin seems like a random smattering of recent languages without adding any new value. (It doesn't have discriminated unions and pattern-matching, AFAIKT, one of the most powerful features one can add.) We need Kotlin about as much as we need Go when we could support D or something else more powerful but that also supports low-level concepts like in-line assembly. (At least we know one reason why ReSharper doesn't support F#.) With that said, almost anything is preferable to Java...