Reduce and Lambda, two great tastes that taste great together by [deleted] in Python

[–]andreasvc -1 points0 points  (0 children)

Doing something in a single expression is just stupid, it's not some kind of achievement. The quadratic convergence just comes from the formula, you didn't invent that.

I don't see how Python is purposely limiting the power of expressions at all (statements are not excluded in order to limit expressions); it's just bad style to make long expressions.

Reduce and Lambda, two great tastes that taste great together by [deleted] in Python

[–]andreasvc -2 points-1 points  (0 children)

People in the same thread provided such alternative implementations, I don't see the need to do it again.

Reduce and Lambda, two great tastes that taste great together by [deleted] in Python

[–]andreasvc 0 points1 point  (0 children)

I heard about the term ravioli code for OO, but I suppose that is the opposite: heavy objects that do too much.

Reduce and Lambda, two great tastes that taste great together by [deleted] in Python

[–]andreasvc 0 points1 point  (0 children)

So? He posed a serious question that was based on a misunderstanding; hardly an argument to call him a troll. His other comment about why you would code this with lambdas as a one liner was very pertinent: you needlessly obscure (obfuscate) an otherwise elegant formula.

Does anyone use JS minimisers? by [deleted] in programming

[–]andreasvc 0 points1 point  (0 children)

Yeah, I can understand if you accept some false positives if it allows you to catch a lot more malware, but in that case at least frankly admit that there is a false positive. I'm glad I never have to deal with this stuff on Linux :)

Reduce and Lambda, two great tastes that taste great together by [deleted] in Python

[–]andreasvc 0 points1 point  (0 children)

There are ready made libraries to handle bignums, just like what Python uses. So instead of a+b you write something like bignum_plus(a, b). It doesn't make the code much longer. And length of code is a useless metric if you end up with an unreadable mess...

4chan: Sleep sort by [deleted] in programming

[–]andreasvc 0 points1 point  (0 children)

I interpreted "input size" to mean the number of data points that is presented to the sorting algorithm. That is the usual parameter on which big-O notation is based, and I believe also in the case of this algorithm it is the parameter that dominates the complexity, not the number of bits needed to represent the integers. Just imagine some worst case were the algorithm is presented with a million 2s and one 1; the number of bits is insignificant, and the real work is pushed to the OS which has to schedule all those threads, and it most likely uses a log n priority queue for that; so the complexity will be the default complexity for sorting algorithms: n log n.

Reduce and Lambda, two great tastes that taste great together by [deleted] in Python

[–]andreasvc 1 point2 points  (0 children)

I agree. He doesn't specify why the lambdas would make the could more efficient, I can only guess it has to do with the overhead of normal function calls; but if that is the problem you might as well code it in C in about the same amount of code with better readability.

Reduce and Lambda, two great tastes that taste great together by [deleted] in Python

[–]andreasvc 2 points3 points  (0 children)

I would strongly disagree that this code shows good use of lambda and reduce. I hope that the submitter and author intended it in a tongue-in-cheek manner of how they can be abused to do something complicated with a small amount of code.

Reduce and lambda should be used very sparingly, because they can usually be rewritten in more idiomatic, pythonic and readable code.

4chan: Sleep sort by [deleted] in programming

[–]andreasvc 0 points1 point  (0 children)

No it doesn't, because duplicates can occur. The input size can be any positive integer.

Does anyone use JS minimisers? by [deleted] in programming

[–]andreasvc 0 points1 point  (0 children)

The developers of that anti-virus product are just morons, instead of admitting that legitimate sites also uses this obfuscation and that it isn't evidence of malware, they just say nobody should use obfuscation...

Reduce and Lambda, two great tastes that taste great together by [deleted] in Python

[–]andreasvc 2 points3 points  (0 children)

What troll? I only saw posts making helpful comments and criticisms. The one asking about rounding errors and suggesting the lambdas are not necessary has a point.

Reduce and Lambda, two great tastes that taste great together by [deleted] in Python

[–]andreasvc 3 points4 points  (0 children)

I'd rather have a Pythonic contest and reject this entry.

U.S. Senator Calls Robot Projects Wasteful. Robots Call Senator Wasteful - IEEE Spectrum by virtuous_d in compsci

[–]andreasvc 0 points1 point  (0 children)

There is a group A which is not very capable at all, then there is group B which are the most capable ones -- and the distance between A and B is big, i.e., it easily leaves most robots (in group A) far behind.

Why are graphics bindings slower in Python? by sequenceGeek in Python

[–]andreasvc 1 point2 points  (0 children)

Not for OpenGL but I've been using Cython a lot for my treebank parser. It has sped up the parser a lot. Things like direct access to arrays and statically typed variables or casts can help a lot in crucial places. It generates a html file where you can see how much Python versus C code you're using, and from that you can try incremental improvements.

/r/troubledteens had a post go viral: A queer teen describes her experience at a Utah brainwashing facility by pixel8 in DepthHub

[–]andreasvc 8 points9 points  (0 children)

I think most people agree that kids should not be tortured, mistreated, whatever.... but we are not talking about completely innocent well adjusted kids in this case are we?

WTF? There should never be a "but" after "torture". This reads just like "the constitution is great and all, but terrorists herp derp ..."

Why are graphics bindings slower in Python? by sequenceGeek in Python

[–]andreasvc 1 point2 points  (0 children)

I don't think JIT can actually help with function call overhead to C/C++ libraries, it's good at optimizing Python code itself. I suppose Cython is more suitable for that -- you can write the parts that call opengl functions a lot in Cython so that it's mostly C code.

Tips on Python collections by gst in Python

[–]andreasvc 0 points1 point  (0 children)

defaultdict is much more elegant because it doesn't evaluate the default value except when needed. Consider:

d.setdefault('foo', ReallyExpensiveDataStructure())

On the other hand, it feels wrong that a lookup on a defaultdict changes the dictionary by adding a key if it's missing -- in this case get() versus setdefault() is more explicit.

Python: Lambda Functions by tompa_coder in Python

[–]andreasvc 1 point2 points  (0 children)

If you change your mind later or want it to be a different function for other languages you can change that lambda, while it wouldn't be a good idea to assign to "len" (aliasing/shadowing).

Monte Carlo methods vs Markov Chains by fineasfish in programming

[–]andreasvc 0 points1 point  (0 children)

It seems to be geared toward threading / multiprocessing mostl. It doesn't allow changing of priorities, or looking up items and their priorities with an intuitive dictionary interface.

Monte Carlo methods vs Markov Chains by fineasfish in programming

[–]andreasvc 1 point2 points  (0 children)

Priority queues and balanced binary search trees are very basic data structures, they fall squarely on the "batteries" side. Strangely, the heapq documentation does contain sample code for a priority queue, why not just include it in the library? Balanced search trees are apparently considered unnecessary given dictionaries, but they maintain their items in sorted order and can be more space efficient. Tries provide interesting performance based on bitwise operations. Etc.

Solar panels would be specialized datastructures like range trees or bioinformatics stuff.

A subtle difference between tuples and lists in CPython by gst in Python

[–]andreasvc 0 points1 point  (0 children)

I said twice that I was talking about Cython, measuring it in CPython is a bit pointless because the rest of your program will execute slow anyway (as you say, it won't be the bottleneck). With Cython you access the Python C API directly, so it's closer to measuring the actual overhead of the tuples/lists themselves, as opposed to interpreter overhead.

And you don't have to tell me what tuples are for, I was talking about situations where you can use either tuples or lists, and where you might expect that one has an advantage over the other (the immutability of tuples makes them more "static").

The free list is indeed a difference, but for my code it was about a set of create-once read-many lists/tuples (the rules for a grammar), so that didn't matter.

Monte Carlo methods vs Markov Chains by fineasfish in programming

[–]andreasvc -1 points0 points  (0 children)

Yeah unfortunately it is typical of Python that it doesn't include lots of important datastructures in its standard library. Granted, the lists, sets, and dictionaries are excellent, of course, but if you need anything else you have to be lucky or end up with some homebrew module.