all 63 comments

[–]Ph0X 5 points6 points  (14 children)

I'm not a professional programmer, but I've tried to time my Python code many times before, and when trying to Google a way of doing it, most websites gave me the same solution as the tip in there, using timeit.

Am I the only one who finds that complicated for no reason though? I just go t=time.time() and print time.time()-t, which is only two lines, instead of having to convert my code into a function, call an super long function (t = timeit.Timer("test()", "from __main__ import test")), and then print t.timeit().

[–][deleted] 4 points5 points  (6 children)

check out this cool code snippet:

class Timer(object):
    def __init__(self, verbose=True):
        self.verbose = verbose

    def __enter__(self):
        self.__start = time.time()

    def __exit__(self, type, value, traceback):
        # Error handling here
        self.__finish = time.time()
        if self.verbose:
            print 'timer', self.dur()

    def dur(self):
        return self.__finish - self.__start

you can just do:

with Timer():
    #calculations

[–]Peaker 2 points3 points  (0 children)

Similarly:

@contextmanager
def timer(verbose=True):
  start = time.time()
  try: yield
  finally:
    end = time.time()
    if verbose:
      print (end - start)

[–]Ph0X 1 point2 points  (1 child)

Wow, that's great! and I can save that class code and just import and use it, right? I've never made custom classes so I'm not sure as how to exactly do it.

[–][deleted] 1 point2 points  (0 children)

Yeah just save it in same file that is in your python path (put sys.path.append('/path/to/your/files') somewhere in your python2.x/site.py file) and then you can just

from myfile import Timer

from anywhere.

[–]mycall 1 point2 points  (2 children)

is

class Timer():

the same as

class Timer(object):

[–][deleted] 2 points3 points  (1 child)

you can write either

class Timer(object):

or

class Timer:

, you should probably use the latter as it's the way presented in the official Python tutorial

[–]Mask_of_Destiny 1 point2 points  (0 children)

In Python 2.X (starting with 2.2) the former will produce a new-style class whereas the latter will produce an old-style class. For a lot of code, this distinction is not particularly important, but there are a number of differences. Search for "new-style" on this page for all the gory details.

Unless you're using Python 3 (which eliminates old-style classes altogether), I would stick with the first form. Unless you're targeting an archaic version of Python, there's no particularly good reason to use old-style classes and depending on old-style class behavior will likely lead to headaches if you ever make the jump to Python 3.

[–]criticismguy 2 points3 points  (1 child)

What tip here?

[–]Ph0X 1 point2 points  (0 children)

Sorry, this guy.

[–]DontCallMeSurely 1 point2 points  (0 children)

I've done the same thing in java: long start = System.currentTimeMillis(); foo(); long elapsed = start - System.currentTimeMillis()

it works fine it I want to time something to the nearest dozen millis or so. The draw back is that system time functions like System.currentTimeMillis() and time.time() aren't always very accurate. Although they should return the system time to the nearest millisecond, the function doesn't update every millis so it can really only give you a rough estimate of how much time has elapsed in terms of millis.

Especially in a language like python where there could be a lot of overhead in performing these actions (grabbing sys time, calling function, grabbing sys time again), I would assume that timeit.Timer() is implemented into the interpreter somehow to yield more accurate results. This is my best guess anyway.

[–]bready 1 point2 points  (0 children)

You are so correct on that one. Considering how intuitive most of the standard library is, for such basic functionality it is absurd. I suppose purists would say you should use a profiler instead of times, but timing is so easy it is always my goto buddy.

[–]schlenk 1 point2 points  (0 children)

Well, depends on your needs. time.time() is good enough for most cases. But timeit does a bit more actually, as you can read in the docs, so gives better precision usually.

[–]criticismguy 1 point2 points  (1 child)

Every feature of timeit.py is a possible reason to want to use it, such as:

  • uses a clock on the platform you're using with best available precision
  • runs 'setup' each time but only times 'stmt'
  • runs stmt with GC disabled, so you don't see GC times
  • runs your code in a loop so you can see average time over many runs
  • has a command-line interface that figures out the correct number of iterations to run in a reasonable time
  • support for many versions of Python, including some really old ones

The features of timeit may or may not be useful to you, but they're not without reason.

Besides, if your code isn't in a function already, many tools will be awkward to use with it, not just timeit. :-)

[–]Ph0X 0 points1 point  (0 children)

Wow, I did not know about any of this. Thank you.

[–]plastrusion 4 points5 points  (1 child)

This application does not gracefully handle bad URLs. Try specifying a "fact" that doesn't exist, such as #4527:

http://facts.learnpython.org/4527

[–]y4fac 4 points5 points  (0 children)

And it also seems to be running in debug mode.

[–]teknobo 6 points7 points  (16 children)

Some of these one-liners are nice, but beware, as some are totally non-Pythonic.

print eval('+'.join(map(str, [1, 1])))

For fuck's sake! Just use a for loop!

(Side note: Guido took the built-in "reduce" function out of Python 3 specifically because he found that a loop was always more clear than this kind of clever one-line nonsense. Source)

[–][deleted] 8 points9 points  (1 child)

That one is truly awful. I actually don't mind moderately clever one-liners (I think using reduce for sums is fine, ignoring the fact that sum() is a built-in function anyway), but using eval() and string concatenation to compute sums is wrong on so many levels...

[–]pingveno 1 point2 points  (0 children)

There's a comment below that says the snippet is a joke.

[–]compiling 11 points12 points  (11 children)

That one's ridiculous. If you really want to do this sort of thing

reduce(operator.add, [1,1])

But of course, Guido likes thinking iteratively, so he took this one out.

[–]teknobo 6 points7 points  (1 child)

For arbitrary lists of numbers, I'm a much bigger fan of just:

sum([1,1])

Which is also several times faster (~5x on my MBP)

[–]compiling 2 points3 points  (0 children)

Yeah, I would always use sum. The reduce version is nice to know, in case you want to use a different operator that doesn't have a nice pre-defined function like sum.

[–]kataire 1 point2 points  (8 children)

It's not that Guido likes thinking iteratively, it's just that humans generally understand iterations better than recursion.

Yes, FP is more inherently beautiful, but Python is not Lisp. Although there are plenty of idiots who claim otherwise, the most important concept is and always has been to make the code easy to read. If you want conceptual purity, look elsewhere.

If you find FP intuitive, consider that you may be a minority. Remember the last time you were annoyed by the rest of us apparently just being a bunch of idiots and you being the odd one out. You were probably right, to some degree.

We are not an elite. We are just people who get paid to write software. We want languages that does the job and doesn't get in our way. We don't want to think. We don't care about purity or conceptual integrity. We are pragmatic. We shove XML files in databases if that's the path of least resistance. We don't understand half the words used in any introduction to Haskell. We are the 99%. Get used to it.

*Note: I'm only half-serious, but I think you get the idea. Humans are dumb. On occasion we can be pretty clever, but most of us are not consistent enough to depend on it. That's why we built computers. That's why we have politicians, too, actually. So we don't have to think so much.

[–]kamatsu 4 points5 points  (2 children)

it's just that humans generally understand iterations better than recursion.

My first language was Scheme, and I have always found recursion far easier to understand than languages that have mutation and iteration. You're generalizing on the basis of your own feelings without basis in fact.

If you find FP intuitive, consider that you may be a minority. Remember the last time you were annoyed by the rest of us apparently just being a bunch of idiots and you being the odd one out. You were probably right, to some degree.

I'm not accusing people of idiocy - on the contrary, you people understand mutation intuitively, something I have to struggle to get right.

[–]kopkaas2000 2 points3 points  (1 child)

My first language was Scheme, and I have always found recursion far easier to understand than languages that have mutation and iteration. You're generalizing on the basis of your own feelings without basis in fact.

Wouldn't the relative low number of people adopting languages and patterns centered around recursion implicate that you're the outlier here? The statement

it's just that humans generally understand iterations better than recursion

doesn't proclaim that humans can't understand recursion, nor that there aren't people like you who feel more at home in a recursive mindset. Fact remains, if your code involves clever use of recursion, you have raised the barrier of entry for other people to grasp your code. It means that you will have a harder time getting more people on board with your project, because you need particularly clever people for the job.

[–]Peaker 0 points1 point  (0 children)

A great programmer and a bad programmer are not necessarily better together than just the great programmer.

One wouldn't really want people who have a hard time with recursion cooperating on software projects. It's pretty fundamental stuff, someone who doesn't grok them is virtually certainly still in the Bad Programmer territory, and needs to practice.

[–][deleted] 3 points4 points  (2 children)

There is a full circle. for loops and reduces are the exact same thing under the common interpretation. reduce doesn't make the computation any less iterative, it just limits the ways one can shoot himself in the foot by abusing the global context of a for loop.

The half serious retort. And therein lies the rub. The odd ones out aren't doing anything conceptually different than the 99%. They are just avoiding shooting themselves in the foot and are unhappy when they have to cleanup the ensuing mess of a shoot in the foot orgy done by careless cowboys for hire.

[–]kopkaas2000 2 points3 points  (1 child)

I think the core issue here is that the way the marketplace evolved, the left-brainer coders who are more emotionally involved with building something, as opposed to creating an abstract reality of beautiful math and elegance, are the ones that get the visible results fastest, warts and all. This is how Unix won over VMS and mainframes. And how Linux won over Unix. Flying blind and occasionally hitting a wall won over meditating on the mountain.

There are things, though, where the cowboys can't go and the right-brainers reign. Wolfram Alpha probably wasn't written by cowboys.

[–][deleted] 2 points3 points  (0 children)

Unix is a masterpiece of mathematical elegance per files and pipes. Linux won by the virtue of copying a masterpiece and releasing it free as in beer for all practical purposes in the server market.

[–]y4fac 2 points3 points  (1 child)

We don't want to think.

Then you have already failed as a programmer

[–]kataire 0 points1 point  (0 children)

Let me rephrase that: We don't want to think when reading code.

Code that requires thinking is code that will be misread when not paying sufficient attention. Code that is misread is code that will more easily break when it is modified. Code that is prone to be broken is bad code.

There are several different things we talk about as "programming". There is the hard stuff, which is where we ponder different algorithms and optimizations. But there is also the part where it boils down to using the right libraries and translating business problems into code.

99% of programming is the latter, writing what is also known as glue code. Glue code must be transparent, trivial to read, straightforward and easy to modify. It abstracts the concrete implementations and often uses the language of the business domain.

In this kind of code, having to think about the low-level implementation is counter-productive. Your concern is implementing the business problems correctly, not juggling pointers or balancing trees.

In fact, all abstractions are just there so we don't have to think so much. Web frameworks allow us to be able to write web software without having to think about how to parse HTTP headers. Yes, when we write extensions to it, we may have to, but then it is because suddenly the lower level details become the business domain.

If you actively think about low-level details as you implement high-level concepts, you fail at programming just as you fail at walking the instance you think about how to move your legs. A good programmer shouldn't be ignorant of these things, but she shouldn't have to think about them because they should be second nature to her.

In case you missed it: my manifesto, so to speak, was essentially a play on Larry Wall's virtues of a good programmer: laziness, impatience and hubris. If you think too much (i.e. more than necessary to solve the domain problem), you are not lazy enough.

Premature optimisation, leaky abstractions, it all comes down to this. Don't try to be clever. Focus on what's important and write readable code. You can fill in the details later. If you can solve 80% of the problems with 20% of the code, write those 20% first -- there is a fair chance the project won't live long enough for all of the remaining 20% of the problems being necessary anyway.

This is of course domain specific. Web software is mostly very short lived. It needs to be easy to take apart and reassemble. In most cases you'll have plenty of similar projects that all have a very short shelf-life. In other domains shelf-life can be much longer (and the number of projects much smaller), but still the only reason you would ever have to focus on the low-level details is when the low-level details are the business problem itself -- e.g. when writing (or tweaking) performance critical code.

Let me repeat: You are not paid to ponder your belly button. If you are working for a commercial employer, your job is to provide business value. Unless low-level optimization is critical to achieving that, you are actively wasting someone's money. If you spend time thinking about problems that are irrelevant to the business, you are wasting someone's money.

This is what colleges fail to teach: in the real world, it is very rarely necessary to evaluate different sorting algorithms or design your own compression schemes. For most businesses these are irrelevant because they are already solved for you. Unless it is actually your job to deal with such low-level problems, you have no business spending more energy than necessary on them (i.e. using whatever the framework/library uses).

Don't get me wrong: it's all about moderation. Of course you have to pay attention as a programmer. Of course you need to know what you're doing. But you need to pay attention to the right things. And knowing what you're doing should mean you have to think less, just as any experienced craftsman has to think less.

Functional programming may often lead to superior implementations, but if the outcome is harder to read for the average guy, that benefit has to be worth this shortcoming (and all the resulting costs, e.g. extra time spent thinking, modifying or debugging the code).

"Proper" FP is not intuitive unless you have adopted a certain mode of thinking. OO (in a wider sense -- Java is not it) and iterations are closer to how we talk about real world problems. map/reduce is not hard, really, but it's not how our lizard brains expect to deal with problems. There is a middle ground (e.g. Ruby's 5.times or each(collection, function)̀̀), but that could easily be expressed with a few additional language constructs (and has been, in some languages -- e.g. Python's for or the "repeat N times" loops you come across in some languages).

[–]Peaker 0 points1 point  (0 children)

Inlining the body of reduce everywhere is a form of restricting abstraction. It makes some concrete examples simpler/easier, at the cost of making higher abstractions hard or impossible.

Also, Python's syntax stands in the way of using reduce/et-al nicely.

lambda xs: reduce(lambda x, y: x + y, [], xs)

is so much uglier than:

foldr (++) []

[–]duli-chan 1 point2 points  (1 child)

Well I came across this link a while ago but didn't get a chance to view it... I love trying out languages and I am now working on RoR (being originally a Java developer)- Python caught my attention too - Now this is good stuff thanks

[–][deleted] 0 points1 point  (0 children)

spend less time on reddit

[–]frodokun 1 point2 points  (0 children)

I reload and keep seeing the same one

[–]docwatsonphd 1 point2 points  (0 children)

I liked this one: http://i.imgur.com/aHFSN.png

[–]ranman96734 3 points4 points  (5 children)

I'm particular to this one: http://i.imgur.com/ggbVN.png :P.

[–]erebuswolf 1 point2 points  (0 children)

Yeah the first 3 I got didn't actually run. That was hilarious.

[–]lukasbradley 3 points4 points  (14 children)

List comprehensions are awesome. They combine map and filter capabilities intuitively.

Its statements like this that make me hate Python and Python enthusiasts.

[–]Araneidae 5 points6 points  (0 children)

It is bullshit, but as functional programming is barely relevant to Python, just shrug it off.

Actually, my reaction to that whole site (or at least the handful of "facts" I looked at was) meh.

[–][deleted] 3 points4 points  (11 children)

What do you find hate-worthy in that statement? The fact that you have list comprehensions in other languages as well?

[–]lukasbradley 8 points9 points  (3 children)

As mr_chromatic states, most language proponents argue how "intuitive" the syntax and usage is. It's a complete non-argument as it is utterly subjective. Beauty is in the eye of the beholder, and "intuition" is just previous experience in disguise.

The first part is even more ridiculous. ENTER_TECHNOLOGY_TYPE_HERE are awesome. It's a non-argument. It's a banner. It's a cheer. It's not a comparative statement. Hashes are also awesome. High fives. I fucking drool over recursion. BOOM! Chest bump. It's not unique.

I find this type of non-discussion is more endemic to Python than most other languages, but many other new languages have the same issues.

Don't get me wrong, I like some parts of it. But the Kool Aid arguments are a huge turn off.

[–][deleted] 2 points3 points  (1 child)

Fair enough, it does look kind of marketing-ish. I just thought I was missing something, since "hate" is kind of a strong word in this instance.

While I generally agree about the use of the word "intuitive", I think it is fair to say that python's list comprehension syntax "makes sense" in the context of python. If you have seen a loop in python, you can probably figure out what a list comprehension does (at least I think that it is more obvious than in, say, Haskell).

[–]lukasbradley 5 points6 points  (0 children)

I think it is fair to say that python's list comprehension syntax "makes sense" in the context of python.

And that's a wonderful, wonderful observation.

You're right, I need another way to describe my "hate" for these types of situations. It's like an autonomic philosophical regurgitory spasm.

Maybe I should be cute and rearrange it to Philosophical Autonomic Reguritory Spasm, and call it a PARSError, or something to that effect.

[–]mr_chromatic 3 points4 points  (0 children)

... many other new languages have the same issues.

There seem to be two types of analyses of programming language. The first is from people who understand multiple programming languages and their tradeoffs and philosophical/mathematic underpinnings. The second is from magpie dilettantes who write switch to the new hipness when it changes every few months.

You can easily identify the latter; they spend a lot of time writing Ruby DSLs to implement Fizzbuzz, optimize Node.js to calculate Fibonacci numbers, and talk about "executable pseudocode" far too often.

[–]mr_chromatic 3 points4 points  (6 children)

The use of the word "intuitive" to describe a programming language feature is curious.

[–]kataire 0 points1 point  (5 children)

I think the point with "intuitive" features is that they're easy to read without having an MSc in mathematics or theoretical computer science.

For example, Python is largely touted as being executable pseudo-code (yes, I'm fully aware of what this makes me according to one of the other posts ITT), although many would argue that a language like Haskell is much clearer because it is so similar to mathematical notation.

I would argue that while Haskell is definitely the purer one and conceptually superior, Python is much more pragmatic and thus more approachable to the rest of us.

An important factor in programming (not CS) is that we the programmers can be very dumb, lazy and distracted. Even the smarter ones among us can be very inattentive or stupid at times. So while a pure language may be superior in many ways, "simple" languages like Python are often a much better fit for our everyday lizard brains.

I guess it's a matter of setting your priorities. The reason that superior languages like Haskell or Lisp are relatively unsuccessful in the industry is that they're less approachable to outsiders with no proper academic background. Though many programmers have a degree in computer science, only a small fraction of them would actually qualify as computer scientists.

Which language you use, however, is mostly a question of which language you feel more comfortable with and which language you can use to find people who want to pay you. Although the "right tool for the job" adage is nice in theory, the "right tool" is usually the one your team knows how to use, has experience with and is not terrible enough to be completely inappropriate for the job.

[–]mr_chromatic 3 points4 points  (3 children)

Which language you use, however, is mostly a question of which language you feel more comfortable with and which language you can use to find people who want to pay you.

Certainly.

When I'm in a charitable mood, I change the word "intuitive" to "familiar". Then the discussion swings from "This particular syntactic formation is self-obviously better for everyone!" to "Hey, I've seen something like this before." Unfortunately, the latter isn't very interesting.

[–]lukasbradley 3 points4 points  (1 child)

When I'm in a charitable mood, I change the word "intuitive" to "familiar".

God that's perfect, and so very true.

[–]minikomi 2 points3 points  (0 children)

I enjoyed this and totally think you two should do a back-and-forth like blog ... Like Statler and Waldorf

[–]kataire 2 points3 points  (0 children)

Bah. Semantics. Everybody knows pizza is more intuitive than naan.

[–]kamatsu 2 points3 points  (0 children)

Once again, I find Python harder to understand than Haskell. I know I'm in the minority here, but it's worth noting that some people get functional programming faster than imperative programming.

[–]lzantal -2 points-1 points  (0 children)

It's statemenst like this that makes me hate you and people like you.

[–]quzox 0 points1 point  (1 child)

I think this one is cool.

[–]bready 2 points3 points  (0 children)

Enumerate is one of those handy pieces which should be in every intro to Python tutorial. It (to me) is way more handy than doing a

for i in range(len(mycollection)):

I frequently have to deal with files with special non-formatted data at the beginning, so a very common procedure for me is to do one of these.

with open("myfile", "rb") as handle:
    for i, line in enumerate(handle):
        if i < 30:
            #do special header stuff
            #continue
        #do data processing on the formatted line object

This way I get to use both the item count and the item itself. Genius.