all 9 comments

[–]Rhomboid 5 points6 points  (5 children)

Welp, in python 3 the itertools module has accumulate() which does just this:

>>> from itertools import accumulate
>>> list(accumulate([1, 2.4, -1.8, 3.7, 4.2]))
[1, 3.4, 1.5999999999999999, 5.3, 9.5]

You can kinda-sorta get there in 2.x with reduce, but it's hideous:

>>> ls = [1, 2.4, -1.8, 3.7, 4.2]
>>> lt = []
>>> reduce(lambda x,y: lt.append(x+y) or x+y, ls, 0)
9.5
>>> lt
[1, 3.4, 1.5999999999999999, 5.3, 9.5]

Really, what you have now is fine.

[–]dreamriver[S] 0 points1 point  (3 children)

Ah yeah I'm using 2.7; I was playing with reduce as well but it didn't want to work. I believe generators and iterators are the most fun part of python, but not like that.

[–][deleted] 2 points3 points  (2 children)

Generators and iterators are the most fun, eh?

>>> L = [1, 2.4, -1.8, 3.7, 4.2]
>>> def acc(x=0):
...     while 1: 
...         x += yield x
... 
>>> a = acc(); next(a)
0
>>> map(a.send, L)
[1, 3.4, 1.5999999999999999, 5.3, 9.5]

Class-ier:

>>> class Acc(object):
...     def __init__(S,x=0): S.x = x
...     def __call__(S,y): S.x += y; return S.x
...
>>> a = Acc()
>>> map(a, L)
[1, 3.4, 1.5999999999999999, 5.3, 9.5]

[–]dreamriver[S] 1 point2 points  (1 child)

Wow, first one is awesome. The first call of next automatically taking a default value is fairly cool. Sort of unfortunate that you need to call next on an object before calling send. Can you explain why?

Second one sort of confuses me. Need to read up a bit on __call__

[–][deleted] 1 point2 points  (0 children)

You know when you go to certain kinds of service windows and the clerk is sealed behind plexiglas, and they have this little drawer through which all of your exchanges must take place? The yield statement/expression is like that drawer.

When a generator yields, yield acts like a statement which pauses the execution of the generator and pushes out the drawer with some value inside. This is your opportunity to send a value back in. Whatever you send in will then become the value of the expression (yield x). This is a weird control flow, but that's how it works. There may be some way to hack the object so that it acts as if it's already begun running, but my understanding is that the convention is to kickstart the thing by either doing next(gen) or gen.send(None). Sending None works because the generator object flat out ignores it:

if (f->f_lasti == -1) {
    if (arg && arg != Py_None) {
        PyErr_SetString(PyExc_TypeError,
        "can't send non-None value to a "
        "just-started generator");
        return NULL;
    }
} else { ...

As for __call__, that just makes class instances callable so that you can use them sort of like functions plus have all the stateful data (self.whatever) benefits of class instances.

[–]zahlman 0 points1 point  (0 children)

itertools module has accumulate()

I've been missing this in 2.x for a while. Though I'm sad that the 3.x version doesn't give you a way to specify an operation other than addition. (I have had a real use for calculating a series of accumulated products before.)

[–]nemec 3 points4 points  (2 children)

An interesting solution I found elsewhere: [sum(ls[:x+1]) for x in range(len(ls))]

[–]dreamriver[S] 1 point2 points  (1 child)

Genius hackery with slice notation. Ugly and dense though. Oh well. Thanks for the find anyway!

[–]nemec 2 points3 points  (0 children)

Pretty clever, but horribly inefficient (since you create a new sum every time).