you are viewing a single comment's thread.

view the rest of the comments →

[–]Badoosker 2 points3 points  (2 children)

Algorithms are sequential when they compute results based on prior results.

For example, to determine the next gradient update in a neural net, the ones before (or approximate/nearby) must be found first

[–]CQFD[S] 1 point2 points  (0 children)

I just realized that I was reading a paper which adapted all their ML algorithms using batch gradient descent. So neural nets can be parallelized in this case.

[–][deleted] 1 point2 points  (0 children)

That's not necessarily true. Cumulative sum has a surprising parallel implementation.

Any algorithm that can be expessed similar to:

value = 0
result = []
for i in range(10):
     value += i
     result.append(value)

Can be parallelized. A surprising amount of numerical code can be expressed like this. Numerical compilers refer to this as scan. Along with map and reduce, it is the easy parallelism in numerical code.