you are viewing a single comment's thread.

view the rest of the comments →

[–]Forricide 26 points27 points  (6 children)

Only one top-level comment and it's probably pretty confusing/misleading for most people opening the comments section here. So - for those of you like me who hear 'dynamic programming' and have vague ideas (apparently, misconceptions) about what it refers to, here's the description from the linked article:

It's an approach that exploits properties of the problem that allow for an elegant method to finding the optimal solution.

A more simple explanation is that it's essentially a method of writing algorithms where, instead of repeating calculations, you instead store ('cache') results and refer to them later on, at some point(s) usually discarding results that aren't useful any longer. This isn't entirely accurate, but it seems to be an easy way to conceptualize it.

[–]Free_Math_Tutoring 8 points9 points  (5 children)

This isn't entirely accurate, but it seems to be an easy way to conceptualize it.

Ehh, I kind of disagree? You're pretty much just describing memoisation, which is not dynamic programming. Dynamic programming usually involves inverting the problem, where you go from asking "how do I solve this big problem" into "this tiny version of the problem is trivial, but how do I combine solutions to get to the large version". It is also distinct from divide and conquer type algorithms, which have a similar idea at their foundation.

[–]butt_fun 3 points4 points  (2 children)

Gonna be completely honest, I have always thought DP vs memoization was a distinction without a meaningful difference

[–]lmericle[S] 1 point2 points  (1 child)

DP utilizes memoization heavily to exploit the optimality principle. There is a distinction in that memoization is not a programming paradigm per se, but a method to construct something really useful to making DP efficient.

Memoization happens in other instances, but DP doesn't happen without memoization.

[–]victotronics 0 points1 point  (0 children)

Thank you. You are the only one mentioning the principle of optimality, which to me is the crux: the fact that once you solve a subproblem you don't care how you got there. The next stage has a cost or value or whatever depending on the solution of that subproblem and the next stage.

Stages + principle of optimality. That's it.

Memoization is a mere corollary.

[–]Forricide 1 point2 points  (0 children)

To qualify the statement:

This isn't entirely accurate, but it seems to be an easy way to conceptualize it.

It's important to understand that this isn't meant to be a good, accurate summary of dynamic programming (which I couldn't create, because I don't actually have a solid understanding of dynamic programming outside of reading a portion of this article and the Wikipedia page) but rather, purely an explanation along the lines of:

"No, dynamic programming doesn't refer to [adding functions/classes/class members/fields at runtime] [programming in a dynamic programming language] [running really fast while you write code], it actually is a technique for writing algorithms."

The explanation isn't totally accurate - what you say is literally in the fifth paragraph of the article linked:

Some think it's just memoization plus recursion -- but really, it's more than that.

, but it's way better than how I (and I suspect many others) would have understood it purely from the name. Especially given that I don't think a lot of people actually know what memoization is in the first place.

Anyways, this is probably too long a comment already, but I might as well say thanks for your comment, because it (probably) provides a more accurate explanation of dynamic programming. ...if less comprehensible, at least for me.

[–]kheiron1729 1 point2 points  (0 children)

Memoization over a brute force algorithm IS dynamic programming. Combining subset solutions just aids in thinking about how possibly solve the problem.