> In reality, it is nothing more than memoization of function calls into an array.
No, it's the next step after memoization.
Recursion is the slowest approach of implementing many algorithms, as it will duplicate a lot of computation of subproblems. It solves a lot of subproblems many times.
Memoization is a cheap fix, it will not solve subproblems multiple times. It stores subproblems it has already solved, along with the solution. But it has an increasingly large state, keeping solutions of subproblems in memory which are actually no longer needed.
Dynamic programming is a manipulation on algorithms relying on memoization. It takes such an algorithm, and it makes the state as small as possible. So solutions of subproblems are discarded when they are no longer needed.
Like in memoization, dynamic programming will keep around previous subproblem solutions. But it will only keep the ones around it still needs in the future, and discard what is no longer needed. This often requires a change in representation of that state and in the order in which the subproblems are solved. But it will run much faster than recursion, on less memory than memoized approaches.
You: Dynamic programming means saving the results of computations
Me: That's just normal programming, there is no reason to use a term that someone made up in the 60s to as intentionally nonsensical.
You: That will blow up your memory and slow down your program!
What are you talking about here? All I said was that what you described was normal programming. I didn't come up with anything different, we're still talking about whatever you wrote. Somehow now storing data, something that happens in every program ever made, 'blows up memory'.
But there is a difference between memoization and dynamic programming. Memoization or otherwise indiscriminately storing the results of computations will blow up your memory use on many problems, where a dynamic programming approach would not.
Dynamic programming does not mean saving the results of computations, it is about saving the _minimal_ number of results, without having to recompute already solved subproblems later on.
I agree that the name chosen is nonsensical, but the concept is very much a real thing. And a quite important one at that.
Your previous comment: dynamic programming will keep around previous subproblem solutions
Your comment here: Dynamic programming does not mean saving the results of computations
indiscriminately storing the results of computations
This is not something that happens. No is out there storing a bunch of stuff they don't need on purpose, running out of memory, then calling it a day. That's like saying "some people walk straight into walls, but in dynamic walking we go around walls!".
it is about saving the _minimal_ number of results, without having to recompute already solved subproblems later on.
Again, this is just what programming is some times. You save results instead of recomputing stuff.
> No is out there storing a bunch of stuff they don't need on purpose, running out of memory, then calling it a day.
There are a lot of people (such as my students) that would take a DP problem, solve with recursion, throw some caching [0] in and call it a day. They are often surprised by the 1000x speedups using a DP algorithm instead, no caching needed.
Dynamic programming was a made up nonsense term from the 50s to be opaque and vague so they wouldn't be bothered. It doesn't mean anything, but people try to make up some backwards rationalization.
Computing a factorial by recursively computing all the other previous factorials if you have any of them already done is an insane way to work. Doing something straightforward and sane like saving some results is like walking on broken glass then calling putting something over your feet "shoe walking". There is no special fancy label for doing the simple sane version of something.
No, it's the next step after memoization.
Recursion is the slowest approach of implementing many algorithms, as it will duplicate a lot of computation of subproblems. It solves a lot of subproblems many times.
Memoization is a cheap fix, it will not solve subproblems multiple times. It stores subproblems it has already solved, along with the solution. But it has an increasingly large state, keeping solutions of subproblems in memory which are actually no longer needed.
Dynamic programming is a manipulation on algorithms relying on memoization. It takes such an algorithm, and it makes the state as small as possible. So solutions of subproblems are discarded when they are no longer needed.
Like in memoization, dynamic programming will keep around previous subproblem solutions. But it will only keep the ones around it still needs in the future, and discard what is no longer needed. This often requires a change in representation of that state and in the order in which the subproblems are solved. But it will run much faster than recursion, on less memory than memoized approaches.