Saturday 27 October 2012

To understand recursion, you must understand recursion

After what seemed like an endless stream of induction proofs and flavours, we have finally moved on to the second major topic of the course: Algorithm Analysis. In particular, we were working on the analysis of recursive algorithms.

As a precise person, the way we analyzed complexity in CSC165 didn't sit well with me. I couldn't always come to terms with us just dismissing all of the constant time operations with a wave of the hand. It seemed like a cheap trick to me, a simplification that ruined the usefulness of our answer.

I only began to understand when I looked at complexity in a new light: I compared it to limits in calculus. When you have a limit going to infinity, only the largest degree terms have an effect on the value of the limit. In a very similar way, as the size of the problem grows arbitrarily large, only the fastest growing term is going to have a major effect with how efficient our algorithm is. This connection helped me to rationalize 'throwing away' the constant terms.

As for recursive algorithms, I find the whole unwinding process to be a simple but powerful tool to get some intuition on closed forms and time complexity (I feel dumb for never thinking of it myself!). I'm still struggling with it somewhat, though. It seems pretty straightforward to do with recursions that cut the problem into parts (like mergeSort), but I'm having difficulty using it on problems with other types of recursions, like where the problem size is reduced by subtraction. I'm sure more practice will get me on the right track with this.

Everyday it seems a thorough understanding of recursion is becoming more and more important for success in CS. The folks at Google seem to agree!

No comments:

Post a Comment