Friday, February 24, 2012

OMG! WNBMM (Part 3)

One of the biggest reasons we make algorithms is so that we can do things faster. One of the biggest reasons we study algorithms is so that we can make better algorithms to do things even faster than that. In particular, we like algorithms to do better on large scales, so we worry about big-O/big-theta complexity and so on. For loops, that often means solving recurrences.

Consider these two recurrences:
A(n) = A(n/3) + A(2n/3) + n
B(n) = A(n/5) + A(3n/5) + n

They look pretty similar, right? But if you look at the solved complexities, A(n) is in O(n logn), while B(n) is in O(n)!

Why does this happen? It turns out that if you look at the tree of operations (where A(n) breaks into A(n/3) and A(2n/3) after doing n work at that level), and so on, it seems a lot like Pascal's Triangle, which in turn is related to binomial expansions.

In fact, as you work at the sum for the recurrence, it miraculously turns into a sum on binomial expansions, more or less. But while (1/3 + 2/3)^k will be 1 no matter how big k gets, (1/5 + 3/5)^k will drop to 0 (the limit for k being the height of the tree, which is in log n, as usual).

A(n)'s geometric sum is just adding 1 log n times, so of course it keeps the log n. In contrast, since B(n)'s geometric sum converges even as it goes to infinity, the log n disappears! :) SO COOL!!!

No comments:

Post a Comment