Analysis of Algorithms refers to the practice of determining the asymptotic runtime complexity of algorithms, i.e. how the algorithm behaves for large input sizes. We use Big-Oh notation to represent the runtime as N -> infinity.## Analysis of Iterative AlgorithmsFinding the runtime complexity of an iterative algorithm is straightforward. First, you must classify every step of the algorithm into one of three classes: - a simple statement which takes constant time to compute
- a loop which iterates from 0 to N
- a call to another function
Simple statements like "a++" or even "a = b*sqrt(x)^y" take constant time to compute, so T(N) = c = O(1), where "c" is some constant (we don't care exactly how long it takes to compute--just that it's constant). For-loops which iterate N times (or approximately N times) take linear time, so T(N) = N = O(N). For-loops with something inside take N times the runtime of the inner portion: T_{outer}(N) = N*T_{inner}(N)To find the total runtime complexity of the algorithm, multiply when something is inside a for-loop. Otherwise, add it.For example, consider the following simple algorithm: `1: for i = 1 to N do` `2: for j = 1 to N do` `3: a++` `4: foo(a, i, j, N)` `5: for k = 1 to N do` `6: b++` Let's say that foo's runtime is lgN. Applying the rules above, we see that: - the runtime for line 3 is some constant c
_{1} - the runtime for line 4 is lgN
- the runtime for line 6 is some constant c
_{2} - the runtime for lines 3 and 4 together is c
_{1}+ lgN - the runtime for lines 2, 3, and 4 is N*(c
_{1}+ lgN) - the runtime for lines 1 thru 4 is N*N*(c
_{1}+ lgN) - the runtime for lines 5 and 6 is N*c
_{2} - the runtime for lines 1 thru 6 is N*N*(c
_{1}+ lgN) + N*c_{2}
Simplifying, we see that the overall runtime is T(N) = c _{1}N^{2} + N^{2}lgN + Nc_{2}. Next, we wish to find the asymptotic behavior of this function (the Big-Oh notation). To do this, notice that c_{1}N^{2} and Nc_{2} are both dominated by N^{2}lgN. In other words, as N -> infinity, the terms c_{1}N^{2} + Nc_{2 }become insignificant compared to N^{2}lgN (which grows at a much faster rate). Therefore, the asymptotic runtime complexity of this algorithm is O(N^{2}lgN).## Analysis of Recursive AlgorithmsThe strategy above cannot be applied to recursive functions, since there are usually no for-loops and the function calls itself, not another known function. Instead, we make use of recurrence relations like this one:T(N) = c _{1} + N + T(N-1)T(0) = c _{2}The above runtime is recurrent because it depends on itself. In order to find the asymptotic behavior (Big-Oh notation) of a recurrent function like this, we need to find a closed form of the function. This is done as follows:- Start with the
*base case*T(0) - Write the recurrence relation for T(1) by substituting T(0)
- Repeat: write T(k) based on T(k-1)
Eventually, you'll see a pattern as you iterate from 1 to N. If you can write an expression which matches the pattern, you will have found the closed form you need. For example, consider the simple factorial function: `factorial(n) :` ` if n <= 1` ` return n` ` else` ` return n * factorial(n-1)` Let's assume the if-statement takes time and the multiplication takes a . Then, the runtime of this function can be written as follows:bT(N) = a + b + T(N-1) T(0) = a T(1) = a Notice that we actually have two base-cases here. That's okay. You can just pick one or use both. Let's start "unrolling" the recurrence relation: T(0) = a T(1) = a T(2) = a + b + T(1) = a + b + a = 2a + b T(3) = a + b + T(2) = a + b + 2a + b = 3a + 2b T(4) = a + b + T(3) = a + b + 3a + 2b = 4a + 3b Now you should see a pattern. For T(k), the runtime is T(k) = ka + (k-1)b = ka + kb - b Now that you have the closed form, you can easily find the asymptotic complexity (Big-Oh notation). Just notice that the terms k*a and k*b are linear, whereas -b is constant. Thus, you have T(N) = O(N) + O(N) - O(1) = O(N). |