- Guess the form of the solution.
- Verify your guess using induction.
- Solve for the constants.

```
T(n) = 4*T(n/2) + n, T(1)
= phi(n)
```

phi(n) means tight asymptotic bound.

- Guess
`O(n^2)`

. - Prove
`O`

(upper bound) and`omega`

(lower bound) separately. - Assume that
`T(k) <= c*k^2`

for`k < n`

. This is our inductive hypothesis. - Prove
`T(n) <= c*n^2`

by induction.

In this case, we will try substituting `c*n^2`

for `T(n)`

.

T(n) = 4*T(n/2) + n T(n) <= 4*c*(n/2)^2 + n T(n) = c*n^2 + n

Now we want to put this in the form `(desired - residual)`

. Then,
reduce to the desired form if possible.

T(n) = c*n^2 - (-n) Test k<n: T(n) <= c*n^2 for no choice of c

Since the inductive basis fails, we have to introduce a low-order
term. Let's call it `-d^k`

. Change our inductive hypothesis
appropriately.

Inductive hypothesis: T(k) <= c*k^2 - d*k, for k < n T(n) = 4*T(n/2) + n T(n) <= 4*(c*(n/2)^2 - d*(n/2)) + n T(n) = c*n^2 - 2*d*n + n T(n) = c*n^2 - d*n - (d*n - n) T(n) <= c*n^2 - d*n, if d >= 1

Then, pick a c that is big enough to handle the initial conditions.

A recursion tree is good for generating guesses for the substitution method.

- Build a recursion tree.
- Determine height of tree as a function of input size.
- Determine size of each tree level.
- Add the sizes of each level.
- Multiply by the height of the tree.

The Master Theorem applies to recurrences of the following form, where
`a>1`

, `b>1`

, and f is positive.

`T(n) = a*T(n/b) + f(n)`

We do the Apprentice Theorem for recurrences of the following form,
where `a>1`

, `b>1`

, `c>1`

.

`T(n) = a*T(n/b) + c*n^d`

The Apprentice Theorem is the only subset of the Master Theorem that we are shown how to use.

This applies to recurrences of the following form.

`T(n) = a*T(n/b) + c*n^d, T(1) = c`

After much thrashing, the following is also true.

`T(n) = c*n^d * (u^(k+1) - 1) / (u-1); u = a / b^d`

Three cases arise from this.

`a = b^d, u = 1`

. We get a factor of log (b,n).`T(n) = O(n^d * log(n))`

`a > b^d, u > 1`

. This is a diverging series.`T(n) = O(n^(log(b,a)))`

`a < b^d, u < 1`

. This is a convergent series.`T(n) = O(n^d)`

**Quicksort**-
`T(n) = 2*T(n/2) + c*n`

Case 1:

`u=1`

.Solution:

`T(n) = O(n*log(n))`

**Binary search**-
`T(n) = T(n/2) + c`

Case 1:

`u=1`

.Solution:

`T(n) = O(log(n))`

**Median finding**^{1}-
`T(n) = T(3*n/4) + c*n`

Case 3:

`u<1`

.Solution:

`T(n) = O(n)`

1. Assumes lucky partition.

We consider 3 algorithms.

- Transitive Closure
- All-Pairs Shortest Path
- Warshal's Algorithm

These algorithms are essentially the same. They are based on organizing all paths of a directed graph in a certain manner.

Given a finite set S and a relation R on S, add the pairs that are
necessary in order to make R transitive. This means that if `iRk`

and
`kRj`

, require that `iRj`

.

Here is some pseudocode that accomplishes this.

function path-exists (M) foreach i in all rows in M foreach j in all cols in M if M[i,j] is defined new-m[i,j] = true else new-m[i,j] = false return new-m function transitive-closure (weight-matrix) M = path-exists (weight-matrix) n = row-length (M) for k=0 upto n for i=0 upto n for j=0 upto n M[i,j] = or (M[i,j] and (M[i,k], M[k,j])) return M

The complexity of this operation is `O(n^3)`

.

Given a graph G, find all paths in G.

This is accomplished by organizing the paths by the highest numbered vertex that is on the path as an intermediate step.

- Assume the vertices are numbered
`1..n`

. `P(0,i,j)`

is defined as all paths that have no intermediate step and lead from vertex i to vertex j.`P(k,i,j)`

is defined as all paths of arbitrary length in which the intermediate vertices are in the set`{1..k}`

, and the path leads from vertex i to vertex j.

Here is some pseudocode that accomplishes this.

function weight (M) foreach i in all rows in M foreach j in all cols in M if M[i,j] is defined new-m[i,j] = M[i,j] else new-m[i,j] = INF return new-m function all-pairs (weight-matrix) M = weight (weight-matrix) n = row-length (M) for k=0 upto n for i=0 upto n for j=0 upto n M[i,j] = min (M[i,j], M[i,k] + M[k,j]) return M

The complexity of this operation is `O(n^3)`

.

- MIT OpenCourseware lecture notes (Copyright Charles E. Lieserson)
- Prof. Christoph Hoffman's lecture notes