Algorithms - Preparation for 1st Exam

Solving recurrences
Method 1: Substitution
Method 2: Recursion tree
Method 3: Master Theorem
Path computations
Transitive closure
How to find all paths of a graph
Sources

Solving recurrences

Method 1: Substitution

Example

T(n) = 4*T(n/2) + n, T(1) = phi(n)

phi(n) means tight asymptotic bound.

  1. Guess O(n^2).
  2. Prove O (upper bound) and omega (lower bound) separately.
  3. Assume that T(k) <= c*k^2 for k < n. This is our inductive hypothesis.
  4. Prove T(n) <= c*n^2 by induction.

In this case, we will try substituting c*n^2 for T(n).

T(n) = 4*T(n/2) + n
T(n) <= 4*c*(n/2)^2 + n
T(n) = c*n^2 + n

Now we want to put this in the form (desired - residual). Then, reduce to the desired form if possible.

T(n) = c*n^2 - (-n)
Test k<n:
  T(n) <= c*n^2 for no choice of c

Since the inductive basis fails, we have to introduce a low-order term. Let's call it -d^k. Change our inductive hypothesis appropriately.

Inductive hypothesis: T(k) <= c*k^2 - d*k, for k < n
T(n) = 4*T(n/2) + n
T(n) <= 4*(c*(n/2)^2 - d*(n/2)) + n
T(n) = c*n^2 - 2*d*n + n
T(n) = c*n^2 - d*n - (d*n - n)
T(n) <= c*n^2 - d*n, if d >= 1

Then, pick a c that is big enough to handle the initial conditions.

Method 2: Recursion tree

A recursion tree is good for generating guesses for the substitution method.

Method 3: Master Theorem

The Master Theorem applies to recurrences of the following form, where a>1, b>1, and f is positive.

T(n) = a*T(n/b) + f(n)

We do the Apprentice Theorem for recurrences of the following form, where a>1, b>1, c>1.

T(n) = a*T(n/b) + c*n^d

The Apprentice Theorem is the only subset of the Master Theorem that we are shown how to use.

Methodology

This applies to recurrences of the following form.

T(n) = a*T(n/b) + c*n^d, T(1) = c

After much thrashing, the following is also true.

T(n) = c*n^d * (u^(k+1) - 1) / (u-1); u = a / b^d

Three cases arise from this.

Examples

Quicksort

T(n) = 2*T(n/2) + c*n

Case 1: u=1.

Solution: T(n) = O(n*log(n))

Binary search

T(n) = T(n/2) + c

Case 1: u=1.

Solution: T(n) = O(log(n))

Median finding1

T(n) = T(3*n/4) + c*n

Case 3: u<1.

Solution: T(n) = O(n)


1. Assumes lucky partition.

Path computations

We consider 3 algorithms.

These algorithms are essentially the same. They are based on organizing all paths of a directed graph in a certain manner.

Transitive closure

Given a finite set S and a relation R on S, add the pairs that are necessary in order to make R transitive. This means that if iRk and kRj, require that iRj.

Here is some pseudocode that accomplishes this.

function path-exists (M)
  foreach i in all rows in M
    foreach j in all cols in M
      if M[i,j] is defined
         new-m[i,j] = true
      else
         new-m[i,j] = false
  return new-m

function transitive-closure (weight-matrix)
  M = path-exists (weight-matrix)
  n = row-length (M)
  for k=0 upto n
    for i=0 upto n
      for j=0 upto n
        M[i,j] = or (M[i,j]
                     and (M[i,k],
                          M[k,j]))
  return M

The complexity of this operation is O(n^3).

How to find all paths of a graph

Given a graph G, find all paths in G.

This is accomplished by organizing the paths by the highest numbered vertex that is on the path as an intermediate step.

Here is some pseudocode that accomplishes this.

function weight (M)
  foreach i in all rows in M
    foreach j in all cols in M
      if M[i,j] is defined
         new-m[i,j] = M[i,j]
      else
         new-m[i,j] = INF
  return new-m

function all-pairs (weight-matrix)
  M = weight (weight-matrix)
  n = row-length (M)
  for k=0 upto n
    for i=0 upto n
      for j=0 upto n
        M[i,j] = min (M[i,j],
                      M[i,k] + M[k,j])
  return M

The complexity of this operation is O(n^3).

Sources