Back to Ch 3

We looked at recursive algorithms where the smaller problem was just
one smaller. For some algorithms the smaller problems
are a fraction of the original problem size.

Merge Sort: T(n) = 2T( ⌈n/2⌉) + Θ (n)

Binary search: T(n) = T( ⌈n/2⌉) + Θ (1)

…

All have form related to

T(n) = a T( ⌈n/b⌉) + n^{d}

We can generalize from cases where n = b^{k},
avoiding floor and ceiling functions. (The book develops this
more in the appendix. Their approach with eventually
nonecreasing functions makes the analysis easier, but could be avoided
with a fancier argument. For the algorithms we consider, it make
heuristic sense that a bigger problem is harder to solve, but it is a
bit unsatisfing to assume that before we work out the function!
Still, for simplicity we will stick with this approach, which does
include the main idea.)

Note: You will not need to reproduce the derivation of the Master Theorem, below. You are responsible to be able to do the kind of

algebra and logic in individual steps, as they come up in other (much smaller!) problems. Where I say X simplifies to Y, it may take

one step, or it may take several basic algebraic steps. Have a pencil handy. We will use exponents and logs a lot in this course. These are high school topics that many students have not kept up with. Following these steps will be slow, but you will review essentially every kind of thing we do with exponents and logs, and also with geometric series. Keep the summary of log and exponent rules handy.

T(b^{k}) = a T(b^{k-1})
+ b^{kd}

= a[aT(b^{k-2}) + b^{(k-1)d}]
+ b^{kd}

… much like the examples in the book. Another way is to think in term of the number of problems at each level of recursion, and the size of of all their inputs:

The work for a problem is the work outside the recursion plus the work from all the recursive calls.

Top level:

1 problem input size: b^{k} for each the work outside recursion: (b^{k})^{d} = b^{kd}; for the whole level, outside recursion:
(1)b^{kd}

1 down:

a problems input size: b^{k-1} for each the work outside recursion: b^{(k-1)d} ; for the whole level, outside
recursion: (a)b^{(k-1)d}

2 down:

a^{2} problems input size: b^{k-2} for each the work outside recursion: b^{(k-2)d} ; for the whole level, outside
recursion: (a^{2})b^{(k-2)d}

…

k-1 down:

a^{k-1} problems input size: b^{1} for each the work outside recursion: b^{(1)d} ; for the whole level, outside
recursion: (a^{k-1})b^{(1)d}

k down:

a^{k} problems input size: b^{0} for each the work total (no recursion): b^{(0)d} ; for the whole level, total: (a^{k})b^{(0)d}

In total from all levels of recursion:

(1)b^{kd} + (a)b^{(k-1)d
}+... + (a^{2})b^{(k-2)d} + (a^{k-1})b^{(1)d}
+ (a^{k})b^{(0)d}

The general term for the i^{th} nested
recursion

(a^{i})b^{(k-i)d}

= (a^{i})b^{kd-id} doing exponent arithmetic

= (a^{i})b^{kd}/b^{id} using exponent subtraction rule

= (a/b^{d})^{i})b^{kd} if collect i powers

so this is a term in a geometric series
with ratio (a/b^{d})

There are three cases. Comparing the ratio to 1: It is equal to, less than, or greater than, 1:

Equal 1: Case a/b^{d} = 1 or
equivalently a = b^{d} or equivalently d = log _{b}
a.

Then there are k+1 equal terms in the sum and

T(b^{k}) = b^{kd} (k
+1)

If (a/b^{d}) is not 1:

Sum the geometric series:

T(b^{k}) = b^{kd} (1 –
(a/b^{d})^{k+1}) / (1 – a/b^{d})

This version will be useful if a/b^{d}
< 1.

We can get a version more useful if
a/b^{d} > 1 by multiplying out with the b^{kd},
and multiplying numerator and denominator by -1

T(b^{k}) = ((a/b^{d})a^{k})
– b^{kd}) / (a/b^{d} - 1)

We can restate these in terms of n =
b^{k}, using n^{d} = b^{kd} , and k = log _{b}
n = lg n / lg b. This means

T(n) = n^{d} (lg n / lg b + 1)
if a = b^{d}.

T(n) = n^{d} (1 – (a/b^{d})^{(lg
n/lg b) + 1}) / (1 – a/b^{d}) if a < b^{d}

T(n) = ((a/b^{d})a^{lg
n/lg b} – n^{d}) / (a/b^{d} - 1) if a > b^{d}

We usually work with orders and are not interested in these special exact results. If only worrying about order, we would like to switch the starting recursive relationship to

T(n) = a T( ⌊n/b⌋) + f(n),
where f(n) ∈ Θ(n^{d}).

and the same orders for the answer will work out. (I will skip the details.) Let us simplify the order expressions for the three cases.

1. Case a/b^{d} = 1:

T(n) ∈ Θ(n^{d} (lg n / lg b
+ 1))

easily simplifies to

T(n) ∈ Θ(n^{d} log n )

2. Case a/b^{d} < 1:

T(n) ∈ Θ((n^{d} (1 –
(a/b^{d})^{(lg n/lg b) + 1}) / (1 – a/b^{d}))

As n go to infinity the part (a/b^{d})^{(lg
n/lg b) }

approaches 0 and

so T(n) ∈ Θ(n^{d})

3. Case a/b^{d} > 1:

Taking logs for this case this means

lg a – d lg b > 0

so

(lg a)/(lg b) > d.

T(n) ∈ Θ(((a/b^{d})a^{lg
n/lg b} – n^{d}) / (a/b^{d} - 1) ) if a >
b^{d}

We need to analyze the part

a^{lg n/lg b}

Use log/exponent rules to see that a = b^{(lg a/lg b)}
and n = b^{(lg n/lg b) }so

a^{lg n/lg b} = (b^{(lg a/lg
b)})^{(lg n/lg b)}

= b^{(lg n/lg b)(lg a/lg b)}

= (b^{(lg n/lg b)})^{(lg
a/lg b)} = n^{(lg a/lg b)}

Since in this case, lg a / lg b > d,
the order of n^{(lg a/lg b)} is higher than the order of n^{d}
and after simplifying, we can ignore the lower order term, and

T(n) ∈ Θ(n^{(lg a/lg b)})

- - - - - - - - -

We can put this all together now. It is
somewhat shorter to state if we introduce the *critical exponent*,

E = lg a / lg b

In terms of E:

T(n) ∈ Θ(n^{d})
if d > E

T(n) ∈ Θ(n^{d}
log n) if d = E

T(n) ∈ Θ(n^{E})
if d < E

We can combine the first and third cases to get

**The Master Recursion Theorem**

If T(n) is eventually nondecreasing and

T(n) = a T(⌊n/b⌋)
+ f(n), for all positive n, where f(n) ∈ Θ(n^{d}),

and we let

E = log _{b}
a = lg a / lg b

then

T(n) ∈ Θ(n^{d}
log n) if d = E

T(n) ∈ Θ(n^{max(d, E)}) if d ≠ E

The Theorem is also true if the floor function is replaced by ceiling and/or if the class Θ is replaced uniformly by O or Ω.

Note: In future I will assume integer division in such recurrence relations (same as applying floor function) if I leave out floor or ceiling operations around the division.

This may be the theorem we use
most often in the class! Preferably remember it. At least
have it handy and keep it in mind!

T(n)
= 2T(⌊n/2⌋) + n^{2}

T(n)
= T(⌊2n/3) + 1

T(n) = 3T( ⌈n/4⌉) + 5n + 7

T(n)
= 8T(⌊n/2⌋) + n^{2}

Note: On exams, where I am not giving you a calculator, I am likely
to keep the arithmetic simple with examples like the last one, but this
is only "simple" if you remember your exponent rules!

The Θ
version of the theorem does not work directly for

T() = 2T(⌈n/2⌉)
+ n lg n

What big Oh statement can you make?

On to More Sorts and Divide and Conquer.