Skip to content

Latest commit

 

History

History
215 lines (166 loc) · 5.17 KB

File metadata and controls

215 lines (166 loc) · 5.17 KB
title
Mathematical Foundations for Algorithm Analysis

Asymptotic Notation

Big O Definition

Definition: Function $f(n)$ is in order of growth $O(g(n))$ if $\textcolor{green}{f(n)\text{'s order of growth}}$ $\le$ $\textcolor{green}{g(n)\text{'s order of growth}}$ (within a constant multiple).

$$ \boxed{ \text{Order of Growth: } f(n) \le c g(n) \quad \forall \quad n \ge n_0 } \~\\ \small \textit{where $c$ is a positive constant, and} \\ \textit{$n_0$ is a non-negative integer} $$

Example: Order of growth, formally

$10n$ is $O(n^2)$

  • Choose $c=1$ and $n_0 = 10$
  • For all $n \ge 10$: $10n \le 1 \times n^2$
  • When $n=10$: $100 \le 100$
  • When $n=100$: $1000 \le 1000$, et cetera.

$5n+20$ is $O(n)$

  • Choose $c=25$ and $n_0 =1$
  • For all $n \ge 1$: $5n + 20 \le 25n$ (constant 20 becomes negligible for large $n$)

Key Properties

1. Every function bounds itself

$$ f(n) \in O(f(n)) $$

2. Upper and lower bounds are flipsides

$$ f(n) \in O(g(n)) \text{ iff } g(n) \in \Omega ( f(n)) $$

3. Transitivity

$$ \text{If $f(n) \in O(g(n))$ and $g(n) \in O(h(n))$,} \\ \text{then $f(n) \in O(h(n))$} $$

Layman's: "If $A \le B$ and $B \le C$, then $A \le C$"

4. Addition Rule

$$ \text{If $f_1(n) \in O(g_1(n))$ and $f_2(n) \in O(g_2(n))$, then} \\ \text{$f_1 (n) + f_2 (n) \in O(max{ g_1(n), g_2(n) })$} $$

Layman's: When adding functions, only the fastest-growing one matters.

  • e.g., If you have an algorithm that does $O(n^2)$ work and then $O(n)$ work, the total is $O(n^2)$ because the smaller term gets absorbed.

Limit Method

Alternatively, instead of finding $c$ and $n_0$, we can use limits to find the order of growth.

$$ \lim_{n \to \infin} \frac{T(n)}{g(n)} = \begin{cases} 0&: \text{$T(n)$ grows slower than $g(n)$} \\ c > 0&: \text{$T(n)$ grows at the same rate as $g(n)$} \\ \\ \infin&: \text{$T(n)$ grows faster than $g(n)$} \end{cases} $$

Example: Comparing Orders of Growth

Q: Which functions grow faster?

  1. $10n$ v.s. $n^2$
  2. $n(n+1)/2$ v.s. $n^2$

A:

  1. $n^2$ grows faster
  2. They're the same speed

Standard Complexity Cases

Some fundamental rules to remember.

1. Logarithm Base Doesn't Matter

All logarithmic functions belong to the same class $\Theta (\log n)$ no matter what the logarithm's base is.

  • e.g., $\log_2 n$, $\log_{10} n$, $\ln n$ are all $\Theta( \log n )$

Why?: Change of base formula. ($\log_a n = \frac{ \log n }{ \log a}$)

2. Polynomial Degree Rules

All polynomials of the same degree $k$ belong to the same class:

$$ a_k n^k + a_{k-1} n^{k-1} + ... + a_0 \in \Theta (n^k) $$

3. Exponential Bases Matter

Exponential functions $a^n$ have different orders of growth for different $a$'s.

  • e.g., $2^n$ vs $3^n$ are different growth rates (unlike logarithms)

4. The Hierarchy

You should know the fundamental ordering of complexity classes.

Complexity Name
$\Theta( 1 )$ constant
$\Theta( \log n )$ logarithmic
$\Theta( n )$ linear
$\Theta( n \log n )$ $n$-$\log$-$n$ or linearithmic
$\Theta( n^2 )$ quadratic
$\Theta( n^3 )$ cubic
$\Theta( 2^n )$ exponential
$\Theta( n! )$ factorial

Essential Summation Formulas

Notation Reference:

  • $l$: Lower bound
  • $u$: Upper bound
  • $m$: Middle point (for splitting sums)

1. Counting

For simple loops that run $n$ times. $$ \sum_{l \le i \le u} 1 = 1 + 1 + ... + 1 = u - l + 1 $$

  • In particular, $\sum_{1 \le i \le n} 1 = n - 1 + 1 = n \in \Theta(n)$

Example: Simple loop

for i <- 1 to n
	do something constant

2. Arithmetic Series

For nested loops where inner loop depends on outer loop. $$ \sum_{1 \le i \le n} i = 1 + 2 + ... + n = n(n+1)/2 \approx n^2 / 2 \in \Theta(n^2) $$

Example: Nested loop with dependency

for i <- 1 to n
	for j <- 1 to i
		do something constant

3. Sum of Squares

For loops where the inner work grows quadratically with the outer variable. $$ \sum_{1 \le i \le n} i^2 = 1^2 + 2^2 + ... + n^2 = n(n+1)(2n+1)/6 \approx n^3 / 3 \in \Theta(n^3) $$

Example: Quadratic dependency

for i <- 1 to n
	for j <- 1 to i^2
		do something constant

4. Geometric Series

For algorithms that double/halve at each step. $$ \sum_{0 \le i \le n} a^i = 1 + a + ... + a^n = (a^{n+1} - 1) / (a - 1) \text{ for any } a \ne 1 $$

  • In particular, $\sum_{0 \le i \le n} 2^i = 2^0 + 2^1 + ... + 2^n = 2^{n+1} - 1 \in \Theta(2^n)$

Example: Exponentially growing work

operations <- 1
for i <- 1 to n
	do 'operations' amount of work
	operations <- operations * 2

5. Summation Properties

Assorted properties for simplifying complex expressions. $$ \sum ( a_i \pm b_i ) = \sum a_i \pm \sum b_i $$ $$ \sum c a_i = c \sum a_i $$ $$ \sum_{l \le i \le u} a_i = \sum_{l \le i \le m} a_i + \sum_{m + 1 \le i \le u} a_i $$