- The Role of Algorithm
- Asymptotic Complexity
- Asymptotic Notation & Growth of Functions
- Comparing of two functions
- Order of Growth
- General Terms to remember
- Asymptotic Complexity - Examples
- Worksheet 1.1: Growth of Functions and Run-Time Estimation
- Worksheet 1.2: Asymptotic Complexity
- Space Complexity
- Reference
- Data Structures & Algorithms (DS & Algo)
- Hardware (H/W): e.g., desktop vs. desktop
- Compiler: e.g., 4-6 cycle optimization
- Operating System (OS): e.g., how it links system libraries
- Programming Language (Prog. Lang): e.g., Microsoft Visual C++ vs. GCC
Note: Items 2-5 are considered secondary factors. None of these give a 100 times better result for program performance.
A good algorithm can make a significant difference in performance.
-
Merge Sort: Has a time complexity of
$O(n \log n)$ . -
Insertion Sort: Has a time complexity of
$O(n^2)$ .
Calculation of Speedup:
-
Merge Sort Speedup =
$\frac{O(n^2)}{O(n \log n)} = \frac{n}{\log n}$ -
For an input size
$n = 1000$ :- Speedup =
$\frac{1000}{\log_2(1000)} \approx \frac{1000}{10} \approx 100$
- Speedup =
This shows that Merge Sort is approximately 100 times faster than Insertion Sort for an input size of 1000, highlighting the importance of choosing the right algorithm.
-
Limitations of Asymptotic Notation
- We don't account for memory and caching.
- We assume all operations take the same amount of time in our model.
- We ignore constants and lower-order terms because they become insignificant for large input sizes.
-
Example: For a function
$t(n) = 4n^2 + 6n + 16$ , the term$4n^2$ will dominate as$n$ becomes large. Therefore, the complexity is$O(n^2)$ .
-
Example: For a function
-
Real-world Performance Example
- A 1-GHz processor can perform
$10^9$ cycles per second. - Assuming an average operation takes 10 cycles, the processor can execute
$\frac{10^9}{10} = 10^8$ operations per second. -
Case 1:
$O(n)$ algorithm- For an input size
$n = 10^5$ , the time taken is$\frac{10^5 \text{ operations}}{10^8 \text{ ops/sec}} = 10^{-3} \text{ seconds}$ , or 1 millisecond.
- For an input size
-
Case 2:
$O(n^2)$ algorithm- For an input size
$n = 10^5$ , the number of operations is$(10^5)^2 = 10^{10}$ . - The time taken is
$\frac{10^{10} \text{ operations}}{10^8 \text{ ops/sec}} = 10^2 \text{ seconds}$ , or 100 seconds. - This demonstrates why an asymptotically worse algorithm can become impractical for large inputs, even on a fast machine.
- For an input size
- A 1-GHz processor can perform
-
Consider an
$O(2^n)$ exponential algorithm running on a machine that can execute$10^8$ operations/second.Some Terms to Remember
-
1K =
$2^{10} \approx 10^3$ -
1M =
$2^{20} \approx 10^6$ -
1G =
$2^{30} \approx 10^9$ -
For
$n = 50$ :- Time taken,
$t = \frac{\text{no. of ops}}{\text{speed}} = \frac{2^{50}}{10^8}$ $\frac{2^{50}}{10^8} = \frac{(10^3)^5}{10^8} = \frac{10^{15}}{10^8} = 10^7 \text{ seconds}$ -
$10^7 \text{ seconds} \approx 0.317$ years
- Time taken,
-
For
$n = 100$ :- Time taken,
$t = \frac{2^{100}}{10^8} = \frac{2^{50} \cdot 2^{50}}{10^8} = \frac{2^{50}}{10^8} \cdot 2^{50} \approx 0.317 \text{ years} \cdot 2^{50}$ -
$2^{50}$ is an incredibly large number, so the time taken would be billions of years, even on a supercomputer.
- Time taken,
-
-
This demonstrates that even with a significant increase in hardware speed, for a highly inefficient algorithm, the increase in input size from 50 to 100 makes the problem practically unsolvable. Therefore, we need to use efficient algorithms.
Problem:
- Consider an algorithm with an asymptotic complexity of
$O(n^2)$ . - If the algorithm takes 3 seconds to process an input of size
$10^4$ , how much time will it take to process an input of size$10^5$ on the same machine?
Hint:
$Time = \frac{\text{no. of ops}}{\text{speed}}$
Solution:
-
Let
$t_1$ be the time taken for input size$n_1$ , and$t_2$ be the time for input size$n_2$ . -
We know that the number of operations for an
$O(n^2)$ algorithm is proportional to$n^2$ . Let the speed of the machine be constant. -
$t_1 = \frac{n_1^2}{\text{speed}}$ (1) -
$t_2 = \frac{n_2^2}{\text{speed}}$ (2) -
By dividing equation (2) by (1), we get:
$\frac{t_2}{t_1} = \frac{n_2^2 / \text{speed}}{n_1^2 / \text{speed}} = \left(\frac{n_2}{n_1}\right)^2$
-
Now, substitute the given values:
$t_2 = \left(\frac{n_2}{n_1}\right)^2 \times t_1$ $t_2 = \left(\frac{10^5}{10^4}\right)^2 \times 3 \text{ seconds}$ $t_2 = (10)^2 \times 3 \text{ seconds}$ $t_2 = 100 \times 3 \text{ seconds}$ $t_2 = 300 \text{ seconds}$
Answer: It will take 300 seconds (or 5 minutes) to process the larger input.
-
Lucky - Best Case: Input already sorted
$T_{BC}(n) = \theta(n)$
-
Unlucky - Worst Case: Input sorted in reverse order
-
Lower Bound: Order
$n$ $T(n) = \Omega(n)$
-
Upper Bound: Order
$n^2$ $T(n) = O(n^2)$
-
Tight Bound: Order
$n^2$ $T(n) = \frac{n(n-1)}{2} = \theta(n^2)$
-
Lower Bound: Order
-
$O$ : upper bound$\le$ -
$\Omega$ : lower bound$\ge$ -
$\theta$ : tight asymptotic bound$=$ -
$o$ : exclusive upper bound$<$ -
$\omega$ : exclusive lower bound$>$
Given two functions
-
If
$\lim_{n\to\infty} \frac{f(n)}{g(n)} = \text{zero}$ :-
$f(n) = o(g(n))$ [most precise] - implicits$f(n) < o(g(n))$ -
$g(n) = \omega(f(n))$ [most precise] - implicits$g(n)$>$ (f(n))$ -
$f(n) = O(g(n))$ - implicits$f(n) \le (g(n))$ -
$g(n) = \Omega(f(n))$ - implicits$g(n) \ge (f(n))$
Example
Given two functions:
$f(n) = n$ and$g(n) = 2n^2$ The limit as
$n$ approaches infinity:$\lim_{n \to \infty} \frac{f(n)}{g(n)} = \lim_{n \to \infty} \frac{n}{2n^2} = \lim_{n \to \infty} \frac{1}{2n} = 0$ Since the limit is zero, it implies that
$f(n)$ grows strictly slower than$g(n)$ . Therefore, the following relationships hold:-
$f(n) = o(g(n))$ - This is the most precise answer. It states that
$f(n)$ is a "little-o" of$g(n)$ , meaning$f(n)$ grows strictly slower than$g(n)$ .
- This is the most precise answer. It states that
-
$g(n) = \omega(f(n))$ - This is also a precise answer. It states that
$g(n)$ is a "little-omega" of$f(n)$ , meaning$g(n)$ grows strictly faster than$f(n)$ .
- This is also a precise answer. It states that
-
$f(n) = O(g(n))$ - This is a correct, but less precise, answer. The Big-O notation indicates that
$f(n)$ grows at a rate that is less than or equal to$g(n)$ , which is true since it grows strictly slower.
- This is a correct, but less precise, answer. The Big-O notation indicates that
-
$g(n) = \Omega(f(n))$ - This is also a correct, but less precise, answer. The Big-Omega notation indicates that
$g(n)$ grows at a rate that is greater than or equal to$f(n)$ , which is true since it grows strictly faster.
- This is also a correct, but less precise, answer. The Big-Omega notation indicates that
-
-
If
$\lim_{n\to\infty} \frac{f(n)}{g(n)} = k$ (a non-zero constant):-
$f(n) = \theta(g(n))$ [most precise] - implicits$f(n) = (g(n))$ -
$f(n) = O(g(n))$ [ precise ] -
$f(n) = \Omega(g(n))$ [ precise ]
Example
Given two functions:
$f(n) = n^2$ and$g(n) = 4n^2$ The limit as
$n$ approaches infinity:$\lim_{n \to \infty} \frac{f(n)}{g(n)} = \lim_{n \to \infty} \frac{n^2}{4n^2} = \frac{1}{4}$ Since the limit of
$\frac{f(n)}{g(n)}$ as$n$ approaches infinity is a non-zero constant, it implies that both functions grow at the same rate. Therefore, the following relationships hold:-
$f(n) = \theta(g(n))$ - This is the most precise answer. The Big-Theta notation signifies that
$f(n)$ and$g(n)$ have the same asymptotic growth rate, meaning they are tightly bound to each other.
- This is the most precise answer. The Big-Theta notation signifies that
-
$f(n) = O(g(n))$ - This is also a correct answer. The Big-O notation indicates that
$f(n)$ grows at a rate that is less than or equal to$g(n)$ , which is true since they grow at the same rate.
- This is also a correct answer. The Big-O notation indicates that
-
$f(n) = \Omega(g(n))$ - This is also a correct answer. The Big-Omega notation indicates that
$f(n)$ grows at a rate that is greater than or equal to$g(n)$ , which is also true since they grow at the same rate.
- This is also a correct answer. The Big-Omega notation indicates that
-
-
If
$\lim_{n \to \infty} \frac{f(n)}{g(n)} = \infty$ :$f(n) = \omega(g(n))$ $g(n) = o(f(n))$ $f(n) = \Omega(g(n))$ $g(n) = O(f(n))$
| Complexity |
|---|
| log(log n) |
| n |
| ... |
| ... |
| n! |
- Logarithm Rules
| Rule Name | Formula | Notes |
|---|---|---|
| Definition | Base relation | |
| Product Rule | Multiply → Add | |
| Division Rule | Divide → Subtract | |
| Power Rule | Power comes down | |
| Change of Base | Convert base | |
| Log of 1 | ||
| Log of Base | ||
| Inverse Rule | Cancels out | |
| Exponential | Reverse |
- Seconds Conversion
1 second = 1000 milliseconds (ms)
1 ms = 1000 microseconds (µs) = 10^-3 seconds
1 µs = 1000 nanoseconds (ns) = 10^-6 seconds
1 ns = 10^-9 seconds
1 hour = 3600 seconds
1 day = 24 hours
1 month ≈ 30 days
1 year ≈ 365 days
for (i = 3; i <= n; i++)
op;for (i = 3; i <= 2*n(bound); i++)
op;for (i = 3; i <= n; i*=2 (Step))
op;for (i = 3; i <= n; i+=2)
op;-
Core Principle: In Asymptotic Notation (
$O, \Omega, \theta$ ), the base of the logarithm is not required and is conventionally dropped. -
Mathematical Proof (Change of Base Formula): The change of base formula for logarithms is:
$$\log_a x = \frac{\log_b x}{\log_b a}$$ -
Application to Complexity:
-
If we have an algorithm running in
$O(\log_2 n)$ time, and we want to change the base to$10$ :$$\log_2 n = \frac{\log_{10} n}{\log_{10} 2}$$ -
Since
$\log_{10} 2$ is a constant value (approximately 0.301), the expression becomes:$$\log_2 n = (\text{Constant}) \times \log_{10} n$$ -
In asymptotic analysis, we discard all constant factors. Therefore,
$O(\log_2 n)$ is equivalent to$O(\log_{10} n)$ , and both are simply written as$O(\log n)$ .
-
for (i=1 ; i<=n ; i++) // -> θ(n)
for (j=1 ; j<=10 ; j++) // -> θ(1)
if (cond)
op
else
breakfor (i=1 ; i<=n ; i++) // -> θ(n)
for (j=1 ; j<=n*n ; j++) // -> θ(n^2)
if (cond)
op
else
breakfor (i=1 ; i<=n ; i++) // -> θ(n)
for (j=i ; j<=i+3 ; j++) // -> θ(4) -> θ(1)
if (cond)
opfor (i=1 ; i<=n ; i++) // -> θ(n)
for (j=1 ; j<=n*n ; j*=2) // -> θ(log_2 n^2)
if (cond)
op
else
break// Section 1: Nested Loops
for (i=1 ; i<=n ; i++)
for (j=1 ; j<=n ; j+=5)
op
// Section 2: Sequential Loop
for (k=1 ; k<=n/2 ; k++)
opGiven Functions:
$f(n) = n^2 + n + 5$ $g(n) = 5n^3 + 2$
Analysis:
As
- For
$f(n)$ , the dominant term is$n^2$ . - For
$g(n)$ , the dominant term is$5n^3$ .
Since
Since the limit is zero, the following asymptotic relationships hold:
| Relationship | Notation | Implication |
|---|---|---|
|
|
||
|
|
||
|
|
||
|
|
Question 1: For each pair of functions below, first determine which function grows asymptotically faster, and then express the relation between the two functions using all the asymptotic notations (θ, O, Ω, o and ω) that apply. Only give the relations in which f(n) appears on the left-hand side. Justify your answer, and use the limit definition if necessary.
-
$f(n) = n^2 + (\log n)^3$ vs.$g(n) = n^2 \log n$
No justification is required -
$f(n) = 4^n$ vs.$g(n) = 3^n + n$
Justify your answer using the limit definition. Show your work.
Justification: We compare the highest-order terms of the functions.
-
$f(n)$ 's dominant term is$n^2$ . -
$g(n)$ 's dominant term is$n^2 \log n$ .
Since
Since the limit is zero,
Conclusion:
| Notation | Relation | Implication |
|---|---|---|
| Little-o (o) |
|
|
| Big-O (O) |
|
Justification: We must use the limit definition as the base of the exponential term is different.
We can simplify the fraction by dividing the numerator and denominator by the highest-growing term in the denominator, which is
We know that:
-
$\lim_{n\to\infty} (4/3)^n = \infty$ (since$4/3 > 1$ ) -
$\lim_{n\to\infty} \frac{n}{3^n} = 0$ (exponentials grow faster than polynomials)
Since the limit is infinity,
Conclusion:
| Notation | Relation | Implication |
|---|---|---|
| Little-omega ( |
|
|
| Big-Omega ( |
|
Question 2: If we know that the running time T(n) of some algorithm satisfies the relations:
$n^{2.5} \log n$ $n (\log n)^3$ $n(\log n)^5$ $n^{1.01}$ $n^{2.6}$
If
This means
-
Upper Bound (Strictly Less Than):
$T(n)$ must grow strictly slower than$n^{2.5} \log n$ . -
Lower Bound (Greater Than or Equal To):
$T(n)$ must grow at a rate equal to or faster than$n (\log n)^3$ .
Let
| Candidate Function | Comparison with |
Comparison with |
Result |
|---|---|---|---|
| No (Grows at the same rate: |
Yes | Fails |
|
| Yes | Yes (Grows at the same rate: |
Possible | |
| Yes | Yes (Grows faster than |
Possible | |
| Yes | Yes (Grows faster than |
Possible | |
| No (Grows faster than |
Yes | Fails |
Possible Functions:
Question 3: Consider two algorithms for solving a certain problem: Algorithm X with an asymptotic complexity of
We need to find the speed of the machine running Algorithm Y (
- Input size:
$n = 1,000,000 = 10^6$ - Algorithm X Complexity:
$T_X(n) = \theta(n^2 \log_2 n)$ - Algorithm Y Complexity:
$T_Y(n) = \theta(n^3)$ - Machine X Speed:
$S_X = 10^7$ operations/second
For
We use the common approximation:
The time taken is
Since
We are using the dominant terms for operations:
Answer: The machine running Algorithm Y must have a speed of approximately
Find the asymptotic complexity of each of the following algorithms. Assume that the input size is n and that Op is a constant-time operation. Show your analysis. If it is possible to analyze each loop separately, you must give the asymptotic complexity for each loop, and then show how you combine them to compute the asymptotic complexity for the whole algorithm.
Notation / assumptions
Opis a constant-time operation.log nmeans logarithm in any constant base (Θ same).sqrt(n)denotes the square root of n.
for (i=1; i < n/2; i+=3) {
for (j=1; j < sqrt(n); j*=4)
Op;
for (k=3; k < n*n; k+=2)
for (m=k-2; m <= k+2; m++)
Op;
}Analysis (per loop):
iloop: iterations ≈ (n/2)/3 = Θ(n).jloop: geometric by ×4 until < sqrt(n) → iterations = Θ(log_4 sqrt(n)) = Θ(log n).kloop: Θ(n²).mloop: constant 5 iterations = Θ(1).k–mpair: Θ(n²).
Per i iteration cost = Θ(log n) + Θ(n²) = Θ(n²).
Total = Θ(n) * Θ(n²) = Θ(n³).
Answer: Θ(n³).
for (i=1; i < 4*n; i+=2) {
for (j=i+5; j >= i; j--)
Op;
}
for (k=1; k < n*n*n; k*=5) {
for (m=1; m < logn; m++)
Op;
}Analysis:
- First double loop:
i→ Θ(n), innerjfixed 6 iterations → Θ(1) ⇒ Θ(n). - Second double loop:
kgeometric to n^3 → Θ(log n),m→ Θ(log n) ⇒ Θ((log n)²). - Combined: Θ(n) + Θ((log n)²) = Θ(n).
Answer: Θ(n).
for (i=3; i < n; i++) {
for (j=i; j <= 2*n; j++)
Op;
}Answer: Asymptotic: Θ(n²).
Array of size n (even):
<3, 8, 3, 8, … , 3, 8>
InsertionSort(A, n) {
for (i=1; i < n; i++) {
key = A[i];
for (j=i-1; j>=0 && A[j] > key; j--)
A[j+1] = A[j];
A[j+1] = key;
}
}Answer: Asymptotic runtime: Θ(n²).
Total memory used by an algorithm.
Includes:
-
Input space
-
Output space
-
Temporary variables
-
Function call stack
-
Space Complexity = Input Space + Auxiliary Space
Extra memory used by the algorithm (excluding input).
Includes:
- Temporary variables
- Data structures
- Recursion stack
- 1️⃣ Sum of Array
int sum(int arr[], int n) {
int s = 0;
for(int i=0;i<n;i++) s += arr[i];
return s;
}Input array → O(n) Extra variable → O(1)
👉 Auxiliary Space = O(1) 👉 Space Complexity = O(n)
- 2️⃣ Merge Sort
Input array → O(n) Temporary array → O(n)
👉 Auxiliary Space = O(n) 👉 Space Complexity = O(n)
- 3️⃣ Recursion (Factorial)
int fact(int n) {
if(n == 1) return 1;
return n * fact(n - 1);
}👉 Recursion stack → O(n) 👉 Auxiliary Space = O(n) 👉 Space Complexity = O(n)
I would like to extend my sincere gratitude to Dr. Ghassan Shobaki for providing an incredibly helpful YouTube playlist that greatly assisted in the development and understanding of concepts of Algorithm Analysis.
The detailed explanations and clear examples in the lectures were invaluable.
- Playlist: Algorithms - Ghassan Shobaki
- Channel: Ghassan Shobaki Computer Science Lectures