Skip to content

Commit a719612

Browse files
Merge pull request #1622 from hashfx/issue-1519
fixes #1519
2 parents 2564ee5 + 2fc2cf2 commit a719612

22 files changed

+1821
-0
lines changed

DSA-Python/Algorithms/BigO.md

Lines changed: 96 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,96 @@
1+
# BigO Notation
2+
3+
### Big O Notation is used to measure how running time or space requirements for your program grow as input size grows
4+
5+
## Order of Magnitude
6+
7+
### A Time Complexity does not tell the exact number of times the code inside a loop is executed, but it only shows the order of magnitude. Whether code inside loop is executed for 3n, n+5, n/2 times; the time complexity of each code is O(n)
8+
9+
## Rules for BigO Notation
10+
```time = a * n + b```
11+
+ ### Keep the fastest growing term
12+
#### BigO refers to very large value of n. Hence, if you have a function like:
13+
```javascript
14+
let time = 5*n^2 + 3*n + 20
15+
// when value of n is very large (b*n + c) become irrelevant
16+
let n = 1000
17+
time = 5*1000^2 + 3*1000 + 20
18+
time = 5000000 + 3020
19+
time === 5000000 // terms other than 5*1000^2 contribute negligible
20+
```
21+
22+
#### time = a * n # being b a constant and static
23+
+ ### Drop constant
24+
#### `time = n # a being constant`
25+
`time = O(n)`
26+
27+
## Measuring running time growth (Time Complexity)
28+
29+
+ ## Constant Time Complexity O(1)
30+
### Running time of Constant-time algorithm does not depend on the input size
31+
```js
32+
time = a
33+
time = O(1) // applying rules for Big O :: Keep fastest growing term (a)
34+
```
35+
36+
```python
37+
def foo(arr): # time is nearly constant with increase in input size
38+
sizeof(arr) : 100 -> 0.22 milliseconds
39+
sizeof(arr) : 1000 -> 0.23 milliseconds
40+
```
41+
42+
+ ## Logarithmic Time Complexity O(log n)
43+
### A logarithmic algorithm often halves the input size at each step. The running time of such an algorithm is logarithmic, because log2 n equals the number of times n must be divided by 2 to get 1
44+
45+
+ ## Square Root Algorithm O(sqrt(n))
46+
### A square root algorithm is slower than O(log n) but faster than O(n) A special property of square roots is that ```sqrt(n) = n/(sqrt(n))```, so the square root `(sqrt(n))` lies, in some sense, in the middle of the input
47+
48+
+ ## Linear Time Complexity O(n)
49+
### A linear algorithm goes through the input a constant number of times
50+
```js
51+
time = a * n + b
52+
time = O(n) // applying rules for Big O :: Keep fastest growing term (a*n) ->->-> Drop Constants(a, b)
53+
```
54+
```python
55+
def foo(arr):
56+
sizeof(arr) : 100 -> 0.22 milliseconds
57+
sizeof(arr) : 1000 -> 2.30 milliseconds
58+
```
59+
60+
+ ## O(n log n)
61+
### This time complexity often indicates that the algorithm sorts the input, because the time complexity of efficient sorting algorithms is O(n log n)
62+
63+
+ ## Quadratic Time Complexity O(n^2)
64+
### A quadratic time complexity often contains two nested loops
65+
```js
66+
time = a * (n^2) + b
67+
time = O(n^2) // applying rules for Big O :: Keep fastest growing term (a*n^2) ->->-> Drop Constants(a, b)
68+
/*Exception O(n^2) not O(n^2 + n)*/
69+
time = a*(n^2) + (b * n) + c
70+
time = n^2 // applying rules for Big O :: Keep fastest growing term (a*n^2) ->->-> Drop Constants(a, b, c)
71+
```
72+
73+
+ ## Cubic Time Complexity O(n^3)
74+
### A cubic algorithm often contains three nested loops
75+
76+
+ ## O(2^n)
77+
### This time complexity often indicates that the algorithm iterates through all subsets of the input elements
78+
79+
+ ## O(n!)
80+
### This time complexity often indicates that the algorithm iterates through all permutations of the input elements
81+
82+
### Note:
83+
#### An algorithm is polynomial if its time complexity is at most O(nk) where k is a constant.
84+
#### All the above time complexities except O(2n) and O(n!) are polynomial
85+
86+
### Given the input size, we can try to guess the required time complexity of the algorithm that solves the problem.
87+
### The following table contains some useful estimates assuming a time limit of one second
88+
89+
|Input Size|Time Complexity|
90+
|---|---|
91+
|n <= 10| O(n!)|
92+
|n <= 20| O(2n)|
93+
|n <= 500| O(n3)|
94+
|n <= 5000| O(n2)|
95+
|n <= 106| O(n logn) or O(n)|
96+
|n is large| O(1) or O(log n)|

DSA-Python/Algorithms/BigO.py

Lines changed: 158 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,158 @@
1+
"""
2+
BigO Notation
3+
Big O Notation is used to measure hoe running time or space requirements for your program grow as input size grows
4+
Order of Magnitude
5+
A Time Complexity does not tell the exact number of times the code inside a loop is executed,
6+
but it only shows the order of magnitude. Whether code inside loop is executed for 3n, n+5, n/2 times;
7+
the time complexity of each code is O(n)
8+
9+
Rules for BigO Notation
10+
time = a * n + b
11+
> Keep fastest growing term
12+
BigO refers to very large value of n. Hence if you have a function like:
13+
time = 5*n^2 + 3*n + 20
14+
when value of n is very large (b*n + c) become irrelevant
15+
Example: n = 1000
16+
time = 5*1000^2 + 3*1000 + 20
17+
time = 5000000 + 3020
18+
time ~= 5000000 # terms other than 5*1000^2 contribute negligible
19+
20+
time = a * n # being b a constant and static
21+
22+
> Drop constant
23+
time = n # a being constant
24+
time = O(n)
25+
26+
Measuring running time growth (Time Complexity)
27+
28+
Constant Time Complexity O(1)
29+
Running time of Constant-time algorithm does not depend on the input size
30+
time = a
31+
time = O(1) # applying rules for Big O :: Keep fastest growing term (a) ->->-> Drop Constants(a)
32+
33+
34+
def foo(a): # time is nearly constant with increase in input size
35+
size(arr) : 100 -> 0.22 milliseconds
36+
size(arr) : 1000 -> 0.23 milliseconds
37+
38+
Logarithmic Time Complexity O(log n)
39+
A logarithmic algorithm often halves the input size at each step. The running time of such an algorithm is
40+
logarithmic, because log2 n equals the number of times n must be divided by 2 to get 1
41+
42+
Square Root Algorithm O(sqrt(n))
43+
A square root algorithm is slower than O(log n) but faster than O(n) A special property of square roots is that
44+
sqrt(n) = n/(sqrt(n)), so the square root (sqrt(n)) lies, in some sense, in the middle of the input
45+
46+
Linear Time Complexity O(n)
47+
A linear algorithm goes through the input a constant number of times
48+
time = a * n + b
49+
time = O(n) # applying rules for Big O :: Keep fastest growing term (a*n) ->->-> Drop Constants(a, b)
50+
51+
def foo(arr):
52+
size(arr) : 100 -> 0.22 milliseconds
53+
size(arr) : 1000 -> 2.30 milliseconds
54+
55+
O(n log n)
56+
This time complexity often indicates that the algorithm sorts the input, because the time complexity of
57+
efficient sorting algorithms is O(n logn)
58+
59+
Quadratic Time Complexity O(n^2)
60+
A quadratic time complexity often contains two nested loops
61+
time = a * (n^2) + b
62+
time = O(n^2) # applying rules for Big O :: Keep fastest growing term (a*n^2) ->->-> Drop Constants(a, b)
63+
Exception O(n^2) not O(n^2 + n)
64+
time = a*(n^2) + (b * n) + c
65+
time = n^2 # applying rules for Big O :: Keep fastest growing term (a*n^2) ->->-> Drop Constants(a, b, c)
66+
67+
Cubic Time Complexity O(n^3)
68+
A cubic algorithm often contains three nested loops
69+
70+
O(2^n)
71+
This time complexity often indicates that the algorithm iterates through all subsets of the input elements
72+
73+
O(n!)
74+
This time complexity often indicates that the algorithm iterates through all permutations of the input elements
75+
76+
Note:
77+
An algorithm is polynomial if its time complexity is at most O(nk) where k is a constant.
78+
All the above time complexities except O(2n) and O(n!) are polynomial
79+
80+
Given the input size, we can try to guess the required time complexity of the algorithm that solves the problem.
81+
The following table contains some useful estimates assuming a time limit of one second
82+
83+
n <= 10 O(n!)
84+
n <= 20 O(2n)
85+
n <= 500 O(n3)
86+
n <= 5000 O(n2)
87+
n <= 106 O(n logn) or O(n)
88+
n is large O(1) or O(log n)
89+
90+
91+
"""
92+
""" Time Complexity """
93+
94+
# Linear Time Complexity O(n)
95+
def squareNum(numbers):
96+
"""
97+
Time Complexity: O(n) :: loop will iterate n time linearly
98+
"""
99+
sqNum = [] # empty array for squared numbers initialised
100+
for n in numbers: # loop will iterate n times
101+
sqNum.append(n*n)
102+
return sqNum
103+
104+
105+
# Constant Time Complexity O(1)
106+
def findPE(prices, eps, index):
107+
pe = prices[index] / eps[index]
108+
return pe
109+
110+
111+
# O(n^2)
112+
def duplicateInArray(numbers):
113+
""" Running two for loops"""
114+
duplicate = None
115+
# (n^2) iterations
116+
for i in range(len(numbers)):
117+
for j in range(i+1, len(numbers)):
118+
if numbers[i] == numbers[j]:
119+
print(f"{numbers[i]} is duplicate")
120+
duplicate = numbers[i]
121+
break
122+
# n iterations
123+
for i in range(len(numbers)):
124+
if numbers[i] == duplicate:
125+
print(i)
126+
127+
128+
""" Space Complexity """
129+
def Search(numbers):
130+
"""
131+
Search for 10 in list
132+
Time Complexity: O(n)
133+
"""
134+
for i in range(len(numbers)):
135+
if numbers[i] == 10:
136+
print(i)
137+
138+
139+
140+
141+
if __name__ == '__main__':
142+
# O(n)
143+
numbers = [2, 5, 7, 9]
144+
print(squareNum(numbers))
145+
146+
# O(1)
147+
prices = [250, 500, 450]
148+
eps = [25, 50, 75]
149+
index = 1
150+
print(findPE(prices, eps, index))
151+
152+
# O(n^2)
153+
numbers = [2, 4, 6, 8, 14, 9, 4]
154+
duplicateInArray(numbers)
155+
156+
numbers = [1, 6, 8, 10, 4, 5]
157+
Search(numbers)
158+

DSA-Python/Algorithms/BinarySearch.py

Lines changed: 105 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,105 @@
1+
"""
2+
Binary Search Algorithm
3+
Search Algorithm that finds the position of a target value within a sorted array(ASCE).
4+
Binary search compares the target value to the middle element of the array.
5+
Normal Zindagi (Linear Search):
6+
To find an element from a list, compare elements from index[0] till the element is found
7+
Mentos Zindagi (Binary Search):
8+
To find an element k from a list:
9+
> Sort the array in Ascending order
10+
> Find the middle element from the list
11+
> Compare it with key to be found
12+
> If middle element is less than key, find the middle element of the list at right-side of the middle element
13+
> If middle element is more than key, find the middle element of the list at left-side of the middle element
14+
> Repeat above 2 steps until the key is found
15+
> Return index of the key if it is found in the list
16+
> If key is not found in the list, return -1 or false
17+
With every iteration, search space is divided by 1/2
18+
Iteration 1 = n/2
19+
Iteration 2 = (n/2)/2 = n/2^2
20+
Iteration 3 = (n/2^2) = n/2^3
21+
Iteration k = (n/2^k)
22+
1 = n/2^k
23+
n = 2^k
24+
log2(n) = log2(2^k)
25+
log2(n) = k*log2(2)
26+
k = log(n)
27+
Time Complexity: O(log n)
28+
29+
30+
key: 18
31+
2 4 6 8 |10| 12 16 18 20 :: key not in {2, 4, 6, 8, 10}
32+
~2 4 6 8 10~ 12 |16| 18 20 :: key not in {12}
33+
~2 4 6 8 10 12 16~ |18| 20 :: key found in {18} at index[7]
34+
Output:
35+
7
36+
37+
Linear Search 7 iterations O(n)
38+
Binary Search 3 iterations log(n)
39+
"""
40+
41+
def LinearSearch(numList, key):
42+
for index, element in enumerate(numList): # return index as well as element
43+
if element == key:
44+
return index # if key is found in list, return index of key
45+
return -1
46+
47+
def BinarySearch(numList, key):
48+
left = 0 # index of elements at left of the mid value
49+
right = len(numList) - 1 # index of elements at right of the mid value
50+
mid_index = 0 # index of mid value
51+
52+
while left <= right: # while index[left] <= index[right]
53+
mid_index = (left + right) // 2 # '//2' returns integer value
54+
mid_num = numList[mid_index]
55+
56+
if mid_num == key: # middle number is equal to key
57+
return mid_index
58+
if mid_num < key:
59+
left = mid_index + 1
60+
else: # mid_num > key
61+
right = mid_index - 1
62+
63+
return -1
64+
65+
66+
"""Binary Search using Recursion"""
67+
def BinarySearchRecursion(numList, key, leftIndex, rightIndex): # will search within left and right
68+
if rightIndex < leftIndex: # won't iterate in reverse order
69+
return -1
70+
71+
mid_index = (leftIndex + rightIndex) // 2 # '//2' returns integer value
72+
if mid_index >= len(numList): # if key is out of index of list, return -1
73+
return -1
74+
75+
mid_num = numList[mid_index]
76+
if mid_num == key: # middle number is equal to key
77+
return mid_index
78+
79+
if mid_num < key:
80+
leftIndex = mid_index + 1
81+
else: # mid_num > key
82+
rightIndex = mid_index - 1
83+
84+
return BinarySearchRecursion(numList, key, leftIndex, rightIndex)
85+
86+
87+
88+
89+
90+
91+
if __name__ == '__main__':
92+
numList = [2, 4, 6, 8, 10, 12, 14, 16, 18, 20]
93+
key = 18
94+
# Linear Search
95+
index = LinearSearch(numList, key)
96+
print(f"Linear Search: Number found at index {index}")
97+
# Binary Search
98+
index = BinarySearch(numList, key)
99+
print(f"Binary Search: Number found at index {index}")
100+
# Binary Search using Recursion
101+
index = BinarySearchRecursion(numList, key, 0, len(numList))
102+
print(f"Binary Search Recursion: Number found at index {index}")
103+
104+
105+

0 commit comments

Comments
 (0)