Skip to content

Commit e4e33fd

Browse files
committed
Standardise README for sorting algors
1 parent 51e9c31 commit e4e33fd

File tree

8 files changed

+87
-59
lines changed

8 files changed

+87
-59
lines changed

src/algorithms/sorting/bubbleSort/BubbleSort.java

Lines changed: 0 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -15,21 +15,6 @@
1515
*
1616
* At the kth iteration of the outer loop, we only require (n-k) adjacent comparisons to get the kth largest
1717
* element to its correct position.
18-
*
19-
* Complexity Analysis:
20-
* Time:
21-
* - Worst case (reverse sorted array): O(n^2)
22-
* - Average case: O(n^2)
23-
* - Best case (sorted array): O(n)
24-
* In the worst case, during each iteration of the outer loop, the number of adjacent comparisons is upper-bounded
25-
* by n. Since BubbleSort requires (n-1) iterations of the outer loop to sort the entire array, the total number
26-
* of comparisons performed can be upper-bounded by (n-1) * n ≈ n^2.
27-
*
28-
* This implementation of BubbleSort terminates the outer loop once there are no swaps within one iteration of the
29-
* outer loop. This improves the best case time complexity to O(n) for an already sorted array.
30-
*
31-
* Space:
32-
* - O(1) since sorting is done in-place
3318
*/
3419
public class BubbleSort {
3520
/**
Lines changed: 21 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1,21 @@
1-
![bubble sort img](../../../../assets/BubbleSort.jpeg)
1+
# Bubble Sort
2+
Bubble sort is one of the more intuitive comparison-based sorting algorithms.
3+
It makes repeated comparisons between neighbouring elements, 'bubbling' (side-by-side swaps)
4+
largest (or smallest) element in the unsorted region to the sorted region (often the front or the back).
5+
6+
![bubble sort img](../../../../assets/BubbleSort.jpeg)
7+
8+
## Complexity Analysis
9+
**Time**:
10+
- Worst case (reverse sorted array): O(n^2)
11+
- Average case: O(n^2)
12+
- Best case (sorted array): O(n)
13+
14+
In the worst case, during each iteration of the outer loop, the number of adjacent comparisons is upper-bounded
15+
by n. Since BubbleSort requires (n-1) iterations of the outer loop to sort the entire array, the total number
16+
of comparisons performed can be upper-bounded by (n-1) * n ≈ n^2.
17+
18+
This implementation of BubbleSort terminates the outer loop once there are no swaps within one iteration of the
19+
outer loop. This improves the best case time complexity to O(n) for an already sorted array.
20+
21+
**Space**: O(1) since sorting is done in-place

src/algorithms/sorting/countingSort/README.md

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
# Counting Sort
22

3-
Counting sort is a non-comparison based sorting algorithm and isn't bounded by the O(nlogn) lower-bound
4-
of most sorting algorithms.
3+
Counting sort is a non-comparison-based sorting algorithm and isn't bounded by the O(nlogn) lower-bound
4+
of most sorting algorithms. <br>
55
It first obtains the frequency map of all elements (ie counting the occurrence of every element), then
66
computes the prefix sum for the map. This prefix map tells us which position an element should be inserted.
77
Ultimately, each group of elements will be placed together, and the groups in succession, in the sorted output.
@@ -17,3 +17,5 @@ COMMON MISCONCEPTION: Counting sort does not require total ordering of elements
1717
This is incorrect. It requires total ordering of elements to determine their relative positions in the sorted output.
1818
In fact, in conventional implementation, the total ordering property is reflected by virtue of the structure
1919
of the frequency map.
20+
21+
Supplementary: Here is a [video](https://www.youtube.com/watch?v=OKd534EWcdk) if you are still having troubles.

src/algorithms/sorting/insertionSort/InsertionSort.java

Lines changed: 3 additions & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -3,33 +3,13 @@
33
/** Here, we are implementing InsertionSort where we sort the array in increasing (or more precisely, non-decreasing)
44
* order.
55
*
6-
* Brief Description:
7-
* InsertionSort is a simple comparison-based sorting algorithm that builds the final sorted array one element at a
8-
* time. It works by repeatedly taking an element from the unsorted portion of the array and inserting it into its
9-
* correct position within the sorted portion. At the kth iteration, we take the element arr[k] and insert
10-
* it into arr[0, k-1] following sorted order, returning us arr[0, k] in sorted order.
11-
*
126
* Implementation Invariant:
137
* The loop invariant is: at the end of kth iteration, the first (k+1) items in the array are in sorted order.
148
* At the end of the (n-1)th iteration, all n items in the array will be in sorted order.
15-
* (Note: the loop invariant here slightly differs from the lecture slides as we are using 0-based indexing.)
16-
*
17-
* Complexity Analysis:
18-
* Time:
19-
* - Worst case (reverse sorted array): O(n^2)
20-
* - Average case: O(n^2)
21-
* - Best case (sorted array): O(n)
22-
*
23-
* In the worst case, inserting an element into the sorted array of length m requires us to iterate through the
24-
* entire array, requiring O(m) time. Since InsertionSort does this insertion (n - 1) times, the time complexity
25-
* of InsertionSort in the worst case is 1 + 2 + ... + (n-2) + (n-1) = O(n^2).
26-
*
27-
* In the best case of an already sorted array, inserting an element into the sorted array of length m requires
28-
* O(1) time as we insert it directly behind the first position of the pointer in the sorted array. Since InsertionSort
29-
* does this insertion (n-1) times, the time complexity of InsertionSort in the best case is O(1) * (n-1) = O(n).
309
*
31-
* Space:
32-
* - O(1) since sorting is done in-place
10+
* Note:
11+
* 1. the loop invariant here slightly differs from the lecture slides as we are using 0-based indexing
12+
* 2. Insertion into the sorted portion is done byb 'bubbling' elements as in bubble sort
3313
*/
3414

3515
public class InsertionSort {
Lines changed: 33 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,36 @@
1+
# Insertion Sort
2+
3+
Insertion sort is a comparison-based sorting algorithm that builds the final sorted array one element at a
4+
time. It works by repeatedly taking an element from the unsorted portion of the array and
5+
inserting it correctly (portion remains sorted) into the sorted portion. Note that the position is not final
6+
since subsequent elements from unsorted portion may displace previously inserted elements. What's important is
7+
the sorted region remains sorted. More succinctly: <br>
8+
At the kth iteration, we take the element arr[k] and insert
9+
it into arr[0, k-1] following sorted order, returning us arr[0, k] in sorted order.
10+
111
![InsertionSort](../../../../assets/InsertionSort.png)
212

13+
## Complexity Analysis
14+
**Time**:
15+
- Worst case (reverse sorted array): O(n^2)
16+
- Average case: O(n^2)
17+
- Best case (sorted array): O(n)
18+
19+
In the worst case, inserting an element into the sorted array of length m requires us to iterate through the
20+
entire array, requiring O(m) time. Since InsertionSort does this insertion (n - 1) times, the time complexity
21+
of InsertionSort in the worst case is 1 + 2 + ... + (n-2) + (n-1) = O(n^2).
22+
23+
In the best case of an already sorted array, inserting an element into the sorted array of length m requires
24+
O(1) time as we insert it directly behind the first position of the pointer in the sorted array. Since InsertionSort
25+
does this insertion (n-1) times, the time complexity of InsertionSort in the best case is O(1) * (n-1) = O(n).
26+
27+
**Space**: O(1) since sorting is done in-place
28+
29+
## Notes
30+
### Common Misconception
31+
Its invariant is often confused with selection sort's. In selection sort, an element in the unsorted region will
32+
be immediately placed in its correct and final position as it would be in the sorted array. This is not the case
33+
for insertion sort. However, it is because of this 'looser' invariant that allows for a better best case time complexity
34+
for insertion sort.
35+
336
Image Source: https://www.hackerrank.com/challenges/correctness-invariant/problem
Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,22 @@
1+
# Selection Sort
2+
3+
Selection sort is another intuitive comparison-based sorting algorithm. It works similarly to other sorting algorithms
4+
like bubble and insertion in the sense that it maintains a sorted and unsorted region. It does so by repeatedly finding
5+
smallest (or largest) element in the unsorted region, and places the element in the correct and final position as it
6+
would be in the sorted array.
7+
18
![SelectionSort](../../../../assets/SelectionSort.png)
29

10+
## Complexity Analysis
11+
**Time**:
12+
- Worst case: O(n^2)
13+
- Average case: O(n^2)
14+
- Best case: O(n^2)
15+
16+
Regardless of how sorted the input array is, selectionSort will run the minimum element finding algorithm (n-1)
17+
times. For an input array of length m, finding the minimum element necessarily takes O(m) time. Therefore, the
18+
time complexity of selectionSort is n + (n-1) + (n-2) + ... + 2 = O(n^2)
19+
20+
**Space**: O(1) since sorting is done in-place
21+
322
Image Source: https://www.hackerearth.com/practice/algorithms/sorting/selection-sort/tutorial/

src/algorithms/sorting/selectionSort/SelectionSort.java

Lines changed: 5 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -3,26 +3,15 @@
33
/** Here, we are implementing SelectionSort where we sort the array in increasing (or more precisely, non-decreasing)
44
* order.
55
*
6-
* Brief Description and Implementation Invariant:
7-
* Let the array to be sorted be A of length n. SelectionSort works by finding the minimum element A[j] in A[i...n],
8-
* then swapping A[i] with A[j], for i in [0, n-1). The loop invariant is: at the end of the kth iteration, the
9-
* smallest k items are correctly sorted in the first k positions of the array.
6+
* Implementation Invariant:
7+
* Let the array of length n to be sorted be A.
8+
* The loop invariant is:
9+
* At the end of the kth iteration, the smallest k items are correctly sorted in the first k positions of the array.
1010
*
11-
* At the end of the (n-1)th iteration of the loop, the smallest (n-1) items are correctly sorted in the first (n-1)
11+
* So, at the end of the (n-1)th iteration of the loop, the smallest (n-1) items are correctly sorted in the first (n-1)
1212
* positions of the array, leaving the last item correctly positioned in the last index of the array. Therefore,
1313
* (n-1) iterations of the loop is sufficient.
1414
*
15-
* Complexity Analysis:
16-
* Time:
17-
* - Worst case: O(n^2)
18-
* - Average case: O(n^2)
19-
* - Best case: O(n^2)
20-
* Regardless of how sorted the input array is, selectionSort will run the minimum element finding algorithm (n-1)
21-
* times. For an input array of length m, finding the minimum element necessarily takes O(m) time. Therefore, the
22-
* time complexity of selectionSort is n + (n-1) + (n-2) + ... + 2 = O(n^2)
23-
*
24-
* Space:
25-
* - O(1) since sorting is done in-place
2615
*/
2716

2817
public class SelectionSort {

src/dataStructures/linkedList/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -18,14 +18,14 @@ linked lists are stored across memory and are connected to each other via pointe
1818
*Source: BeginnersBook*
1919

2020
## Analysis
21-
Time Complexity: Depends on operations, O(n) in general for most operations.
21+
**Time Complexity**: Depends on operations, O(n) in general for most operations.
2222

2323
Most operations require iterating the linked list. For instance,
2424
searching for an element in a linked list requires iterating from the head to the tail, incurring O(n)
2525
time complexity in the worst and average case. The best case would be O(1), for instance, when the head is the desired
2626
element.
2727

28-
Space Complexity: O(n) where n is the size of the linked list.
28+
**Space Complexity**: O(n) where n is the size of the linked list.
2929

3030
## Notes
3131

0 commit comments

Comments
 (0)