Skip to content

Commit a4098e4

Browse files
authored
Merge pull request #60 from 4ndrelim/branch-UpdateDocs
docs: Standardise READMEs and improve clarity
2 parents 0550787 + 2b5e701 commit a4098e4

File tree

18 files changed

+165
-125
lines changed

18 files changed

+165
-125
lines changed

src/main/java/algorithms/sorting/bubbleSort/BubbleSort.java

Lines changed: 8 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -2,20 +2,16 @@
22

33
/**
44
* Here, we are implementing BubbleSort where we sort the array in increasing (or more precisely, non-decreasing)
5-
* order.
6-
* <p>
7-
* Brief Description and Implementation Invariant:
8-
* BubbleSort relies on the outer loop variant that after the kth iteration, the biggest k items are correctly sorted
9-
* at the final k positions of the array. The job of the kth iteration of the outer loop is to bubble the kth
10-
* largest element to the kth position of the array from the right (i.e. its correct position). This is done through
11-
* repeatedly comparing adjacent elements and swapping them if they are in the wrong order.
5+
* order. Below are some details specific to this implementation.
126
* <p>
7+
* Early Termination:
138
* At the end of the (n-1)th iteration of the outer loop, where n is the length of the array, the largest (n-1)
149
* elements are correctly sorted at the final (n-1) positions of the array, leaving the last 1 element placed correctly
1510
* in the first position of the array. Therefore, (n-1) iterations of the outer loop is sufficient.
1611
* <p>
17-
* At the kth iteration of the outer loop, we only require (n-k) adjacent comparisons to get the kth largest
18-
* element to its correct position.
12+
* Slight optimisation:
13+
* At the kth iteration of the outer loop, we only require (n-k) adjacent comparisons to get the kth-largest element
14+
* to its correct position.
1915
*/
2016
public class BubbleSort {
2117
/**
@@ -28,10 +24,10 @@ public static int[] sort(int[] arr) {
2824
int n = arr.length;
2925
boolean swapped; // tracks of the presence of swaps within one iteration of the outer loop to
3026
// facilitate early termination
31-
for (int i = 0; i < n - 1; i++) { //outer loop which supports the invariant
27+
for (int i = 0; i < n - 1; i++) { // outer loop which supports the invariant; n-1 suffice
3228
swapped = false;
33-
for (int j = 0; j < n - 1 - i; j++) { //inner loop that does the adjacent comparisons
34-
if (arr[j] > arr[j + 1]) { //if we changed this to <, we will sort the array in non-increasing order
29+
for (int j = 0; j < n - 1 - i; j++) { // inner loop that does the adjacent comparisons
30+
if (arr[j] > arr[j + 1]) { // if we changed this to <, we will sort the array in non-increasing order
3531
int temp = arr[j];
3632
arr[j] = arr[j + 1];
3733
arr[j + 1] = temp;

src/main/java/algorithms/sorting/bubbleSort/README.md

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,20 @@
11
# Bubble Sort
22

3+
## Background
4+
35
Bubble sort is one of the more intuitive comparison-based sorting algorithms.
46
It makes repeated comparisons between neighbouring elements, 'bubbling' (side-by-side swaps)
57
largest (or smallest) element in the unsorted region to the sorted region (often the front or the back).
68

79
![bubble sort img](../../../../../../docs/assets/images/BubbleSort.jpeg)
810

11+
### Implementation Invariant
12+
After the kth iteration, the biggest k items are correctly sorted at the final k positions of the array.
13+
14+
The job of the kth iteration of the outer loop is to bubble the kth-largest element to the kth position of the array
15+
from the right (i.e. its correct position).
16+
This is done through repeatedly comparing adjacent elements and swapping them if they are in the wrong order.
17+
918
## Complexity Analysis
1019

1120
**Time**:
@@ -22,3 +31,5 @@ This implementation of BubbleSort terminates the outer loop once there are no sw
2231
outer loop. This improves the best case time complexity to O(n) for an already sorted array.
2332

2433
**Space**: O(1) since sorting is done in-place
34+
35+
## Notes

src/main/java/algorithms/sorting/countingSort/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ At the end of the ith iteration, the ith element (of the original array) from th
1515
its correct position.
1616

1717
Note: An alternative implementation from the front is easily done with minor modification.
18-
The catch is that this implementation is not stable.
18+
The catch is that this implementation would not be stable.
1919

2020
### Common Misconception
2121

src/main/java/algorithms/sorting/cyclicSort/generalised/README.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Generalized Case
22

3-
## More Details
3+
## Background
44

55
Implementation of cyclic sort in the generalised case where the input can contain any integer and duplicates.
66

@@ -36,3 +36,5 @@ Read: &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
3636
- Average: O(n^2), it's bounded by the above two
3737

3838
**Space**: O(1) auxiliary space, this is an in-place algorithm
39+
40+
## Notes

src/main/java/algorithms/sorting/cyclicSort/simple/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Simple Case
22

3-
## More Details
3+
## Background
44

55
Cyclic Sort can achieve O(n) time complexity in cases where the elements of the collection have a known,
66
continuous range, and there exists a direct, O(1) time-complexity mapping from each element to its respective index in
@@ -41,7 +41,7 @@ The algorithm does a 2-pass iteration.
4141
Note that the answer is necessarily between 0 and n (inclusive), where n is the length of the array,
4242
otherwise there would be a contradiction.
4343

44-
## Misc
44+
## Notes
4545

4646
1. It may seem quite trivial to sort integers from 0 to n-1 when one could simply generate such a sequence.
4747
But this algorithm is useful in cases where the integers to be sorted are keys to associated values (or some mapping)

src/main/java/algorithms/sorting/insertionSort/InsertionSort.java

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -4,10 +4,6 @@
44
* Here, we are implementing InsertionSort where we sort the array in increasing (or more precisely, non-decreasing)
55
* order.
66
* <p>
7-
* Implementation Invariant:
8-
* The loop invariant is: at the end of kth iteration, the first (k+1) items in the array are in sorted order.
9-
* At the end of the (n-1)th iteration, all n items in the array will be in sorted order.
10-
* <p>
117
* Note:
128
* 1. the loop invariant here slightly differs from the lecture slides as we are using 0-based indexing
139
* 2. Insertion into the sorted portion is done byb 'bubbling' elements as in bubble sort

src/main/java/algorithms/sorting/insertionSort/README.md

Lines changed: 10 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,15 +1,23 @@
11
# Insertion Sort
22

3+
## Background
4+
35
Insertion sort is a comparison-based sorting algorithm that builds the final sorted array one element at a
46
time. It works by repeatedly taking an element from the unsorted portion of the array and
57
inserting it correctly (portion remains sorted) into the sorted portion. Note that the position is not final
68
since subsequent elements from unsorted portion may displace previously inserted elements. What's important is
7-
the sorted region remains sorted. More succinctly: <br>
8-
At the kth iteration, we take the element arr[k] and insert
9+
the sorted region remains sorted.
10+
11+
More succinctly, at the kth iteration, we take the element arr[k] and insert
912
it into arr[0, k-1] following sorted order, returning us arr[0, k] in sorted order.
1013

1114
![InsertionSort](../../../../../../docs/assets/images/InsertionSort.png)
1215

16+
### Implementation Invariant
17+
The loop invariant: At the end of kth iteration, the first (k+1) items in the array are in sorted order.
18+
19+
At the end of the (n-1)th iteration, all n items in the array will be in sorted order.
20+
1321
## Complexity Analysis
1422

1523
**Time**:

src/main/java/algorithms/sorting/quickSort/README.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -20,3 +20,9 @@ pivot check.
2020
So, we introduced 3-way partitioning to handle duplicates by partitioning the array into three sections: elements less
2121
than the pivot, elements equal to the pivot, and elements greater than the pivot. Now the good pivot check ignores the
2222
size of the segment that comprises elements = to pivot.
23+
24+
## Recommended Order of Reading
25+
1. [Hoares](./hoares)
26+
2. [Lomuto](./lomuto)
27+
3. [Paranoid](./paranoid)
28+
4. [3-way partitioning](./threeWayPartitioning)

src/main/java/algorithms/sorting/quickSort/lomuto/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
This is how QuickSort works if we always pick the first element as the pivot.
1+
This is how QuickSort works if we always pick the first element as the pivot with Lomuto's partitioning.
22

33
![QuickSort with first element as pivot](../../../../../../../docs/assets/images/QuickSortFirstPivot.png)
44

src/main/java/algorithms/sorting/radixSort/README.md

Lines changed: 19 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -2,25 +2,24 @@
22

33
## Background
44

5-
Radix Sort is a non-comparison based, stable sorting algorithm with a counting sort subroutine.
5+
Radix Sort is a non-comparison based, stable sorting algorithm that conventionally uses counting sort as a subroutine.
66

7-
Radix Sort continuously sorts based on the least-significant segment of a element
8-
to the most-significant value of a element.
7+
Radix Sort performs counting sort several times on the numbers. It sorts starting with the least-significant segment
8+
to the most-significant segment.
99

10+
### Segments
1011
The definition of a 'segment' is user defined and defers from implementation to implementation.
11-
Within our implementation, we define each segment as a bit chunk.
12+
It is most commonly defined as a bit chunk.
1213

1314
For example, if we aim to sort integers, we can sort each element
1415
from the least to most significant digit, with the digits being our 'segments'.
1516

1617
Within our implementation, we take the binary representation of the elements and
17-
partition it into 8-bit segments, a integer is represented in 32 bits,
18+
partition it into 8-bit segments. An integer is represented in 32 bits,
1819
this gives us 4 total segments to sort through.
1920

20-
Note that the binary representation is weighted positional,
21-
where each bit's value is dependent on its overall position
22-
within the representation (the n-th bit from the right represents *2^n*),
23-
hence we can actually increase / decrease the number segments we wish to conduct a split from.
21+
Note that the number of segments is flexible and can range up to the number of digits in the binary representation.
22+
(In this case, sub-routine sort is done on every digit from right to left)
2423

2524
![Radix Sort](https://miro.medium.com/v2/resize:fit:661/1*xFnpQ4UNK0TvyxiL8r1svg.png)
2625

@@ -46,22 +45,23 @@ Hence, a stable sort is required to maintain the order as
4645
the sorting is done with respect to each of the segments.
4746

4847
## Complexity Analysis
48+
Let b-bit words be broken into r-bit pieces. Let n be the number of elements to sort.
4949

50-
**Time**:
51-
Let *b* be the length of a single element we are sorting and *r* is the amount of bit-string
52-
we plan to break each element into.
53-
(Essentially, *b/r* represents the number of segments we
54-
sort on and hence the number of passes we do during our sort).
50+
*b/r* represents the number of segments and hence the number of counting sort passes. Note that each pass
51+
of counting sort takes *(2^r + n)* (O(k+n) where k is the range which is 2^r here).
5552

56-
Note that we derive *(2^r + n)* from the counting sort subroutine,
57-
since we have *2^r* represents the range since we have *r* bits.
58-
59-
We get a general time complexity of *O((b/r) * (2^r + n))*
53+
**Time**: *O((b/r) * (2^r + n))*
6054

6155
**Space**: *O(n + 2^r)*
6256

63-
## Notes
57+
### Choosing r
58+
Previously we said the number of segments is flexible. Indeed, it is but for more optimised performance, r needs to be
59+
carefully chosen. The optimal choice of r is slightly smaller than logn which can be justified with differentiation.
6460

61+
Briefly, r=lgn --> Time complexity can be simplified to (b/lgn)(2n). <br>
62+
For numbers in the range of 0 - n^m, b = mlgn and so the expression can be further simplified to *O(mn)*.
63+
64+
## Notes
6565
- Radix sort's time complexity is dependent on the maximum number of digits in each element,
6666
hence it is ideal to use it on integers with a large range and with little digits.
6767
- This could mean that Radix Sort might end up performing worst on small sets of data

0 commit comments

Comments
 (0)