Skip to content

Commit de6d161

Browse files
committed
started writing my own documentation
1 parent fa1e96c commit de6d161

File tree

2 files changed

+12
-8
lines changed

2 files changed

+12
-8
lines changed

.gitignore

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -21,6 +21,8 @@ local*
2121
formatting.x
2222
testrun.x
2323
storage/
24+
Makefile
25+
job.yaml
2426

2527
# Byte-compiled / optimized / DLL files
2628
__pycache__/

doc/Johan_page.md

Lines changed: 10 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -1,22 +1,22 @@
11
# Individual task for Johan
22

3-
## My implementation tasks
3+
## My implementation tasks
44

55
* Data: Implement [MNIST](../CollaborativeCoding/dataloaders/mnist_4_9.py) dataset with digits between 4-9.
6-
* Model: [MLP-model](../CollaborativeCoding/models/johan_model.py/) with 4 hidden layers, each with 77 neurons and ReLU activation.
6+
* Model: [MLP-model](../CollaborativeCoding/models/johan_model.py) with 4 hidden layers, each with 77 neurons and ReLU activation.
77
* Evaluation metric: [Precision](../CollaborativeCoding/metrics/precision.py).
88

99
## Implementation choices
1010

11-
### Dataset
11+
### Dataset
1212

13-
The choices regarding the dataset were mostly done in conjunction with Jan (@hzavadil98) as we were both using the MNIST dataset. Jan had the idea to download the binary files and construct the images from those. The group decided collaboratorily to make the package download the data once and store it for all of use to use. Hence the individual implementations are fairly similar, at least for the two MNIST dataloaders. Were it not for these individual tasks, there would have been one dataloader class, initialised with two separate ranges for labels 0-3 and 4-9. However, individual dataloaders had to be created to comply with the exam description. For my implementation, the labels had to be mapped to a range starting at 0: $(4-9) \to (0,5)$ since the cross-entropy loss function in PyTorch expect this range.
13+
The choices regarding the dataset were mostly done in conjunction with Jan (@hzavadil98) as we were both using the MNIST dataset. Jan had the idea to download the binary files and construct the images from those. The group decided collaboratively to make the package download the data once and store it for all of use to use. Hence, the individual implementations are fairly similar, at least for the two MNIST dataloaders. Were it not for these individual tasks, there would have been one dataloader class, initialised with two separate ranges for labels 0-3 and 4-9. However, individual dataloaders had to be created to comply with the exam description. For my implementation, the labels had to be mapped to a range starting at 0: $(4-9) \to (0,5)$ since the cross-entropy loss function in PyTorch expect this range.
1414

15-
## Experiences with running someone else's code
15+
## Experiences with running someone else's code
1616

17-
## Experiences having someone else to run my code
17+
## Experiences having someone else to run my code
1818

19-
## I learned how to use these tools during this course
19+
## I learned how to use these tools during this course
2020

2121
### Git-stuff
2222

@@ -26,8 +26,10 @@ The choices regarding the dataset were mostly done in conjunction with Jan (@hza
2626

2727
### Proper documentation
2828

29-
### Nice ways of testing code
29+
### Nice ways of testing code
3030

3131
### UV
3232

33+
## General thoughts on collaboration
3334

35+
As someone new to this University and how the IT-setup is here, this project was very fruitful. The positive thing about working with peers that are highly skilled in the relevant field is that the learning curve is steep, and the take home for me was significant. The con is that I constantly felt behind, spend half the time just understanding the changes implemented by others and felt that my contributions to the overall project was less significant. However, working with skilled peers have boosted my understanding about how things work around here, especially git (more fancy commands that add, commit, push and pull) and docker, cluster operations, kubernetes and logging metrics on Weights and Biases.

0 commit comments

Comments
 (0)