Skip to content

Commit ac13c9e

Browse files
author
Baichuan Sun
committed
doc: address bracket issues
1 parent 0fe8a0b commit ac13c9e

File tree

1 file changed

+4
-2
lines changed

1 file changed

+4
-2
lines changed

README.md

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -67,9 +67,11 @@ For Twin Neural Networks, multiple input data are passed through an _encoding_ n
6767
|:--:|
6868
| **Fig. 1 - Twin Neural Network Architecture**|
6969
70+
Now that Fast.ai is installed, we can define this model architecture in pure PyTorch for Fast.ai training. Specifically, we will load the pre-trained encoding network from PyTorch, and define our customised FCN, before passing the images through both.
71+
7072
#### Encoding Network
7173
72-
Now that Fast.ai is installed, we can create an image transformation pipeline to prepare our model for Fast.ai training. First, we will load the encoding network and the FCN and pass the images through both. For the encoding network, `ResNet50` is used, as an example, with its pre-trained weights, and the last fully connected layer is removed.[I'm not sure what you mean by this - do you mean it's not shown in the sample commands, or it's been removed from the process?]
74+
For the encoding network, `ResNet50` is used, as an example, with its pre-trained weights, and the last fully connected layer is removed to be replaced by our own FCN in the following step.
7375
7476
```python
7577
import torchvision.models as models
@@ -145,7 +147,7 @@ model = TwinModel(encoder, head)
145147

146148
### Dataset and Transformations
147149

148-
[Why is this step needed? Can you add a sentence to explain?] Import `fastai.vision` modules and download the sample data `PETS`, by:
150+
With the Twin model defined, next we need to prepare the dataset and corresponding data transformations for the model to learn from. Import `fastai.vision` modules and download the sample data `PETS`, by:
149151

150152
```python
151153
from fastai.vision.all import *

0 commit comments

Comments
 (0)