Skip to content

Commit 1d01d75

Browse files
author
Baichuan Sun
committed
rename
1 parent f3c55e9 commit 1d01d75

File tree

2 files changed

+14
-14
lines changed

2 files changed

+14
-14
lines changed

README.md

Lines changed: 14 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -63,7 +63,7 @@ For other installation options, please refer to the fast.ai documentation. The f
6363
6464
For Twin Neural Networks, multiple input data are passed through a _encoding_ neural network to generated their hyper-dimensional embedding vectors, which are concatenated before fed into a _fully connected network_ for output, as shown in _Fig. 1_.
6565
66-
| ![siamese](static/siamese_archi.png) |
66+
| ![twin](static/twin_archi.png) |
6767
|:--:|
6868
| **Fig. 1 - Twin Neural Network Architecture**|
6969
@@ -131,7 +131,7 @@ head = Sequential(
131131
As shown in _Fig. 1_, both images are passed through the encoding network, and the output is concatenated before fed into the fully connected network.
132132

133133
```python
134-
class SiameseModel(Module):
134+
class TwinModel(Module):
135135
def __init__(self, encoder, head):
136136
super().__init__()
137137
self.encoder, self.head = encoder, head
@@ -140,7 +140,7 @@ class SiameseModel(Module):
140140
ftrs = torch.cat([self.encoder(x1), self.encoder(x2)], dim=1)
141141
return self.head(ftrs)
142142

143-
model = SiameseModel(encoder, head)
143+
model = TwinModel(encoder, head)
144144
```
145145

146146
### Dataset and Transformations
@@ -174,7 +174,7 @@ image_tfm = transforms.Compose(
174174
Per fast.ai's unique semantic requirement, define the basic image-pair and label data entity for visualisation, by:
175175

176176
```python
177-
class SiameseImage(fastuple):
177+
class TwinImage(fastuple):
178178
@staticmethod
179179
def img_restore(image: torch.Tensor):
180180
return (image - image.min()) / (image.max() - image.min())
@@ -203,7 +203,7 @@ Then define the helper function parsing image breads from its file path, and tak
203203
def label_func(fname):
204204
return re.match(r'^(.*)_\d+.jpg$', fname.name).groups()[0]
205205

206-
class SiameseTransform(Transform):
206+
class TwinTransform(Transform):
207207
def __init__(self, files, label_func, splits):
208208
self.labels = files.map(label_func).unique()
209209
self.lbl2files = {
@@ -218,7 +218,7 @@ class SiameseTransform(Transform):
218218
if (f not in self.valid) and random.random() < 0.5:
219219
img1, img2 = img2, img1
220220
img1, img2 = image_tfm(img1), image_tfm(img2)
221-
return SiameseImage(img1, img2, t)
221+
return TwinImage(img1, img2, t)
222222

223223
def _draw(self, f):
224224
same = random.random() < 0.5
@@ -236,7 +236,7 @@ Randomly split the dataset into train/validation partitions, and construct `data
236236

237237
```python
238238
splits = RandomSplitter()(files)
239-
tfm = SiameseTransform(files, label_func, splits)
239+
tfm = TwinTransform(files, label_func, splits)
240240
tls = TfmdLists(files, tfm, splits=splits)
241241
dls = tls.dataloaders(
242242
bs=32,
@@ -252,11 +252,11 @@ Now setup loss function and parameter groups, after which trigger the fast.ai tr
252252
def loss_func(out, targ):
253253
return nn.CrossEntropyLoss()(out, targ.long())
254254

255-
def siamese_splitter(model):
255+
def twin_splitter(model):
256256
return [params(model.encoder), params(model.head)]
257257

258258
learn = Learner(
259-
dls, model, loss_func=loss_func, splitter=siamese_splitter, metrics=accuracy
259+
dls, model, loss_func=loss_func, splitter=twin_splitter, metrics=accuracy
260260
)
261261

262262
learn.freeze()
@@ -285,7 +285,7 @@ torch.save(learn.encoder.state_dict(), "./model/encoder_weight.pth")
285285
torch.save(learn.head.state_dict(), "./model/head_weight.pth")
286286
```
287287

288-
For more details about the modeling process, refer to `notebook/01_fastai_siamese.ipynb` [[link](notebook/01_fastai_siamese.ipynb)].
288+
For more details about the modeling process, refer to `notebook/01_fastai_twin.ipynb` [[link](notebook/01_fastai_twin.ipynb)].
289289

290290
### Convolutional Neural Network Interpretation
291291

@@ -534,19 +534,19 @@ Now it's ready to setup and launch TorchServe.
534534
Step 1: Archive the model PyTorch
535535

536536
```bash
537-
torch-model-archiver --model-name siamese --version 1.0 --serialized-file ./model/encoder_weight.pth --export-path model_store --handler ./deployment/handler.py -f --extra-files ./model/head_weight.pth -f
537+
torch-model-archiver --model-name twin --version 1.0 --serialized-file ./model/encoder_weight.pth --export-path model_store --handler ./deployment/handler.py -f --extra-files ./model/head_weight.pth -f
538538
```
539539

540540
Step 2: Serve the Model
541541

542542
```bash
543-
torchserve --start --ncs --model-store model_store --models siamese.mar
543+
torchserve --start --ncs --model-store model_store --models twin.mar
544544
```
545545

546546
Step 3: Call API and Get the Response (here we use [httpie](https://httpie.org/)).
547547

548548
```bash
549-
time http --form POST http://127.0.0.1:8080/predictions/siamese left@sample/c1.jpg right@sample/c2.jpg cam=False
549+
time http --form POST http://127.0.0.1:8080/predictions/twin left@sample/c1.jpg right@sample/c2.jpg cam=False
550550
>>>
551551
HTTP/1.1 200
552552
Cache-Control: no-cache; no-store, must-revalidate, private
@@ -653,7 +653,7 @@ This repository presented an end-to-end workflow of training a Twin Neural Netwo
653653
## Reference
654654

655655
- [fast.ai · Making neural nets uncool again](https://www.fast.ai/)
656-
- [Tutorial - Using fastai on a custom new task](https://docs.fast.ai/tutorial.siamese.html#Patch-in-a-siampredict-method-to-Learner,-to-automatically-show-images-and-prediction)
656+
- [Tutorial - Using fastai on a custom new task](https://docs.fast.ai/tutorial.siamese.html)
657657
- [Deep Learning for Coders with Fastai and Pytorch: AI Applications Without a PhD](https://www.amazon.com/Deep-Learning-Coders-fastai-PyTorch/dp/1492045527)
658658
- [Walk with fastai: Lesson 7 - Siamese](https://walkwithfastai.com/Siamese)
659659
- [TORCHSERVE](https://pytorch.org/serve/)
File renamed without changes.

0 commit comments

Comments
 (0)