Skip to content

Commit 778eef6

Browse files
committed
Add download instructions for pretrained model in dynamic quantization tutorial
Fixes #3254 - Added wget command to download word_language_model_quantize.pth - Included note about file placement - Improved tutorial usability by providing missing download information
1 parent d627981 commit 778eef6

File tree

1 file changed

+16
-4
lines changed

1 file changed

+16
-4
lines changed

advanced_source/dynamic_quantization_tutorial.py

Lines changed: 16 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -129,15 +129,27 @@ def tokenize(self, path):
129129

130130
corpus = Corpus(model_data_filepath + 'wikitext-2')
131131

132+
######################################################################
133+
# 3. Load the pretrained model
134+
# -----------------------------
135+
#
132136
######################################################################
133137
# 3. Load the pretrained model
134138
# -----------------------------
135139
#
136140
# This is a tutorial on dynamic quantization, a quantization technique
137-
# that is applied after a model has been trained. Therefore, we'll simply load some
138-
# pretrained weights into this model architecture; these weights were obtained
139-
# by training for five epochs using the default settings in the word language model
140-
# example.
141+
# that is applied after a model has been trained. Therefore, we'll simply
142+
# load some pretrained weights into this model architecture; these
143+
# weights were obtained by training for five epochs using the default
144+
# settings in the word language model example.
145+
#
146+
# **Note:** Before running this tutorial, download the required pretrained model:
147+
#
148+
# .. code::
149+
#
150+
# wget https://s3.amazonaws.com/pytorch-tutorial-assets/word_language_model_quantize.pth
151+
#
152+
# Place the downloaded file in the data directory or update the model_data_filepath accordingly.
141153

142154
ntokens = len(corpus.dictionary)
143155

0 commit comments

Comments
 (0)