Skip to content
This repository was archived by the owner on Jan 15, 2024. It is now read-only.

Commit 34fae5c

Browse files
author
WANG, Chen
authored
[DOC] Add to toc tree depth to enable multiple level menu (#1081) (#1108)
* Add toc tree depth to enable multiple level menu. * Add index page for each chapter To enable multiple level contents on the navigation bar * Refactor sub-index pages into rst format
1 parent 75c29a3 commit 34fae5c

File tree

7 files changed

+166
-16
lines changed

7 files changed

+166
-16
lines changed

docs/examples/index.rst

Lines changed: 12 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -26,10 +26,9 @@ Word Embedding
2626

2727
.. toctree::
2828
:hidden:
29-
:maxdepth: 1
29+
:maxdepth: 2
3030

31-
word_embedding/word_embedding.ipynb
32-
word_embedding/word_embedding_training.ipynb
31+
word_embedding/index
3332

3433

3534
Language Model
@@ -47,9 +46,9 @@ Language Model
4746

4847
.. toctree::
4948
:hidden:
50-
:maxdepth: 1
49+
:maxdepth: 2
5150

52-
language_model/language_model.ipynb
51+
language_model/index
5352

5453

5554
Machine Translation
@@ -72,10 +71,9 @@ Machine Translation
7271

7372
.. toctree::
7473
:hidden:
75-
:maxdepth: 1
74+
:maxdepth: 2
7675

77-
machine_translation/gnmt.ipynb
78-
machine_translation/transformer.ipynb
76+
machine_translation/index
7977

8078

8179
Sentence Embedding
@@ -106,11 +104,9 @@ Sentence Embedding
106104

107105
.. toctree::
108106
:hidden:
109-
:maxdepth: 1
107+
:maxdepth: 2
110108

111-
sentence_embedding/elmo_sentence_representation.ipynb
112-
sentence_embedding/self_attentive_sentence_embedding.ipynb
113-
sentence_embedding/bert.ipynb
109+
sentence_embedding/index
114110

115111

116112
Sentiment Analysis
@@ -126,9 +122,9 @@ Sentiment Analysis
126122

127123
.. toctree::
128124
:hidden:
129-
:maxdepth: 1
125+
:maxdepth: 2
130126

131-
sentiment_analysis/sentiment_analysis.ipynb
127+
sentiment_analysis/index
132128

133129

134130
Sequence Sampling
@@ -145,6 +141,6 @@ Sequence Sampling
145141

146142
.. toctree::
147143
:hidden:
148-
:maxdepth: 1
144+
:maxdepth: 2
149145

150-
sequence_sampling/sequence_sampling.ipynb
146+
sequence_sampling/index
Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
Language Model
2+
==============
3+
4+
.. container:: cards
5+
6+
.. card::
7+
:title: LSTM-based Language Models
8+
:link: language_model.html
9+
10+
Learn what a language model is, what it can do, and how to train a word-level language model
11+
with truncated back-propagation-through-time (BPTT).
12+
13+
14+
.. toctree::
15+
:hidden:
16+
:maxdepth: 2
17+
18+
language_model.ipynb
19+
20+
21+
Lines changed: 28 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,28 @@
1+
Machine Translation
2+
===================
3+
4+
.. container:: cards
5+
6+
.. card::
7+
:title: Google Neural Machine Translation
8+
:link: gnmt.html
9+
10+
Learn how to train Google Neural Machine Translation, a seq2seq with attention model.
11+
12+
.. card::
13+
:title: Machine Translation with Transformer
14+
:link: transformer.html
15+
16+
Learn how to use a pre-trained transformer translation model for English-German translation.
17+
18+
19+
20+
.. toctree::
21+
:hidden:
22+
:maxdepth: 2
23+
24+
gnmt.ipynb
25+
transformer.ipynb
26+
27+
28+
Lines changed: 36 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,36 @@
1+
Sentence Embedding
2+
==================
3+
4+
.. container:: cards
5+
6+
.. card::
7+
:title: ELMo: Deep Contextualized Word Representations
8+
:link: elmo_sentence_representation.html
9+
10+
See how to use GluonNLP's model API to automatically download the pre-trained ELMo
11+
model from NAACL2018 best paper, and extract features with it.
12+
13+
.. card::
14+
:title: A Structured Self-attentive Sentence Embedding
15+
:link: self_attentive_sentence_embedding.html
16+
17+
See how to use GluonNLP to build more advanced model structure for extracting sentence
18+
embeddings to predict Yelp review rating.
19+
20+
.. card::
21+
:title: BERT: Bidirectional Encoder Representations from Transformers
22+
:link: bert.html
23+
24+
See how to use GluonNLP to fine-tune a sentence pair classification model with
25+
pre-trained BERT parameters.
26+
27+
.. toctree::
28+
:hidden:
29+
:maxdepth: 2
30+
31+
elmo_sentence_representation.ipynb
32+
self_attentive_sentence_embedding.ipynb
33+
bert.ipynb
34+
35+
36+
Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
Sentiment Analysis
2+
==================
3+
4+
.. container:: cards
5+
6+
.. card::
7+
:title: Sentiment Analysis by Fine-tuning Word Language Model
8+
:link: sentiment_analysis.html
9+
10+
See how to fine-tune a pre-trained language model to perform sentiment analysis on movie reviews.
11+
12+
13+
.. toctree::
14+
:hidden:
15+
:maxdepth: 2
16+
17+
sentiment_analysis.ipynb
18+
19+
Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
Sequence Sampling
2+
=================
3+
4+
.. container:: cards
5+
6+
.. card::
7+
:title: Sequence Generation with Sampling and Beam Search
8+
:link: sequence_sampling.html
9+
10+
Learn how to generate sentence from pre-trained language model through sampling and beam
11+
search.
12+
13+
14+
.. toctree::
15+
:hidden:
16+
:maxdepth: 2
17+
18+
sequence_sampling.ipynb
19+
20+
21+
Lines changed: 29 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,29 @@
1+
Word Embedding
2+
==============
3+
4+
.. container:: cards
5+
6+
.. card::
7+
:title: Pre-trained Word Embeddings
8+
:link: word_embedding.html
9+
10+
Basics on how to use word embedding with vocab in GluonNLP and apply it on word similarity and
11+
analogy problems.
12+
13+
.. card::
14+
:title: Word Embeddings Training and Evaluation
15+
:link: word_embedding_training.html
16+
17+
Learn how to train fastText and word2vec embeddings on your own dataset, and determine
18+
embedding quality through intrinsic evaluation.
19+
20+
21+
.. toctree::
22+
:hidden:
23+
:maxdepth: 2
24+
25+
word_embedding.ipynb
26+
word_embedding_training.ipynb
27+
28+
29+

0 commit comments

Comments
 (0)