You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+5-5Lines changed: 5 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -72,7 +72,7 @@ This package provides only one class - `Ngram`. It models the n-gram.
72
72
You can create n-gram from any `SequenceableCollection`:
73
73
74
74
```Smalltalk
75
-
trigram := Ngram withElements: #(do not like).
75
+
trigram := AINgram withElements: #(do not like).
76
76
tetragram := #(green eggs and ham) asNgram.
77
77
```
78
78
@@ -82,13 +82,13 @@ Or by explicitly providing the history (n-gram of lower order) and last element:
82
82
hist := #(green eggs and) asNgram.
83
83
w := 'ham'.
84
84
85
-
ngram := Ngram withHistory: hist last: w.
85
+
ngram := AINgram withHistory: hist last: w.
86
86
```
87
87
88
88
You can also create a zerogram - n-gram of order 0. It is an empty sequence with no history and no last word:
89
89
90
90
```Smalltalk
91
-
Ngram zerogram.
91
+
AINgram zerogram.
92
92
```
93
93
94
94
### Accessing
@@ -115,13 +115,13 @@ brown := file contents.
115
115
```
116
116
#### 2. Training a 2-gram language model on the corpus
117
117
```Smalltalk
118
-
model := NgramModel order: 2.
118
+
model := AINgramModel order: 2.
119
119
model trainOn: brown.
120
120
```
121
121
#### 3. Generating text of 100 words
122
122
At each step the model selects top 5 words that are most likely to follow the previous words and returns the random word from those five (this randomnes ensures that the generator does not get stuck in a cycle).
123
123
```Smalltalk
124
-
generator := NgramTextGenerator new model: model.
124
+
generator := AINgramTextGenerator new model: model.
0 commit comments