You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/machine-learning/algorithm-module-reference/latent-dirichlet-allocation.md
+5-6Lines changed: 5 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -25,7 +25,7 @@ This module takes a column of text, and generates these outputs:
25
25
26
26
+ A transformation, which you can save and reapply to new text used as input
27
27
28
-
This module uses the scikit-learn library. For more information about scikit-learn, see the [GitHub repository](https://github.com/scikit-learn/scikit-learn) which includes tutorials and an explanation of the algorithm.
28
+
This module uses the scikit-learn library. For more information about scikit-learn, see the [GitHub repository, which includes tutorials and an explanation of the algorithm.
29
29
30
30
### More about Latent Dirichlet Allocation (LDA)
31
31
@@ -37,7 +37,7 @@ A generative model can be preferable because it avoids making any strong assumpt
37
37
38
38
+ The implementation in this module is based on the [scikit-learn library](https://github.com/scikit-learn/scikit-learn/blob/master/sklearn/decomposition/_lda.py) for LDA.
39
39
40
-
For more information, see the [Technical notes](#bkmk_AboutLDA) section.
40
+
For more information, see the [Technical notes](#technical-notes) section.
41
41
42
42
## How to configure Latent Dirichlet Allocation
43
43
@@ -105,13 +105,12 @@ The module has two outputs:
105
105
106
106
+**Feature topic matrix**: The leftmost column contains the extracted text feature, and there is a column for each category containing the score for that feature in that category.
107
107
108
-
For details, see [Example of LDA results](#bkmk_Understanding).
109
108
110
109
### LDA transformation
111
110
112
-
This module also outputs the *transformation* that applies LDA to the dataset, as an [ITransform interface](itransform-interface.md).
111
+
This module also outputs the *LDA transformation* that applies LDA to the dataset.
113
112
114
-
You can save this transformation and re-use it for other datasets. This might be useful if you have trained on a large corpus and want to reuse the coefficients or categories.
113
+
You can save this transformation by register dataset under **Outputs+logs** tab in the right pane of the module and reuse it for other datasets. This might be useful if you have trained on a large corpus and want to reuse the coefficients or categories.
115
114
116
115
### Refining an LDA model or results
117
116
@@ -137,7 +136,7 @@ The accuracy of models based on LDA can often be improved by using natural langu
137
136
138
137
+ Named entity recognition
139
138
140
-
For more information, see [Preprocess Text](preprocess-text.md) and [Named Entity Recognition](named-entity-recognition.md).
139
+
For more information, see [Preprocess Text](preprocess-text.md).
141
140
142
141
In the designer, you can also use R or Python libraries for text processing: [Execute R Script](execute-r-script.md), [Execute Python Script](execute-python-script.md)
0 commit comments