Skip to content

Commit f4deb6b

Browse files
committed
fix URLs, fix alt-text
1 parent 90d6b5a commit f4deb6b

File tree

2 files changed

+13
-12
lines changed

2 files changed

+13
-12
lines changed

README.Rmd

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -93,15 +93,15 @@ cbind(test, predict(fit, test, type = "prob")) %>%
9393
TabNet has intrinsic explainability feature through the visualization of attention map, either **aggregated**:
9494

9595
```{r model-explain}
96-
#| fig.alt: "An expainability plot showing for each variable of the test-set on the y axis the importance along each observation on the x axis. The value is a mask agggregate."
96+
#| fig.alt: "An heatmap as explainability plot showing for each variable of the test-set on the y axis the importance along each observation on the x axis. The value is a mask agggregate."
9797
explain <- tabnet_explain(fit, test)
9898
autoplot(explain)
9999
```
100100

101101
or at **each layer** through the `type = "steps"` option:
102102

103103
```{r step-explain}
104-
#| fig.alt: "An small-multiple expainability plot for each step of the Tabnet network. Each plot shows for each variable of the test-set on the y axis the importance along each observation on the x axis."
104+
#| fig.alt: "An small-multiple heatmap as explainability plot for each step of the Tabnet network. Each plot shows for each variable of the test-set on the y axis the importance along each observation on the x axis."
105105
autoplot(explain, type = "steps")
106106
```
107107

@@ -115,25 +115,25 @@ pretrain <- tabnet_pretrain(rec, train, epochs = 50, valid_split=0.1, learn_rate
115115
autoplot(pretrain)
116116
```
117117

118-
The example here is a toy example as the `train` dataset does actually contain outcomes. The vignette [`vignette("selfsupervised_training")`](articles/selfsupervised_training.html) will gives you the complete correct workflow step-by-step.
118+
The example here is a toy example as the `train` dataset does actually contain outcomes. The vignette [`vignette("selfsupervised_training")`](https://mlverse.github.io/tabnet/articles/selfsupervised_training.html) will gives you the complete correct workflow step-by-step.
119119

120120
## {tidymodels} integration
121121

122122
The integration within tidymodels workflows offers you unlimited opportunity to compare {tabnet} models with challengers.
123123

124-
Don't miss the [`vignette("tidymodels-interface")`](articles/tidymodels-interface.html) for that.
124+
Don't miss the [`vignette("tidymodels-interface")`](https://mlverse.github.io/tabnet/articles/tidymodels-interface.html) for that.
125125

126126
## Missing data in predictors
127127

128128
{tabnet} leverage the masking mechanism to deal with missing data, so you don't have to remove the entries in your dataset with some missing values in the predictors variables.
129129

130-
See [`vignette("Missing_data_predictors")`](articles/Missing_data_predictors.html)
130+
See [`vignette("Missing_data_predictors")`](https://mlverse.github.io/tabnet/articles/Missing_data_predictors.html)
131131

132132
## Imbalanced binary classification
133133

134134
{tabnet} includes a Area under the $Min(FPR,FNR)$ (AUM) loss function `nn_aum_loss()` dedicated to your imbalanced binary classification tasks.
135135

136-
Try it out in [`vignette("aum_loss")`](articles/aum_loss.html)
136+
Try it out in [`vignette("aum_loss")`](https://mlverse.github.io/tabnet/articles/aum_loss.html)
137137

138138
# Comparison with other implementations
139139

README.md

Lines changed: 7 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -126,15 +126,15 @@ explain <- tabnet_explain(fit, test)
126126
autoplot(explain)
127127
```
128128

129-
<img src="man/figures/README-model-explain-1.png" alt="An expainability plot showing for each variable of the test-set on the y axis the importance along each observation on the x axis. The value is a mask agggregate." width="100%" />
129+
<img src="man/figures/README-model-explain-1.png" alt="An heatmap as explainability plot showing for each variable of the test-set on the y axis the importance along each observation on the x axis. The value is a mask agggregate." width="100%" />
130130

131131
or at **each layer** through the `type = "steps"` option:
132132

133133
``` r
134134
autoplot(explain, type = "steps")
135135
```
136136

137-
<img src="man/figures/README-step-explain-1.png" alt="An small-multiple expainability plot for each step of the Tabnet network. Each plot shows for each variable of the test-set on the y axis the importance along each observation on the x axis." width="100%" />
137+
<img src="man/figures/README-step-explain-1.png" alt="An small-multiple heatmap as explainability plot for each step of the Tabnet network. Each plot shows for each variable of the test-set on the y axis the importance along each observation on the x axis." width="100%" />
138138

139139
## Self-supervised pretraining
140140

@@ -152,7 +152,7 @@ autoplot(pretrain)
152152

153153
The example here is a toy example as the `train` dataset does actually
154154
contain outcomes. The vignette
155-
[`vignette("selfsupervised_training")`](articles/selfsupervised_training.html)
155+
[`vignette("selfsupervised_training")`](https://mlverse.github.io/tabnet/articles/selfsupervised_training.html)
156156
will gives you the complete correct workflow step-by-step.
157157

158158
## {tidymodels} integration
@@ -161,7 +161,7 @@ The integration within tidymodels workflows offers you unlimited
161161
opportunity to compare {tabnet} models with challengers.
162162

163163
Don’t miss the
164-
[`vignette("tidymodels-interface")`](articles/tidymodels-interface.html)
164+
[`vignette("tidymodels-interface")`](https://mlverse.github.io/tabnet/articles/tidymodels-interface.html)
165165
for that.
166166

167167
## Missing data in predictors
@@ -171,15 +171,16 @@ you don’t have to remove the entries in your dataset with some missing
171171
values in the predictors variables.
172172

173173
See
174-
[`vignette("Missing_data_predictors")`](articles/Missing_data_predictors.html)
174+
[`vignette("Missing_data_predictors")`](https://mlverse.github.io/tabnet/articles/Missing_data_predictors.html)
175175

176176
## Imbalanced binary classification
177177

178178
{tabnet} includes a Area under the $Min(FPR,FNR)$ (AUM) loss function
179179
`nn_aum_loss()` dedicated to your imbalanced binary classification
180180
tasks.
181181

182-
Try it out in [`vignette("aum_loss")`](articles/aum_loss.html)
182+
Try it out in
183+
[`vignette("aum_loss")`](https://mlverse.github.io/tabnet/articles/aum_loss.html)
183184

184185
# Comparison with other implementations
185186

0 commit comments

Comments
 (0)