Skip to content

Commit 5222d50

Browse files
authored
docs: fastfit tutorial contains link to copied blog post (#4995)
<!-- Thanks for your contribution! As part of our Community Growers initiative 🌱, we're donating Justdiggit bunds in your name to reforest sub-Saharan Africa. To claim your Community Growers certificate, please contact David Berenstein in our Slack community or fill in this form https://tally.so/r/n9XrxK once your PR has been merged. --> # Description Change the link to the original blog post. Closes #4994 **Type of change** (Remember to title the PR according to the type of change) - [ ] Documentation update **How Has This Been Tested** (Please describe the tests that you ran to verify your changes.) - [ ] `sphinx-autobuild` (read [Developer Documentation](https://docs.argilla.io/en/latest/community/developer_docs.html#building-the-documentation) for more details) **Checklist** - [ ] I added relevant documentation - [ ] I followed the style guidelines of this project - [ ] I did a self-review of my code - [ ] I made corresponding changes to the documentation - [ ] My changes generate no new warnings - [ ] I filled out [the contributor form](https://tally.so/r/n9XrxK) (see text above) - [ ] I have added relevant notes to the `CHANGELOG.md` file (See https://keepachangelog.com/)
1 parent eddbcb4 commit 5222d50

File tree

2 files changed

+3
-3
lines changed

2 files changed

+3
-3
lines changed

docs/_source/tutorials_and_integrations/tutorials/feedback/training-fastfit-agreement.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -36,7 +36,7 @@
3636
"\n",
3737
"[**FastFit**](https://github.com/IBM/fastfit) is a library that allows you to train a multi-class classifier with few-shot learning. It is based on the [**transformers**](https://huggingface.co/transformers/) library and uses a pre-trained model to fine-tune it on a small dataset. This is particularly useful when you have a small dataset and you want to train a model quickly. However, [**SetFit**](https://github.com/huggingface/setfit) is another well-know library that also allows few-shot learning with Sentence Transformers.\n",
3838
"\n",
39-
"So, why using one and not the other? Based on this [article](https://medium.com/@meetgandhi586/comparing-setfit-fastfit-and-semantic-router-finding-the-best-nlp-chatbot-intent-detection-d8161a7ad117), where the author compares FastFit, SetFit, and Semantic Router, we can determine some distinctions.\n",
39+
"So, why using one and not the other? Based on this [article](https://pub.towardsai.net/few-shot-nlp-intent-classification-d29bf85548aa), where the author compares FastFit, SetFit, and Semantic Router, we can determine some distinctions.\n",
4040
"\n",
4141
"| **Aspect** | **FastFit** | **SetFit** |\n",
4242
"|---------------------------|-----------------------------------------------|--------------------------------------------|\n",
@@ -335,7 +335,7 @@
335335
],
336336
"source": [
337337
"# Sample the dataset\n",
338-
"dataset[\"train\"] = sample_dataset(dataset[\"train\"], label_column=\"intent\", num_samples_per_label=5, seed=42)\n",
338+
"dataset[\"train\"] = sample_dataset(dataset[\"train\"], label_column=\"intent\", num_samples_per_label=10, seed=42)\n",
339339
"\n",
340340
"# Rename the validation split\n",
341341
"dataset['validation'] = dataset.pop('val')\n",

docs/_source/tutorials_and_integrations/tutorials/tutorials.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -139,7 +139,7 @@ Learn how to monitor data and model drift in a real-world scenario using differe
139139
140140
Learn how to train an ABSA model and evaluate with Argilla.
141141
```
142-
```{grid-item-card} # 🙌 Analyzing Annotation Metrics with FastFit Model Predictions
142+
```{grid-item-card} 🙌 Analyzing Annotation Metrics with FastFit Model Predictions
143143
:link: feedback/training-fastfit-agreement.html
144144
Learn how to train a FastFit model and calculate well-know annotation metrics.
145145
```

0 commit comments

Comments
 (0)