You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Rename Linear Attention section to Hybrid Attention, add lfm mapping
The section was called "Linear Attention" but the models in it
(FalconH1, Liquid LFM2, LFM2.5) are actually hybrid attention / SSM
architectures, not pure linear attention. Rename for accuracy.
Also add an "lfm" architecture key so LFM2.5_* notebooks (Conversational,
Text Completion, Translation, VL Vision) land in this section instead of
falling through to "Other". The existing "liquid" key only matched
Liquid_LFM2_* filenames, not the newer LFM2.5_* naming.
Copy file name to clipboardExpand all lines: README.md
+6-6Lines changed: 6 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -201,11 +201,15 @@ Below are Colab notebooks, organized by model. You can also view all [notebooks
201
201
|**Granite4.0****(3B)**| Conversational | <ahref="https://colab.research.google.com/github/unslothai/notebooks/blob/main/nb/Granite4.0.ipynb"target="_blank"rel="noopener noreferrer"><imgsrc="https://colab.research.google.com/assets/colab-badge.svg"alt="Open In Colab"></a> |
202
202
|**Granite4.0****(350M)**| Conversational | <ahref="https://colab.research.google.com/github/unslothai/notebooks/blob/main/nb/Granite4.0_350M.ipynb"target="_blank"rel="noopener noreferrer"><imgsrc="https://colab.research.google.com/assets/colab-badge.svg"alt="Open In Colab"></a> |
203
203
204
-
### Linear Attention Notebooks
204
+
### Hybrid Attention Notebooks
205
205
| Model | Type | Notebook Link |
206
206
| --- | --- | --- |
207
+
|**LFM2.5****(1.2B)**| Conversational | <ahref="https://colab.research.google.com/github/unslothai/notebooks/blob/main/nb/LFM2.5_(1.2B)-Conversational.ipynb"target="_blank"rel="noopener noreferrer"><imgsrc="https://colab.research.google.com/assets/colab-badge.svg"alt="Open In Colab"></a> |
208
+
|**LFM2.5 VL****(1.6B)**| Vision | <ahref="https://colab.research.google.com/github/unslothai/notebooks/blob/main/nb/LFM2.5_VL_(1.6B)-Vision.ipynb"target="_blank"rel="noopener noreferrer"><imgsrc="https://colab.research.google.com/assets/colab-badge.svg"alt="Open In Colab"></a> |
207
209
|**Liquid LFM2****(1.2B)**| Conversational | <ahref="https://colab.research.google.com/github/unslothai/notebooks/blob/main/nb/Liquid_LFM2_(1.2B)-Conversational.ipynb"target="_blank"rel="noopener noreferrer"><imgsrc="https://colab.research.google.com/assets/colab-badge.svg"alt="Open In Colab"></a> |
208
210
|**Liquid LFM2**| Conversational | <ahref="https://colab.research.google.com/github/unslothai/notebooks/blob/main/nb/Liquid_LFM2-Conversational.ipynb"target="_blank"rel="noopener noreferrer"><imgsrc="https://colab.research.google.com/assets/colab-badge.svg"alt="Open In Colab"></a> |
211
+
|**LFM2.5****(1.2B)**| Text Completion | <ahref="https://colab.research.google.com/github/unslothai/notebooks/blob/main/nb/LFM2.5_(1.2B)-Text_Completion.ipynb"target="_blank"rel="noopener noreferrer"><imgsrc="https://colab.research.google.com/assets/colab-badge.svg"alt="Open In Colab"></a> |
212
+
|**LFM2.5****(1.2B)**|| <ahref="https://colab.research.google.com/github/unslothai/notebooks/blob/main/nb/LFM2.5_(1.2B)-Translation.ipynb"target="_blank"rel="noopener noreferrer"><imgsrc="https://colab.research.google.com/assets/colab-badge.svg"alt="Open In Colab"></a> |
209
213
|**Falcon H1****(0.5B)**| Alpaca | <ahref="https://colab.research.google.com/github/unslothai/notebooks/blob/main/nb/Falcon_H1_(0.5B)-Alpaca.ipynb"target="_blank"rel="noopener noreferrer"><imgsrc="https://colab.research.google.com/assets/colab-badge.svg"alt="Open In Colab"></a> |
210
214
|**Falcon H1**| Alpaca | <ahref="https://colab.research.google.com/github/unslothai/notebooks/blob/main/nb/Falcon_H1-Alpaca.ipynb"target="_blank"rel="noopener noreferrer"><imgsrc="https://colab.research.google.com/assets/colab-badge.svg"alt="Open In Colab"></a> |
211
215
@@ -308,10 +312,6 @@ Below are Colab notebooks, organized by model. You can also view all [notebooks
308
312
| --- | --- | --- |
309
313
|**CodeForces cot Finetune<br>for Reasoning on CodeForces**| Reasoning | <ahref="https://colab.research.google.com/github/unslothai/notebooks/blob/main/nb/CodeForces-cot-Finetune_for_Reasoning_on_CodeForces.ipynb"target="_blank"rel="noopener noreferrer"><imgsrc="https://colab.research.google.com/assets/colab-badge.svg"alt="Open In Colab"></a> |
310
314
|**Synthetic Data Hackathon**| Synthetic Data | <ahref="https://colab.research.google.com/github/unslothai/notebooks/blob/main/nb/Synthetic_Data_Hackathon.ipynb"target="_blank"rel="noopener noreferrer"><imgsrc="https://colab.research.google.com/assets/colab-badge.svg"alt="Open In Colab"></a> |
311
-
|**LFM2.5****(1.2B)**| Conversational | <ahref="https://colab.research.google.com/github/unslothai/notebooks/blob/main/nb/LFM2.5_(1.2B)-Conversational.ipynb"target="_blank"rel="noopener noreferrer"><imgsrc="https://colab.research.google.com/assets/colab-badge.svg"alt="Open In Colab"></a> |
312
-
|**LFM2.5 VL****(1.6B)**| Vision | <ahref="https://colab.research.google.com/github/unslothai/notebooks/blob/main/nb/LFM2.5_VL_(1.6B)-Vision.ipynb"target="_blank"rel="noopener noreferrer"><imgsrc="https://colab.research.google.com/assets/colab-badge.svg"alt="Open In Colab"></a> |
313
-
|**LFM2.5****(1.2B)**| Text Completion | <ahref="https://colab.research.google.com/github/unslothai/notebooks/blob/main/nb/LFM2.5_(1.2B)-Text_Completion.ipynb"target="_blank"rel="noopener noreferrer"><imgsrc="https://colab.research.google.com/assets/colab-badge.svg"alt="Open In Colab"></a> |
314
-
|**LFM2.5****(1.2B)**|| <ahref="https://colab.research.google.com/github/unslothai/notebooks/blob/main/nb/LFM2.5_(1.2B)-Translation.ipynb"target="_blank"rel="noopener noreferrer"><imgsrc="https://colab.research.google.com/assets/colab-badge.svg"alt="Open In Colab"></a> |
315
315
|**Unsloth**| Studio | <ahref="https://colab.research.google.com/github/unslothai/notebooks/blob/main/nb/Unsloth_Studio.ipynb"target="_blank"rel="noopener noreferrer"><imgsrc="https://colab.research.google.com/assets/colab-badge.svg"alt="Open In Colab"></a> |
316
316
317
317
# 📒 Kaggle Notebooks
@@ -437,7 +437,7 @@ Below are Colab notebooks, organized by model. You can also view all [notebooks
437
437
|**Granite4.0****(3B)**| Conversational | <ahref="https://www.kaggle.com/notebooks/welcome?src=https://github.com/unslothai/notebooks/blob/main/nb/Kaggle-Granite4.0.ipynb&accelerator=nvidiaTeslaT4"target="_blank"rel="noopener noreferrer"><imgsrc="https://kaggle.com/static/images/open-in-kaggle.svg"alt="Open in Kaggle"></a> |
438
438
|**Granite4.0****(350M)**| Conversational | <ahref="https://www.kaggle.com/notebooks/welcome?src=https://github.com/unslothai/notebooks/blob/main/nb/Kaggle-Granite4.0_350M.ipynb&accelerator=nvidiaTeslaT4"target="_blank"rel="noopener noreferrer"><imgsrc="https://kaggle.com/static/images/open-in-kaggle.svg"alt="Open in Kaggle"></a> |
439
439
440
-
### Linear Attention Notebooks
440
+
### Hybrid Attention Notebooks
441
441
| Model | Type | Notebook Link |
442
442
| --- | --- | --- |
443
443
|**Liquid LFM2****(1.2B)**| Conversational | <ahref="https://www.kaggle.com/notebooks/welcome?src=https://github.com/unslothai/notebooks/blob/main/nb/Kaggle-Liquid_LFM2_(1.2B)-Conversational.ipynb&accelerator=nvidiaTeslaT4"target="_blank"rel="noopener noreferrer"><imgsrc="https://kaggle.com/static/images/open-in-kaggle.svg"alt="Open in Kaggle"></a> |
0 commit comments