Skip to content

[Ecosystem] Unsloth #55

@danielhanchen

Description

@danielhanchen

Contact emails

daniel@unsloth.ai, michael@unsloth.ai

Project summary

Fine-tuning & Reinforcement Learning for LLMs. Train LLMs, VLMs 2x faster with 70% less VRAM

Project description

  • We're one of the most popular fine-tuning, training & reinforcement learning libraries! (51K Github stars)
  • Unsloth makes training LLMs, VLMs, Embeddings and audio TTS, OCR models 2x faster and use 70% less VRAM.
  • We also contribute bug fixes in OSS models like Gemma, Qwen, Mistral, Llama, GPT-OSS and more.
  • We have over 150 million total model downloads via https://huggingface.co/unsloth (4th largest overall), and over 30K-40K Pip downloads per day for our package!
  • We collab directly with TorchAO, OpenEnv, ExecuTorch, and the entire PyTorch team on bringing the latest features of PyTorch to everyone!

Are there any other projects in the PyTorch Ecosystem similar to yours? If, yes, what are they?

Project repo URL

https://github.com/unslothai/unsloth

Additional repos in scope of the application

No response

Project license

Apache-2.0 license

GitHub handles of the project maintainer(s)

danielhanchen, shimmyshimmer, Datta0, mmathew23

Is there a corporate or academic entity backing this project? If so, please provide the name and URL of the entity.

Unsloth AI Inc., https://unsloth.ai/

Website URL

https://unsloth.ai/

Documentation

How do you build and test the project today (continuous integration)? Please describe.

We test it manually for now - 30 notebooks on GCP every few days.
When new models come, we have to test immediately.
We also test once Gitthub, Discord, Reddit issues are resolved.
We're working with AMD, NVIDIA on streamlining CI CD

Version of PyTorch

We support 2.4.0 until 2.11 nightly.
We test currently on 2.9.1 and 2.10.0

Components of PyTorch

Everything, since we train LLMs! We use PyTorch directly:

  1. torch.nn.*
  2. torch.nn.functional.*
  3. General torch functions and nearly all of PyTorch!

For example https://github.com/search?q=repo%3Aunslothai%2Funsloth%20%22torch%22&type=code shows 90 files using PyTorch, over 1,000 GitHub issues mentioning PyTorch, 130 PRs using PyTorch!

How long do you expect to maintain the project?

Forever :) This is our dream project!
Concretely > 20 years!

Additional information

We also collab with:

  1. TorchAO https://pytorch.org/blog/torchao-quantized-models-and-quantization-recipes-now-available-on-huggingface-hub/
  2. ExecuTorch: https://pytorch.org/blog/torchao-quantized-models-and-quantization-recipes-now-available-on-huggingface-hub/
  3. Part of the PyTorch Conference 2025: https://pytorch.org/blog/torchao-quantized-models-and-quantization-recipes-now-available-on-huggingface-hub/
  4. PyTorch at Neurips 2025: https://pytorch.org/blog/pytorch-foundation-at-neurips-2025/
  5. OpenEnv: https://huggingface.co/blog/openenv
  6. GPU Mode, OpenEnv Mini RL Conference: https://www.youtube.com/watch?v=jMSCJZAEYR8
  7. PyTorch generally: https://www.youtube.com/watch?v=MQwryfkydc0

Metadata

Metadata

Labels

Type

No type

Projects

Status

Done

Status

Review - Scheduled

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions