[SIGIR'24] The official implementation code of MOELoRA.
-
Updated
Jul 22, 2024 - Python
[SIGIR'24] The official implementation code of MOELoRA.
A generalized framework for subspace tuning methods in parameter efficient fine-tuning.
Repository for Chat LLaMA - training a LoRA for the LLaMA (1 or 2) models on HuggingFace with 8-bit or 4-bit quantization. Research only.
LORA: Low-Rank Adaptation of Large Language Models implemented using PyTorch
The official implementation for MTLoRA: A Low-Rank Adaptation Approach for Efficient Multi-Task Learning (CVPR '24)
Easy wrapper for inserting LoRA layers in CLIP.
Over 60 figures and diagrams of LLMs, quantization, low-rank adapters (LoRA), and chat templates FREE TO USE in your blog posts, slides, presentations, or papers.
GaLore: Memory-Efficient LLM Training by Gradient Low-Rank Projection
This repository contains the source code and related resources for R-LoRA.
FHBI: Improving Generalization with Flat Hilbert Bayesian Inference (ICML2025)
[Arxiv 2025] Official code for T-REX: Mixture-of-Rank-One-Experts with semantic-aware Intuition for Multi-task Large Language Model Finetuning
This repository contains the lab work for Coursera course on "Generative AI with Large Language Models".
[arXiv 2025] FairFedMed: Benchmarking Group Fairness in Federated Medical Imaging with FairLoRA
[CVPR 2025 Highlight] Meta LoRA / MetaPEFT: Meta-Learning Hyperparameters for Parameter-Efficient Fine-Tuning (LoRA, Adapter, Prompt Tuning) — Automatic hyperparameter optimization via bi-level meta-learning
fine-tuning of the Segment-Anything model by MetaAI
A curated list of Parameter Efficient Fine-tuning papers with a TL;DR
Fine tuning Mistral-7b with PEFT(Parameter Efficient Fine-Tuning) and LoRA(Low-Rank Adaptation) on Puffin Dataset(multi-turn conversations between GPT-4 and real humans)
[ICLR 2026] Official implementation (Claude Agent reproduce supported) of paper "mtLoRA: Scalable Multi-Task Low-Rank Model Adaptation" +2.3% over SOTA with 47% fewer parameters
Add a description, image, and links to the low-rank-adaptation topic page so that developers can more easily learn about it.
To associate your repository with the low-rank-adaptation topic, visit your repo's landing page and select "manage topics."