You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We continuously update our library by making improvements and adding new features. If you use or like our library, please **star**:star: this repo. Thank you!
-**2023/5**: We are currently working on another major update, which will include new features, documentation & codebase improvements. We expect the new version to be released later this month!
30
-
-**2022/7**: LibAUC **1.2.0** is released! In this version, we've included more losses and optimizers as well as made some performance improvements. Please check [release note](https://github.com/Optimization-AI/LibAUC/releases/tag/v1.2.0) for more details! Thanks!
31
-
36
+
News
37
+
---
38
+
-[2023/06/10]: **LibAUC 1.3.0 is now available!** In this update, we have made improvements and introduced new features. We also release a new documentation website at [https://docs.libauc.org/](https://docs.libauc.org/). Please see the [release notes](https://github.com/Optimization-AI/LibAUC/releases) for details.
39
+
-[2023/06/10]: We value your thoughts and feedback! Please consider filling out [this brief survey](https://forms.gle/oWNtjN9kLT51CMdf9) to guide our future developments. Thank you!
32
40
33
-
:mag: What is X-Risk?
41
+
Why LibAUC?
34
42
---
35
-
X-risk refers to a family of compositional measures/losses, in which each data point is compared with a set of data points explicitly or implicitly for defining a risk function. It covers a family of widely used measures/losses including but not limited to the following four interconnected categories:
36
-
-**Areas under the curves**, including areas under ROC curves (AUROC), areas under Precision-Recall curves (AUPRC), one-way and two-wary partial areas under ROC curves.
37
-
-**Ranking measures/objectives**, including p-norm push for bipartite ranking, listwise losses for learning to rank (e.g., listNet), mean average precision (mAP), normalized discounted cumulative gain (NDCG), etc.
38
-
-**Performance at the top**, including top push, top-K variants of mAP and NDCG, Recall at top K positions (Rec@K), Precision at a certain Recall level (Prec@Rec), etc.
39
-
-**Contrastive objectives**, including supervised contrastive objectives (e.g., NCA), and global self-supervised contrastive objectives improving upon SimCLR and CLIP.
43
+
LibAUC offers an easier way to directly optimize commonly-used performance measures and losses with user-friendly API. LibAUC has broad applications in AI for tackling many challenges, such as **Classification of Imbalanced Data (CID)**, **Learning to Rank (LTR)**, and **Contrastive Learning of Representation (CLR)**. LibAUC provides a unified framework to abstract the optimization of many compositional loss functions, including surrogate losses for AUROC, AUPRC/AP, and partial AUROC that are suitable for CID, surrogate losses for NDCG, top-K NDCG, and listwise losses that are used in LTR, and global contrastive losses for CLR. Here’s an overview:
40
44
41
-
42
-
:star: Key Features
43
-
---
44
-
-**Easy Installation** - Easy to install and insert LibAUC code into existing training pipeline with Deep Learning frameworks like PyTorch.
45
-
-**Broad Applications** - Users can learn different neural network structures (e.g., MLP, CNN, GNN, transformer, etc) that support their data types.
46
-
-**Efficient Algorithms** - Stochastic algorithms with provable theoretical convergence that support learning with millions of data points without larger batch size.
47
-
-**Hands-on Tutorials** - Hands-on tutorials are provided for optimizing a variety of measures and objectives belonging to the family of X-risks.
The latest version **`1.2.0`** is available now! You can check [release note](https://github.com/Optimization-AI/LibAUC/releases/tag/v1.2.0) for more details. Source code is available for download [here](https://github.com/Optimization-AI/LibAUC/releases).
55
+
For more details, please check the latest [release note](https://github.com/Optimization-AI/LibAUC/releases/).
56
56
57
57
58
-
:clipboard:Usage
58
+
Usage
59
59
---
60
60
#### Example training pipline for optimizing X-risk (e.g., AUROC)
61
61
```python
@@ -82,34 +82,39 @@ The latest version **`1.2.0`** is available now! You can check [release note](ht
82
82
>>> optimizer.update_regularizer()
83
83
```
84
84
85
-
:notebook_with_decorative_cover:Tutorials
85
+
Tutorials
86
86
-------
87
87
### X-Risk
88
-
-**AUROC**: [Optimizing AUROC loss on imbalanced dataset](https://github.com/Optimization-AI/LibAUC/blob/main/examples/02_Optimizing_AUROC_with_ResNet20_on_Imbalanced_CIFAR10.ipynb)
89
-
-**AUPRC**: [Optimizing AUPRC loss on imbalanced dataset](https://github.com/Optimization-AI/LibAUC/blob/main/examples/03_Optimizing_AUPRC_Loss_on_Imbalanced_dataset.ipynb)
90
-
-**Partial AUROC**: [Optimizing Partial AUC loss on imbalanced dataset](https://github.com/Optimization-AI/LibAUC/blob/main/examples/11_Optimizing_pAUC_Loss_on_Imbalanced_data_wrapper.ipynb)
91
-
-**Compositional AUROC**: [Optimizing Compositional AUROC loss on imbalanced dataset](https://github.com/Optimization-AI/LibAUC/blob/main/examples/09_Optimizing_CompositionalAUC_Loss_with_ResNet20_on_CIFAR10.ipynb)
92
-
-**NDCG**: [Optimizing NDCG loss on MovieLens 20M](https://github.com/Optimization-AI/LibAUC/blob/main/examples/10_Optimizing_NDCG_Loss_on_MovieLens20M.ipynb)
93
-
-**SogCLR**: [Optimizing Contrastive Loss using small batch size on ImageNet-1K](https://github.com/Optimization-AI/SogCLR)
94
-
95
-
### Applications
96
-
-[A Tutorial of Imbalanced Data Sampler](https://github.com/Optimization-AI/LibAUC/blob/main/examples/placeholder.md) (Updates Coming Soon)
97
-
-[Constructing benchmark imbalanced datasets for CIFAR10, CIFAR100, CATvsDOG, STL10](https://github.com/Optimization-AI/LibAUC/blob/main/examples/01_Creating_Imbalanced_Benchmark_Datasets.ipynb)
98
-
-[Using LibAUC with PyTorch learning rate scheduler](https://github.com/Optimization-AI/LibAUC/blob/main/examples/04_Training_with_Pytorch_Learning_Rate_Scheduling.ipynb)
99
-
-[Optimizing AUROC loss on Chest X-Ray dataset (CheXpert)](https://github.com/Optimization-AI/LibAUC/blob/main/examples/05_Optimizing_AUROC_Loss_with_DenseNet121_on_CheXpert.ipynb)
100
-
-[Optimizing AUROC loss on Skin Cancer dataset (Melanoma)](https://github.com/Optimization-AI/LibAUC/blob/main/examples/08_Optimizing_AUROC_Loss_with_DenseNet121_on_Melanoma.ipynb)
101
-
-[Optimizing AUROC loss on Molecular Graph dataset (OGB-Molhiv)](https://github.com/yzhuoning/DeepAUC_OGB_Challenge)
102
-
-[Optimizing multi-label AUROC loss on Chest X-Ray dataset (CheXpert)](https://github.com/Optimization-AI/LibAUC/blob/main/examples/07_Optimizing_Multi_Label_AUROC_Loss_with_DenseNet121_on_CheXpert.ipynb)
103
-
-[Optimizing AUROC loss on Tabular dataset (Credit Fraud)](https://github.com/Optimization-AI/LibAUC/blob/main/examples/12_Optimizing_AUROC_Loss_on_Tabular_Data.ipynb)
104
-
-[Optimizing AUROC loss for Federated Learning](https://github.com/Optimization-AI/LibAUC/blob/main/examples/scripts/06_Optimizing_AUROC_loss_with_DenseNet121_on_CIFAR100_in_Federated_Setting_CODASCA.py)
-[Constructing benchmark imbalanced datasets for CIFAR10, CIFAR100, CATvsDOG, STL10](https://github.com/Optimization-AI/LibAUC/blob/main/examples/01_Creating_Imbalanced_Benchmark_Datasets.ipynb)
102
+
-[Using LibAUC with PyTorch learning rate scheduler](https://github.com/Optimization-AI/LibAUC/blob/main/examples/04_Training_with_Pytorch_Learning_Rate_Scheduling.ipynb)
103
+
-[Optimizing AUROC loss on Chest X-Ray dataset (CheXpert)](https://github.com/Optimization-AI/LibAUC/blob/main/examples/05_Optimizing_AUROC_Loss_with_DenseNet121_on_CheXpert.ipynb)
104
+
-[Optimizing AUROC loss on Skin Cancer dataset (Melanoma)](https://github.com/Optimization-AI/LibAUC/blob/main/examples/08_Optimizing_AUROC_Loss_with_DenseNet121_on_Melanoma.ipynb)
105
+
-[Optimizing multi-label AUROC loss on Chest X-Ray dataset (CheXpert)](https://github.com/Optimization-AI/LibAUC/blob/main/examples/07_Optimizing_Multi_Label_AUROC_Loss_with_DenseNet121_on_CheXpert.ipynb)
106
+
-[Optimizing AUROC loss on Tabular dataset (Credit Fraud)](https://github.com/Optimization-AI/LibAUC/blob/main/examples/12_Optimizing_AUROC_Loss_on_Tabular_Data.ipynb)
107
+
-[Optimizing AUROC loss for Federated Learning](https://github.com/Optimization-AI/LibAUC/blob/main/examples/scripts/06_Optimizing_AUROC_loss_with_DenseNet121_on_CIFAR100_in_Federated_Setting_CODASCA.py)
108
+
109
+
</details>
110
+
111
+
112
+
Citation
108
113
---------
109
-
If you find LibAUC useful in your work, please cite the papers in [BibTex](https://github.com/Optimization-AI/LibAUC/blob/main/citations.bib) and acknowledge our library:
114
+
If you find LibAUC useful in your work, please cite the following papers:
110
115
```
111
116
@inproceedings{yuan2023libauc,
112
-
title={LibAUC: A Deep Learning Library for X-risk Optimization},
117
+
title={LibAUC: A Deep Learning Library for X-Risk Optimization},
113
118
author={Zhuoning Yuan and Dixian Zhu and Zi-Hao Qiu and Gang Li and Xuanhui Wang and Tianbao Yang},
114
119
booktitle={29th SIGKDD Conference on Knowledge Discovery and Data Mining},
115
120
year={2023}
@@ -124,6 +129,6 @@ If you find LibAUC useful in your work, please cite the papers in [BibTex](https
124
129
}
125
130
```
126
131
127
-
:email:Contact
132
+
Contact
128
133
----------
129
-
For any technical questions, please open a new issue in the Github. If you have any other questions, please contact us @ [Zhuoning Yuan](https://yzhuoning.com)[[email protected]] and [Tianbao Yang](http://people.tamu.edu/~tianbao-yang/)[[email protected]].
134
+
For any technical questions, please open a new issue in the Github. If you have any other questions, please contact us @ [Zhuoning Yuan](https://zhuoning.cc)[[email protected]] and [Tianbao Yang](http://people.tamu.edu/~tianbao-yang/)[[email protected]].
0 commit comments