Skip to content

Commit 307812a

Browse files
authored
Update README.md
1 parent 0d43151 commit 307812a

File tree

1 file changed

+6
-7
lines changed

1 file changed

+6
-7
lines changed

README.md

Lines changed: 6 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -44,12 +44,12 @@
4444
```
4545
$ pip install libauc
4646
```
47-
You can also download source code from [here](https://github.com/Optimization-AI/LibAUC/releases)
47+
You can also download source code from [here](https://github.com/Optimization-AI/LibAUC/releases).
4848

4949
:notebook_with_decorative_cover: Usage
5050
-------
5151
### Official Tutorials:
52-
- Creating Imbalanced Benchmark Datasets [[Notebook](https://github.com/Optimization-AI/LibAUC/blob/main/examples/01_Creating_Imbalanced_Benchmark_Datasets.ipynb)][[Script](https://github.com/Optimization-AI/LibAUC/tree/main/examples/scripts)]
52+
- Creating Imbalanced Benchmark Datasets on **CIFAR10, CIFAR100, CATvsDOG, STL10** [[Notebook](https://github.com/Optimization-AI/LibAUC/blob/main/examples/01_Creating_Imbalanced_Benchmark_Datasets.ipynb)][[Script](https://github.com/Optimization-AI/LibAUC/tree/main/examples/scripts)]
5353
- Training with Pytorch Learning Rate Scheduling [[Notebook](https://github.com/Optimization-AI/LibAUC/blob/main/examples/04_Training_with_Pytorch_Learning_Rate_Scheduling.ipynb)][[Script](https://github.com/Optimization-AI/LibAUC/tree/main/examples/scripts)]
5454
- Optimizing <strong>AUROC</strong> loss with ResNet20 on Imbalanced CIFAR10 [[Notebook](https://github.com/Optimization-AI/LibAUC/blob/main/examples/02_Optimizing_AUROC_with_ResNet20_on_Imbalanced_CIFAR10.ipynb)][[Script](https://github.com/Optimization-AI/LibAUC/tree/main/examples/scripts)]
5555
- Optimizing <strong>AUPRC</strong> loss with ResNet18 on Imbalanced CIFAR10 [[Notebook](https://github.com/Optimization-AI/LibAUC/blob/main/examples/03_Optimizing_AUPRC_with_ResNet18_on_Imbalanced_CIFAR10.ipynb)][[Script](https://github.com/Optimization-AI/LibAUC/tree/main/examples/scripts)]
@@ -111,14 +111,13 @@ You can also download source code from [here](https://github.com/Optimization-AI
111111

112112
:zap: Useful Tips
113113
---
114-
Checklist before Running Experiments:
115-
- [ ] Your data should have binary labels **0,1** and **1** is the **minority class** and **0** is the **majority class**
116-
- [ ] Compute the **imbalance_ratio** from your train set and pass it to `AUCMLoss(imratio=xxx)`
117-
- [ ] Adopt a proper **initial learning rate**, e.g., **lr=[0.1, 0.05]** usually works better
114+
- [ ] Your dataset should have **0,1** labels, e.g., **1** is the **minority class** and **0** is the **majority class**
115+
- [ ] Compute `imratio=#pos/#total` based on training set and pass it to `AUCMLoss(imratio=xxx)`
116+
- [ ] Adopt a proper `initial learning rate`, e.g., `lr=[0.1, 0.05]` usually works better
118117
- [ ] Choose `libauc.optimizers.PESG` to optimize `AUCMLoss(imratio=xxx)`
119118
- [ ] Use `optimizer.update_regularizer(decay_factor=10)` to update learning rate and regularizer in stagewise
120119
- [ ] Add activation layer, e.g., `torch.sigmoid(logits)`, before passing model outputs to loss function
121-
- [ ] Reshape both variables **preds** and **targets** to `(N, 1)` before calling loss function
120+
- [ ] Reshape both variables `y_preds` and `y_targets` to `(N, 1)` before calling loss function
122121

123122

124123
:page_with_curl: Citation

0 commit comments

Comments
 (0)