This is the official implementation of the paper "Stochastic Optimization of Areas Under Precision-Recall Curves with Provable Convergence" published on Neurips2021.
Image: CIFAR10, CIFAR100, Melanoma
Graph: HIV, MUV, AICures
The main algorithm SOAP has been implemented in LibAUC, with
>>> from libauc.optimizers import SOAP_SGD, SOAP_ADAM
You can design your own loss. The following is a usecase:
pip install libauc
>>> #import library
>>> from libauc.losses import APLoss_SH
>>> from libauc.optimizers import SOAP_SGD, SOAP_ADAM
...
>>> #define loss
>>> Loss = APLoss_SH()
>>> optimizer = SOAP_ADAM()
...
>>> #training
>>> model.train()
>>> for index, data, targets in trainloader:
data, targets = data.cuda(), targets.cuda()
logits = model(data)
preds = torch.sigmoid(logits)
loss = Loss(preds, targets, index)
optimizer.zero_grad()
loss.backward()
optimizer.step()
If you want to download the code that reproducing the reported table results for the Neurips 2021 paper, please go to the Graph/Image subdirectories and refer the inside README.md.
If you find this repo helpful, please cite the following paper:
@article{qi2021stochastic,
title={Stochastic Optimization of Area Under Precision-Recall Curve for Deep Learning with Provable Convergence},
author={Qi, Qi and Luo, Youzhi and Xu, Zhao and Ji, Shuiwang and Yang, Tianbao},
journal={arXiv preprint arXiv:2104.08736},
year={2021}
}
If you have any questions, please contact us @ Qi Qi [qi-qi@uiowa.edu] , and Tianbao Yang [tianbao-yang@uiowa.edu] or please open a new issue in the Github.