LibAUC 1.3.0
Introducing LibAUC 1.3.0
We are thrilled to release LibAUC 1.3.0! In this version, we have made improvements and brought new features to our library. We have released a new documentation website at https://docs.libauc.org/, where you can access our code and comments. We are also happy to announce that our LibAUC paper has been accepted by KDD2023!
Major Improvements
- Improved the implementations for
DualSamplerandTriSamplerfor better efficiency. - Merged
DataSamplerforNDCGLosswithTriSamplerand added a new string argumentmodeto switch between classification mode for multi-label classification and ranking mode for movie recommendations. - Improved
AUCMLossand included a new version v2 (required DualSampler) that removes the class prior p required in the previous version v1. To use different version, you can setversion='v1'orversion='v2'inAUCMLoss. - Improved
CompositionalAUCLoss, which now allows multiple updates for optimizing inner loss by settingkin the loss. Similar toAUCMLoss, we introduced v2 version in this loss without using the class priorp. By default, k is 1 and version is v1. - Improved code quality for
APLossandpAUCLossincludingpAUC_CVaR_Loss,pAUC_DRO_Loss,tpAUC_KL_Lossfor better efficiency and readability. - API change for all
optimizermethods. Please passmodel.parameters()to the optimizer instead ofmodel, e.g.,PESG(model.parameters()).
New Features
- Launched an official documentation site at http://docs.libauc.org/ to access source code and parameter information.
- Introduced a new library logo for X-Risk designed by Zhuoning Yuan, Tianbao Yang .
- Introduced MIDAM for multi-instance learning. It supports two pooling functions,
MIDAMLoss('softmax')for using softmax pooling andMIDAMLoss('attention')for attention-based pooling. - Introduced a new
GCLosswrapper for contrastive self-supervised learning, which can be optimized by two algorithms in the backend: SogCLR and iSogCLR. - Introduced iSogCLR for automatic temperature individualization in self-supervised contrastive learning. To use
iSogCLR, you can setGCLoss('unimodal', enable_isogclr=True)andGCLoss('bimodal', enable_isogclr=True). - Introduced three new multi-label losses:
mAPLossfor optimizing mean AP,MultiLabelAUCMLossfor optimizing multi-label AUC loss, andMultiLabelpAUCLossfor multi-label partial AUC loss. - Introduced
PairwiseAUCLossto support optimization of traditional pairwise AUC losses. - Added more evaluation metrics:
ndcg_at_k,map_at_k,precision_at_k, andrecall_at_k.
Acknowledgment
Team: Zhuoning Yuan, Dixian Zhu, Zi-Hao Qiu, Gang Li, Tianbao Yang (Advisor)
Feedback
We value your thoughts and feedback! Please fill out this brief survey to guide our future developments. Thank you for your time! For other questions, please contact us @ Zhuoning Yuan [yzhuoning@gmail.com] and Tianbao Yang [tianbao-yang@tamu.edu].