Skip to content
/ Padam Public

Partially Adaptive Momentum Estimation method in the paper "Closing the Generalization Gap of Adaptive Gradient Methods in Training Deep Neural Networks" (accepted by IJCAI 2020)

License

Notifications You must be signed in to change notification settings

uclaml/Padam

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Padam

This repository contains our pytorch implementation of Partially Adaptive Momentum Estimation method (Padam) in the paper Closing the Generalization Gap of Adaptive Gradient Methods in Training Deep Neural Networks (accepted by IJCAI 2020).

Prerequisites:

  • Pytorch
  • CUDA

Usage:

Use python to run run_cnn_test_cifar10.py for experiments on Cifar10 and run_cnn_test_cifar100.py for experiments on Cifar100

Command Line Arguments:

  • --lr: (start) learning rate
  • --method: optimization method, e.g., "sgdm", "adam", "amsgrad", "padam"
  • --net: network architecture, e.g. "vggnet", "resnet", "wideresnet"
  • --partial: partially adaptive parameter for Padam method
  • --wd: weight decay
  • --Nepoch: number of training epochs
  • --resume: whether resume from previous training process

Usage Examples:

  • Run experiments on Cifar10:
  -  python run_cnn_test_cifar10.py  --lr 0.1 --method "padam" --net "vggnet"  --partial 0.125 --wd 5e-4
  • Run experiments on Cifar100:
  -  python run_cnn_test_cifar100.py  --lr 0.1 --method "padam" --net "resnet"  --partial 0.125 --wd 5e-4

Citation

Please check our paper for technical details and full results.

@inproceedings{chen2020closing,
  title={Closing the Generalization Gap of Adaptive Gradient Methods in Training Deep Neural Networks},
  author={Chen, Jinghui and Zhou, Dongruo and Tang, Yiqi and Yang, Ziyan and Cao, Yuan and Gu, Quanquan},
  booktitle={International Joint Conferences on Artificial Intelligence},
  year={2020}
}

About

Partially Adaptive Momentum Estimation method in the paper "Closing the Generalization Gap of Adaptive Gradient Methods in Training Deep Neural Networks" (accepted by IJCAI 2020)

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages