Train CIFAR10 with PyTorch
I'm playing with PyTorch on the CIFAR10 dataset.
Pros & cons
Pros:
- Built-in data loading and augmentation, very nice!
- Training is fast, maybe even a little bit faster.
- Very memory efficient!
Cons:
- No progress bar, sad :(
- No built-in log.
Accuracy
| Model | Acc. |
|---|---|
| VGG16 | 92.64% |
| ResNet18 | 93.02% |
| ResNet50 | 93.62% |
| ResNet101 | 93.75% |
| MobileNetV2 | 94.43% |
| ResNeXt29(32x4d) | 94.73% |
| ResNeXt29(2x64d) | 94.82% |
| DenseNet121 | 95.04% |
| PreActResNet18 | 95.11% |
| DPN92 | 95.16% |
Learning rate adjustment
I manually change the lr during training:
-
0.1for epoch[0,150) -
0.01for epoch[150,250) -
0.001for epoch[250,350)
Resume the training with python main.py --resume --lr=0.01