For experiments on ViTs, we select backbones from DeiT and PVT families. This code is build based on the official implementation of DeiT and PVT.
Results for ViT backbones with NOAH trained on ImageNet.
Backbones | Name | Params | Top-1 Acc(%) | Google Drive | ||
---|---|---|---|---|---|---|
DeiT-Base | deit_base_patch16_224 | 86.86M | 4 | 1/2 | 81.85 | model |
+ NOAH | noah_deit_base_patch16_224 | 86.86M | 4 | 1/2 | 82.22 | model |
DeiT-Small | deit_small_patch16_224 | 22.06M | 4 | 1/2 | 79.78 | model |
+ NOAH | noah_deit_small_patch16_224 | 22.06M | 4 | 1/2 | 80.56 | model |
DeiT-Tiny (×1.0) | deit_tiny_patch16_224 | 5.72M | 4 | 1/2 | 72.16 | model |
+ NOAH | noah_deit_tiny_patch16_224 | 5.72M | 4 | 1/2 | 74.29 | model |
DeiT-Tiny (×0.75) | deit_tiny_075_patch16_224 | 3.29M | 4 | 1/2 | 62.55 | model |
+ NOAH | noah_deit_tiny_075_patch16_224 | 3.30M | 4 | 1/2 | 66.64 | model |
DeiT-Tiny (×0.5) | deit_tiny_050_patch16_224 | 1.53M | 4 | 1/2 | 51.36 | model |
+ NOAH | noah_deit_tiny_050_patch16_224 | 1.54M | 4 | 1/2 | 56.66 | model |
Backbones | Name | Params | Top-1 Acc(%) | Google Drive | ||
---|---|---|---|---|---|---|
PVT-Tiny (×1.0) | pvt_tiny | 13.23M | 4 | 1/2 | 75.10 | model |
+ NOAH | noah_pvt_tiny | 13.24M | 4 | 1/2 | 76.51 | model |
PVT-Tiny (×0.75) | pvt_tiny_075 | 7.62M | 4 | 1/2 | 71.81 | model |
+ NOAH | noah_pvt_tiny_075 | 7.62M | 4 | 1/2 | 74.22 | model |
PVT-Tiny (×0.5) | pvt_tiny_050 | 3.54M | 4 | 1/2 | 65.33 | model |
+ NOAH | noah_pvt_tiny_050 | 3.55M | 4 | 1/2 | 68.50 | model |
Please follow DeiT on how to prepare the environment. Then attach our code to the origin project.
To train DeiT models:
python -m torch.distributed.launch --nproc_per_node={ngpus} --use_env main.py \
--model {model name} --batch-size {batch size} --data-path {path to dataset} --output_dir {path to checkpoint}
For example, you can use following command to train DeiT-Tiny with NOAH
python -m torch.distributed.launch --nproc_per_node=8 --use_env main.py \
--model deit_tiny_patch16_224 --batch-size 128 --data-path ./datasets/ILSVRC2012 --output_dir ./checkpoints/noah_deit_tiny
To evaluate a pre-trained DeiT model:
python main.py --eval --resume {path to model} --model {model name} --data-path {path to dataset}
Please follow PVT on how to prepare the environment. Then attach our code to the origin project.
To train PVT models:
bash dist_train.sh {path to config file} {ngpus} --data-path {path to dataset}
For example, you can use following command to train PVT-Tiny with NOAH
bash dist_train.sh configs/pvt/pvt_tiny.py 8 --data-path ./datasets/ILSVRC2012
To evaluate a pre-trained PVT model:
bash dist_train.sh {path to config file} {ngpus} --data-path {path to dataset} --resume {path to model} --eval