open-print

发布时间 2023-12-09 15:52:57作者: 太好了还有脑子可以用

1.stage_1 print

/home/zy/anaconda3/envs/py/bin/python /home/zy/pycharm/project/OpenLongTailRecognition-OLTR/main.py
Loading dataset from: /home/zy/pycharm/project/ImageNet2012
{'criterions': {'PerformanceLoss': {'def_file': './loss/SoftmaxLoss.py',
                                    'loss_params': {},
                                    'optim_params': None,
                                    'weight': 1.0}},
 'memory': {'centroids': False, 'init_centroids': False},
 'networks': {'classifier': {'def_file': './models/DotProductClassifier.py',
                             'optim_params': {'lr': 0.1,
                                              'momentum': 0.9,
                                              'weight_decay': 0.0005},
                             'params': {'dataset': 'ImageNet_LT',
                                        'in_dim': 512,
                                        'num_classes': 1000,
                                        'stage1_weights': False}},
              'feat_model': {'def_file': './models/ResNet10Feature.py',
                             'fix': False,
                             'optim_params': {'lr': 0.1,
                                              'momentum': 0.9,
                                              'weight_decay': 0.0005},
                             'params': {'dataset': 'ImageNet_LT',
                                        'dropout': None,
                                        'stage1_weights': False,
                                        'use_fc': False,
                                        'use_modulatedatt': False}}},
 'training_opt': {'batch_size': 128,
                  'dataset': 'ImageNet_LT',
                  'display_step': 10,
                  'feature_dim': 512,
                  'log_dir': './logs/ImageNet_LT/stage1',
                  'num_classes': 1000,
                  'num_epochs': 30,
                  'num_workers': 8,
                  'open_threshold': 0.1,
                  'sampler': None,
                  'scheduler_params': {'gamma': 0.1, 'step_size': 10}}}
Enter load_data
I want to know  phase:%s
Loading data from ./data/ImageNet_LT/ImageNet_LT_train.txt
No sampler.
Shuffle is True.
Enter load_data
I want to know  phase:%s
Loading data from ./data/ImageNet_LT/ImageNet_LT_val.txt
No sampler.
Shuffle is True.
Using 1 GPUs.
Loading Scratch ResNet 10 Feature Model.
No Pretrained Weights For Feature Model.
Loading Dot Product Classifier.
Random initialized classifier weights.
------Enter run_networks train----
Using steps for training.
Initializing model optimizer.
Loading Softmax Loss.
Phase: train
Epoch: [1/30] Step:     0  Minibatch_loss_performance: 6.958 Minibatch_accuracy_micro: 0.000
Epoch: [1/30] Step:   900  Minibatch_loss_performance: 5.773 Minibatch_accuracy_micro: 0.039
Phase: val
100%|██████████| 157/157 [00:27<00:00,  5.66it/s]
 
 
 Phase: val 
 
 总体准确率(Evaluation_accuracy_micro_top1): 0.016 
 计算F度量值(Averaged F-measure): 0.004 
 多样本准确率(Many_shot_accuracy_top1): 0.042 中等样本准确率(Median_shot_accuracy_top1): 0.000 少样本准确率(Low_shot_accuracy_top1): 0.000 
 
Epoch: [2/30] Step:     0  Minibatch_loss_performance: 5.838 Minibatch_accuracy_micro: 0.070
Epoch: [2/30] Step:   900  Minibatch_loss_performance: 5.226 Minibatch_accuracy_micro: 0.102
Phase: val
100%|██████████| 157/157 [00:27<00:00,  5.64it/s]
 
 
 Phase: val 
 
 总体准确率(Evaluation_accuracy_micro_top1): 0.022 
 计算F度量值(Averaged F-measure): 0.008 
 多样本准确率(Many_shot_accuracy_top1): 0.053 中等样本准确率(Median_shot_accuracy_top1): 0.002 少样本准确率(Low_shot_accuracy_top1): 0.000 
 
Epoch: [3/30] Step:     0  Minibatch_loss_performance: 5.210 Minibatch_accuracy_micro: 0.062
Epoch: [3/30] Step:   900  Minibatch_loss_performance: 4.887 Minibatch_accuracy_micro: 0.133
Phase: val
100%|██████████| 157/157 [00:26<00:00,  5.92it/s]
 
 
 Phase: val 
 
 总体准确率(Evaluation_accuracy_micro_top1): 0.033 
 计算F度量值(Averaged F-measure): 0.013 
 多样本准确率(Many_shot_accuracy_top1): 0.084 中等样本准确率(Median_shot_accuracy_top1): 0.001 少样本准确率(Low_shot_accuracy_top1): 0.000 
 
Epoch: [4/30] Step:     0  Minibatch_loss_performance: 4.857 Minibatch_accuracy_micro: 0.117
Phase: val
100%|██████████| 157/157 [00:24<00:00,  9.53it/s]
 
 
 Phase: val 
 
 总体准确率(Evaluation_accuracy_micro_top1): 0.259 
 计算F度量值(Averaged F-measure): 0.211 
 多样本准确率(Many_shot_accuracy_top1): 0.493 中等样本准确率(Median_shot_accuracy_top1): 0.143 少样本准确率(Low_shot_accuracy_top1): 0.006 
 
Epoch: [30/30] Step:     0  Minibatch_loss_performance: 2.552 Minibatch_accuracy_micro: 0.445
Epoch: [30/30] Step:    10  Minibatch_loss_performance: 2.570 Minibatch_accuracy_micro: 0.469
Epoch: [30/30] Step:    20  Minibatch_loss_performance: 2.714 Minibatch_accuracy_micro: 0.453
Epoch: [30/30] Step:    30  Minibatch_loss_performance: 2.458 Minibatch_accuracy_micro: 0.492
Epoch: [30/30] Step:    40  Minibatch_loss_performance: 2.490 Minibatch_accuracy_micro: 0.477
Epoch: [30/30] Step:    50  Minibatch_loss_performance: 2.590 Minibatch_accuracy_micro: 0.398
Epoch: [30/30] Step:    60  Minibatch_loss_performance: 2.514 Minibatch_accuracy_micro: 0.492
Epoch: [30/30] Step:    70  Minibatch_loss_performance: 2.822 Minibatch_accuracy_micro: 0.398
Epoch: [30/30] Step:    80  Minibatch_loss_performance: 2.642 Minibatch_accuracy_micro: 0.375
Epoch: [30/30] Step:    90  Minibatch_loss_performance: 2.389 Minibatch_accuracy_micro: 0.477
Epoch: [30/30] Step:   100  Minibatch_loss_performance: 2.305 Minibatch_accuracy_micro: 0.500
Epoch: [30/30] Step:   110  Minibatch_loss_performance: 2.620 Minibatch_accuracy_micro: 0.430
Epoch: [30/30] Step:   120  Minibatch_loss_performance: 2.502 Minibatch_accuracy_micro: 0.453
Epoch: [30/30] Step:   130  Minibatch_loss_performance: 2.354 Minibatch_accuracy_micro: 0.461
Epoch: [30/30] Step:   140  Minibatch_loss_performance: 2.532 Minibatch_accuracy_micro: 0.469
Epoch: [30/30] Step:   150  Minibatch_loss_performance: 2.524 Minibatch_accuracy_micro: 0.492
Epoch: [30/30] Step:   160  Minibatch_loss_performance: 2.945 Minibatch_accuracy_micro: 0.391
Epoch: [30/30] Step:   170  Minibatch_loss_performance: 2.498 Minibatch_accuracy_micro: 0.461
Epoch: [30/30] Step:   180  Minibatch_loss_performance: 2.608 Minibatch_accuracy_micro: 0.445
Epoch: [30/30] Step:   190  Minibatch_loss_performance: 2.591 Minibatch_accuracy_micro: 0.453
Epoch: [30/30] Step:   200  Minibatch_loss_performance: 2.370 Minibatch_accuracy_micro: 0.531
Epoch: [30/30] Step:   210  Minibatch_loss_performance: 2.481 Minibatch_accuracy_micro: 0.523
Epoch: [30/30] Step:   220  Minibatch_loss_performance: 2.447 Minibatch_accuracy_micro: 0.469
Epoch: [30/30] Step:   230  Minibatch_loss_performance: 2.541 Minibatch_accuracy_micro: 0.461
Epoch: [30/30] Step:   240  Minibatch_loss_performance: 2.581 Minibatch_accuracy_micro: 0.492
Epoch: [30/30] Step:   250  Minibatch_loss_performance: 2.785 Minibatch_accuracy_micro: 0.438
Epoch: [30/30] Step:   260  Minibatch_loss_performance: 2.376 Minibatch_accuracy_micro: 0.500
Epoch: [30/30] Step:   270  Minibatch_loss_performance: 2.592 Minibatch_accuracy_micro: 0.414
Epoch: [30/30] Step:   280  Minibatch_loss_performance: 2.532 Minibatch_accuracy_micro: 0.445
Epoch: [30/30] Step:   290  Minibatch_loss_performance: 2.585 Minibatch_accuracy_micro: 0.438
Epoch: [30/30] Step:   300  Minibatch_loss_performance: 3.031 Minibatch_accuracy_micro: 0.414
Epoch: [30/30] Step:   310  Minibatch_loss_performance: 2.546 Minibatch_accuracy_micro: 0.398
Epoch: [30/30] Step:   320  Minibatch_loss_performance: 2.952 Minibatch_accuracy_micro: 0.422
Epoch: [30/30] Step:   330  Minibatch_loss_performance: 2.688 Minibatch_accuracy_micro: 0.516
Epoch: [30/30] Step:   340  Minibatch_loss_performance: 2.745 Minibatch_accuracy_micro: 0.359
Epoch: [30/30] Step:   350  Minibatch_loss_performance: 2.765 Minibatch_accuracy_micro: 0.422
Epoch: [30/30] Step:   360  Minibatch_loss_performance: 2.433 Minibatch_accuracy_micro: 0.508
Epoch: [30/30] Step:   370  Minibatch_loss_performance: 2.718 Minibatch_accuracy_micro: 0.484
Epoch: [30/30] Step:   380  Minibatch_loss_performance: 2.307 Minibatch_accuracy_micro: 0.500
Epoch: [30/30] Step:   390  Minibatch_loss_performance: 2.836 Minibatch_accuracy_micro: 0.445
Epoch: [30/30] Step:   400  Minibatch_loss_performance: 2.403 Minibatch_accuracy_micro: 0.453
Epoch: [30/30] Step:   410  Minibatch_loss_performance: 2.764 Minibatch_accuracy_micro: 0.406
Epoch: [30/30] Step:   420  Minibatch_loss_performance: 2.469 Minibatch_accuracy_micro: 0.516
Epoch: [30/30] Step:   430  Minibatch_loss_performance: 2.680 Minibatch_accuracy_micro: 0.414
Epoch: [30/30] Step:   440  Minibatch_loss_performance: 2.146 Minibatch_accuracy_micro: 0.562
Epoch: [30/30] Step:   450  Minibatch_loss_performance: 2.557 Minibatch_accuracy_micro: 0.445
Epoch: [30/30] Step:   460  Minibatch_loss_performance: 2.604 Minibatch_accuracy_micro: 0.453
Epoch: [30/30] Step:   470  Minibatch_loss_performance: 2.752 Minibatch_accuracy_micro: 0.391
Epoch: [30/30] Step:   480  Minibatch_loss_performance: 2.318 Minibatch_accuracy_micro: 0.547
Epoch: [30/30] Step:   490  Minibatch_loss_performance: 2.462 Minibatch_accuracy_micro: 0.438
Epoch: [30/30] Step:   500  Minibatch_loss_performance: 2.541 Minibatch_accuracy_micro: 0.477
Epoch: [30/30] Step:   510  Minibatch_loss_performance: 2.681 Minibatch_accuracy_micro: 0.398
Epoch: [30/30] Step:   520  Minibatch_loss_performance: 2.608 Minibatch_accuracy_micro: 0.492
Epoch: [30/30] Step:   530  Minibatch_loss_performance: 2.566 Minibatch_accuracy_micro: 0.430
Epoch: [30/30] Step:   540  Minibatch_loss_performance: 2.698 Minibatch_accuracy_micro: 0.430
Epoch: [30/30] Step:   550  Minibatch_loss_performance: 2.329 Minibatch_accuracy_micro: 0.523
Epoch: [30/30] Step:   560  Minibatch_loss_performance: 2.531 Minibatch_accuracy_micro: 0.477
Epoch: [30/30] Step:   570  Minibatch_loss_performance: 2.595 Minibatch_accuracy_micro: 0.453
Epoch: [30/30] Step:   580  Minibatch_loss_performance: 2.625 Minibatch_accuracy_micro: 0.461
Epoch: [30/30] Step:   590  Minibatch_loss_performance: 2.839 Minibatch_accuracy_micro: 0.398
Epoch: [30/30] Step:   600  Minibatch_loss_performance: 3.175 Minibatch_accuracy_micro: 0.383
Epoch: [30/30] Step:   610  Minibatch_loss_performance: 2.679 Minibatch_accuracy_micro: 0.367
Epoch: [30/30] Step:   620  Minibatch_loss_performance: 2.710 Minibatch_accuracy_micro: 0.422
Epoch: [30/30] Step:   630  Minibatch_loss_performance: 2.895 Minibatch_accuracy_micro: 0.414
Epoch: [30/30] Step:   640  Minibatch_loss_performance: 2.462 Minibatch_accuracy_micro: 0.484
Epoch: [30/30] Step:   650  Minibatch_loss_performance: 2.285 Minibatch_accuracy_micro: 0.516
Epoch: [30/30] Step:   660  Minibatch_loss_performance: 2.644 Minibatch_accuracy_micro: 0.453
Epoch: [30/30] Step:   670  Minibatch_loss_performance: 2.943 Minibatch_accuracy_micro: 0.422
Epoch: [30/30] Step:   680  Minibatch_loss_performance: 2.542 Minibatch_accuracy_micro: 0.508
Epoch: [30/30] Step:   690  Minibatch_loss_performance: 2.716 Minibatch_accuracy_micro: 0.453
Epoch: [30/30] Step:   700  Minibatch_loss_performance: 2.673 Minibatch_accuracy_micro: 0.438
Epoch: [30/30] Step:   710  Minibatch_loss_performance: 2.660 Minibatch_accuracy_micro: 0.445
Epoch: [30/30] Step:   720  Minibatch_loss_performance: 3.312 Minibatch_accuracy_micro: 0.359
Epoch: [30/30] Step:   730  Minibatch_loss_performance: 2.430 Minibatch_accuracy_micro: 0.484
Epoch: [30/30] Step:   740  Minibatch_loss_performance: 2.376 Minibatch_accuracy_micro: 0.523
Epoch: [30/30] Step:   750  Minibatch_loss_performance: 2.464 Minibatch_accuracy_micro: 0.500
Epoch: [30/30] Step:   760  Minibatch_loss_performance: 2.506 Minibatch_accuracy_micro: 0.438
Epoch: [30/30] Step:   770  Minibatch_loss_performance: 2.920 Minibatch_accuracy_micro: 0.383
Epoch: [30/30] Step:   780  Minibatch_loss_performance: 2.449 Minibatch_accuracy_micro: 0.547
Epoch: [30/30] Step:   790  Minibatch_loss_performance: 3.060 Minibatch_accuracy_micro: 0.430
Epoch: [30/30] Step:   800  Minibatch_loss_performance: 2.656 Minibatch_accuracy_micro: 0.453
Epoch: [30/30] Step:   810  Minibatch_loss_performance: 2.757 Minibatch_accuracy_micro: 0.406
Epoch: [30/30] Step:   820  Minibatch_loss_performance: 2.482 Minibatch_accuracy_micro: 0.445
Epoch: [30/30] Step:   830  Minibatch_loss_performance: 2.679 Minibatch_accuracy_micro: 0.430
Epoch: [30/30] Step:   840  Minibatch_loss_performance: 2.907 Minibatch_accuracy_micro: 0.375
Epoch: [30/30] Step:   850  Minibatch_loss_performance: 2.605 Minibatch_accuracy_micro: 0.445
Epoch: [30/30] Step:   860  Minibatch_loss_performance: 2.474 Minibatch_accuracy_micro: 0.469
Epoch: [30/30] Step:   870  Minibatch_loss_performance: 2.651 Minibatch_accuracy_micro: 0.422
Epoch: [30/30] Step:   880  Minibatch_loss_performance: 2.678 Minibatch_accuracy_micro: 0.414
Epoch: [30/30] Step:   890  Minibatch_loss_performance: 2.647 Minibatch_accuracy_micro: 0.445
Epoch: [30/30] Step:   900  Minibatch_loss_performance: 2.643 Minibatch_accuracy_micro: 0.477
Phase: val
100%|██████████| 157/157 [00:24<00:00,  6.47it/s]
 
 
 Phase: val 
 
 总体准确率(Evaluation_accuracy_micro_top1): 0.259 
 计算F度量值(Averaged F-measure): 0.210 
 多样本准确率(Many_shot_accuracy_top1): 0.491 中等样本准确率(Median_shot_accuracy_top1): 0.142 少样本准确率(Low_shot_accuracy_top1): 0.007 
 
 
Training Complete.
Best validation accuracy is 0.259 at epoch 29
Done
ALL COMPLETED.
 
Process finished with exit code 0

2.stage_2 meta

/home/zy/anaconda3/envs/py/bin/python /home/zy/pycharm/project/OpenLongTailRecognition-OLTR/main.py
Loading dataset from: /home/zy/pycharm/project/ImageNet2012
{'criterions': {'FeatureLoss': {'def_file': './loss/DiscCentroidsLoss.py',
                                'loss_params': {'feat_dim': 512,
                                                'num_classes': 1000},
                                'optim_params': {'lr': 0.01,
                                                 'momentum': 0.9,
                                                 'weight_decay': 0.0005},
                                'weight': 0.01},
                'PerformanceLoss': {'def_file': './loss/SoftmaxLoss.py',
                                    'loss_params': {},
                                    'optim_params': None,
                                    'weight': 1.0}},
 'memory': {'centroids': True, 'init_centroids': True},
 'networks': {'classifier': {'def_file': './models/MetaEmbeddingClassifier.py',
                             'optim_params': {'lr': 0.1,
                                              'momentum': 0.9,
                                              'weight_decay': 0.0005},
                             'params': {'dataset': 'ImageNet_LT',
                                        'in_dim': 512,
                                        'num_classes': 1000,
                                        'stage1_weights': True}},
              'feat_model': {'def_file': './models/ResNet10Feature.py',
                             'fix': False,
                             'optim_params': {'lr': 0.01,
                                              'momentum': 0.9,
                                              'weight_decay': 0.0005},
                             'params': {'dataset': 'ImageNet_LT',
                                        'dropout': None,
                                        'stage1_weights': True,
                                        'use_fc': True,
                                        'use_modulatedatt': True}}},
 'training_opt': {'batch_size': 128,
                  'dataset': 'ImageNet_LT',
                  'display_step': 10,
                  'feature_dim': 512,
                  'log_dir': './logs/ImageNet_LT/meta_embedding',
                  'num_classes': 1000,
                  'num_epochs': 60,
                  'num_workers': 8,
                  'open_threshold': 0.1,
                  'sampler': {'def_file': './data/ClassAwareSampler.py',
                              'num_samples_cls': 4,
                              'type': 'ClassAwareSampler'},
                  'scheduler_params': {'gamma': 0.1, 'step_size': 20}}}
Loading data from ./data/ImageNet_LT/ImageNet_LT_train.txt
Use data transformation: Compose(
    RandomResizedCrop(size=(224, 224), scale=(0.08, 1.0), ratio=(0.75, 1.3333), interpolation=bilinear)
    RandomHorizontalFlip(p=0.5)
    ColorJitter(brightness=[0.6, 1.4], contrast=[0.6, 1.4], saturation=[0.6, 1.4], hue=None)
    ToTensor()
    Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])
)
Using sampler.
Sample 4 samples per-class.
Loading data from ./data/ImageNet_LT/ImageNet_LT_val.txt
Use data transformation: Compose(
    Resize(size=256, interpolation=bilinear)
    CenterCrop(size=(224, 224))
    ToTensor()
    Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])
)
No sampler.
Shuffle is True.
Loading data from ./data/ImageNet_LT/ImageNet_LT_train.txt
Use data transformation: Compose(
    Resize(size=256, interpolation=bilinear)
    CenterCrop(size=(224, 224))
    ToTensor()
    Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])
)
No sampler.
Shuffle is True.
Using 1 GPUs.
Loading Scratch ResNet 10 Feature Model.
Using fc.
Using self attention.
Loading ImageNet_LT Stage 1 ResNet 10 Weights.
Pretrained feature model weights path: ./logs/ImageNet_LT/stage1/final_model_checkpoint.pth
Loading Meta Embedding Classifier.
Loading ImageNet_LT Stage 1 Classifier Weights.
Pretrained classifier weights path: ./logs/ImageNet_LT/stage1/final_model_checkpoint.pth
------Enter run_networks train----
Using steps for training.
Initializing model optimizer.
Loading Softmax Loss.
Loading Discriminative Centroids Loss.
Initializing criterion optimizer.
Calculating centroids.
100%|██████████| 906/906 [02:21<00:00,  6.40it/s]
Phase: train
./loss/DiscCentroidsLoss.py:39: UserWarning: This overload of addmm_ is deprecated:
	addmm_(Number beta, Number alpha, Tensor mat1, Tensor mat2)
Consider using one of the following signatures instead:
	addmm_(Tensor mat1, Tensor mat2, *, Number beta, Number alpha) (Triggered internally at  /pytorch/torch/csrc/utils/python_arg_parser.cpp:1005.)
  distmat.addmm_(1, -2, feat.clone(), self.centroids.clone().t())
Epoch: [1/60] Step:     0 Minibatch_loss_feature: 0.109 Minibatch_loss_performance: 7.195 Minibatch_accuracy_micro: 0.000
Epoch: [1/60] Step:   900 Minibatch_loss_feature: 0.302 Minibatch_loss_performance: 5.015 Minibatch_accuracy_micro: 0.078
Phase: val
100%|██████████| 157/157 [00:25<00:00,  6.11it/s]
 
 
 Phase: val 
 
 总体准确率(Evaluation_accuracy_micro_top1): 0.163 
 计算F度量值(Averaged F-measure): 0.142 
 多样本准确率(Many_shot_accuracy_top1): 0.189 中等样本准确率(Median_shot_accuracy_top1): 0.155 少样本准确率(Low_shot_accuracy_top1): 0.122 
 
Epoch: [2/60] Step:     0 Minibatch_loss_feature: 0.291 Minibatch_loss_performance: 4.765 Minibatch_accuracy_micro: 0.141
Epoch: [2/60] Step:   900 Minibatch_loss_feature: 0.220 Minibatch_loss_performance: 4.135 Minibatch_accuracy_micro: 0.211
Phase: val
100%|██████████| 157/157 [00:25<00:00,  6.12it/s]
 
 
 Phase: val 
 
 总体准确率(Evaluation_accuracy_micro_top1): 0.199 
 计算F度量值(Averaged F-measure): 0.179 
 多样本准确率(Many_shot_accuracy_top1): 0.235 中等样本准确率(Median_shot_accuracy_top1): 0.189 少样本准确率(Low_shot_accuracy_top1): 0.132 
 
Epoch: [3/60] Step:     0 Minibatch_loss_feature: 0.213 Minibatch_loss_performance: 4.383 Minibatch_accuracy_micro: 0.172
Phase: val
100%|██████████| 157/157 [00:25<00:00,  6.28it/s]
 
 
 Phase: val 
 
 总体准确率(Evaluation_accuracy_micro_top1): 0.399 
 计算F度量值(Averaged F-measure): 0.382 
 多样本准确率(Many_shot_accuracy_top1): 0.484 中等样本准确率(Median_shot_accuracy_top1): 0.391 少样本准确率(Low_shot_accuracy_top1): 0.197 
 
Epoch: [60/60] Step:     0 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.730 Minibatch_accuracy_micro: 0.727
Epoch: [60/60] Step:    10 Minibatch_loss_feature: 0.010 Minibatch_loss_performance: 2.181 Minibatch_accuracy_micro: 0.547
Epoch: [60/60] Step:    20 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.547 Minibatch_accuracy_micro: 0.703
Epoch: [60/60] Step:    30 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.771 Minibatch_accuracy_micro: 0.648
Epoch: [60/60] Step:    40 Minibatch_loss_feature: 0.012 Minibatch_loss_performance: 1.961 Minibatch_accuracy_micro: 0.617
Epoch: [60/60] Step:    50 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.664 Minibatch_accuracy_micro: 0.656
Epoch: [60/60] Step:    60 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.954 Minibatch_accuracy_micro: 0.641
Epoch: [60/60] Step:    70 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 2.115 Minibatch_accuracy_micro: 0.602
Epoch: [60/60] Step:    80 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 2.132 Minibatch_accuracy_micro: 0.555
Epoch: [60/60] Step:    90 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.798 Minibatch_accuracy_micro: 0.656
Epoch: [60/60] Step:   100 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.915 Minibatch_accuracy_micro: 0.609
Epoch: [60/60] Step:   110 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 2.024 Minibatch_accuracy_micro: 0.609
Epoch: [60/60] Step:   120 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.863 Minibatch_accuracy_micro: 0.641
Epoch: [60/60] Step:   130 Minibatch_loss_feature: 0.010 Minibatch_loss_performance: 1.613 Minibatch_accuracy_micro: 0.703
Epoch: [60/60] Step:   140 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.846 Minibatch_accuracy_micro: 0.609
Epoch: [60/60] Step:   150 Minibatch_loss_feature: 0.010 Minibatch_loss_performance: 1.774 Minibatch_accuracy_micro: 0.617
Epoch: [60/60] Step:   160 Minibatch_loss_feature: 0.010 Minibatch_loss_performance: 1.796 Minibatch_accuracy_micro: 0.703
Epoch: [60/60] Step:   170 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.594 Minibatch_accuracy_micro: 0.750
Epoch: [60/60] Step:   180 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.685 Minibatch_accuracy_micro: 0.672
Epoch: [60/60] Step:   190 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.944 Minibatch_accuracy_micro: 0.648
Epoch: [60/60] Step:   200 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.977 Minibatch_accuracy_micro: 0.617
Epoch: [60/60] Step:   210 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.672 Minibatch_accuracy_micro: 0.672
Epoch: [60/60] Step:   220 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.535 Minibatch_accuracy_micro: 0.789
Epoch: [60/60] Step:   230 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.938 Minibatch_accuracy_micro: 0.633
Epoch: [60/60] Step:   240 Minibatch_loss_feature: 0.012 Minibatch_loss_performance: 1.961 Minibatch_accuracy_micro: 0.641
Epoch: [60/60] Step:   250 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.791 Minibatch_accuracy_micro: 0.688
Epoch: [60/60] Step:   260 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.663 Minibatch_accuracy_micro: 0.688
Epoch: [60/60] Step:   270 Minibatch_loss_feature: 0.012 Minibatch_loss_performance: 1.754 Minibatch_accuracy_micro: 0.656
Epoch: [60/60] Step:   280 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 2.112 Minibatch_accuracy_micro: 0.641
Epoch: [60/60] Step:   290 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 2.025 Minibatch_accuracy_micro: 0.594
Epoch: [60/60] Step:   300 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.946 Minibatch_accuracy_micro: 0.656
Epoch: [60/60] Step:   310 Minibatch_loss_feature: 0.010 Minibatch_loss_performance: 2.333 Minibatch_accuracy_micro: 0.547
Epoch: [60/60] Step:   320 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.724 Minibatch_accuracy_micro: 0.672
Epoch: [60/60] Step:   330 Minibatch_loss_feature: 0.010 Minibatch_loss_performance: 2.139 Minibatch_accuracy_micro: 0.602
Epoch: [60/60] Step:   340 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 2.079 Minibatch_accuracy_micro: 0.648
Epoch: [60/60] Step:   350 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 2.184 Minibatch_accuracy_micro: 0.617
Epoch: [60/60] Step:   360 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 2.165 Minibatch_accuracy_micro: 0.602
Epoch: [60/60] Step:   370 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 2.021 Minibatch_accuracy_micro: 0.578
Epoch: [60/60] Step:   380 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 2.073 Minibatch_accuracy_micro: 0.594
Epoch: [60/60] Step:   390 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 2.257 Minibatch_accuracy_micro: 0.531
Epoch: [60/60] Step:   400 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 2.067 Minibatch_accuracy_micro: 0.570
Epoch: [60/60] Step:   410 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.824 Minibatch_accuracy_micro: 0.648
Epoch: [60/60] Step:   420 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.715 Minibatch_accuracy_micro: 0.711
Epoch: [60/60] Step:   430 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 2.050 Minibatch_accuracy_micro: 0.578
Epoch: [60/60] Step:   440 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.672 Minibatch_accuracy_micro: 0.664
Epoch: [60/60] Step:   450 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 2.062 Minibatch_accuracy_micro: 0.625
Epoch: [60/60] Step:   460 Minibatch_loss_feature: 0.012 Minibatch_loss_performance: 1.850 Minibatch_accuracy_micro: 0.688
Epoch: [60/60] Step:   470 Minibatch_loss_feature: 0.012 Minibatch_loss_performance: 1.812 Minibatch_accuracy_micro: 0.625
Epoch: [60/60] Step:   480 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.685 Minibatch_accuracy_micro: 0.695
Epoch: [60/60] Step:   490 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.878 Minibatch_accuracy_micro: 0.641
Epoch: [60/60] Step:   500 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 2.120 Minibatch_accuracy_micro: 0.547
Epoch: [60/60] Step:   510 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.913 Minibatch_accuracy_micro: 0.625
Epoch: [60/60] Step:   520 Minibatch_loss_feature: 0.012 Minibatch_loss_performance: 1.823 Minibatch_accuracy_micro: 0.641
Epoch: [60/60] Step:   530 Minibatch_loss_feature: 0.012 Minibatch_loss_performance: 2.112 Minibatch_accuracy_micro: 0.602
Epoch: [60/60] Step:   540 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.955 Minibatch_accuracy_micro: 0.609
Epoch: [60/60] Step:   550 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.973 Minibatch_accuracy_micro: 0.609
Epoch: [60/60] Step:   560 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.888 Minibatch_accuracy_micro: 0.625
Epoch: [60/60] Step:   570 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.763 Minibatch_accuracy_micro: 0.656
Epoch: [60/60] Step:   580 Minibatch_loss_feature: 0.012 Minibatch_loss_performance: 1.704 Minibatch_accuracy_micro: 0.672
Epoch: [60/60] Step:   590 Minibatch_loss_feature: 0.012 Minibatch_loss_performance: 1.939 Minibatch_accuracy_micro: 0.570
Epoch: [60/60] Step:   600 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.996 Minibatch_accuracy_micro: 0.594
Epoch: [60/60] Step:   610 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 2.449 Minibatch_accuracy_micro: 0.531
Epoch: [60/60] Step:   620 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.634 Minibatch_accuracy_micro: 0.680
Epoch: [60/60] Step:   630 Minibatch_loss_feature: 0.010 Minibatch_loss_performance: 1.785 Minibatch_accuracy_micro: 0.664
Epoch: [60/60] Step:   640 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.854 Minibatch_accuracy_micro: 0.617
Epoch: [60/60] Step:   650 Minibatch_loss_feature: 0.012 Minibatch_loss_performance: 1.745 Minibatch_accuracy_micro: 0.672
Epoch: [60/60] Step:   660 Minibatch_loss_feature: 0.012 Minibatch_loss_performance: 1.661 Minibatch_accuracy_micro: 0.656
Epoch: [60/60] Step:   670 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.806 Minibatch_accuracy_micro: 0.633
Epoch: [60/60] Step:   680 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 2.190 Minibatch_accuracy_micro: 0.570
Epoch: [60/60] Step:   690 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.810 Minibatch_accuracy_micro: 0.648
Epoch: [60/60] Step:   700 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 2.198 Minibatch_accuracy_micro: 0.617
Epoch: [60/60] Step:   710 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.770 Minibatch_accuracy_micro: 0.703
Epoch: [60/60] Step:   720 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.869 Minibatch_accuracy_micro: 0.680
Epoch: [60/60] Step:   730 Minibatch_loss_feature: 0.012 Minibatch_loss_performance: 1.875 Minibatch_accuracy_micro: 0.594
Epoch: [60/60] Step:   740 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.889 Minibatch_accuracy_micro: 0.656
Epoch: [60/60] Step:   750 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.946 Minibatch_accuracy_micro: 0.672
Epoch: [60/60] Step:   760 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.834 Minibatch_accuracy_micro: 0.711
Epoch: [60/60] Step:   770 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 2.133 Minibatch_accuracy_micro: 0.586
Epoch: [60/60] Step:   780 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 2.007 Minibatch_accuracy_micro: 0.672
Epoch: [60/60] Step:   790 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 2.139 Minibatch_accuracy_micro: 0.609
Epoch: [60/60] Step:   800 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.847 Minibatch_accuracy_micro: 0.664
Epoch: [60/60] Step:   810 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 2.224 Minibatch_accuracy_micro: 0.602
Epoch: [60/60] Step:   820 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 2.099 Minibatch_accuracy_micro: 0.594
Epoch: [60/60] Step:   830 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.889 Minibatch_accuracy_micro: 0.672
Epoch: [60/60] Step:   840 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.989 Minibatch_accuracy_micro: 0.656
Epoch: [60/60] Step:   850 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.987 Minibatch_accuracy_micro: 0.625
Epoch: [60/60] Step:   860 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.919 Minibatch_accuracy_micro: 0.680
Epoch: [60/60] Step:   870 Minibatch_loss_feature: 0.012 Minibatch_loss_performance: 1.547 Minibatch_accuracy_micro: 0.734
Epoch: [60/60] Step:   880 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.712 Minibatch_accuracy_micro: 0.641
Epoch: [60/60] Step:   890 Minibatch_loss_feature: 0.011 Minibatch_loss_performance: 1.613 Minibatch_accuracy_micro: 0.727
Epoch: [60/60] Step:   900 Minibatch_loss_feature: 0.010 Minibatch_loss_performance: 2.221 Minibatch_accuracy_micro: 0.586
Phase: val
100%|██████████| 157/157 [00:25<00:00,  6.93it/s]
 
 
 Phase: val 
 
 总体准确率(Evaluation_accuracy_micro_top1): 0.399 
 计算F度量值(Averaged F-measure): 0.381 
 多样本准确率(Many_shot_accuracy_top1): 0.485 中等样本准确率(Median_shot_accuracy_top1): 0.389 少样本准确率(Low_shot_accuracy_top1): 0.199 
 
 
Training Complete.
Best validation accuracy is 0.400 at epoch 57
Done
ALL COMPLETED.
 
Process finished with exit code 0