WS-DAN 复现 WSDAN(Weakly Supervised Data Augmentation Network)

Posted WinstonYF

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了WS-DAN 复现 WSDAN(Weakly Supervised Data Augmentation Network)相关的知识,希望对你有一定的参考价值。

一, WS-DAN介绍

论文原文:《See Better Before Looking Closer: Weakly Supervised Data Augmentation Network for Fine-Grained Visual Classification》

网上很多介绍,选1篇大家自己去看:
【细粒度】WS-DAN

b站视频:
【论文讲解+复现】WS-DAN WSDAN(Weakly Supervised Data Augmentation Network

二,准备

2.1 平台

极链AI云
整个复现过程将会在这个平台上进行,原因是,有现成的pytorch和cuda环境,直接线上敲代码,不需要本地配置。

2.2 源码

码云上的源码(速度快)
github上的源码

三,开始复现

3.1,创建实例

进入极链AI云平台,选择一个最便宜的机子。

镜像如下:
Pytorch选择1.8,python3.8,CUDA11.1.1

创建好之后的样子:

我们使用第三方工具的jupyter lab

进入后,界面如下:

3.2 在jupyter lab中创建文件

在/home下创建wsdan文件

打开terminal

我们输入ls,查看一下当前目录情况:

在terminal当中输入:

cd /home/wsdan/

进入到wadn目录中,然后我们开始下载ws-dan的项目

3.3 下载ws-dan项目

输入命令:

git clone https://gitee.com/YFwinston/WS-DAN.PyTorch.git

下载好后,ws-dan的目录如下:

3.4 数据集准备

我这里以鸟类数据集为例子
官方下载地址:CUB-200-2011 (Bird) 但是下载这个要翻墙

在极链云平台,这个数据集我已经叫官方加进去了,就在:
/datasets/CUB-200-2011-Bird/

3.5 修改bird_dataset.py文件

修改bird_dataset.py中的数据集的指定路径

修改如下(修改为我们下载的路径):

#DATAPATH = '/home/guyuchong/DATA/FGVC/CUB-200-2011'
DATAPATH = '/home/dataset/CUB_200_2011'

3.6训练

没错,就是这么快,就可以开始训练了,往往最高端的论文,往往只需要最简单的操作。

在终端,进入下面的文件路径

在终端中输入:

python3 train.py 

3.8 训练结果

下面是训练日志,所有信息都在里面:

2021-08-10 20:38:58,851: INFO: [inception.py:177]: Inception3: All params loaded
2021-08-10 20:38:59,323: INFO: [wsdan.py:97]: WSDAN: using inception_mixed_6e as feature extractor, num_classes: 200, num_attentions: 32
2021-08-10 20:39:02,305: INFO: [train.py:93]: Network weights save to ./FGVC/CUB-200-2011/ckpt/
2021-08-10 20:39:02,353: INFO: [train.py:126]: Start training: Total epochs: 160, Batch size: 8, Training size: 5994, Validation size: 5794
2021-08-10 20:39:02,353: INFO: [train.py:128]: 
2021-08-10 20:39:02,353: INFO: [train.py:136]: Epoch 001, Learning Rate 0.001
2021-08-10 20:42:35,529: INFO: [train.py:247]: Train: Loss 5.4726, Raw Acc (28.40, 49.73), Crop Acc (19.95, 38.02), Drop Acc (18.92, 38.67), Time 213.17
2021-08-10 20:43:10,918: INFO: [train.py:302]: Valid: Val Loss 2.1645, Val Acc (54.06, 83.26), Time 35.38
2021-08-10 20:43:10,918: INFO: [train.py:303]: 
2021-08-10 20:43:11,094: INFO: [train.py:136]: Epoch 002, Learning Rate 0.001
2021-08-10 20:46:44,574: INFO: [train.py:247]: Train: Loss 2.8880, Raw Acc (64.60, 88.47), Crop Acc (50.30, 76.13), Drop Acc (48.68, 76.54), Time 213.48
2021-08-10 20:47:18,815: INFO: [train.py:302]: Valid: Val Loss 1.5154, Val Acc (69.14, 89.30), Time 34.24
2021-08-10 20:47:18,815: INFO: [train.py:303]: 
2021-08-10 20:47:19,097: INFO: [train.py:136]: Epoch 003, Learning Rate 0.0009
2021-08-10 20:50:52,811: INFO: [train.py:247]: Train: Loss 1.8485, Raw Acc (79.46, 95.43), Crop Acc (67.42, 87.20), Drop Acc (66.00, 88.99), Time 213.71
2021-08-10 20:51:27,392: INFO: [train.py:302]: Valid: Val Loss 1.1276, Val Acc (75.58, 93.23), Time 34.58
2021-08-10 20:51:27,393: INFO: [train.py:303]: 
2021-08-10 20:51:27,672: INFO: [train.py:136]: Epoch 004, Learning Rate 0.0009
2021-08-10 20:55:02,738: INFO: [train.py:247]: Train: Loss 1.3692, Raw Acc (84.45, 97.23), Crop Acc (75.79, 92.71), Drop Acc (72.82, 92.68), Time 215.06
2021-08-10 20:55:37,500: INFO: [train.py:302]: Valid: Val Loss 1.0774, Val Acc (76.53, 94.56), Time 34.75
2021-08-10 20:55:37,501: INFO: [train.py:303]: 
2021-08-10 20:55:37,770: INFO: [train.py:136]: Epoch 005, Learning Rate 0.00081
2021-08-10 20:59:13,200: INFO: [train.py:247]: Train: Loss 1.0293, Raw Acc (88.47, 98.53), Crop Acc (82.12, 95.58), Drop Acc (79.63, 95.93), Time 215.43
2021-08-10 20:59:48,232: INFO: [train.py:302]: Valid: Val Loss 0.8782, Val Acc (80.96, 95.60), Time 35.03
2021-08-10 20:59:48,232: INFO: [train.py:303]: 
2021-08-10 20:59:48,496: INFO: [train.py:136]: Epoch 006, Learning Rate 0.00081
2021-08-10 21:03:24,037: INFO: [train.py:247]: Train: Loss 0.8415, Raw Acc (91.02, 99.27), Crop Acc (85.30, 96.71), Drop Acc (82.83, 96.98), Time 215.54
2021-08-10 21:03:59,091: INFO: [train.py:302]: Valid: Val Loss 0.9179, Val Acc (80.82, 95.91), Time 35.05
2021-08-10 21:03:59,091: INFO: [train.py:303]: 
2021-08-10 21:03:59,092: INFO: [train.py:136]: Epoch 007, Learning Rate 0.000729
2021-08-10 21:07:36,231: INFO: [train.py:247]: Train: Loss 0.6588, Raw Acc (93.91, 99.70), Crop Acc (90.12, 98.31), Drop Acc (86.60, 97.58), Time 217.14
2021-08-10 21:08:11,372: INFO: [train.py:302]: Valid: Val Loss 0.7636, Val Acc (83.29, 96.43), Time 35.13
2021-08-10 21:08:11,373: INFO: [train.py:303]: 
2021-08-10 21:08:11,642: INFO: [train.py:136]: Epoch 008, Learning Rate 0.000729
2021-08-10 21:11:52,144: INFO: [train.py:247]: Train: Loss 0.5629, Raw Acc (95.30, 99.68), Crop Acc (92.46, 98.75), Drop Acc (88.30, 98.03), Time 220.50
2021-08-10 21:12:26,958: INFO: [train.py:302]: Valid: Val Loss 0.7243, Val Acc (83.50, 96.48), Time 34.81
2021-08-10 21:12:26,959: INFO: [train.py:303]: 
2021-08-10 21:12:27,248: INFO: [train.py:136]: Epoch 009, Learning Rate 0.0006561
2021-08-10 21:16:07,619: INFO: [train.py:247]: Train: Loss 0.4597, Raw Acc (97.26, 99.93), Crop Acc (94.29, 98.88), Drop Acc (91.29, 98.58), Time 220.37
2021-08-10 21:16:42,933: INFO: [train.py:302]: Valid: Val Loss 0.6472, Val Acc (85.88, 96.96), Time 35.31
2021-08-10 21:16:42,934: INFO: [train.py:303]: 
2021-08-10 21:16:43,229: INFO: [train.py:136]: Epoch 010, Learning Rate 0.0006561
2021-08-10 21:20:23,629: INFO: [train.py:247]: Train: Loss 0.4084, Raw Acc (97.86, 99.98), Crop Acc (95.68, 99.15), Drop Acc (92.43, 98.92), Time 220.40
2021-08-10 21:20:58,538: INFO: [train.py:302]: Valid: Val Loss 0.6557, Val Acc (84.90, 96.65), Time 34.90
2021-08-10 21:20:58,538: INFO: [train.py:303]: 
2021-08-10 21:20:58,540: INFO: [train.py:136]: Epoch 011, Learning Rate 0.00059049
2021-08-10 21:24:39,002: INFO: [train.py:247]: Train: Loss 0.3567, Raw Acc (98.63, 100.00), Crop Acc (96.06, 98.87), Drop Acc (94.63, 99.33), Time 220.46
2021-08-10 21:25:13,825: INFO: [train.py:302]: Valid: Val Loss 0.6422, Val Acc (85.33, 97.24), Time 34.82
2021-08-10 21:25:13,826: INFO: [train.py:303]: 
2021-08-10 21:25:13,827周志华A Brief Introduction to Weakly Supervised Learning翻译

paper reading - A Fusion-based enhancing method for weakly illuminated images

论文笔记 Deep Patch Learning for Weakly Supervised Object Classi cation and Discovery

漏洞复现-vulhub复现Weblogic的SSRF漏洞

漏洞复现-vulhub复现Weblogic的SSRF漏洞

A brief introduction to weakly supervised learning(简要介绍弱监督学习)