使用Pytorch进行多标签学习(一张图片预测多个属性)
文章目录
- 报错
- 原因
- 解决
-
- one-hot:
- 调用
- 另:
- 参考
报错
Traceback (most recent call last):File "/home/user1/main.py", line 544, in <module>main()File "/home/user1/main.py", line 273, in mainvalidate(test_loader_lfwa, model, criterion)File "/home/user1/main.py", line 476, in validateloss.append(criterion(output[j], target_j))File "/home/user1/miniconda3/lib/python3.7/site-packages/torch/nn/modules/module.py", line 493, in __call__result = self.forward(*input, **kwargs)File "/home/user1/miniconda3/lib/python3.7/site-packages/torch/nn/modules/loss.py", line 942, in forwardignore_index=self.ignore_index, reduction=self.reduction)File "/home/user1/miniconda3/lib/python3.7/site-packages/torch/nn/functional.py", line 2056, in cross_entropyreturn nll_loss(log_softmax(input, 1), target, weight, None, ignore_index, None, reduction)File "/home/user1/miniconda3/lib/python3.7/site-packages/torch/nn/functional.py", line 1871, in nll_lossret = torch._C._nn.nll_loss(input, target, weight, _Reduction.get_enum(reduction), ignore_index)
RuntimeError: multi-target not supported at /opt/conda/conda-bld/pytorch_1556653215914/work/aten/src/THCUNN/generic/ClassNLLCriterion.cu:15
原因
- 测试时没有做one-hot编码(出现这种错误也有可能是预测输出和ground truth target,一个是one-hot的,另一个不是。这时候计算loss自然就会出错。需要结合代码具体去排查)
- 在Pytorch中,不管是
nn.MultiLabelSoftMarginLoss()
还是nn.BCEWithLogitsLoss
都不是默认你提供的标签是one-hot的。所以不适用于多标签学习,多标签学习可能你需要先对标签进行one-hot转换。
解决
one-hot:
def one_hot(src, batchSize):# one-hottarget_j = src.tolist() # src like: target[:, j]target_j = torch.tensor([[i] for i in target_j])target_j = torch.zeros(batchSize, 2).scatter_(1, target_j, 1)target_j = target_j.cuda(non_blocking=True)return target_j
调用
target_j = one_hot(target[:, j], target.shape[0])
loss.append(criterion(output[j], target_j, ))
在上面这个one_hot处理的时候可能会报错:
File "main_mt_arcf.py", line 155, in <module>main()File "main_mt_arcf.py", line 126, in maintrain_loss, train_acc, train_bacc = train(model, optimizer, train_loader, criterion, weights=attrWeights)File "/home/user1/train.py", line 53, in trainlabel = one_hot(label, labels.shape[0])File "/home/user1/dataset/celeba.py", line 27, in one_hottarget_j = torch.zeros(batchSize, 2).scatter_(1, target_j, 1)
RuntimeError: invalid argument 3: Index tensor must either be empty or have same dimensions as output tensor at /opt/conda/conda-bld/pytorch_1556653215914/work/aten/src/TH/generic/THTensorEvenMoreMath.cpp:533
原因:
解决:
其他独热编码:
def label2onehot(labels, dim):"""Convert label indices to one-hot vectors."""# list(labels.size()) 返回labels的shapeout = torch.zeros(list(labels.size())+[dim]).to('cpu')print(out.shape)# labels.unsqueeze(-1) 在最后增加一个维度;torch.Size([10, 9, 9]) to torch.Size([10, 9, 9, 1])print(labels.unsqueeze(-1).shape)print(len(out.size())-1)out.scatter_(len(out.size())-1,labels.unsqueeze(-1),1.)return out
https://blog.csdn.net/NockinOnHeavensDoor/article/details/88757073
另:
类似的问题:ValueError: Error when checking target: expected activation_7 to have shape (2,) but got array with
出现这种情况的还有一种可能是你用了 criterion = nn.CrossEntropyLoss()
但是你又对target做了one-hot,这种情况下是不用对target做one-hot的,亲。虽然每个样本对应的output是二维的,但target是一维的就可以了。应该是这个这种简单的criterion可以自己handle这个问题吧。。
或者输出维度和标签维度对齐之后,出现这个情况。不要再用cross_entropy损失,使用BCEWithLogitsLoss损失就可以了。
https://blog.csdn.net/qq_22210253/article/details/85222093
参考
https://discuss.pytorch.org/t/runtimeerror-multi-target-not-supported-newbie/10216