非常感謝大佬指點(diǎn)坏逢,貼出我的代碼了赌,供大家參考鉴竭,改進(jìn)在這句話上 loss = loss_function(pred+1e-8, labels.to(device))高职,傳入一個(gè)極小值pred+1e-8抠刺,保證預(yù)測值不會(huì)為0
pred = model(images.to(device))
#加入一個(gè)極小值导匣,RuntimeError: Function 'LogSoftmaxBackward0' returned nan values in its 0th output.
loss = loss_function(pred+1e-8, labels.to(device))
print('loss=',loss)
loss.backward()
pytorch訓(xùn)練時(shí)前一次迭代的loss正常后一次迭代卻報(bào)nan問題描述:訓(xùn)練一個(gè)有四個(gè)階段模型的網(wǎng)絡(luò)(可以簡單理解每一階段就是一個(gè)卷積層)哭尝,每階段輸出都會(huì)進(jìn)行l(wèi)oss計(jì)算。現(xiàn)將pytorch原始的ce loss改為focal loss后...