其他分享
首页 > 其他分享> > pytorch中动态调整学习率

pytorch中动态调整学习率

作者:互联网

https://blog.csdn.net/bc521bc/article/details/85864555

这篇bolg说的很详细了,但是具体在代码中怎么用还是有点模糊。自己试验了一下,顺路记一下,其实很简单,在optimizer后面定义一下,然后在每个epoch中step一下就可以了。一开始出错是因为我把step放到

T_optimizer.step()后面了,导致一个epoch后小到看不出来了.
T_optimizer = SGD(net.parameters(), lr=LR, weight_decay=0.0005, momentum=0.9)
scheduler = lr_scheduler.StepLR(T_optimizer, step_size=30, gamma=0.5)

for epoch in enumerate(range(startepoch, startepoch + EPOCH)): for anc, pos, neg in Triplet_data: net.zero_grad() anc_feat = net(anc.to(device)) pos_feat = net(pos.to(device)) neg_feat = net(neg.to(device)) tri_loss = T_loss(anc_feat, pos_feat, neg_feat) tri_loss.backward() T_optimizer.step() ave_loss = np.mean(loss_np) scheduler.step()#LR减小

标签:loss,optimizer,anc,step,学习,pytorch,net,动态,feat
来源: https://www.cnblogs.com/jiangnanyanyuchen/p/11794825.html