其他分享
首页 > 其他分享> > tensorflow2.0--反向传播

tensorflow2.0--反向传播

作者:互联网

# -*- coding:utf-8 -*-

import tensorflow as tf

w = tf.Variable(tf.constant(5, dtype=tf.float32))

epoch = 40
LR_BASE = 0.2
LR_DECAY = 0.99
LR_STEP = 1

for epoch in range(epoch):
    lr = LR_BASE * LR_DECAY ** (epoch / LR_STEP)
    with tf.GradientTape() as tape:
        loss = tf.square(w + 1)
    grads = tape.gradient(loss, w)

    w.assign_sub(lr * grads)
    print("After %s epoch,w is %f,loss is %f,lr is %f" % (epoch, w.numpy(), loss, lr))

 

标签:loss,DECAY,--,lr,tensorflow2.0,epoch,LR,tf,反向
来源: https://www.cnblogs.com/ai-tech/p/15145922.html