吴裕雄--天生自然TensorFlow2教程:激活函数及其梯度
作者:互联网
import tensorflow as tf a = tf.linspace(-10., 10., 10) a
with tf.GradientTape() as tape: tape.watch(a) y = tf.sigmoid(a)
grads = tape.gradient(y, [a]) grads
a = tf.linspace(-5.,5.,10) a
tf.tanh(a)
a = tf.linspace(-1.,1.,10) a
tf.nn.relu(a)
tf.nn.leaky_relu(a)
标签:TensorFlow2,10,教程,relu,tape,linspace,tf,吴裕雄,grads 来源: https://www.cnblogs.com/tszr/p/12228118.html