tensorflow2.0---交叉信息熵
作者:互联网
# -*- coding:utf-8 -*- import tensorflow as tf import numpy as np y_ = np.array([[1, 0, 0], [0, 1, 0], [0, 0, 1], [1, 0, 0], [0, 1, 0]]) y = np.array([[12, 3, 2], [3, 10, 1], [1, 2, 5], [4, 6.5, 1.2], [3, 6, 1]]) y_pro = tf.nn.softmax(y) loss_ce1 = tf.losses.categorical_crossentropy(y_,y_pro) loss_ce2 = tf.nn.softmax_cross_entropy_with_logits(y_, y) print('分步计算的结果:\n', loss_ce1) print('结合计算的结果:\n', loss_ce2)
标签:---,loss,pro,tensorflow2.0,tf,信息熵,ce2,ce1,np 来源: https://www.cnblogs.com/ai-tech/p/15145927.html