Tensorflow案例
作者:互联网
使用Keras实现MLP
一.使用顺序AP实现图像分类器
1.加载Fashion MNIST数据集,70000张灰度图像,每张图28*28像素,10个类。10个类都是衣物品.
2.使用Keras加载数据集
#导入keras
import tensorflow as tf from tensorflow import keras #导入数据,并且分为训练集和测试集
fashion_mnist=keras.datasets.fashion_mnist (x_train,y_train),(x_test,y_test)=fashion_mnist.load_data()
#创建验证集
x_valid,x_train=x_train[:5000]/255.0,x_train[5000:]/255.0
y_valid,y_train=y_train[:5000],y_train[5000:]
3.得到10个类标签,存放在列表中
class_names=["T-shirt/top","Trouser","Pullover","Dress","Coat","Sandal","Shirt", "Sneaker","Bag","Ankle boot"]
二.构建神经网络模型(具有两个隐藏层的分类MLP)
model=keras.models.Sequential() model.add(keras.layers.Flatten(input_shape=[28,28])) model.add(keras.layers.Dense(300,activation="relu")) model.add(keras.layers.Dense(100,activation="relu")) model.add(keras.layers.Dense(10,activation="softmax"))
model.summary()
*Flatten层,将每个输入图像转换为一维数组:计算X.reshape(-1,1)。指定输入维度input_shape=[28,28]
*接下来添加具有300个神经元的Dense隐藏层,使用Relu激活函数
*添加有100个神经元的Dense隐藏层,使用Relu激活函数
*最后一层,包含10个神经元的输出层,使用softmax激活函数来分类
4.model.summary可以显示模型的所有曾,包括模型的参数,输出形状等等
Model: "sequential_1"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
flatten (Flatten) (None, 784) 0
_________________________________________________________________
dense (Dense) (None, 300) 235500
_________________________________________________________________
dense_1 (Dense) (None, 100) 30100
_________________________________________________________________
dense_2 (Dense) (None, 10) 1010
=================================================================
Total params: 266,610
Trainable params: 266,610
Non-trainable params: 0
三.编译模型
model.compile(loss="sparse_categorical_crossentropy",optimizer="sgd",metrics=["accuracy"])
*指定损失函数,以及优化器sgd算法,sgd表示使用随机梯度下降来训练模型,默认学习率为0.01,因为这是个分类器,所以目的是测量其accuracy很有用
四.训练和评估模型
history=model.fit(x_train,y_train,epochs=30,validation_data=(x_valid,y_valid))
*我们将输入特征x_train,目标类y_train以及要训练的轮次传递给它,epochs=30,我们还传递了一个验证集。
训练过程:
Epoch 1/30
1719/1719 [==============================] - 5s 3ms/step - loss: 0.7117 - accuracy: 0.7678 - val_loss: 0.5185 - val_accuracy: 0.8244
Epoch 2/30
1719/1719 [==============================] - 4s 2ms/step - loss: 0.4833 - accuracy: 0.8325 - val_loss: 0.4603 - val_accuracy: 0.8438
Epoch 3/30
1719/1719 [==============================] - 4s 2ms/step - loss: 0.4371 - accuracy: 0.8473 - val_loss: 0.4155 - val_accuracy: 0.8592
Epoch 4/30
1719/1719 [==============================] - 4s 2ms/step - loss: 0.4099 - accuracy: 0.8556 - val_loss: 0.3928 - val_accuracy: 0.8616
Epoch 5/30
1719/1719 [==============================] - 4s 2ms/step - loss: 0.3904 - accuracy: 0.8623 - val_loss: 0.3761 - val_accuracy: 0.8702
Epoch 6/30
1719/1719 [==============================] - 5s 3ms/step - loss: 0.3745 - accuracy: 0.8678 - val_loss: 0.3881 - val_accuracy: 0.8638
Epoch 7/30
1719/1719 [==============================] - 4s 2ms/step - loss: 0.3620 - accuracy: 0.8718 - val_loss: 0.3618 - val_accuracy: 0.8766
Epoch 8/30
1719/1719 [==============================] - 4s 2ms/step - loss: 0.3501 - accuracy: 0.8761 - val_loss: 0.3518 - val_accuracy: 0.8760
Epoch 9/30
1719/1719 [==============================] - 4s 2ms/step - loss: 0.3407 - accuracy: 0.8786 - val_loss: 0.3574 - val_accuracy: 0.8766
Epoch 10/30
1719/1719 [==============================] - 4s 2ms/step - loss: 0.3309 - accuracy: 0.8839 - val_loss: 0.3490 - val_accuracy: 0.8792
Epoch 11/30
1719/1719 [==============================] - 3s 2ms/step - loss: 0.3226 - accuracy: 0.8839 - val_loss: 0.3404 - val_accuracy: 0.8782
Epoch 12/30
1719/1719 [==============================] - 3s 2ms/step - loss: 0.3149 - accuracy: 0.8880 - val_loss: 0.3415 - val_accuracy: 0.8798
Epoch 13/30
1719/1719 [==============================] - 3s 2ms/step - loss: 0.3075 - accuracy: 0.8911 - val_loss: 0.3289 - val_accuracy: 0.8852
Epoch 14/30
1719/1719 [==============================] - 3s 2ms/step - loss: 0.3020 - accuracy: 0.8912 - val_loss: 0.3278 - val_accuracy: 0.8832
Epoch 15/30
1719/1719 [==============================] - 3s 2ms/step - loss: 0.2943 - accuracy: 0.8940 - val_loss: 0.3305 - val_accuracy: 0.8814
Epoch 16/30
1719/1719 [==============================] - 3s 2ms/step - loss: 0.2897 - accuracy: 0.8965 - val_loss: 0.3286 - val_accuracy: 0.8834
Epoch 17/30
1719/1719 [==============================] - 3s 2ms/step - loss: 0.2827 - accuracy: 0.8979 - val_loss: 0.3115 - val_accuracy: 0.8892
Epoch 18/30
1719/1719 [==============================] - 3s 2ms/step - loss: 0.2770 - accuracy: 0.9013 - val_loss: 0.3188 - val_accuracy: 0.8870
Epoch 19/30
1719/1719 [==============================] - 3s 2ms/step - loss: 0.2722 - accuracy: 0.9008 - val_loss: 0.3146 - val_accuracy: 0.8852
Epoch 20/30
1719/1719 [==============================] - 4s 2ms/step - loss: 0.2671 - accuracy: 0.9048 - val_loss: 0.3159 - val_accuracy: 0.8862
Epoch 21/30
1719/1719 [==============================] - 4s 2ms/step - loss: 0.2629 - accuracy: 0.9053 - val_loss: 0.3120 - val_accuracy: 0.8864
Epoch 22/30
1719/1719 [==============================] - 4s 2ms/step - loss: 0.2569 - accuracy: 0.9078 - val_loss: 0.3179 - val_accuracy: 0.8866
Epoch 23/30
1719/1719 [==============================] - 3s 2ms/step - loss: 0.2540 - accuracy: 0.9082 - val_loss: 0.3063 - val_accuracy: 0.8900
Epoch 24/30
1719/1719 [==============================] - 3s 2ms/step - loss: 0.2496 - accuracy: 0.9101 - val_loss: 0.3160 - val_accuracy: 0.8900
Epoch 25/30
1719/1719 [==============================] - 3s 2ms/step - loss: 0.2448 - accuracy: 0.9118 - val_loss: 0.3115 - val_accuracy: 0.8864
Epoch 26/30
1719/1719 [==============================] - 3s 2ms/step - loss: 0.2414 - accuracy: 0.9126 - val_loss: 0.3020 - val_accuracy: 0.8906
Epoch 27/30
1719/1719 [==============================] - 3s 2ms/step - loss: 0.2365 - accuracy: 0.9149 - val_loss: 0.3274 - val_accuracy: 0.8818
Epoch 28/30
1719/1719 [==============================] - 4s 2ms/step - loss: 0.2321 - accuracy: 0.9173 - val_loss: 0.3153 - val_accuracy: 0.8878
Epoch 29/30
1719/1719 [==============================] - 3s 2ms/step - loss: 0.2293 - accuracy: 0.9184 - val_loss: 0.3064 - val_accuracy: 0.8876
Epoch 30/30
1719/1719 [==============================] - 3s 2ms/step - loss: 0.2254 - accuracy: 0.9190 - val_loss: 0.2931 - val_accuracy: 0.8950
*fit方法返回一个History对象,其中包含参数params
#得到学习曲线
import pandas as pd import matplotlib.pyplot as plt pd.DataFrame(history.history).plot(figsize=(8,5)) plt.grid(True) plt.gca().set_ylim(0,1) plt.show()
*如果你对模型的性能不满意,应该调整超参数。首先要检查的是学习率,然后可以选择换一个优化器,继续尝试模型超参数,使用evaluate方法得到准确率
model.evaluate(x_test,y_test)
#[57.97081756591797, 0.853600025177002]
五.使用模型预测
x_new=x_test[:3] y_proba=model.predict(x_new) y_proba.round(2) y_pred=model.predict_classes(x_new) np.array(class_names)[y_pred]
*使用模型的predict方法对新的输入进行预测,对于每个输入实例,模型估计从0类到9类每个类的概率.
标签:loss,val,301719,step,案例,Epoch,Tensorflow,accuracy 来源: https://www.cnblogs.com/wprgogogo/p/16191007.html