tensorflow--Mnist數(shù)據(jù)集

Mnist數(shù)據(jù)集一共有7萬張圖片:

  • 6萬張28*28像素點(diǎn)的0~9手寫數(shù)字圖片和標(biāo)簽,用于訓(xùn)練
  • 1萬張28*28像素點(diǎn)的0~9手寫數(shù)字圖片和標(biāo)簽绽昏,用于測試
import tensorflow as tf
import matplotlib.pyplot as plt

# 導(dǎo)入數(shù)據(jù)
mnist = tf.keras.datasets.mnist
(x_train, y_train), (x_test, y_test) = mnist.load_data()

# 看看數(shù)據(jù)樣式把还,有個大概認(rèn)識
print(x_train.shape, y_train.shape, x_test.shape, y_test.shape)
print(x_train[0].shape)
print(x_train[0])

# 可視化數(shù)據(jù)(隨便看一個)
plt.imshow(x_train[24], cmap="gray")
plt.show()

運(yùn)行結(jié)果:

(60000, 28, 28) (60000,) (10000, 28, 28) (10000,)
(28, 28)
[[  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   3  18  18  18 126 136
  175  26 166 255 247 127   0   0   0   0]
 [  0   0   0   0   0   0   0   0  30  36  94 154 170 253 253 253 253 253
  225 172 253 242 195  64   0   0   0   0]
 [  0   0   0   0   0   0   0  49 238 253 253 253 253 253 253 253 253 251
   93  82  82  56  39   0   0   0   0   0]
 [  0   0   0   0   0   0   0  18 219 253 253 253 253 253 198 182 247 241
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0  80 156 107 253 253 205  11   0  43 154
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0  14   1 154 253  90   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0 139 253 190   2   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0  11 190 253  70   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0  35 241 225 160 108   1
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0  81 240 253 253 119
   25   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0  45 186 253 253
  150  27   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0  16  93 252
  253 187   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0 249
  253 249  64   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0  46 130 183 253
  253 207   2   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0  39 148 229 253 253 253
  250 182   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0  24 114 221 253 253 253 253 201
   78   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0  23  66 213 253 253 253 253 198  81   2
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0  18 171 219 253 253 253 253 195  80   9   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0  55 172 226 253 253 253 253 244 133  11   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0 136 253 253 253 212 135 132  16   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0]]

圖片如下:


1.png

Sequential實現(xiàn)手寫數(shù)字識別

import tensorflow as tf


mnist = tf.keras.datasets.mnist
(x_train, y_train), (x_test, y_test) = mnist.load_data()
x_train, x_test = x_train / 255.0, x_test / 255.0  # 輸入特征歸一化实蓬,把0·255的數(shù)值變?yōu)?·1之間,更適合神經(jīng)網(wǎng)絡(luò)吸收

model = tf.keras.models.Sequential([
    tf.keras.layers.Flatten(),  # 拉直為一維數(shù)組
    tf.keras.layers.Dense(128, activation="relu"),  # 第一層128個神經(jīng)元吊履,relu激活函數(shù)
    tf.keras.layers.Dense(10, activation="softmax")  # 第二層10個神經(jīng)元安皱,softmax使輸出符合概率分布
])

# 訓(xùn)練方法:adam優(yōu)化器,損失函數(shù)艇炎,評測指標(biāo)
model.compile(optimizer="adam",
              loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=False),
              metrics=["sparse_categorical_accuracy"])

model.fit(x_train, y_train, batch_size=32, epochs=5, validation_data=(x_test, y_test),
          validation_freq=1)
model.summary()

運(yùn)行結(jié)果:

Epoch 5/5
   1/1875 [..............................] - ETA: 0s - loss: 0.1622 - sparse_categorical_accuracy: 0.9375
  50/1875 [..............................] - ETA: 2s - loss: 0.0566 - sparse_categorical_accuracy: 0.9812
  98/1875 [>.............................] - ETA: 2s - loss: 0.0531 - sparse_categorical_accuracy: 0.9831
 147/1875 [=>............................] - ETA: 1s - loss: 0.0478 - sparse_categorical_accuracy: 0.9853
 196/1875 [==>...........................] - ETA: 1s - loss: 0.0475 - sparse_categorical_accuracy: 0.9850
 235/1875 [==>...........................] - ETA: 1s - loss: 0.0444 - sparse_categorical_accuracy: 0.9859
 276/1875 [===>..........................] - ETA: 1s - loss: 0.0439 - sparse_categorical_accuracy: 0.9868
 321/1875 [====>.........................] - ETA: 1s - loss: 0.0422 - sparse_categorical_accuracy: 0.9876
 366/1875 [====>.........................] - ETA: 1s - loss: 0.0420 - sparse_categorical_accuracy: 0.9880
 407/1875 [=====>........................] - ETA: 1s - loss: 0.0425 - sparse_categorical_accuracy: 0.9878
 452/1875 [======>.......................] - ETA: 1s - loss: 0.0430 - sparse_categorical_accuracy: 0.9876
 502/1875 [=======>......................] - ETA: 1s - loss: 0.0425 - sparse_categorical_accuracy: 0.9877
 552/1875 [=======>......................] - ETA: 1s - loss: 0.0444 - sparse_categorical_accuracy: 0.9870
 596/1875 [========>.....................] - ETA: 1s - loss: 0.0450 - sparse_categorical_accuracy: 0.9869
 640/1875 [=========>....................] - ETA: 1s - loss: 0.0453 - sparse_categorical_accuracy: 0.9867
 689/1875 [==========>...................] - ETA: 1s - loss: 0.0453 - sparse_categorical_accuracy: 0.9868
 739/1875 [==========>...................] - ETA: 1s - loss: 0.0455 - sparse_categorical_accuracy: 0.9867
 782/1875 [===========>..................] - ETA: 1s - loss: 0.0458 - sparse_categorical_accuracy: 0.9867
 827/1875 [============>.................] - ETA: 1s - loss: 0.0452 - sparse_categorical_accuracy: 0.9870
 877/1875 [=============>................] - ETA: 1s - loss: 0.0455 - sparse_categorical_accuracy: 0.9868
 926/1875 [=============>................] - ETA: 1s - loss: 0.0458 - sparse_categorical_accuracy: 0.9867
 968/1875 [==============>...............] - ETA: 1s - loss: 0.0457 - sparse_categorical_accuracy: 0.9867
1003/1875 [===============>..............] - ETA: 1s - loss: 0.0458 - sparse_categorical_accuracy: 0.9866
1036/1875 [===============>..............] - ETA: 1s - loss: 0.0455 - sparse_categorical_accuracy: 0.9867
1085/1875 [================>.............] - ETA: 0s - loss: 0.0455 - sparse_categorical_accuracy: 0.9867
1130/1875 [=================>............] - ETA: 0s - loss: 0.0456 - sparse_categorical_accuracy: 0.9866
1175/1875 [=================>............] - ETA: 0s - loss: 0.0457 - sparse_categorical_accuracy: 0.9867
1224/1875 [==================>...........] - ETA: 0s - loss: 0.0460 - sparse_categorical_accuracy: 0.9865
1273/1875 [===================>..........] - ETA: 0s - loss: 0.0458 - sparse_categorical_accuracy: 0.9866
1319/1875 [====================>.........] - ETA: 0s - loss: 0.0454 - sparse_categorical_accuracy: 0.9866
1367/1875 [====================>.........] - ETA: 0s - loss: 0.0459 - sparse_categorical_accuracy: 0.9865
1415/1875 [=====================>........] - ETA: 0s - loss: 0.0458 - sparse_categorical_accuracy: 0.9864
1463/1875 [======================>.......] - ETA: 0s - loss: 0.0462 - sparse_categorical_accuracy: 0.9863
1508/1875 [=======================>......] - ETA: 0s - loss: 0.0465 - sparse_categorical_accuracy: 0.9862
1555/1875 [=======================>......] - ETA: 0s - loss: 0.0469 - sparse_categorical_accuracy: 0.9861
1603/1875 [========================>.....] - ETA: 0s - loss: 0.0474 - sparse_categorical_accuracy: 0.9859
1650/1875 [=========================>....] - ETA: 0s - loss: 0.0473 - sparse_categorical_accuracy: 0.9859
1696/1875 [==========================>...] - ETA: 0s - loss: 0.0468 - sparse_categorical_accuracy: 0.9861
1744/1875 [==========================>...] - ETA: 0s - loss: 0.0471 - sparse_categorical_accuracy: 0.9859
1795/1875 [===========================>..] - ETA: 0s - loss: 0.0471 - sparse_categorical_accuracy: 0.9858
1844/1875 [============================>.] - ETA: 0s - loss: 0.0468 - sparse_categorical_accuracy: 0.9858
1875/1875 [==============================] - 3s 1ms/step - loss: 0.0467 - sparse_categorical_accuracy: 0.9859 - val_loss: 0.0869 - val_sparse_categorical_accuracy: 0.9750
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
flatten (Flatten)            multiple                  0         
_________________________________________________________________
dense (Dense)                multiple                  100480    
_________________________________________________________________
dense_1 (Dense)              multiple                  1290      
=================================================================
Total params: 101,770
Trainable params: 101,770
Non-trainable params: 0
_________________________________________________________________
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
  • 序言:七十年代末酌伊,一起剝皮案震驚了整個濱河市,隨后出現(xiàn)的幾起案子缀踪,更是在濱河造成了極大的恐慌居砖,老刑警劉巖,帶你破解...
    沈念sama閱讀 218,204評論 6 506
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件驴娃,死亡現(xiàn)場離奇詭異奏候,居然都是意外死亡,警方通過查閱死者的電腦和手機(jī)托慨,發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 93,091評論 3 395
  • 文/潘曉璐 我一進(jìn)店門鼻由,熙熙樓的掌柜王于貴愁眉苦臉地迎上來,“玉大人厚棵,你說我怎么就攤上這事蕉世。” “怎么了婆硬?”我有些...
    開封第一講書人閱讀 164,548評論 0 354
  • 文/不壞的土叔 我叫張陵狠轻,是天一觀的道長。 經(jīng)常有香客問我彬犯,道長向楼,這世上最難降的妖魔是什么查吊? 我笑而不...
    開封第一講書人閱讀 58,657評論 1 293
  • 正文 為了忘掉前任,我火速辦了婚禮湖蜕,結(jié)果婚禮上逻卖,老公的妹妹穿的比我還像新娘。我一直安慰自己昭抒,他們只是感情好评也,可當(dāng)我...
    茶點(diǎn)故事閱讀 67,689評論 6 392
  • 文/花漫 我一把揭開白布。 她就那樣靜靜地躺著灭返,像睡著了一般盗迟。 火紅的嫁衣襯著肌膚如雪。 梳的紋絲不亂的頭發(fā)上熙含,一...
    開封第一講書人閱讀 51,554評論 1 305
  • 那天罚缕,我揣著相機(jī)與錄音,去河邊找鬼怎静。 笑死邮弹,一個胖子當(dāng)著我的面吹牛,可吹牛的內(nèi)容都是我干的消约。 我是一名探鬼主播肠鲫,決...
    沈念sama閱讀 40,302評論 3 418
  • 文/蒼蘭香墨 我猛地睜開眼,長吁一口氣:“原來是場噩夢啊……” “哼或粮!你這毒婦竟也來了导饲?” 一聲冷哼從身側(cè)響起,我...
    開封第一講書人閱讀 39,216評論 0 276
  • 序言:老撾萬榮一對情侶失蹤氯材,失蹤者是張志新(化名)和其女友劉穎渣锦,沒想到半個月后,有當(dāng)?shù)厝嗽跇淞掷锇l(fā)現(xiàn)了一具尸體氢哮,經(jīng)...
    沈念sama閱讀 45,661評論 1 314
  • 正文 獨(dú)居荒郊野嶺守林人離奇死亡袋毙,尸身上長有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點(diǎn)故事閱讀 37,851評論 3 336
  • 正文 我和宋清朗相戀三年,在試婚紗的時候發(fā)現(xiàn)自己被綠了冗尤。 大學(xué)時的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片听盖。...
    茶點(diǎn)故事閱讀 39,977評論 1 348
  • 序言:一個原本活蹦亂跳的男人離奇死亡,死狀恐怖裂七,靈堂內(nèi)的尸體忽然破棺而出皆看,到底是詐尸還是另有隱情,我是刑警寧澤背零,帶...
    沈念sama閱讀 35,697評論 5 347
  • 正文 年R本政府宣布腰吟,位于F島的核電站,受9級特大地震影響徙瓶,放射性物質(zhì)發(fā)生泄漏毛雇。R本人自食惡果不足惜嫉称,卻給世界環(huán)境...
    茶點(diǎn)故事閱讀 41,306評論 3 330
  • 文/蒙蒙 一、第九天 我趴在偏房一處隱蔽的房頂上張望灵疮。 院中可真熱鬧织阅,春花似錦、人聲如沸始藕。這莊子的主人今日做“春日...
    開封第一講書人閱讀 31,898評論 0 22
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽伍派。三九已至,卻和暖如春剩胁,著一層夾襖步出監(jiān)牢的瞬間诉植,已是汗流浹背。 一陣腳步聲響...
    開封第一講書人閱讀 33,019評論 1 270
  • 我被黑心中介騙來泰國打工昵观, 沒想到剛下飛機(jī)就差點(diǎn)兒被人妖公主榨干…… 1. 我叫王不留晾腔,地道東北人。 一個月前我還...
    沈念sama閱讀 48,138評論 3 370
  • 正文 我出身青樓啊犬,卻偏偏與公主長得像灼擂,于是被迫代替她去往敵國和親。 傳聞我的和親對象是個殘疾皇子觉至,可洞房花燭夜當(dāng)晚...
    茶點(diǎn)故事閱讀 44,927評論 2 355