使用VGG16來訓(xùn)練車輛識別

使用VGG16來訓(xùn)練車輛識別

from keras.applications import VGG16
conv_base = VGG16(weights='imagenet',
                    include_top=False,
                    input_shape=(150,150,3))

import os 
import numpy as np 
from keras.preprocessing.image import ImageDataGenerator
base_dir = '/Users/chenyin/Documents/深度學(xué)習(xí)/car'
train_dir = os.path.join(base_dir,'train')
validation_dir = os.path.join(base_dir,'val')
test_dir = os.path.join(base_dir,'test')
from keras import models
from keras import layers
from keras import optimizers

加入分類層

model = models.Sequential()
model.add(conv_base)
model.add(layers.Flatten())
model.add(layers.Dense(256,activation='relu'))
model.add(layers.Dense(10, activation='sigmoid'))
model.summary()
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
vgg16 (Model)                (None, 4, 4, 512)         14714688  
_________________________________________________________________
flatten (Flatten)            (None, 8192)              0         
_________________________________________________________________
dense (Dense)                (None, 256)               2097408   
_________________________________________________________________
dense_1 (Dense)              (None, 10)                2570      
=================================================================
Total params: 16,814,666
Trainable params: 16,814,666
Non-trainable params: 0
_________________________________________________________________
len(model.trainable_weights)
30
conv_base.trainable = False
len(model.trainable_weights)
4
train_datagen = ImageDataGenerator(
    rescale=1./255,
    rotation_range=40,
    width_shift_range=0.2,
    height_shift_range=0.2,
    shear_range=0.2,
    zoom_range=0.2,
    horizontal_flip=True,
    fill_mode='nearest'
)
test_datagen = ImageDataGenerator(rescale=1./255)
train_generator = train_datagen.flow_from_directory(
    train_dir,
    target_size=(150,150),
    batch_size=20,
    class_mode='binary'
)
validation_generator = test_datagen.flow_from_directory(
    validation_dir,
    target_size=(150,150),
    batch_size=20,
    class_mode='binary'
)
model.compile(loss='sparse_categorical_crossentropy',
    optimizer=optimizers.RMSprop(lr=2e-5),
    metrics=['acc'])
Found 1400 images belonging to 10 classes.
Found 200 images belonging to 10 classes.
history = model.fit(
    train_generator,
    steps_per_epoch=100,
    epochs=30,
    validation_data=validation_generator,
    validation_steps=50
)
Epoch 1/30
WARNING:tensorflow:AutoGraph could not transform <function Model.make_train_function.<locals>.train_function at 0x7fdfaeef3710> and will run it as-is.
Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output.
Cause: Bad argument number for Name: 4, expecting 3
To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert
WARNING: AutoGraph could not transform <function Model.make_train_function.<locals>.train_function at 0x7fdfaeef3710> and will run it as-is.
Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output.
Cause: Bad argument number for Name: 4, expecting 3
To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert
100/100 [==============================] - ETA: 0s - loss: 2.2270 - acc: 0.2045WARNING:tensorflow:AutoGraph could not transform <function Model.make_test_function.<locals>.test_function at 0x7fdfaed7aa70> and will run it as-is.
Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output.
Cause: Bad argument number for Name: 4, expecting 3
To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert
WARNING: AutoGraph could not transform <function Model.make_test_function.<locals>.test_function at 0x7fdfaed7aa70> and will run it as-is.
Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output.
Cause: Bad argument number for Name: 4, expecting 3
To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert
100/100 [==============================] - 199s 2s/step - loss: 2.2270 - acc: 0.2045 - val_loss: 2.0976 - val_acc: 0.3400
Epoch 2/30
100/100 [==============================] - 194s 2s/step - loss: 1.9879 - acc: 0.3965 - val_loss: 1.8054 - val_acc: 0.5260
Epoch 3/30
100/100 [==============================] - 198s 2s/step - loss: 1.7472 - acc: 0.5060 - val_loss: 1.5587 - val_acc: 0.5880
Epoch 4/30
100/100 [==============================] - 178s 2s/step - loss: 1.5686 - acc: 0.5600 - val_loss: 1.3564 - val_acc: 0.6710
Epoch 5/30
100/100 [==============================] - 179s 2s/step - loss: 1.4308 - acc: 0.5940 - val_loss: 1.2241 - val_acc: 0.6850
Epoch 6/30
100/100 [==============================] - 185s 2s/step - loss: 1.3169 - acc: 0.6170 - val_loss: 1.0816 - val_acc: 0.7150
Epoch 7/30
100/100 [==============================] - 179s 2s/step - loss: 1.2277 - acc: 0.6530 - val_loss: 1.0170 - val_acc: 0.7210
Epoch 8/30
100/100 [==============================] - 178s 2s/step - loss: 1.1479 - acc: 0.6790 - val_loss: 0.9509 - val_acc: 0.7530
Epoch 9/30
100/100 [==============================] - 179s 2s/step - loss: 1.0915 - acc: 0.6845 - val_loss: 0.8450 - val_acc: 0.7820
Epoch 10/30
100/100 [==============================] - 177s 2s/step - loss: 1.0540 - acc: 0.6915 - val_loss: 0.8223 - val_acc: 0.7720
Epoch 11/30
100/100 [==============================] - 182s 2s/step - loss: 0.9687 - acc: 0.7200 - val_loss: 0.7857 - val_acc: 0.7700
Epoch 12/30
100/100 [==============================] - 179s 2s/step - loss: 0.9658 - acc: 0.7180 - val_loss: 0.7357 - val_acc: 0.7820
Epoch 13/30
100/100 [==============================] - 178s 2s/step - loss: 0.9198 - acc: 0.7295 - val_loss: 0.7101 - val_acc: 0.7780
Epoch 14/30
100/100 [==============================] - 181s 2s/step - loss: 0.8924 - acc: 0.7425 - val_loss: 0.6926 - val_acc: 0.7940
Epoch 15/30
100/100 [==============================] - 179s 2s/step - loss: 0.8465 - acc: 0.7480 - val_loss: 0.6716 - val_acc: 0.7920
Epoch 16/30
100/100 [==============================] - 178s 2s/step - loss: 0.8407 - acc: 0.7435 - val_loss: 0.6581 - val_acc: 0.7780
Epoch 17/30
100/100 [==============================] - 176s 2s/step - loss: 0.8111 - acc: 0.7490 - val_loss: 0.5969 - val_acc: 0.7950
Epoch 18/30
100/100 [==============================] - 182s 2s/step - loss: 0.7752 - acc: 0.7705 - val_loss: 0.5955 - val_acc: 0.8000
Epoch 19/30
100/100 [==============================] - 176s 2s/step - loss: 0.7615 - acc: 0.7735 - val_loss: 0.5915 - val_acc: 0.8010
Epoch 20/30
100/100 [==============================] - 178s 2s/step - loss: 0.7400 - acc: 0.7790 - val_loss: 0.5745 - val_acc: 0.8060
Epoch 21/30
100/100 [==============================] - 175s 2s/step - loss: 0.7564 - acc: 0.7735 - val_loss: 0.5657 - val_acc: 0.7950
Epoch 22/30
100/100 [==============================] - 180s 2s/step - loss: 0.7097 - acc: 0.7865 - val_loss: 0.5864 - val_acc: 0.7930
Epoch 23/30
100/100 [==============================] - 179s 2s/step - loss: 0.7047 - acc: 0.7930 - val_loss: 0.5537 - val_acc: 0.8150
Epoch 24/30
100/100 [==============================] - 175s 2s/step - loss: 0.6993 - acc: 0.7830 - val_loss: 0.5388 - val_acc: 0.7970
Epoch 25/30
100/100 [==============================] - 178s 2s/step - loss: 0.6872 - acc: 0.7900 - val_loss: 0.5653 - val_acc: 0.8100
Epoch 26/30
100/100 [==============================] - 180s 2s/step - loss: 0.6430 - acc: 0.8040 - val_loss: 0.5413 - val_acc: 0.7990
Epoch 27/30
100/100 [==============================] - 176s 2s/step - loss: 0.6613 - acc: 0.8005 - val_loss: 0.5245 - val_acc: 0.8290
Epoch 28/30
100/100 [==============================] - 177s 2s/step - loss: 0.6185 - acc: 0.8040 - val_loss: 0.5202 - val_acc: 0.8300
Epoch 29/30
100/100 [==============================] - 178s 2s/step - loss: 0.6425 - acc: 0.8085 - val_loss: 0.5097 - val_acc: 0.8150
Epoch 30/30
100/100 [==============================] - 178s 2s/step - loss: 0.6132 - acc: 0.8145 - val_loss: 0.5098 - val_acc: 0.8410
model.save("./car_Vgg16.h5")#保存模型
import matplotlib.pyplot as plt

acc = history.history['acc']

val_acc = history.history['val_acc']

loss = history.history['loss']

val_loss = history.history['val_loss']
epochs = range(1, len(acc) + 1)
plt.plot(epochs, acc, 'bo', label='Training acc') 
plt.plot(epochs, val_acc, 'b', label='Validation acc') 
plt.title('Training and validation accuracy') 
plt.legend()

plt.figure()

plt.plot(epochs, loss, 'bo', label='Training loss') 
plt.plot(epochs, val_loss, 'b', label='Validation loss') 
plt.title('Training and validation loss') 
plt.legend()

plt.show()
test_list=os.listdir(test_dir)
#train_generator.class_indices.keys()   標(biāo)簽類別
labels=['SUV', 'bus', 'family sedan', 'fire engine', 'heavy truck', 'jeep', 'minibus', 'racing car', 'taxi', 'truck']
from keras.preprocessing import image
img_path=os.path.join(test_dir,test_list[180])
img = image.load_img(img_path,target_size=(150,150))
img.show()
x=image.img_to_array(img)
x=np.expand_dims(x,axis=0)
x=x/255.0
labels[model.predict_classes(x)[0]]
'jeep'
len(test_list)
200

?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
  • 序言:七十年代末坦冠,一起剝皮案震驚了整個濱河市令哟,隨后出現(xiàn)的幾起案子敏弃,更是在濱河造成了極大的恐慌摄职,老刑警劉巖倒庵,帶你破解...
    沈念sama閱讀 216,919評論 6 502
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件耘纱,死亡現(xiàn)場離奇詭異扣猫,居然都是意外死亡,警方通過查閱死者的電腦和手機捅膘,發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 92,567評論 3 392
  • 文/潘曉璐 我一進店門添祸,熙熙樓的掌柜王于貴愁眉苦臉地迎上來,“玉大人寻仗,你說我怎么就攤上這事刃泌。” “怎么了署尤?”我有些...
    開封第一講書人閱讀 163,316評論 0 353
  • 文/不壞的土叔 我叫張陵耙替,是天一觀的道長。 經(jīng)常有香客問我曹体,道長俗扇,這世上最難降的妖魔是什么? 我笑而不...
    開封第一講書人閱讀 58,294評論 1 292
  • 正文 為了忘掉前任箕别,我火速辦了婚禮铜幽,結(jié)果婚禮上,老公的妹妹穿的比我還像新娘究孕。我一直安慰自己啥酱,他們只是感情好爹凹,可當(dāng)我...
    茶點故事閱讀 67,318評論 6 390
  • 文/花漫 我一把揭開白布厨诸。 她就那樣靜靜地躺著,像睡著了一般禾酱。 火紅的嫁衣襯著肌膚如雪微酬。 梳的紋絲不亂的頭發(fā)上,一...
    開封第一講書人閱讀 51,245評論 1 299
  • 那天颤陶,我揣著相機與錄音颗管,去河邊找鬼。 笑死滓走,一個胖子當(dāng)著我的面吹牛垦江,可吹牛的內(nèi)容都是我干的。 我是一名探鬼主播搅方,決...
    沈念sama閱讀 40,120評論 3 418
  • 文/蒼蘭香墨 我猛地睜開眼比吭,長吁一口氣:“原來是場噩夢啊……” “哼绽族!你這毒婦竟也來了?” 一聲冷哼從身側(cè)響起衩藤,我...
    開封第一講書人閱讀 38,964評論 0 275
  • 序言:老撾萬榮一對情侶失蹤吧慢,失蹤者是張志新(化名)和其女友劉穎,沒想到半個月后赏表,有當(dāng)?shù)厝嗽跇淞掷锇l(fā)現(xiàn)了一具尸體检诗,經(jīng)...
    沈念sama閱讀 45,376評論 1 313
  • 正文 獨居荒郊野嶺守林人離奇死亡,尸身上長有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點故事閱讀 37,592評論 2 333
  • 正文 我和宋清朗相戀三年瓢剿,在試婚紗的時候發(fā)現(xiàn)自己被綠了逢慌。 大學(xué)時的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片。...
    茶點故事閱讀 39,764評論 1 348
  • 序言:一個原本活蹦亂跳的男人離奇死亡跋选,死狀恐怖涕癣,靈堂內(nèi)的尸體忽然破棺而出,到底是詐尸還是另有隱情前标,我是刑警寧澤坠韩,帶...
    沈念sama閱讀 35,460評論 5 344
  • 正文 年R本政府宣布,位于F島的核電站炼列,受9級特大地震影響只搁,放射性物質(zhì)發(fā)生泄漏。R本人自食惡果不足惜俭尖,卻給世界環(huán)境...
    茶點故事閱讀 41,070評論 3 327
  • 文/蒙蒙 一氢惋、第九天 我趴在偏房一處隱蔽的房頂上張望。 院中可真熱鬧稽犁,春花似錦焰望、人聲如沸。這莊子的主人今日做“春日...
    開封第一講書人閱讀 31,697評論 0 22
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽爹梁。三九已至,卻和暖如春,著一層夾襖步出監(jiān)牢的瞬間迹栓,已是汗流浹背涉茧。 一陣腳步聲響...
    開封第一講書人閱讀 32,846評論 1 269
  • 我被黑心中介騙來泰國打工锌俱, 沒想到剛下飛機就差點兒被人妖公主榨干…… 1. 我叫王不留识啦,地道東北人。 一個月前我還...
    沈念sama閱讀 47,819評論 2 370
  • 正文 我出身青樓泥技,卻偏偏與公主長得像浆兰,于是被迫代替她去往敵國和親。 傳聞我的和親對象是個殘疾皇子,可洞房花燭夜當(dāng)晚...
    茶點故事閱讀 44,665評論 2 354