一、introduction
深度學(xué)習(xí)對(duì)于圖像識(shí)別
二、using pretrained Networks
1、加載并顯示圖像
img1 = imread('file01.jpg');
imshow(img1)
2、預(yù)測(cè)
deepnet = alexnet; %獲取預(yù)訓(xùn)練模型
pred1 = classify(deepnet, img1); %預(yù)測(cè)img1
3栅盲、獲取其他預(yù)訓(xùn)練模型
4、examine network layers
deepnet = alexnet; ?%獲取預(yù)訓(xùn)練網(wǎng)絡(luò)
ly = deepnet.Layers废恋;%獲取網(wǎng)絡(luò)layers
inlayer = ly(1)谈秫; %獲取輸入層結(jié)構(gòu)
insz = inlayer.InputSize; %獲取輸入層size
outlayer = ly(end)鱼鼓; %獲取輸出層
categorynames = outlayer.Classes拟烫; %獲取最后一層的class
5、investigating predictions
分類函數(shù)返回輸入圖像的預(yù)測(cè)類迄本,但是有辦法知道網(wǎng)絡(luò)對(duì)這個(gè)分類有多“自信”嗎?在決定如何處理輸出時(shí)硕淑,考慮這種信心可能很重要。
為了將輸入分類為n個(gè)類中的一個(gè)嘉赎,神經(jīng)網(wǎng)絡(luò)有一個(gè)由n個(gè)神經(jīng)元組成的輸出層置媳,每個(gè)神經(jīng)元對(duì)應(yīng)一個(gè)類。通過網(wǎng)絡(luò)傳遞輸入結(jié)果是為每個(gè)神經(jīng)元計(jì)算一個(gè)數(shù)值公条。這些數(shù)值表示網(wǎng)絡(luò)對(duì)屬于每個(gè)類的輸入概率的預(yù)測(cè)拇囊。
img = imread('file01.jpg');
imshow(img)
net = alexnet;
categorynames = net.Layers(end).ClassNames;
[pred, scores] = classify(net, img); ?%獲得預(yù)測(cè)結(jié)果和自信分?jǐn)?shù)
bar(scores); %Display scores
highscores = scores > 0.01; %Threshold scores
bar(scores(highscores)); %Display thresholded scores
xticklabels(categorynames(highscores)); %Add tick labels
三、managing collections of data
1靶橱、creating a datastore
ls *.jpg
net = alexnet;
imds = imageDatastore('file*.jpg'); %創(chuàng)建datastore
fname = imds.Files; %提取文件名
img = readimage(imds, 7); ?%讀取圖像
preds = classify(net, imds)寥袭; %圖片分類
2、 Preparing Images to Use as Input: Adjust input images
Process Images for Classification
img = imread('file01.jpg');
imshow(img);
sz = size(img); ?%讀取圖像大小
net = alexnet;
insz = net.Layers(1).InputSize; ?%輸入層圖像大小
img = imresize(img, [227, 227]); ?
imshow(img);
3关霸、Processing Images in a Datastore: (2/3) Creating an augmented image datastore
Resize Images in a Datastore
ls *.jpg
net = alexnet传黄;
imds = imageDatastore('*.jpg');
auds = augmentedImageDatastore([227,227], imds); %Create augmentedImageDatastore
preds = classify(net, auds)
Processing Images in a Datastore: (3/3) Color preprocessing with augmented image datastores
augmentedImageDatastore可以對(duì)彩色圖片進(jìn)行處理
ls *.jpg
net = alexnet队寇;
imds = imageDatastore('file*.jpg')膘掰;
montage(imds); %Display images in imds
auds = augmentedImageDatastore([227,227], imds, 'ColorPreprocessing', 'gray2rgb') %Create augmentedImageDatastore
preds = classify(net, auds)
Create a Datastore Using Subfolders
net = alexnet;
flwrds = imageDatastore('Flowers', 'IncludeSubfolders',true);
preds = classify(net,flwrds)
四佳遣、transfer learn
1识埋、原因
(1)原有NET不能解決有效自己的問題
(2)自己訓(xùn)練一個(gè)全新的網(wǎng)絡(luò)--網(wǎng)絡(luò)結(jié)構(gòu)與隨機(jī)權(quán)重啤覆,需要具有網(wǎng)絡(luò)架構(gòu)方面的知識(shí)和經(jīng)驗(yàn)、大量的訓(xùn)練數(shù)據(jù)惭聂、大量的計(jì)算時(shí)間
2、Components Needed for Transfer Learning: (1/2) The components of transfer learning
3相恃、 Preparing Training Data: (1/3) Labeling images
Label Images in a Datastore
load pathToImages
flwrds = imageDatastore(pathToImages,'IncludeSubfolders',true); ?%This code creates a datastore of 960 flower images.
flowernames = flwrds.Labels
flwrds = imageDatastore(pathToImages,'IncludeSubfolders',true,'LabelSource','foldernames') ?%Create datastore with labels
flowernames = flwrds.Labels ?%Extract new labels
Preparing Training Data: (2/3) Split data for training and testing
Split Data for Training and Testing
Instructions are in the task pane to the left. Complete and submit each task one at a time.
This code creates a datastore of 960 flower images.
load pathToImages
flwrds = imageDatastore(pathToImages,'IncludeSubfolders',true,'LabelSource','foldernames')
Task 1
Split datastore
[flwrTrain, flwrTest] = splitEachLabel(flwrds, 0.6)
Task 2
Split datastore randomly
[flwrTrain, flwrTest] = splitEachLabel(flwrds, 0.8, 'randomized')
Task 3
Split datastore by number of images
[flwrTrain, flwrTest] = splitEachLabel(flwrds,50)
Preparing Training Data: (3/3) Augmented training data
4辜纲、微調(diào)思路
(1)Recall that a feed-forward network is represented in MATLAB as an array of layers. This makes it easy to index into the layers of a network and change them.
(2)To modify a preexisting network, you create a new layer
(3)then index into the layer array that represents the network and overwrite the chosen layer with the newly created layer.
(4)As with any indexed assignment in MATLAB, you can combine these steps into one line.
Modifying Network Layers: (2/2) Modify layers of a pretrained network
Modify Network Layers
Instructions are in the task pane to the left. Complete and submit each task one at a time.
This code imports AlexNet and extracts its layers.
anet = alexnet;
layers = anet.Layers
Task 1
Create new layer
fc = fullyConnectedLayer(12)
Task 2
Replace 23rd layer
layers(23) = fc
Task 3
Replace last layer
layers(end) = classificationLayer
Setting Training Options
Set Training Options
Instructions are in the task pane to the left. Complete and submit each task one at a time.
Task 1
Set default options
opts = trainingOptions('sgdm');
Task 2
Set initial learning rate
opts = trainingOptions('sgdm','InitialLearnRate',0.001);
Training the Network: (4/4) Summary example
Transfer Learning Example Script
The code below implements transfer learning for the flower species example in this chapter. It is available as the script?trainflowers.mlx?in the course example files. You can download the course example files from the help menu in the top-right corner. You can find more information on this dataset at the?17 Category Flower Dataset?page from the University of Oxford.?
Note that this example can take some time to run if you run it on a computer that does not have a?supported GPU.
Get training images
flower_ds = imageDatastore('Flowers','IncludeSubfolders',true,'LabelSource','foldernames');[trainImgs,testImgs] = splitEachLabel(flower_ds,0.6);numClasses = numel(categories(flower_ds.Labels));
Create a network by modifying AlexNet
net = alexnet;layers = net.Layers;layers(end-2) = fullyConnectedLayer(numClasses);layers(end) = classificationLayer;
Set training algorithm options
options = trainingOptions('sgdm','InitialLearnRate', 0.001);
Perform training
[flowernet,info] = trainNetwork(trainImgs, layers, options);
Use trained network to classify test images
testpreds = classify(flowernet,testImgs);
4.7 Evaluating Performance: (1/3) Evaluating training and test performance
Evaluate Performance
Instructions are in the task pane to the left. Complete and submit each task one at a time.
This code loads the training information of flowernet.
load pathToImages
load trainedFlowerNetwork flowernet info
Task 1
Plot training loss
plot(info.TrainingLoss)
This code creates a datastore of the flower images.
dsflowers = imageDatastore(pathToImages,'IncludeSubfolders',true,'LabelSource','foldernames');
[trainImgs,testImgs] = splitEachLabel(dsflowers,0.98);
Task 2
Classify images
flwrPreds = classify(flowernet,testImgs)
Evaluating Performance: (2/3) Investigating test performance
Investigate test performance
Instructions are in the task pane to the left. Complete and submit each task one at a time.
This code sets up the Workspace for this activity.
load pathToImages.mat
pathToImages
flwrds = imageDatastore(pathToImages,'IncludeSubfolders',true,'LabelSource','foldernames');
[trainImgs,testImgs] = splitEachLabel(flwrds,0.98);
load trainedFlowerNetwork flwrPreds
Task 1
Extract labels
flwrActual = testImgs.Labels
Task 2
Count correct
numCorrect = nnz(flwrPreds == flwrActual)
Task 3
Calculate fraction correct
fracCorrect = numCorrect/numel(flwrPreds)
Task 4
Display confusion matrix
confusionchart(testImgs.Labels,flwrPreds)
Evaluating Performance: (3/3) Improving performance
Transfer Learning Summary
Transfer Learning Function Summary
Create a network
FunctionDescription
alexnetLoad pretrained network “AlexNet”
supported networksView list of available pretrained networks
fullyConnectedLayerCreate new fully connected network layer
classificationLayerCreate new output layer for a classification network
Get training images
FunctionDescription
imageDatastoreCreate datastore reference to image files
augmentedImageDatastorePreprocess a collection of image files
splitEachLabelDivide datastore into multiple datastores
Set training algorithm options
FunctionDescription
trainingOptionsCreate variable containing training algorithm options
Perform training
FunctionDescription
trainNetworkPerform training
Use trained network to perform classifications
FunctionDescription
classifyObtain trained network's classifications of input images
Evaluate trained network
FunctionDescription
nnzCount non-zero elements in an array
confusionchartCalculate confusion matrix
heatmapVisualize confusion matrix as a heatmap