train set,validation set and test set of Nural Network

Using the results we can plot the way the cost changes as the network learns**This and the next four graphs were generated by the program overfitting.py.:

This looks encouraging, showing a smooth decrease in the cost, just as we expect. Note that I've only shown training epochs 200 through 399.This gives us a nice up-close view of the later stages of learning,which, as we'll see, turns out to be where the interesting action is.
Let's now look at how the classification accuracy on the test data changes over time:


Again, I've zoomed in quite a bit. In the first 200 epochs (not shown) the accuracy rises to just under 82 percent. The learning then gradually slows down. Finally, at around epoch 280 the classification accuracy pretty much stops improving. Later epochs merely see small stochastic fluctuations near the value of the accuracy at epoch 280.Contrast this with the earlier graph, where the cost associated to the training data continues to smoothly drop.*** If we just look at that cost, it appears that our model is still getting "better". But the test accuracy results show the improvement is an illusion*. Just like the model that Fermi disliked, what our network learns after epoch 280no longer generalizes to the test data. And so it's not useful learning. We say the network is overfitting orover training beyond epoch 280.
To put it another way, you can think of the validation data as a type of training data that helps us learn good hyper-parameters.

The training and validation sets are used during training.

for each epoch 
    for each training data instance
         propagate error through the network 
         adjust the weights calculate the accuracy over training data 
   for each validation data instance 
         calculate the accuracy over the validation data 
   if the threshold validation accuracy is met
         exit training 
   else
         continue training

Once you're finished training, then you run against your testing set and verify that the accuracy is sufficient.

The error surface will be different for different sets of data from your data set (batch learning). Therefore if you find a very good local minima for your test set data, that may not be a very good point, and may be a very bad point in the surface generated by some other set of data for the same problem. Therefore you need to compute such a model which not only finds a good weight configuration for the training set but also should be able to predict new data (which is not in the training set) with good error. In other words the network should be able to generalize the examples so that it learns the data and does not simply remembers or loads the training set by overfitting the training data.
The validation data set is a set of data for the function you want to learn, which you are not directly using to train the network. You are training the network with a set of data which you call the training data set. If you are using gradient based algorithm to train the network then the error surface and the gradient at some point will completely depend on the training data set thus the training data set is being directly used to adjust the weights. To make sure you don't overfit the network you need to input the validation dataset to the network and check if the error is within some range. Because the validation set is not being using directly to adjust the weights of the netowork, therefore a good error for the validation and also the test set indicates that the network predicts well for the train set examples, also it is expected to perform well when new example are presented to the network which was not used in the training process.
Early stopping is a way to stop training. There are different variations available, the main outline is, both the train and the validation set errors are monitored, the train error decreases at each iteration (backprop and brothers) and at first the validation error decreases. ***The training is stopped at the moment the validation error starts to rise. *** The weight configuration at this point indicates a model, which predicts the training data well, as well as the data which is not seen by the network . But because the validation data actually affects the weight configuration indirectly to select the weight configuration. This is where the Test set comes in. This set of data is never used in the training process. Once a model is selected based on the validation set, the test set data is applied on the network model and the error for this set is found. This error is a representative of the error which we can expect from absolutely new data for the same problem.

more detaills: function of train set,validation set and test set

最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
  • 序言:七十年代末,一起剝皮案震驚了整個濱河市婚度,隨后出現(xiàn)的幾起案子,更是在濱河造成了極大的恐慌,老刑警劉巖,帶你破解...
    沈念sama閱讀 218,122評論 6 505
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件辫樱,死亡現(xiàn)場離奇詭異区转,居然都是意外死亡,警方通過查閱死者的電腦和手機(jī)槐脏,發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 93,070評論 3 395
  • 文/潘曉璐 我一進(jìn)店門,熙熙樓的掌柜王于貴愁眉苦臉地迎上來撇寞,“玉大人顿天,你說我怎么就攤上這事≈囟叮” “怎么了露氮?”我有些...
    開封第一講書人閱讀 164,491評論 0 354
  • 文/不壞的土叔 我叫張陵,是天一觀的道長钟沛。 經(jīng)常有香客問我畔规,道長,這世上最難降的妖魔是什么恨统? 我笑而不...
    開封第一講書人閱讀 58,636評論 1 293
  • 正文 為了忘掉前任叁扫,我火速辦了婚禮,結(jié)果婚禮上畜埋,老公的妹妹穿的比我還像新娘莫绣。我一直安慰自己,他們只是感情好悠鞍,可當(dāng)我...
    茶點(diǎn)故事閱讀 67,676評論 6 392
  • 文/花漫 我一把揭開白布对室。 她就那樣靜靜地躺著,像睡著了一般。 火紅的嫁衣襯著肌膚如雪掩宜。 梳的紋絲不亂的頭發(fā)上蔫骂,一...
    開封第一講書人閱讀 51,541評論 1 305
  • 那天,我揣著相機(jī)與錄音牺汤,去河邊找鬼辽旋。 笑死,一個胖子當(dāng)著我的面吹牛檐迟,可吹牛的內(nèi)容都是我干的补胚。 我是一名探鬼主播,決...
    沈念sama閱讀 40,292評論 3 418
  • 文/蒼蘭香墨 我猛地睜開眼追迟,長吁一口氣:“原來是場噩夢啊……” “哼溶其!你這毒婦竟也來了?” 一聲冷哼從身側(cè)響起敦间,我...
    開封第一講書人閱讀 39,211評論 0 276
  • 序言:老撾萬榮一對情侶失蹤握联,失蹤者是張志新(化名)和其女友劉穎,沒想到半個月后每瞒,有當(dāng)?shù)厝嗽跇淞掷锇l(fā)現(xiàn)了一具尸體,經(jīng)...
    沈念sama閱讀 45,655評論 1 314
  • 正文 獨(dú)居荒郊野嶺守林人離奇死亡纯露,尸身上長有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點(diǎn)故事閱讀 37,846評論 3 336
  • 正文 我和宋清朗相戀三年剿骨,在試婚紗的時候發(fā)現(xiàn)自己被綠了。 大學(xué)時的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片埠褪。...
    茶點(diǎn)故事閱讀 39,965評論 1 348
  • 序言:一個原本活蹦亂跳的男人離奇死亡浓利,死狀恐怖,靈堂內(nèi)的尸體忽然破棺而出钞速,到底是詐尸還是另有隱情贷掖,我是刑警寧澤,帶...
    沈念sama閱讀 35,684評論 5 347
  • 正文 年R本政府宣布渴语,位于F島的核電站苹威,受9級特大地震影響,放射性物質(zhì)發(fā)生泄漏驾凶。R本人自食惡果不足惜牙甫,卻給世界環(huán)境...
    茶點(diǎn)故事閱讀 41,295評論 3 329
  • 文/蒙蒙 一、第九天 我趴在偏房一處隱蔽的房頂上張望调违。 院中可真熱鬧窟哺,春花似錦、人聲如沸技肩。這莊子的主人今日做“春日...
    開封第一講書人閱讀 31,894評論 0 22
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽。三九已至旋奢,卻和暖如春泳挥,著一層夾襖步出監(jiān)牢的瞬間,已是汗流浹背黄绩。 一陣腳步聲響...
    開封第一講書人閱讀 33,012評論 1 269
  • 我被黑心中介騙來泰國打工羡洁, 沒想到剛下飛機(jī)就差點(diǎn)兒被人妖公主榨干…… 1. 我叫王不留,地道東北人爽丹。 一個月前我還...
    沈念sama閱讀 48,126評論 3 370
  • 正文 我出身青樓筑煮,卻偏偏與公主長得像,于是被迫代替她去往敵國和親粤蝎。 傳聞我的和親對象是個殘疾皇子真仲,可洞房花燭夜當(dāng)晚...
    茶點(diǎn)故事閱讀 44,914評論 2 355

推薦閱讀更多精彩內(nèi)容

  • 昨天是世界讀書日,這是一個跟讀書有關(guān)的節(jié)日初澎。世界讀書日的宗旨是“希望散居在世界各地的人秸应,無論你是年老還是年輕,無論...
    虹飛閱讀 413評論 0 0
  • 刺繡文化碑宴,一針一線連真情软啼!愿只愿有一日繡金賢重等郎回還!
    等東邊的亮閱讀 247評論 0 0