深度學(xué)習(xí) 建模流程總結(jié)

  1. 環(huán)境配置:

    硬件:GPU、CPU

    軟件:Ubuntu蛮穿、TensorFlow-GPU版本筐眷、Seaborn澳眷、Matplotlib

    要求:TF的版本對GPU有版本匹配要求

    tips:

    # 命令行查看GPU使用情況
    watch -n 10 nvidia-sim
    # 10秒 刷新一次唠粥,
    nvidia-sim
    

    AWS學(xué)術(shù)服務(wù)器的申請和使用技巧疏魏。

  2. 建模流程:

    1. 選擇神經(jīng)網(wǎng)絡(luò)框架:Keras、TensorFlow晤愧、Pytorch...
    2. 根據(jù)所選的框架大莫,處理現(xiàn)有數(shù)據(jù),以適配框架數(shù)據(jù)類型官份。巧用panda只厘、sklearn對數(shù)據(jù)集進(jìn)行讀取(read_scv)舅巷、分配(train_test_split)羔味、缺失值處理,用seaborn钠右、matplotlib對數(shù)據(jù)可視化赋元,輔助直覺判斷。
    3. 建立模型
    4. 模型性能評估
    5. 特別糾錯(cuò)
    6. 使用模型預(yù)測并生成結(jié)果
  3. 開發(fā)Tip:

    1. 巧用lib:sklearn飒房、seaborn搁凸、matplotlib、panda

Useful QA:

  1. Q: Just one small question why do you take batch_size to be 86 ? Is it just random value or does it changes something to the result ?

    A: It would be really interesting to hear from author about this. But I believe you will be able to get pretty the same results if you choose 64 or 32 or 128 as batch size. And may be it will be even run faster because of CPU optimizations...

    A: Batch size is mainly a constraint on your own computer. The larger the batch size, the more data your chunking into your memory that your model will train on. The small the batch size, the less data, but your computation will be slower.

    It's a tradeoff between speed and memory.

  2. Q: in my first try I use:

    In -> [ Conv2D (3,3) -> relu -> MaxPool2D ]*2 -> Conv2D (3,3) -> relu -> MaxPool2D --> Flatten -> Dense -> Dropout -> Out

    (I've got a good accuracy in cat&dogs competition with this architecture) and the accuracy was 0.95

    How can we know a good architecture of the CNN for any type of problem?

    A: There are many Convolution neural networks models proposed in many papers . Every model gives better accuracy than the one before it . that is Alex net performs better than lenet and googLeNet is better than AlexNet but in general with some error analysis and trials you should find the number of layers and the architecture that will fit the task

    Q: So, when you face one new image problem, how do beginners start their neural network? Any suggestions for a starting architecture? Thanks

    A: You may try to find a paper or an algorithm that was proven to work well for similar tasks. you then try to modify it to fit your task according to the results you get from the algorithm. You may also consider not to reinvent the wheel by implementing the algorithm from scratch, Instead you may use one of well known algorithm used in ImageNet or other challenges like VGG-16 , VGG-19 or yolo depending on the Task. Transfer learning makes it easier for the training process as the algorithm will be pre-trained but you will have to decide how many layers you want to freeze according to the training data you have.

  3. Q:

    1. Accuracy seems to be lower than validation accuracy. Is this due to the fact that training data is augmented and thus harder to identify than validation data?
    2. You chose In -> [[Conv2D->relu]*2 -> MaxPool2D -> Dropout]*2 -> Flatten -> Dense -> Dropout -> Out as your CNN structure. Could you provide some reasoning for laying 2 Conv2D layers before max pooling? Why is this structure better than In -> [ Conv2D-> relu -> MaxPool2D -> Dropout]*2 -> Flatten -> Dense -> Dropout -> Out

    A:

    1. Yes exaclty !

    ? 2) This dataset is composed of digits images of the same small size. Images are somewhat aleardy normalized. So we are facing an easy problem. No need of very deep networks.
    It is better to add consecutively Conv+relu layers followed by maxpool layer. With this technique you increase exponentially the number of filters. Take a look at Google LeNet or VGG16/19 network , they are very deep networks but very well build to better extract features from images.

?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
  • 序言:七十年代末狠毯,一起剝皮案震驚了整個(gè)濱河市护糖,隨后出現(xiàn)的幾起案子,更是在濱河造成了極大的恐慌垃你,老刑警劉巖椅文,帶你破解...
    沈念sama閱讀 221,888評論 6 515
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件,死亡現(xiàn)場離奇詭異惜颇,居然都是意外死亡,警方通過查閱死者的電腦和手機(jī)少辣,發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 94,677評論 3 399
  • 文/潘曉璐 我一進(jìn)店門凌摄,熙熙樓的掌柜王于貴愁眉苦臉地迎上來,“玉大人漓帅,你說我怎么就攤上這事锨亏。” “怎么了忙干?”我有些...
    開封第一講書人閱讀 168,386評論 0 360
  • 文/不壞的土叔 我叫張陵器予,是天一觀的道長。 經(jīng)常有香客問我捐迫,道長乾翔,這世上最難降的妖魔是什么? 我笑而不...
    開封第一講書人閱讀 59,726評論 1 297
  • 正文 為了忘掉前任,我火速辦了婚禮反浓,結(jié)果婚禮上萌丈,老公的妹妹穿的比我還像新娘。我一直安慰自己雷则,他們只是感情好辆雾,可當(dāng)我...
    茶點(diǎn)故事閱讀 68,729評論 6 397
  • 文/花漫 我一把揭開白布。 她就那樣靜靜地躺著月劈,像睡著了一般度迂。 火紅的嫁衣襯著肌膚如雪。 梳的紋絲不亂的頭發(fā)上猜揪,一...
    開封第一講書人閱讀 52,337評論 1 310
  • 那天英岭,我揣著相機(jī)與錄音,去河邊找鬼湿右。 笑死诅妹,一個(gè)胖子當(dāng)著我的面吹牛,可吹牛的內(nèi)容都是我干的毅人。 我是一名探鬼主播吭狡,決...
    沈念sama閱讀 40,902評論 3 421
  • 文/蒼蘭香墨 我猛地睜開眼,長吁一口氣:“原來是場噩夢啊……” “哼丈莺!你這毒婦竟也來了划煮?” 一聲冷哼從身側(cè)響起,我...
    開封第一講書人閱讀 39,807評論 0 276
  • 序言:老撾萬榮一對情侶失蹤缔俄,失蹤者是張志新(化名)和其女友劉穎弛秋,沒想到半個(gè)月后,有當(dāng)?shù)厝嗽跇淞掷锇l(fā)現(xiàn)了一具尸體俐载,經(jīng)...
    沈念sama閱讀 46,349評論 1 318
  • 正文 獨(dú)居荒郊野嶺守林人離奇死亡蟹略,尸身上長有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點(diǎn)故事閱讀 38,439評論 3 340
  • 正文 我和宋清朗相戀三年,在試婚紗的時(shí)候發(fā)現(xiàn)自己被綠了遏佣。 大學(xué)時(shí)的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片挖炬。...
    茶點(diǎn)故事閱讀 40,567評論 1 352
  • 序言:一個(gè)原本活蹦亂跳的男人離奇死亡,死狀恐怖状婶,靈堂內(nèi)的尸體忽然破棺而出意敛,到底是詐尸還是另有隱情,我是刑警寧澤膛虫,帶...
    沈念sama閱讀 36,242評論 5 350
  • 正文 年R本政府宣布草姻,位于F島的核電站,受9級特大地震影響稍刀,放射性物質(zhì)發(fā)生泄漏撩独。R本人自食惡果不足惜,卻給世界環(huán)境...
    茶點(diǎn)故事閱讀 41,933評論 3 334
  • 文/蒙蒙 一、第九天 我趴在偏房一處隱蔽的房頂上張望跌榔。 院中可真熱鬧异雁,春花似錦、人聲如沸僧须。這莊子的主人今日做“春日...
    開封第一講書人閱讀 32,420評論 0 24
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽担平。三九已至示绊,卻和暖如春,著一層夾襖步出監(jiān)牢的瞬間暂论,已是汗流浹背面褐。 一陣腳步聲響...
    開封第一講書人閱讀 33,531評論 1 272
  • 我被黑心中介騙來泰國打工, 沒想到剛下飛機(jī)就差點(diǎn)兒被人妖公主榨干…… 1. 我叫王不留取胎,地道東北人展哭。 一個(gè)月前我還...
    沈念sama閱讀 48,995評論 3 377
  • 正文 我出身青樓,卻偏偏與公主長得像闻蛀,于是被迫代替她去往敵國和親匪傍。 傳聞我的和親對象是個(gè)殘疾皇子,可洞房花燭夜當(dāng)晚...
    茶點(diǎn)故事閱讀 45,585評論 2 359

推薦閱讀更多精彩內(nèi)容