Tagesbuch

Start from GAN
If feed all data in, after enough iterations, all output of generator would be 1 (using MNIST dataset), which is the simplest digit. --> "the generator fools discriminator with garbage"
training GAN for each classes individually --> 1. GAN structure is suitable for some classes, but when training some classes it leads to collapse mode; 2. not easy to select models for each classes

then to conditional GAN
similar structure, but with one-hot label concatenated to input of G and D
Advantage: no need to train model individually
Note: learning rate set to 0.001, 0.0001 would lead to bad result

then ACGAN
Current test shows ACGAN works not well while using two dense layers, reason might be that ACGAN only works when using convolutional D and G
todo: pretrain D

then Wasserstein GAN


  1. January
  1. January
    refine the proposal

10-12. January

  • implement a DC classifier for preparation to implement the discriminator
  • read Improved GAN, focus on this paper in following days
  1. January
  • DC classifier has no bugs, but performs awfully
  • install theano and lasagne to run the improvedGAN code
  1. - 19. January
  • finally install theano and its GPU backend correctly and fix a lot of deprecated issues
  1. January
  • try to translate it to keras, find way to implement the loss function
  1. January
  • translation to keras is way complicated, first try paviaU in the original theano code
  • 1D improved GAN is too bad for training paviaU (maybe the reason of the training data, check the training and testing data and resave them)
  1. January
  • prepare for the questions for tomorrow's meeting:
  • the loss function in the code does not match the loss in the paper, and the former has a very strange type
  • the l_lab and the train_err is the same thing
  • no implementation of K+1 class
  1. February
  • as to the 3d convolution, an idea: set stride=(1,1,2), which only manipulate the spectral dimension
  • try semi-supervised gan, discriminator classifies labeled sample, and generated sample as k+1, use unlabeled training data, set label as [0.1, 0.1, 0.1, ..., 0], on mnist dataset
  1. Feb. /- 9. Feb.
  • 1D tryout, seems good, need more tests
  1. March
    ready to test:
  • (replace conv3d to conv2d)
  • different training data size (count)
  • different patch size
  • different channel number
  • (different batch size)
  • (different deepwise conv channel)
  1. March
    find a case: the results that randomly choose 200 samples from the whole image as training set is much better than using randomly choose 200 samples from training set

  2. April

  • email to cluster team
  • try cross validation
  • ask Amir how to determine the final result
  • read the "discr_loss" blog, and try their code
  • read gan paper
  1. April
  • adam vs sgd
    the validation curve of using adam is up and down --> not suitable for normal early stopping algorithm
    try to fix: use smaller learning rate

  • alternative for progress (early stopping)
    not calculate the ratio of average training loss and min training loss in a training strip, but average training loss and past average training loss

  • learning rate decay strategy

  • optimizer for G and optimizer for D

  • use cross entropy loss of only first 9 labels to determine when to early stop

  • double check the Dataloader in demoGAN (zhu et al)(pytorch)

  1. April
  • test feature match, start from one layer model (ssgan_improved_pytorch)
  • try to implement custom loss function like keras
最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
  • 序言:七十年代末庐扫,一起剝皮案震驚了整個濱河市甜攀,隨后出現(xiàn)的幾起案子攀例,更是在濱河造成了極大的恐慌蚯根,老刑警劉巖榨惰,帶你破解...
    沈念sama閱讀 222,681評論 6 517
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件噪舀,死亡現(xiàn)場離奇詭異擅威,居然都是意外死亡壕探,警方通過查閱死者的電腦和手機,發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 95,205評論 3 399
  • 文/潘曉璐 我一進(jìn)店門郊丛,熙熙樓的掌柜王于貴愁眉苦臉地迎上來李请,“玉大人瞧筛,你說我怎么就攤上這事〉贾眩” “怎么了较幌?”我有些...
    開封第一講書人閱讀 169,421評論 0 362
  • 文/不壞的土叔 我叫張陵,是天一觀的道長白翻。 經(jīng)常有香客問我乍炉,道長,這世上最難降的妖魔是什么滤馍? 我笑而不...
    開封第一講書人閱讀 60,114評論 1 300
  • 正文 為了忘掉前任岛琼,我火速辦了婚禮,結(jié)果婚禮上纪蜒,老公的妹妹穿的比我還像新娘衷恭。我一直安慰自己,他們只是感情好纯续,可當(dāng)我...
    茶點故事閱讀 69,116評論 6 398
  • 文/花漫 我一把揭開白布随珠。 她就那樣靜靜地躺著,像睡著了一般猬错。 火紅的嫁衣襯著肌膚如雪窗看。 梳的紋絲不亂的頭發(fā)上,一...
    開封第一講書人閱讀 52,713評論 1 312
  • 那天倦炒,我揣著相機與錄音显沈,去河邊找鬼。 笑死逢唤,一個胖子當(dāng)著我的面吹牛拉讯,可吹牛的內(nèi)容都是我干的。 我是一名探鬼主播鳖藕,決...
    沈念sama閱讀 41,170評論 3 422
  • 文/蒼蘭香墨 我猛地睜開眼魔慷,長吁一口氣:“原來是場噩夢啊……” “哼!你這毒婦竟也來了著恩?” 一聲冷哼從身側(cè)響起院尔,我...
    開封第一講書人閱讀 40,116評論 0 277
  • 序言:老撾萬榮一對情侶失蹤,失蹤者是張志新(化名)和其女友劉穎喉誊,沒想到半個月后邀摆,有當(dāng)?shù)厝嗽跇淞掷锇l(fā)現(xiàn)了一具尸體,經(jīng)...
    沈念sama閱讀 46,651評論 1 320
  • 正文 獨居荒郊野嶺守林人離奇死亡伍茄,尸身上長有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點故事閱讀 38,714評論 3 342
  • 正文 我和宋清朗相戀三年栋盹,在試婚紗的時候發(fā)現(xiàn)自己被綠了。 大學(xué)時的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片敷矫。...
    茶點故事閱讀 40,865評論 1 353
  • 序言:一個原本活蹦亂跳的男人離奇死亡例获,死狀恐怖音念,靈堂內(nèi)的尸體忽然破棺而出,到底是詐尸還是另有隱情躏敢,我是刑警寧澤闷愤,帶...
    沈念sama閱讀 36,527評論 5 351
  • 正文 年R本政府宣布,位于F島的核電站件余,受9級特大地震影響讥脐,放射性物質(zhì)發(fā)生泄漏。R本人自食惡果不足惜啼器,卻給世界環(huán)境...
    茶點故事閱讀 42,211評論 3 336
  • 文/蒙蒙 一旬渠、第九天 我趴在偏房一處隱蔽的房頂上張望。 院中可真熱鬧端壳,春花似錦告丢、人聲如沸。這莊子的主人今日做“春日...
    開封第一講書人閱讀 32,699評論 0 25
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽。三九已至照捡,卻和暖如春颅湘,著一層夾襖步出監(jiān)牢的瞬間,已是汗流浹背栗精。 一陣腳步聲響...
    開封第一講書人閱讀 33,814評論 1 274
  • 我被黑心中介騙來泰國打工闯参, 沒想到剛下飛機就差點兒被人妖公主榨干…… 1. 我叫王不留,地道東北人悲立。 一個月前我還...
    沈念sama閱讀 49,299評論 3 379
  • 正文 我出身青樓鹿寨,卻偏偏與公主長得像,于是被迫代替她去往敵國和親薪夕。 傳聞我的和親對象是個殘疾皇子脚草,可洞房花燭夜當(dāng)晚...
    茶點故事閱讀 45,870評論 2 361

推薦閱讀更多精彩內(nèi)容

  • 寫下一篇文字,開啟一段新的旅程 當(dāng)我在鍵盤上敲下第一個文字的時候寥殖,我的腦海里總是浮現(xiàn)出這樣一段話:“這個博客到底是...
    TYB閱讀 514評論 0 51
  • 作為一個資深手機賣家玩讳,經(jīng)常會有朋友問我該如何選購手機涩蜘,感謝朋友給予的信任嚼贡,我愿意和大家分享我的一些心得體會,希望為...
    菲完美閱讀 767評論 0 0
  • 在java多線程并發(fā)編程中,Synchronized一直占有很重要的角色同诫。Synchronized通過獲取鎖來實現(xiàn)...
    Vinctor閱讀 793評論 0 2