GAN

Tue Aug 6 14:07:09 CST 2019

What I want to know

  • What is GAN?
  • Why does it work?
  • How does learn the data distribution?
  • How can it help my project?

Materials

[Thesis] Learning to Synthesize and Manipulate Natural Images (SIG18 best PhD thesis)

[Video] Introduction to GANs, NIPS 2016, Ian Goodfellow, OpenAI (30mins)

<span id='nips2h'></span>
[Video] Ian Goodfellow: Generative Adversarial Networks (NIPS 2016 tutorial) (2hrs)

[WebPage] MSRA: 到底什么是生成式對抗網(wǎng)絡(luò)GAN薇缅?

<span id="web2"></span>
[WebPage] A Beginner's Guide to Generative Adversarial Networks (GANs)

Notes

Basics

GAN網(wǎng)絡(luò)有一個生成器\mathbf{G}和一個判別器\mathbf{D},生成器的目標(biāo)是為了學(xué)習(xí)數(shù)據(jù)分布梧奢,判別器的目標(biāo)是判定輸入的數(shù)據(jù)是否符合真實(shí)分布惦蚊。

優(yōu)化方程

min_\mathbf{G}max_\mathbf{D}\{E_{x\sim{}P_r}[log\mathbf{D}(x)]+E_{x\sim{}P_g}[log(1-\mathbf{D}(x))]\}

網(wǎng)絡(luò)的兩個部分

  • input noise z -> \mathbf{G} -> x sampled from \mathbf{G}, x=\mathbf{G}(z) -> \mathbf{D} -> \mathbf{G} tries to make \mathbf{D}(x)\rightarrow1, \mathbf{D} tries to make \mathbf{D}(x)\rightarrow0

  • x sampled from data -> \mathbf{D} -> D(x)\rightarrow1

Training

最直觀的處理辦法就是分別對D和g進(jìn)行交互迭代铣焊,固定g逊朽,優(yōu)化D,一段時間后粗截,固定D再優(yōu)化g惋耙,直到過程收斂。

High level understanding

Basically, GAN consists of a generator and a discriminator. The generator takes a feature vector, which in this case is random noise, and outputs a sample that mimics the data distribution. The discriminator takes a sample either from the original data or generated by generator, its job is to successfully distinguish real ones from fake ones.

As shown in the following figure, discriminator is supervised by ground truth label, while the generator is supervised by discriminator.
[圖片上傳失敗...(image-68adc6-1647766988833)]

Looking at it separately, the discriminator does a standard binary classification task, real or fake. The network returns a probability, a number between 0 and 1, with 1 representing a prediction of authenticity and 0 representing fake.

The generator takes a feature and expands it to a data sample. It is the opposite process of discriminator. The goal of a generator is to learn the data distribution, how to tell if you have successfully learned the distribution? The generator generates data from random noise, if the discriminator thinks it is real, the generator can be said to have learned the real distribution of the dataset.

The question a generative algorithm tries to answer is: Assuming this email is spam, how likely are these features (words it contains)? While discriminative models care about the relation between y (label) and x (data), generative models care about “how you get x.” They allow you to capture p(x\vert y), the probability of x given y, or the probability of features given a label or category.

The discriminator, on the other hand, captures p(y\vert x), the probability of y given x. Which is given an email (words it contains), how likely the email is spam.

GANs, Encoder-decoder, Autoencoders and VAEs

The generator in GAN serves the similar function as a decoder in Encoder-decoder network??? What are the differences.

[quote]
You can bucket generative algorithms into one of three types:

  • Given a label, they predict the associated features (Naive Bayes)
  • Given a hidden representation, they predict the associated features (VAE, GAN)
  • Given some of the features, they predict the rest (inpainting, imputation)

Applications

  • Same domain
    • super resolution
    • image filling / repairing
  • Cross domain transfer
    • 2d to 3d
    • text to image
    • picture style transfer
  • Learn joint distribution
    • learn attributes from images

Notes on NIPS tutorial

RoadMap

  • Why study generative modeling?
  • How do generative models work? How do GANs compare to others?
  • How do GANs work?
  • Tips and tricks
  • Research frontiers
  • Combining GANs with other methods
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
  • 序言:七十年代末熊昌,一起剝皮案震驚了整個濱河市绽榛,隨后出現(xiàn)的幾起案子,更是在濱河造成了極大的恐慌婿屹,老刑警劉巖灭美,帶你破解...
    沈念sama閱讀 221,635評論 6 515
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件,死亡現(xiàn)場離奇詭異昂利,居然都是意外死亡届腐,警方通過查閱死者的電腦和手機(jī),發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 94,543評論 3 399
  • 文/潘曉璐 我一進(jìn)店門蜂奸,熙熙樓的掌柜王于貴愁眉苦臉地迎上來犁苏,“玉大人,你說我怎么就攤上這事扩所∥辏” “怎么了?”我有些...
    開封第一講書人閱讀 168,083評論 0 360
  • 文/不壞的土叔 我叫張陵,是天一觀的道長助赞。 經(jīng)常有香客問我买羞,道長,這世上最難降的妖魔是什么雹食? 我笑而不...
    開封第一講書人閱讀 59,640評論 1 296
  • 正文 為了忘掉前任畜普,我火速辦了婚禮,結(jié)果婚禮上群叶,老公的妹妹穿的比我還像新娘吃挑。我一直安慰自己,他們只是感情好盖呼,可當(dāng)我...
    茶點(diǎn)故事閱讀 68,640評論 6 397
  • 文/花漫 我一把揭開白布儒鹿。 她就那樣靜靜地躺著化撕,像睡著了一般几晤。 火紅的嫁衣襯著肌膚如雪。 梳的紋絲不亂的頭發(fā)上植阴,一...
    開封第一講書人閱讀 52,262評論 1 308
  • 那天蟹瘾,我揣著相機(jī)與錄音,去河邊找鬼掠手。 笑死憾朴,一個胖子當(dāng)著我的面吹牛,可吹牛的內(nèi)容都是我干的喷鸽。 我是一名探鬼主播众雷,決...
    沈念sama閱讀 40,833評論 3 421
  • 文/蒼蘭香墨 我猛地睜開眼,長吁一口氣:“原來是場噩夢啊……” “哼做祝!你這毒婦竟也來了砾省?” 一聲冷哼從身側(cè)響起,我...
    開封第一講書人閱讀 39,736評論 0 276
  • 序言:老撾萬榮一對情侶失蹤混槐,失蹤者是張志新(化名)和其女友劉穎编兄,沒想到半個月后,有當(dāng)?shù)厝嗽跇淞掷锇l(fā)現(xiàn)了一具尸體声登,經(jīng)...
    沈念sama閱讀 46,280評論 1 319
  • 正文 獨(dú)居荒郊野嶺守林人離奇死亡狠鸳,尸身上長有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點(diǎn)故事閱讀 38,369評論 3 340
  • 正文 我和宋清朗相戀三年,在試婚紗的時候發(fā)現(xiàn)自己被綠了悯嗓。 大學(xué)時的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片件舵。...
    茶點(diǎn)故事閱讀 40,503評論 1 352
  • 序言:一個原本活蹦亂跳的男人離奇死亡,死狀恐怖脯厨,靈堂內(nèi)的尸體忽然破棺而出铅祸,到底是詐尸還是另有隱情,我是刑警寧澤俄认,帶...
    沈念sama閱讀 36,185評論 5 350
  • 正文 年R本政府宣布个少,位于F島的核電站洪乍,受9級特大地震影響,放射性物質(zhì)發(fā)生泄漏夜焦。R本人自食惡果不足惜壳澳,卻給世界環(huán)境...
    茶點(diǎn)故事閱讀 41,870評論 3 333
  • 文/蒙蒙 一、第九天 我趴在偏房一處隱蔽的房頂上張望茫经。 院中可真熱鬧巷波,春花似錦、人聲如沸卸伞。這莊子的主人今日做“春日...
    開封第一講書人閱讀 32,340評論 0 24
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽荤傲。三九已至垮耳,卻和暖如春,著一層夾襖步出監(jiān)牢的瞬間遂黍,已是汗流浹背终佛。 一陣腳步聲響...
    開封第一講書人閱讀 33,460評論 1 272
  • 我被黑心中介騙來泰國打工, 沒想到剛下飛機(jī)就差點(diǎn)兒被人妖公主榨干…… 1. 我叫王不留雾家,地道東北人铃彰。 一個月前我還...
    沈念sama閱讀 48,909評論 3 376
  • 正文 我出身青樓,卻偏偏與公主長得像芯咧,于是被迫代替她去往敵國和親牙捉。 傳聞我的和親對象是個殘疾皇子,可洞房花燭夜當(dāng)晚...
    茶點(diǎn)故事閱讀 45,512評論 2 359

推薦閱讀更多精彩內(nèi)容