Train word2vec in TensorFlow

Let’s apply these steps to creating our word2vec, skip-gram model.

Phase 1: Assemble the graph

  1. Define placeholders for input and output
    Input is the center word and output is the target (context) word. Instead of using one-hot vectors, we input the index of those words directly. For example, if the center word is the 1001th word in the vocabulary, we input the number 1001.
    Each sample input is a scalar, the placeholder for BATCH_SIZE sample inputs with have shape[BATCH_SIZE].
    Similar, the placeholder for BATCH_SIZE sample outputs with have shape [BATCH_SIZE].


    image.png

    Note that our center_words and target_words being fed in are both scalars -- we feed in their corresponding indices in our vocabulary.

  2. Define the weight (in this case, embedding matrix)
    Each row corresponds to the representation vector of one word. If one word is represented with a vector of size EMBED_SIZE, then the embedding matrix will have shape [VOCAB_SIZE, EMBED_SIZE]. We initialize the embedding matrix to value from a random distribution. In this case, let’s choose uniform distribution.


    image.png
  3. Inference (compute the forward path of the graph)
    Our goal is to get the vector representations of words in our dictionary. Remember that the embed_matrix has dimension VOCAB_SIZE x EMBED_SIZE, with each row of the embedding matrix corresponds to the vector representation of the word at that index. So to get the representation of all the center words in the batch, we get the slice of all corresponding rows in the embedding matrix. TensorFlow provides a convenient method to do so called tf.nn.embedding_lookup().


    image.png

This method is really useful when it comes to matrix multiplication with one-hot vectors because it saves us from doing a bunch of unnecessary computation that will return 0 anyway. An illustration from Chris McCormick for multiplication of a one-hot vector with a matrix.

image.png

So, to get the embedding (or vector representation) of the input center words, we use this:

image.png
  1. Define the loss function
    While NCE is cumbersome to implement in pure Python, TensorFlow already implemented it for us.
image.png

Note that by the way the function is implemented, the third argument is actually inputs, and the fourth is labels. This ambiguity can be quite troubling sometimes, but keep in mind that TensorFlow is still new and growing and therefore might not be perfect. Nce_loss source code can be found here.

For nce_loss, we need weights and biases for the hidden layer to calculate NCE loss.


image.png

Then we define loss:


image.png
  1. Define optimizer
    We will use the good old gradient descent.


    image.png

Phase 2: Execute the computation

We will create a session then within the session, use the good old feed_dict to feed inputs and outputs into the placeholders, run the optimizer to minimize the loss, and fetch the loss value to report back to us.


image.png
最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請(qǐng)聯(lián)系作者
  • 序言:七十年代末场钉,一起剝皮案震驚了整個(gè)濱河市夏志,隨后出現(xiàn)的幾起案子,更是在濱河造成了極大的恐慌艰垂,老刑警劉巖,帶你破解...
    沈念sama閱讀 210,978評(píng)論 6 490
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件碉钠,死亡現(xiàn)場(chǎng)離奇詭異流妻,居然都是意外死亡,警方通過(guò)查閱死者的電腦和手機(jī)革砸,發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 89,954評(píng)論 2 384
  • 文/潘曉璐 我一進(jìn)店門(mén)除秀,熙熙樓的掌柜王于貴愁眉苦臉地迎上來(lái),“玉大人算利,你說(shuō)我怎么就攤上這事册踩。” “怎么了效拭?”我有些...
    開(kāi)封第一講書(shū)人閱讀 156,623評(píng)論 0 345
  • 文/不壞的土叔 我叫張陵暂吉,是天一觀的道長(zhǎng)。 經(jīng)常有香客問(wèn)我缎患,道長(zhǎng)慕的,這世上最難降的妖魔是什么? 我笑而不...
    開(kāi)封第一講書(shū)人閱讀 56,324評(píng)論 1 282
  • 正文 為了忘掉前任挤渔,我火速辦了婚禮肮街,結(jié)果婚禮上,老公的妹妹穿的比我還像新娘判导。我一直安慰自己嫉父,他們只是感情好,可當(dāng)我...
    茶點(diǎn)故事閱讀 65,390評(píng)論 5 384
  • 文/花漫 我一把揭開(kāi)白布骡楼。 她就那樣靜靜地躺著熔号,像睡著了一般。 火紅的嫁衣襯著肌膚如雪鸟整。 梳的紋絲不亂的頭發(fā)上引镊,一...
    開(kāi)封第一講書(shū)人閱讀 49,741評(píng)論 1 289
  • 那天,我揣著相機(jī)與錄音篮条,去河邊找鬼弟头。 笑死,一個(gè)胖子當(dāng)著我的面吹牛涉茧,可吹牛的內(nèi)容都是我干的赴恨。 我是一名探鬼主播,決...
    沈念sama閱讀 38,892評(píng)論 3 405
  • 文/蒼蘭香墨 我猛地睜開(kāi)眼伴栓,長(zhǎng)吁一口氣:“原來(lái)是場(chǎng)噩夢(mèng)啊……” “哼伦连!你這毒婦竟也來(lái)了雨饺?” 一聲冷哼從身側(cè)響起,我...
    開(kāi)封第一講書(shū)人閱讀 37,655評(píng)論 0 266
  • 序言:老撾萬(wàn)榮一對(duì)情侶失蹤惑淳,失蹤者是張志新(化名)和其女友劉穎额港,沒(méi)想到半個(gè)月后,有當(dāng)?shù)厝嗽跇?shù)林里發(fā)現(xiàn)了一具尸體歧焦,經(jīng)...
    沈念sama閱讀 44,104評(píng)論 1 303
  • 正文 獨(dú)居荒郊野嶺守林人離奇死亡移斩,尸身上長(zhǎng)有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點(diǎn)故事閱讀 36,451評(píng)論 2 325
  • 正文 我和宋清朗相戀三年,在試婚紗的時(shí)候發(fā)現(xiàn)自己被綠了绢馍。 大學(xué)時(shí)的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片向瓷。...
    茶點(diǎn)故事閱讀 38,569評(píng)論 1 340
  • 序言:一個(gè)原本活蹦亂跳的男人離奇死亡,死狀恐怖舰涌,靈堂內(nèi)的尸體忽然破棺而出猖任,到底是詐尸還是另有隱情,我是刑警寧澤舵稠,帶...
    沈念sama閱讀 34,254評(píng)論 4 328
  • 正文 年R本政府宣布超升,位于F島的核電站,受9級(jí)特大地震影響哺徊,放射性物質(zhì)發(fā)生泄漏室琢。R本人自食惡果不足惜,卻給世界環(huán)境...
    茶點(diǎn)故事閱讀 39,834評(píng)論 3 312
  • 文/蒙蒙 一落追、第九天 我趴在偏房一處隱蔽的房頂上張望盈滴。 院中可真熱鬧,春花似錦轿钠、人聲如沸巢钓。這莊子的主人今日做“春日...
    開(kāi)封第一講書(shū)人閱讀 30,725評(píng)論 0 21
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽(yáng)症汹。三九已至,卻和暖如春贷腕,著一層夾襖步出監(jiān)牢的瞬間背镇,已是汗流浹背。 一陣腳步聲響...
    開(kāi)封第一講書(shū)人閱讀 31,950評(píng)論 1 264
  • 我被黑心中介騙來(lái)泰國(guó)打工泽裳, 沒(méi)想到剛下飛機(jī)就差點(diǎn)兒被人妖公主榨干…… 1. 我叫王不留瞒斩,地道東北人。 一個(gè)月前我還...
    沈念sama閱讀 46,260評(píng)論 2 360
  • 正文 我出身青樓涮总,卻偏偏與公主長(zhǎng)得像胸囱,于是被迫代替她去往敵國(guó)和親。 傳聞我的和親對(duì)象是個(gè)殘疾皇子瀑梗,可洞房花燭夜當(dāng)晚...
    茶點(diǎn)故事閱讀 43,446評(píng)論 2 348

推薦閱讀更多精彩內(nèi)容