1------------------------------------------------
201706
2------------------------------------------------
3------------------------------------------------
林老師在博士群里問撤嫩,有么有人愿意幫他審CIKM的稿效扫,本來不想給自己找事的,但是我主動(dòng)申請(qǐng)位谋,我想審,只是我沒有經(jīng)驗(yàn)怎么寫comments 宦棺。大老板說讓他教我們?cè)趺磳憽?br>
有時(shí)候啊沾鳄,越不主動(dòng),越?jīng)]有動(dòng)力响疚,越?jīng)]有激情兼都。
雖然什么事情都很艱難,但是克服一切困難稽寒,最后取得成果扮碧,這才是真正的收獲。
雖然覺得自己基礎(chǔ)差杏糙,但是其實(shí)是比剛?cè)胄5拇T士是要好很多的慎王,可是,再消沉下去宏侍,差距真的會(huì)越來越大赖淤。
4------------------------------------------------
小螞蟻
在麗娃河畔的長木枯藤椅上,她靜靜坐著谅河,白玉蘭靜靜綻放咱旱,池塘綠荷努力地生長著确丢,大朵大朵的綠
小螞蟻在這種枯悶的天氣下,來回穿梭
她看了一會(huì)兒吐限。它們?cè)诳菽旧吓腔蚕式模奶帉ふ沂裁?br>
行走飛快
是食物?還是歸宿诸典?
用腳擋住它的路
如果是人的生活描函,那還是多大的挫折
她暗自想,如果狐粱,在受到挫折時(shí)舀寓,它依然回來繼續(xù)這條路,她會(huì)告訴它這是一條需要受到懲罰的路
此刻肌蜻,她有些悲傷了
一只小螞蟻闖入了她設(shè)定的閾值范圍
她悲傷著自己可以隨意改變了其他生物的生存狀態(tài)
或者那悲傷只因?yàn)?br>
夕陽近了互墓,光明緩慢了
風(fēng)吹散頭發(fā),吹走了年華
5------------------------------------------------
每天蒋搜,晚上不想睡轰豆,白天起不來。
2017.06.05
Short term goal : Improve the accuracy of Joint non-negative matrix factorization.
- Finish writing the comments and revisions of TKDE papers.
- Attend the paper sharing weekly report.
6------------------------------------------------
2017.06.08
Short term goal : Constructing an evolution road of the latent topics
1.Coding: change the update functions. In first step, my goal is discovering latent topic from text, and the second step is constructing an evolution road of the latent topic. The main problem now is the second step. Learn from the other papers, there are two ways to implement the evolution parts, that is building topic tree (or graph) structure by computing the topic similarity and using HMM based model to learn the shift states.
2.Attend weekly report.
7------------------------------------------------
2017.06.09
- Read the paper of <Topic Aware Neural Response Generation> and prepare the weekly paper sharing presentation.
-- The paper is a Topic Aware seq2seq model which joint the pre-trained LDA model vectors to affect the generation of words in decoding. - Brain Storming
-- Talk with SongYang and ChenQin about how to build a GAN model to generated real topic classification.
Generator: Use LDA generated topic and NMF generated topic as a generator;
Discriminator: Use a classifer as a discriminator.
8------------------------------------------------
你選擇主動(dòng)創(chuàng)新齿诞,還是被動(dòng)回應(yīng)酸休?
當(dāng)你生活的本身成了一個(gè)創(chuàng)新的過程,一種全新的生活體驗(yàn)就會(huì)在你的面前展現(xiàn):你會(huì)發(fā)現(xiàn)自己已經(jīng)融入了生活的本質(zhì)之中祷杈。
9------------------------------------------------
20170610
分析了一下我上任斑司,已經(jīng)畢業(yè)了的師兄的論文。
去年剛開始來的時(shí)候根本一臉懵逼看不懂那些論文但汞,也沒什么方向宿刮。花了半個(gè)小時(shí)私蕾,他發(fā)的三篇文章居然看懂了僵缺,而且覺得創(chuàng)造性也不大。兩篇C1b踩叭,其中一個(gè)C不像是C磕潮,在CCF列表沒有搜到。
1b1c都是做用戶情景推薦電視節(jié)目容贝,因?yàn)殡娨暿侨夜蚕碜愿信仙俨煌瑫r(shí)間看的節(jié)目也不同,其實(shí)就是融合了用戶的習(xí)慣去推薦斤富,加了一個(gè)時(shí)間維度膏潮。沒有什么理論上的創(chuàng)新。
數(shù)據(jù)非公開數(shù)據(jù)集满力。
另一篇c是加入了用戶關(guān)系的推薦焕参,不是電視視頻推薦了轻纪。定義了社交影響力,加到目標(biāo)函數(shù)中進(jìn)行矩陣分解
加業(yè)務(wù)加場景叠纷,不大改模型刻帚。
我想,自己也不大改模型了讲岁。
再想個(gè)模型我擂,套進(jìn)去吧衬以。先做成個(gè)東西缓艳,不讓自己覺得毫無成就感就好。一點(diǎn)點(diǎn)精進(jìn)吧看峻!
10------------------------------------------------
晚上就輪到我presentation
講用神經(jīng)網(wǎng)絡(luò)生成自動(dòng)聊天對(duì)話
論文是看懂了阶淘,
感覺實(shí)現(xiàn)起來還是很難
借助了幾個(gè)實(shí)驗(yàn)室的大牛同學(xué)才看懂了。
真的應(yīng)該多和別人交流互妓,自己悶聲坐著除了愁自己能有什么作用
有點(diǎn)困溪窒。
11------------------------------------------------
24歲本命年的時(shí)候,求得一串開光小葉紫檀手串冯勉。
可是那一年從元旦的掛科開始澈蚌,過得很不平靜。感情灼狰,生活宛瞄,諸事不順。
24歲結(jié)束的時(shí)候交胚,小葉紫檀手串散了一地份汗。
收起,24顆蝴簇,一顆都沒有丟杯活。
25歲的時(shí)候,來上海熬词,帶了其中的一顆旁钧。其余的,由一位舊友代為保管互拾。
這一顆圣眼均践,安靜的躺著,封印著一些舊時(shí)光摩幔。
近日彤委,朋友將其歸還于我』蚝猓看著它們焦影,想起了在上課的時(shí)候還會(huì)去盤玩的自己……
想起晚上睡覺前會(huì)小心翼翼收好
它是那么神圣
那時(shí)候小葉紫檀佛珠紅潤光澤
現(xiàn)在车遂,氧化成一片黑色。
有點(diǎn)難過斯辰。有點(diǎn)心疼舶担。
過去的時(shí)光總會(huì)過去,
過去的事物也再也不在彬呻。
小葉紫檀衣陶,打算重新穿起。
丟失了靈光闸氮,它只是一串普通的手串剪况。
12------------------------------------------------
崩潰
13------------------------------------------------
做了個(gè)夢
和博士姐姐一起,在一座很美的島上玩耍蒲跨,周圍是一片靜謐的水
在這樣一片翠綠中译断,我們?cè)谂恼真覒颉?br>
遠(yuǎn)處傳來了陣陣喧鬧聲
匆忙跑去
眼前的一幕驚呆了
居然,河水奔騰卷涌而來或悲,很多土地被淹沒孙咪,人也被卷入水中。
我們瘋狂地跑啊巡语,雖然知道翎蹈,即使跑到了山頂,依然可能是徒勞男公。
自然界的滄海桑田荤堪。
這是一種絕望的感覺。
夢醒了理澎,不知道夢里的自己是否逃脫了這個(gè)魔蔓逞力。
生活還是這么晴好。
14------------------------------------------------
老爸給我微信視頻糠爬,到最后我也沒說出那句寇荧,老爸節(jié)日快樂,我愛你执隧。
我不快樂揩抡,他們看到后也覺得不快樂。
有效的學(xué)習(xí)時(shí)間很短镀琉。
實(shí)驗(yàn)室每天充滿了負(fù)能量峦嗤,各種打游戲的鼠標(biāo)聲。
太過于壓抑屋摔。
15------------------------------------------------
2017.06.28
Short term goal : Construct an evolution road of the topics && Compare and analysis two time-step topics.
1.Communicate with Prof.He about the experiments.In order to simplify the task, try to decompose the documents into two time-step to analyze.
Yang song suggests me try to add some news models to supports the hypothesis.
2.Read and run a source code of seq2seq model, which is used for text summary.
And I have read some relevant papers before.
16------------------------------------------------
在家待了一天烁设,陪老公過了生日。
之前已經(jīng)兩周沒有回家了钓试∽昂冢回去的次數(shù)越來越少副瀑。
從老公過完生日開始,我們就開始"分家"恋谭,公公婆婆開始搬出去住糠睡,老公也開始一個(gè)人住。
他們請(qǐng)求我以后有時(shí)間多回家呆呆疚颊,陪陪老公狈孔,他們擔(dān)心自己兒子,晚上會(huì)不會(huì)抽煙材义?會(huì)不會(huì)又熬夜很晚均抽?會(huì)不會(huì)照顧不好自己...
他們說我在家也很安靜啊,也可以學(xué)習(xí)母截。
其實(shí)我心里還是挺難過的到忽,挺自責(zé)橄教,因?yàn)樽约旱膶W(xué)業(yè)需要讓自己的家人也跟著緊張不堪清寇。
這一年來,常常覺得壓力很大护蝶,讓自己變得很消極华烟,整個(gè)人都傻了一般,卻還在艱難的堅(jiān)持持灰。
不知道這份堅(jiān)持能否帶來回報(bào)盔夜,又或者對(duì)于這些回報(bào),自己的期望又太高堤魁。
總之喂链,心很累。
17------------------------------------------------
做了個(gè)夢妥泉,夢到在自己家不遠(yuǎn)處椭微,有一個(gè)很美的山
會(huì)有很多的雪,在陽光下金閃金閃亮
突然盲链,一架飛機(jī)飛過蝇率,觸動(dòng)到山頂,飛機(jī)墜毀!就在我的眼前刽沾。
最近總是夢到一些恐怖的事情
一會(huì)海嘯一會(huì)飛機(jī)的
哎
18------------------------------------------------
2017.06.29
Shorttermgoal: Construct an evolution road of the topics && Compare and analysis two time-step topics.
Read a paper about <A Neural Model for Joint Event Detection and Summarization> (IJCAI 2017). This paper is the first neural method paper I have read which used for event detection by learn a pairwise tweet similarity function and to identify first stories in a tweet stream.
There are always so much noises in the tweet stream, and the neural method to detect and summarize events as following step:
1.Tweets Filtering: classify the tweets relevant or irrelevant to the events through a MLP
- Event Clustering: decide whether it belongs to an existing event cluster, or describes a new
event. They use a siamese Neural Network for calculating similarity.
3.Event Summarization: rank all the tweets in the cluster using a probability score, and
select top-n to build the summary.
Similiar to the paper <Learning Similarity Functions for Topic Detection in Online Reputation Monitoring>(SIGIR 2014). This paper propose a traditional but useful method for topic detection task. - learns a pairwise tweet similarity function from previously annotated data, using all kinds of content-based and Twitter-based features;
- applies a clustering algorithm on the previously learned similarity function.
19------------------------------------------------
2017.06.29
Shorttermgoal: Construct an evolution road of the topics && Compare and analysis two time-step topics.
Read a paper about <A Neural Model for Joint Event Detection and Summarization> (IJCAI 2017). This paper is the first neural method paper I have read which used for event detection by learn a pairwise tweet similarity function and to identify first stories in a tweet stream.
There are always so much noises in the tweet stream, and the neural method to detect and summarize events as following step:
1.Tweets Filtering: classify the tweets relevant or irrelevant to the events through a MLP
- Event Clustering: decide whether it belongs to an existing event cluster, or describes a new
event. They use a siamese Neural Network for calculating similarity.
3.Event Summarization: rank all the tweets in the cluster using a probability score, and
select top-n to build the summary.
Similiar to the paper <Learning Similarity Functions for Topic Detection in Online Reputation Monitoring>(SIGIR 2014). This paper propose a traditional but useful method for topic detection task. - learns a pairwise tweet similarity function from previously annotated data, using all kinds of content-based and Twitter-based features;
- applies a clustering algorithm on the previously learned similarity function.