2018-07-11

[1805.09393] Pouring Sequence Prediction using Recurrent Neural Network
https://arxiv.org/abs/1805.09393

價值傳播網(wǎng)絡(luò)粤策,在更復(fù)雜的動態(tài)環(huán)境中進行規(guī)劃的方法 | 機器之心
https://www.jiqizhixin.com/articles/2018-06-21

DeepMind提出關(guān)系RNN:記憶模塊RMC解決關(guān)系推理難題 | 機器之心
https://www.jiqizhixin.com/articles/070104

當前訓(xùn)練神經(jīng)網(wǎng)絡(luò)最快的方式:AdamW優(yōu)化算法+超級收斂 | 機器之心
https://www.jiqizhixin.com/articles/2018-07-03-14

《Graph learning》| 圖傳播算法(下) - 簡書
http://www.reibang.com/p/e7fb897b1d09

論文筆記之Deep Convolutional Networks on Graph-Structured Data - CSDN博客
https://blog.csdn.net/BVL10101111/article/details/53437940

優(yōu)于VAE,為萬能近似器高斯混合模型加入Wasserstein距離 | 機器之心
https://www.jiqizhixin.com/articles/2018-07-07-4

卷積神經(jīng)網(wǎng)絡(luò)不能處理“圖”結(jié)構(gòu)數(shù)據(jù)胎撤?這篇文章告訴你答案 | 雷鋒網(wǎng)
https://www.leiphone.com/news/201706/ppA1Hr0M0fLqm7OP.html

學(xué)界 | 神經(jīng)網(wǎng)絡(luò)碰上高斯過程直颅,DeepMind連發(fā)兩篇論文開啟深度學(xué)習(xí)新方向
https://mp.weixin.qq.com/s?__biz=MzA3MzI4MjgzMw==&mid=2650744847&idx=4&sn=6d04d771485c0970742e33b57dc452a9&chksm=871aec71b06d65671e386229eb75641539aef9e1525e45f2c0f70f6fe9f845d088af9c9cd9fa&scene=38#wechat_redirect

[1807.03402] IGLOO: Slicing the Features Space to Represent Long Sequences
https://arxiv.org/abs/1807.03402

量子位 上海交大搞出SRNN掩缓,比普通RNN也就快135倍
https://mp.weixin.qq.com/s/wfOzCxe3L2t11VguYLGC9Q

[1807.03379] Online Scoring with Delayed Information: A Convex Optimization Viewpoint
https://arxiv.org/abs/1807.03379

Graph Convolutional Networks (GCNs) 簡介 - AHU-WangXiao - 博客園
https://www.cnblogs.com/wangxiaocvpr/p/8059769.html

(1 封私信 / 80 條消息)如何理解 Graph Convolutional Network(GCN)霹菊? - 知乎
https://www.zhihu.com/question/54504471

How powerful are Graph Convolutions? (review of Kipf & Welling, 2016)
https://www.inference.vc/how-powerful-are-graph-convolutions-review-of-kipf-welling-2016-2/

Reinforcement learning’s foundational flaw
https://thegradient.pub/why-rl-is-flawed/

tkipf/gcn: Implementation of Graph Convolutional Networks in TensorFlow
https://github.com/tkipf/gcn

Graph Convolutional Networks | Thomas Kipf | PhD Student @ University of Amsterdam
http://tkipf.github.io/graph-convolutional-networks/

[1807.03379] Online Scoring with Delayed Information: A Convex Optimization Viewpoint https://arxiv.org/abs/1807.03379

We consider a system where agents enter in an online fashion and are evaluated based on their attributes or context vectors. There can be practical situations where this context is partially observed, and the unobserved part comes after some delay. We assume that an agent, once left, cannot re-enter the system. Therefore, the job of the system is to provide an estimated score for the agent based on her instantaneous score and possibly some inference of the instantaneous score over the delayed score. In this paper, we estimate the delayed context via an online convex game between the agent and the system. We argue that the error in the score estimate accumulated over [Math Processing Error] iterations is small if the regret of the online convex game is small. Further, we leverage side information about the delayed context in the form of a correlation function with the known context. We consider the settings where the delay is fixed or arbitrarily chosen by an adversary. Furthermore, we extend the formulation to the setting where the contexts are drawn from some Banach space. Overall, we show that the average penalty for not knowing the delayed context while making a decision scales with [Math Processing Error], where this can be improved to [Math Processing Error] under special setting.
————————————————————

[1807.03402] IGLOO: Slicing the Features Space to Represent Long Sequences
https://arxiv.org/abs/1807.03402

We introduce a new neural network architecture, IGLOO, which aims at providing a representation for long sequences where RNNs fail to converge. The structure uses the relationships between random patches sliced out of the features space of some backbone 1 dimensional CNN to find a representation. This paper explains the implementation of the method and provides benchmark results commonly used for RNNs and compare IGLOO to other structures recently published. It is found that IGLOO can deal with sequences of up to 25,000 time steps. For shorter sequences it is also found to be effective and we find that it achieves the highest score in the literature for the permuted MNIST task. Benchmarks also show that IGLOO can run at the speed of the CuDNN optimised GRU or LSTM without being tied to any specific hardware.
————————————————————

[1807.03523] DLOPT: Deep Learning Optimization Library
https://arxiv.org/abs/1807.03523
Deep learning hyper-parameter optimization is a tough task. Finding an appropriate network configuration is a key to success, however most of the times this labor is roughly done. In this work we introduce a novel library to tackle this problem, the Deep Learning Optimization Library: DLOPT. We briefly describe its architecture and present a set of use examples. This is an open source project developed under the GNU GPL v3 license and it is freely available at this https URL

————————————————————
[1807.03710] Recurrent Auto-Encoder Model for Large-Scale Industrial Sensor Signal Analysis
https://arxiv.org/abs/1807.03710
Recurrent auto-encoder model summarises sequential data through an encoder structure into a fixed-length vector and then reconstructs the original sequence through the decoder structure. The summarised vector can be used to represent time series features. In this paper, we propose relaxing the dimensionality of the decoder output so that it performs partial reconstruction. The fixed-length vector therefore represents features in the selected dimensions only. In addition, we propose using rolling fixed window approach to generate training samples from unbounded time series data. The change of time series features over time can be summarised as a smooth trajectory path. The fixed-length vectors are further analysed using additional visualisation and unsupervised clustering techniques. The proposed method can be applied in large-scale industrial processes for sensors signal analysis purpose, where clusters of the vector representations can reflect the operating states of the industrial system.

————————————————————

[1807.03748] Representation Learning with Contrastive Predictive Coding
https://arxiv.org/abs/1807.03748
While supervised learning has enabled great progress in many applications, unsupervised learning has not seen such widespread adoption, and remains an important and challenging endeavor for artificial intelligence. In this work, we propose a universal unsupervised learning approach to extract useful representations from high-dimensional data, which we call Contrastive Predictive Coding. The key insight of our model is to learn such representations by predicting the future in latent space by using powerful autoregressive models. We use a probabilistic contrastive loss which induces the latent space to capture information that is maximally useful to predict future samples. It also makes the model tractable by using negative sampling. While most prior work has focused on evaluating representations for a particular modality, we demonstrate that our approach is able to learn useful representations achieving strong performance on four distinct domains: speech, images, text and reinforcement learning in 3D environments.

【(Colab/tf.keras + eager)RNN文本生成實例】“Text Generation using a RNN - end-to-end example of generating Shakespeare-like text using tf.keras + eager” 網(wǎng)頁鏈接 ????

【強化學(xué)習(xí)之旅——持續(xù)控制角度】《A Tour of Reinforcement Learning: The View from Continuous Control》by Benjamin Recht [UC Berkeley] 網(wǎng)頁鏈接

最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
  • 序言:七十年代末冬念,一起剝皮案震驚了整個濱河市椭住,隨后出現(xiàn)的幾起案子崇渗,更是在濱河造成了極大的恐慌,老刑警劉巖函荣,帶你破解...
    沈念sama閱讀 218,682評論 6 507
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件,死亡現(xiàn)場離奇詭異扳肛,居然都是意外死亡傻挂,警方通過查閱死者的電腦和手機,發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 93,277評論 3 395
  • 文/潘曉璐 我一進店門挖息,熙熙樓的掌柜王于貴愁眉苦臉地迎上來金拒,“玉大人,你說我怎么就攤上這事套腹⌒髋祝” “怎么了?”我有些...
    開封第一講書人閱讀 165,083評論 0 355
  • 文/不壞的土叔 我叫張陵电禀,是天一觀的道長幢码。 經(jīng)常有香客問我,道長尖飞,這世上最難降的妖魔是什么症副? 我笑而不...
    開封第一講書人閱讀 58,763評論 1 295
  • 正文 為了忘掉前任,我火速辦了婚禮政基,結(jié)果婚禮上贞铣,老公的妹妹穿的比我還像新娘。我一直安慰自己沮明,他們只是感情好辕坝,可當我...
    茶點故事閱讀 67,785評論 6 392
  • 文/花漫 我一把揭開白布。 她就那樣靜靜地躺著荐健,像睡著了一般酱畅。 火紅的嫁衣襯著肌膚如雪琳袄。 梳的紋絲不亂的頭發(fā)上,一...
    開封第一講書人閱讀 51,624評論 1 305
  • 那天圣贸,我揣著相機與錄音挚歧,去河邊找鬼。 笑死吁峻,一個胖子當著我的面吹牛滑负,可吹牛的內(nèi)容都是我干的。 我是一名探鬼主播用含,決...
    沈念sama閱讀 40,358評論 3 418
  • 文/蒼蘭香墨 我猛地睜開眼矮慕,長吁一口氣:“原來是場噩夢啊……” “哼!你這毒婦竟也來了啄骇?” 一聲冷哼從身側(cè)響起痴鳄,我...
    開封第一講書人閱讀 39,261評論 0 276
  • 序言:老撾萬榮一對情侶失蹤,失蹤者是張志新(化名)和其女友劉穎缸夹,沒想到半個月后痪寻,有當?shù)厝嗽跇淞掷锇l(fā)現(xiàn)了一具尸體,經(jīng)...
    沈念sama閱讀 45,722評論 1 315
  • 正文 獨居荒郊野嶺守林人離奇死亡虽惭,尸身上長有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點故事閱讀 37,900評論 3 336
  • 正文 我和宋清朗相戀三年橡类,在試婚紗的時候發(fā)現(xiàn)自己被綠了。 大學(xué)時的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片芽唇。...
    茶點故事閱讀 40,030評論 1 350
  • 序言:一個原本活蹦亂跳的男人離奇死亡顾画,死狀恐怖,靈堂內(nèi)的尸體忽然破棺而出匆笤,到底是詐尸還是另有隱情研侣,我是刑警寧澤,帶...
    沈念sama閱讀 35,737評論 5 346
  • 正文 年R本政府宣布炮捧,位于F島的核電站庶诡,受9級特大地震影響,放射性物質(zhì)發(fā)生泄漏咆课。R本人自食惡果不足惜灌砖,卻給世界環(huán)境...
    茶點故事閱讀 41,360評論 3 330
  • 文/蒙蒙 一、第九天 我趴在偏房一處隱蔽的房頂上張望傀蚌。 院中可真熱鬧基显,春花似錦、人聲如沸。這莊子的主人今日做“春日...
    開封第一講書人閱讀 31,941評論 0 22
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽鹃锈。三九已至窜醉,卻和暖如春宪萄,著一層夾襖步出監(jiān)牢的瞬間,已是汗流浹背榨惰。 一陣腳步聲響...
    開封第一講書人閱讀 33,057評論 1 270
  • 我被黑心中介騙來泰國打工拜英, 沒想到剛下飛機就差點兒被人妖公主榨干…… 1. 我叫王不留,地道東北人琅催。 一個月前我還...
    沈念sama閱讀 48,237評論 3 371
  • 正文 我出身青樓居凶,卻偏偏與公主長得像,于是被迫代替她去往敵國和親藤抡。 傳聞我的和親對象是個殘疾皇子侠碧,可洞房花燭夜當晚...
    茶點故事閱讀 44,976評論 2 355