【知識圖譜——機器大腦中的知識庫】 最近寫了一篇關(guān)于知識圖譜的科普短文《知識圖譜——機器大腦中的知識庫》掷空,未來可能會成為一本書中的一章岔帽,先放出來敬請大家指正:O網(wǎng)頁鏈接
【課程資料: (Caltech)John Preskill的量子計算】《Physics 219/Computer Science 219 - Quantum Computation》John PreskillO網(wǎng)頁鏈接
【DNN特征表示用于無監(jiān)督學習的用例】《Unsupervised Learning: Use Cases》O網(wǎng)頁鏈接
【論文:微博時序遷移熵分析】《The dynamic of information-driven coordination phenomena: a transfer entropy analysis》J Borge-Holthoefer, N Perra, B Gon?alves, S González-Bailón, A Arenas (2015)O網(wǎng)頁鏈接
《愛可可老師今日視野(15.07.26)》( 分享自@簡書)O網(wǎng)頁鏈接
【論文:分布式矩陣補全與(魯棒)分解】《Distributed Matrix Completion and Robust Factorization》L. Mackey, A. Talwalkar, M. I. Jordan (JMLR2015)O網(wǎng)頁鏈接DFC&Code:O網(wǎng)頁鏈接
【論文:面向大規(guī)模機器學習的自動模型搜索TUPAQ】《Automating Model Search for Large Scale Machine Learning》E. Sparks, A. Talwalkar, D. Haas, M. Franklin, M. I. Jordan, T. Kraska (SOCC2015) pdf:O網(wǎng)頁鏈接? arXiv上的版本《TuPAQ: An Efficient Planner for Large-scale Predictive Analytic Queries》O網(wǎng)頁鏈接
【論文:基于sketching實現(xiàn)快速可靠的張量分解】《Fast and Guaranteed Tensor Decomposition via Sketching》Y Wang, H Tung, A Smola, A Anandkumar (2015)O網(wǎng)頁鏈接
【Python/Pandas/Matplotlib/Scikit-learn通過自行車數(shù)據(jù)分析西雅圖工作習慣】《Learning Seattle's Work Habits from Bicycle Counts (Updated!)》O網(wǎng)頁鏈接
【開源:基于Torch的RNN文本生成】"Playground for some RNN stuff in Torch" GitHub:O網(wǎng)頁鏈接
#state-of-the-art#新書【Convex Optimization AlgorithmsO網(wǎng)頁鏈接O網(wǎng)頁鏈接】 MIT課程【Convex Analysis and OptimizationO網(wǎng)頁鏈接】
《MPI 簡易入門》O網(wǎng)頁鏈接
#Mirror Descent#The Mirror Descent AlgorithmO網(wǎng)頁鏈接Mirror descent and nonlinear projected subgradient methods, 2003O網(wǎng)頁鏈接Tutorial: Mirror Descent Algorithms for Large-Scale Deterministic and Stochastic Convex Optimization, 2012O網(wǎng)頁鏈接
COLT的open problem: The landscape of the loss surfaces of multilayer networks AISTAT的問題結(jié)論:The Loss Surfaces of Multilayer Networks 傳送門:COLTO網(wǎng)頁鏈接AISTATO網(wǎng)頁鏈接
arXiv [1507.06411] Arbitrariness of peer review: A Bayesian analysis of the NIPS experiment囊蓝,從貝葉斯分析NIPS起胰,看同行評審的意義O網(wǎng)頁鏈接
“RNN以及LSTM的介紹和公式梳理” ?RNN-LSTM 今年太火了 最近刷Image Caption 要刷榜基本靠它了最近看了不少RNN的升酣,記錄一下~~~~~《RNN以及LSTM的介紹和公式梳理》 - DarkScope從這里開始 - 博客頻道 - CSDN.NETO網(wǎng)頁鏈接?
模擬似乎成了統(tǒng)計論文的必須套路诫睬,但是如果留心煞茫,還是有論文不做模擬的,比如:Antoniak (1974). Mixtures of Dirichlet processes with applications to Bayesian nonparametric problems. The annals of statistics, 1152-1174.O網(wǎng)頁鏈接--還有哪些著名的論文是沒有simulation的摄凡?? 這個鏈接不要錢:O網(wǎng)頁鏈接
我在北大的CIPS暑期學校信息抽取講座主要講了三部分:基本的命名實體抽取方法和實踐续徽、關(guān)系抽取一些較新的進展、以及最新推斷和抽取的聯(lián)合模型亲澡。 幻燈片已經(jīng)放到CMU主頁上钦扭,可以下載了:O網(wǎng)頁鏈接
Lifelong Machine Learning in the Big Data Era - Tutorial at IJCAI 2015, Zhiyuan Chen and Bing Liu.O網(wǎng)頁鏈接
Facebook natural image generation using ConvNets, code:O網(wǎng)頁鏈接
Introducing Jupyter Notebooks in Azure ML Studio:O網(wǎng)頁鏈接
New transcription for Google Voice: using LSTM, cut the transcription errors by 49%...O網(wǎng)頁鏈接
【A new look at the system, algorithm, and theory foundations of scalable machine learning】T29: A new look at the system, algorithm, and theory foundations of scalable machine learning. - Eric P. Xing and Qirong Ho, ijcai-15 tutorial.O網(wǎng)頁鏈接
看好概率圖模型PGM的未來:1)目前深度學習效果好,靠的大量標注樣本床绪。要利用好無標注樣本學特征客情,個人覺得用PGM建生成模型最靠譜。2)PGM現(xiàn)在也融入深度學習癞己,來擬合幾個概率分布函數(shù)膀斋,緩解PGM學習和推斷的困難,具體參考DeepMind和Kingma&Welling的文章痹雅。人想登月不能靠爬樹仰担,得造火箭,雖然很艱難绩社。
Python for Image Understanding: Deep Learning with Convolutional Neural NetsO網(wǎng)頁鏈接
Easy Bayesian Bootstrap in R:O網(wǎng)頁鏈接