【論文:基于在線學(xué)習(xí)自適應(yīng)采樣的加速隨機(jī)梯度下降(AW-SGD)】《Accelerating Stochastic Gradient Descent via Online Learning to Sample》G Bouchard, T Trouillon, J Perez, A Gaidon (2015) 網(wǎng)頁鏈接
【論文:非凸優(yōu)化的隨機(jī)梯度異步并行實現(xiàn)】《Asynchronous Parallel Stochastic Gradient for Nonconvex Optimization》X Lian, Y Huang, Y Li, J Liu (2015) 網(wǎng)頁鏈接
提供的介紹文章《Facebook人工智能負(fù)責(zé)人Yann LeCun談深度學(xué)習(xí)的局限性》*網(wǎng)頁鏈接
【幻燈:(CVPR2015)Yann LeCun關(guān)于深度學(xué)習(xí)局限性的報告】《What's Wrong with Deep Learning?》*網(wǎng)頁鏈接 Intro:*網(wǎng)頁鏈接 云:網(wǎng)頁鏈接
【論文:基于CNN的精確目標(biāo)檢測AttentionNet】《AttentionNet: Aggregating Weak Directions for Accurate Object Detection》D Yoo, S Park, JY Lee, A Paek, IS Kweon (2015) 網(wǎng)頁鏈接
【一個理解機(jī)器學(xué)習(xí)增強(qiáng)(Boosting)算法的嘗試】網(wǎng)頁鏈接 它是一種可以用來減小監(jiān)督式學(xué)習(xí)中偏差的機(jī)器學(xué)習(xí)元算法谤祖。邁可·肯斯提出:一組“弱學(xué)習(xí)者”的集合能否生成一個“強(qiáng)學(xué)習(xí)者”?弱學(xué)習(xí)者一般是指一個分類器遣铝,它的結(jié)果只比隨機(jī)分類好一點點奏路;強(qiáng)學(xué)習(xí)者指分類器的結(jié)果非常接近真值费彼。
【機(jī)器學(xué)習(xí)模型設(shè)計——精度&召回的故事】《Designing Machine Learning Models: A Tale of Precision and Recall》by Ariana Radianto, from Airbnb **網(wǎng)頁鏈接
【幻燈:大規(guī)模深度學(xué)習(xí)】《10 Billion Parameter Neural Networks in your Basement》by Adam Coates, Stanford University **網(wǎng)頁鏈接 云:*網(wǎng)頁鏈接
基于Caffe的另一實現(xiàn)cnn-vis, by Justin Johnson **網(wǎng)頁鏈接
【IPN:Goole深度藝術(shù)生成實現(xiàn)(基于Caffe)】《Deep Dreams (with Caffe)》GitHub:**網(wǎng)頁鏈接 ipn:****網(wǎng)頁鏈接* 參閱:愛可可-愛生活
【開源:基于Scikit-Learn的預(yù)測分析服務(wù)框架Palladium】GitHub:***網(wǎng)頁鏈接 Tutorial:網(wǎng)頁鏈接
【20個最熱門的開源(Python)機(jī)器學(xué)習(xí)項目】《Top 20 Python Machine Learning Open Source Projects》Scikit-Learn/Pylearn2/NuPIC/Nilearn/PyBrain/Pattern/Fuel/Bob/skdata/MILK/IEPY/Quepy/Hebel/mlxtend/nolearn/Ramp/Feature Forge/REP/Python Machine Learning Samples/ELM O****網(wǎng)頁鏈接
文中提到的WordEmbeddingAutoencoder實現(xiàn)代碼:O****網(wǎng)頁鏈接
**
@愛可可-愛生活****
【論文:詞頻模型對詞向量的反擊】《Rehabilitation of Count-based Models for Word Vector Representations》R Lebret, R Collobert (2015) O****網(wǎng)頁鏈接 參閱《Improving Distributional Similarity with Lessons Learned from Word Embeddings》O****愛可可-愛生活
【基于Theano/Lasagne/Pylearn2的交互DNN幻覺藝術(shù)生成(提供代碼)】《Interactive Deep Neural Net Hallucinations (+source code) - Large Scale Deep Neural Net visualizing top level features》O****網(wǎng)頁鏈接 GitHub:O****網(wǎng)頁鏈接 參閱:O****愛可可-愛生活
【GPU---并行計算利器 】顯卡的處理器稱為圖形處理器(GPU)茫因,它是顯卡的“心臟”削咆,與CPU類似牍疏,只不過GPU是專為執(zhí)行復(fù)雜的數(shù)學(xué)和幾何計算而設(shè)計的。O****網(wǎng)頁鏈接
【Python調(diào)參優(yōu)化庫Optunity】GitHub:O****網(wǎng)頁鏈接 Doc:O****網(wǎng)頁鏈接
【論文+代碼:C++并行貝葉斯推理統(tǒng)計庫QUESO】《The Parallel C++ Statistical Library for Bayesian Inference: QUESO》D McDougall, N Malaya, RD Moser (2015) O****網(wǎng)頁鏈接 O****網(wǎng)頁鏈接 QUESO:O****網(wǎng)頁鏈接 GitHub:O****網(wǎng)頁鏈接
【Python調(diào)參優(yōu)化庫Optunity】GitHub:網(wǎng)頁鏈接 Doc:*網(wǎng)頁鏈接
pdf: *網(wǎng)頁鏈接 //Yann LeCun加入討論 ! *網(wǎng)頁鏈接 // Jürgen Schmidhuber的最新評論文章《Critique of Paper by "Deep Learning Conspiracy" (Nature 521 p 436)》 *網(wǎng)頁鏈接
【Nature:LeCun/Bengio/Hinton的最新文章《深度學(xué)習(xí)》】《Deep learning》Yann LeCun, Yoshua Bengio, Geoffrey Hinton (2015) *網(wǎng)頁鏈接 云:*網(wǎng)頁鏈接
【論文+代碼:C++并行貝葉斯推理統(tǒng)計庫QUESO】《The Parallel C++ Statistical Library for Bayesian Inference: QUESO》D McDougall, N Malaya, RD Moser (2015) *網(wǎng)頁鏈接 網(wǎng)頁鏈接 QUESO:網(wǎng)頁鏈接 GitHub:網(wǎng)頁鏈接
:Andrew的機(jī)器學(xué)習(xí)幻燈片我是很推薦的拨齐。這么多年了鳞陨,很多CMU老師講機(jī)器學(xué)習(xí)課程的時候都還要參考Andrew的經(jīng)典課件。網(wǎng)頁 教程
【幻燈:100+頁的Vowpal Wabbit機(jī)器學(xué)習(xí)平臺介紹】《Vowpal Wabbit: A Machine Learning System》 by John Langford [Microsoft Research] 網(wǎng)頁鏈接 云:網(wǎng)頁鏈接
【視頻+代碼:基于Kaggle's Titanic 101數(shù)據(jù)的(R)數(shù)據(jù)科學(xué)實踐】《Introduction to Data Science with R - Data Analysis》Youtube: 網(wǎng)頁鏈接 網(wǎng)頁鏈接 云: 網(wǎng)頁鏈接 網(wǎng)頁鏈接 GitHub:網(wǎng)頁鏈接