大學渣的ISLR筆記(10)-Unsupervised Learning

Most of this book concerns supervised learning methods such as regression and classification. In the supervised learning setting, we typically have access to a set of ?p features X1,X2, . . .,Xp, measured on n observations, and a response Y also measured on those same n observations. The goal is then to predict Y using X1,X2, . . . , Xp.

This chapter will instead focus on unsupervised learning , a set of statistical tools intended for the setting in which we have only a set of features X1,X2, . . . , Xp measured on n observations.

We are not interested in prediction, because we do not have an associated response variable Y.Rather, the goal is to discover interesting things about the measurements on X1,X2, . . .,Xp. Is there an informative way to visualize the data? Can we discover subgroups among the variables or among the observations? Unsupervised learning refers to a diverse set of techniques for answering questions such as these. In this chapter, we will focus on two particular types of unsupervised learning: principal components analysis , a tool used for data visualization or data pre-processing before supervised techniques are applied, and clustering ,a broad class of methods for discovering unknown subgroups in data.

The Challenge of Unsupervised Learning

unsupervised learning is often much more challenging. The exercise tends to be more subjective, and there is no simple goal for the analysis, such as prediction of a response.

Unsupervised learning is often performed as part of an exploratory data analysis . Furthermore, it can be hard to assess the results obtained from unsupervised learning methods,since there is no universally accepted mechanism for performing cross validation or validating results on an independent data set.in unsupervised learning, there is no way to check our work because we don’t know the true answer—the problem is unsupervised.

Principal Components Analysis

When faced with a large set of correlated variables, principal components allow us to summarize this set with a smaller number of representative variables that collectively explain most of the variability in the original set.

What Are Principal Components?

PCA provides a tool to do just this. It finds a low-dimensional representation of a data set that contains as much as possible of the variation.The idea is that each of the n observations lives in p -dimensional space, but not all of these dimensions are equally interesting.PCA seeks a small number of dimensions that are as interesting as possible, where the concept of interesting is measured by the amount that the observations vary along each dimension. Each of the dimensions found by PCA is a linear combination of the p features. We now explain the manner in which these dimensions, or principal components , are found.

The first principal component of a set of features X1,X2, . . . , Xp is the normalized linear combination of the features:


that has the largest variance. By normalized , we mean that

Given a n*p data set X, how do we compute the first principal component? Since we are only interested in variance, we assume that each of the variables in X has been centered to have mean zero (that is, the column means of X are zero). We then look for the linear combination of the sample feature values of the form:


that has largest sample variance, In other words, the first principal component loading vector solves the optimization problem:


Another Interpretation of Principal Components

principal components provide low-dimensional linear surfaces that are closest to the observations.


Clustering Methods

In this section we focus on perhaps the two best-known clustering approaches:K-means clustering and hierarchical clustering.

K-Means Clustering



Hierarchical Clustering


最后編輯于
?著作權歸作者所有,轉載或內(nèi)容合作請聯(lián)系作者
  • 序言:七十年代末酗钞,一起剝皮案震驚了整個濱河市比伏,隨后出現(xiàn)的幾起案子,更是在濱河造成了極大的恐慌,老刑警劉巖铲掐,帶你破解...
    沈念sama閱讀 216,591評論 6 501
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件,死亡現(xiàn)場離奇詭異饼疙,居然都是意外死亡衅谷,警方通過查閱死者的電腦和手機,發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 92,448評論 3 392
  • 文/潘曉璐 我一進店門膀斋,熙熙樓的掌柜王于貴愁眉苦臉地迎上來梭伐,“玉大人,你說我怎么就攤上這事仰担『叮” “怎么了?”我有些...
    開封第一講書人閱讀 162,823評論 0 353
  • 文/不壞的土叔 我叫張陵摔蓝,是天一觀的道長赂苗。 經(jīng)常有香客問我,道長贮尉,這世上最難降的妖魔是什么哑梳? 我笑而不...
    開封第一講書人閱讀 58,204評論 1 292
  • 正文 為了忘掉前任,我火速辦了婚禮绘盟,結果婚禮上,老公的妹妹穿的比我還像新娘悯仙。我一直安慰自己龄毡,他們只是感情好,可當我...
    茶點故事閱讀 67,228評論 6 388
  • 文/花漫 我一把揭開白布锡垄。 她就那樣靜靜地躺著沦零,像睡著了一般。 火紅的嫁衣襯著肌膚如雪货岭。 梳的紋絲不亂的頭發(fā)上路操,一...
    開封第一講書人閱讀 51,190評論 1 299
  • 那天疾渴,我揣著相機與錄音,去河邊找鬼屯仗。 笑死搞坝,一個胖子當著我的面吹牛,可吹牛的內(nèi)容都是我干的魁袜。 我是一名探鬼主播桩撮,決...
    沈念sama閱讀 40,078評論 3 418
  • 文/蒼蘭香墨 我猛地睜開眼,長吁一口氣:“原來是場噩夢啊……” “哼峰弹!你這毒婦竟也來了店量?” 一聲冷哼從身側響起,我...
    開封第一講書人閱讀 38,923評論 0 274
  • 序言:老撾萬榮一對情侶失蹤鞠呈,失蹤者是張志新(化名)和其女友劉穎融师,沒想到半個月后,有當?shù)厝嗽跇淞掷锇l(fā)現(xiàn)了一具尸體蚁吝,經(jīng)...
    沈念sama閱讀 45,334評論 1 310
  • 正文 獨居荒郊野嶺守林人離奇死亡旱爆,尸身上長有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點故事閱讀 37,550評論 2 333
  • 正文 我和宋清朗相戀三年,在試婚紗的時候發(fā)現(xiàn)自己被綠了灭将。 大學時的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片疼鸟。...
    茶點故事閱讀 39,727評論 1 348
  • 序言:一個原本活蹦亂跳的男人離奇死亡,死狀恐怖庙曙,靈堂內(nèi)的尸體忽然破棺而出空镜,到底是詐尸還是另有隱情,我是刑警寧澤捌朴,帶...
    沈念sama閱讀 35,428評論 5 343
  • 正文 年R本政府宣布吴攒,位于F島的核電站,受9級特大地震影響砂蔽,放射性物質(zhì)發(fā)生泄漏洼怔。R本人自食惡果不足惜,卻給世界環(huán)境...
    茶點故事閱讀 41,022評論 3 326
  • 文/蒙蒙 一左驾、第九天 我趴在偏房一處隱蔽的房頂上張望镣隶。 院中可真熱鬧,春花似錦诡右、人聲如沸安岂。這莊子的主人今日做“春日...
    開封第一講書人閱讀 31,672評論 0 22
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽域那。三九已至,卻和暖如春猜煮,著一層夾襖步出監(jiān)牢的瞬間次员,已是汗流浹背败许。 一陣腳步聲響...
    開封第一講書人閱讀 32,826評論 1 269
  • 我被黑心中介騙來泰國打工, 沒想到剛下飛機就差點兒被人妖公主榨干…… 1. 我叫王不留淑蔚,地道東北人市殷。 一個月前我還...
    沈念sama閱讀 47,734評論 2 368
  • 正文 我出身青樓,卻偏偏與公主長得像束倍,于是被迫代替她去往敵國和親被丧。 傳聞我的和親對象是個殘疾皇子,可洞房花燭夜當晚...
    茶點故事閱讀 44,619評論 2 354

推薦閱讀更多精彩內(nèi)容