Homework 5 INF 552, Instructor: Mohammad Reza Rajati1. Multi-class and Multi-Label Classification Using Support Vector Machines(a) Download the Anuran Calls (MFCCs) Data Set from: https://archive.ics.uci.edu/ml/datasets/Anuran+Calls+%28MFCCs%29. Choose 70% of the datarandomly as the training set.(b) Each instance has three labels: Families, Genus, and Species. Each of the labelshas multiple classes. We wish to solve a multi-class and multi-label problem.One of the most important approaches to multi-class classification is to train aclassifier for each label. We first try this approach:i. Research exact match and hamming score/ loss methods for evaluating multilabelclassification and use them in evaluating the classifiers in this problem.ii. Train a SVM for each of the labels, using Gaussian kernels and one versusall classifiers. Determine the weight of the SVM penalty and the width ofthe Gaussian Kernel using 10 fold cross validation.1 You are welcome to tryto solve the problem with both standardized 2 and raw attributes and reportthe results.iii. Repeat 1(b)ii with L1-penalized SVMs.3 Remember to standardize4the attributes.Determine the weight of the SVM penalty using 10 fold cross validation.iv. Repeat 1(b)iii by using SMOTE or any other method you know to remedyclass imbalance. Report your conclusions about the classifiers you trained.v. Extra Practice: Study the Classifier Chain method and apply it to the aboveproblem.vi. Extra Practice: Research how confusion matrices, precision, recall, ROC,and AUC are defined for multi-label classification and compute them for theclassifiers you trained in above.2. K-Means Clustering on a Multi-Class and Multi-Label Data SetMonte-Carlo Simulation: Perform the following procedures 50 times, and reportthe average and standard deviation of the 50 Hamming Distances that you calculate.1How to choose parameter ranges for SVMs? One can use wide ranges for the parameters and a finegrid (e.g. 1000 points) for cross validation; however,this method may be computationally expensive. AnaINF 552作業(yè)代做、代寫MFCCs留學(xué)生作業(yè)囚巴、代寫c/c++原在,Java課程作業(yè)、代做Python語(yǔ)言作業(yè) 代做留學(xué)生lternative way is to train the SVM with very large and very small parameters on the whole training dataand find very large and very small parameters for which the training accuracy is not below a threshold (e.g.,70%). Then one can select a fixed number of parameters (e.g., 20) between those points for cross validation.For the penalty parameter, usually one has to consider increments in log(λ). For example, if one found thatthe accuracy of a support vector machine will not be below 70% for λ = 10?3 and λ = 106, one has to chooselog(λ) ∈ {?3, ?2, . . . , 4, 5, 6}. For the Gaussian Kernel parameter, one usually chooses linear increments,e.g. σ ∈ {.1, .2, . . . , 2}. When both σ and λ are to be chosen using cross-validation, combinations of verysmall and very large λ’s and σ’s that keep the accuracy above a threshold (e.g.70%) can be used to determinethe ranges for σ and λ. Please note that these are very rough rules of thumb, not general procedures.2It seems that the data are already normalized.3The convention is to use L1 penalty with linear kernel.4It seems that the data are already normalized.1Homework 5 INF 552, Instructor: Mohammad Reza Rajati(a) Use k-means clustering on the whole Anuran Calls (MFCCs) Data Set (do not splitthe data into train and test, as we are not performing supervised learning in thisexercise). Choose k ∈ {1, 2, . . . , 50} automatically based on one of the methodsprovided in the slides (CH or Gap Statistics or scree plots or Silhouettes) or anyother method you know.(b) In each cluster, determine which family is the majority by reading the true labels.Repeat for genus and species.(c) Now for each cluster you have a majority label triplet (family, genus, species).Calculate the average Hamming distance, Hamming score, and Hamming loss5between the true labels and the labels assigned by clusters.3. ISLR 10.7.24. Extra Practice: The rest of problems in 10.7.5Research what these scores are. For example, see the paper A Literature Survey on Algorithms forMulti-label Learning, by Mohammad Sorower.轉(zhuǎn)自:http://ass.3daixie.com/2019030729243771.html
講解:INF 552知态、MFCCs、c/c++睡汹,Java肴甸、PythonProcessing|Proc
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請(qǐng)聯(lián)系作者
- 文/潘曉璐 我一進(jìn)店門斑举,熙熙樓的掌柜王于貴愁眉苦臉地迎上來(lái)搅轿,“玉大人,你說(shuō)我怎么就攤上這事富玷¤捣兀” “怎么了?”我有些...
- 文/不壞的土叔 我叫張陵赎懦,是天一觀的道長(zhǎng)雀鹃。 經(jīng)常有香客問(wèn)我,道長(zhǎng)励两,這世上最難降的妖魔是什么黎茎? 我笑而不...
- 正文 為了忘掉前任,我火速辦了婚禮当悔,結(jié)果婚禮上工三,老公的妹妹穿的比我還像新娘迁酸。我一直安慰自己,他們只是感情好俭正,可當(dāng)我...
- 文/花漫 我一把揭開(kāi)白布。 她就那樣靜靜地躺著焙畔,像睡著了一般掸读。 火紅的嫁衣襯著肌膚如雪。 梳的紋絲不亂的頭發(fā)上宏多,一...
- 那天儿惫,我揣著相機(jī)與錄音,去河邊找鬼伸但。 笑死肾请,一個(gè)胖子當(dāng)著我的面吹牛,可吹牛的內(nèi)容都是我干的更胖。 我是一名探鬼主播铛铁,決...
- 文/蒼蘭香墨 我猛地睜開(kāi)眼,長(zhǎng)吁一口氣:“原來(lái)是場(chǎng)噩夢(mèng)啊……” “哼却妨!你這毒婦竟也來(lái)了饵逐?” 一聲冷哼從身側(cè)響起,我...
- 序言:老撾萬(wàn)榮一對(duì)情侶失蹤彪标,失蹤者是張志新(化名)和其女友劉穎倍权,沒(méi)想到半個(gè)月后,有當(dāng)?shù)厝嗽跇?shù)林里發(fā)現(xiàn)了一具尸體捞烟,經(jīng)...
- 正文 獨(dú)居荒郊野嶺守林人離奇死亡薄声,尸身上長(zhǎng)有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
- 正文 我和宋清朗相戀三年,在試婚紗的時(shí)候發(fā)現(xiàn)自己被綠了题画。 大學(xué)時(shí)的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片默辨。...
- 正文 年R本政府宣布桌粉,位于F島的核電站,受9級(jí)特大地震影響衙四,放射性物質(zhì)發(fā)生泄漏铃肯。R本人自食惡果不足惜,卻給世界環(huán)境...
- 文/蒙蒙 一传蹈、第九天 我趴在偏房一處隱蔽的房頂上張望押逼。 院中可真熱鬧步藕,春花似錦、人聲如沸挑格。這莊子的主人今日做“春日...
- 文/蒼蘭香墨 我抬頭看了看天上的太陽(yáng)漂彤。三九已至雾消,卻和暖如春,著一層夾襖步出監(jiān)牢的瞬間挫望,已是汗流浹背立润。 一陣腳步聲響...
- 正文 我出身青樓,卻偏偏與公主長(zhǎng)得像蛉幸,于是被迫代替她去往敵國(guó)和親破讨。 傳聞我的和親對(duì)象是個(gè)殘疾皇子,可洞房花燭夜當(dāng)晚...
推薦閱讀更多精彩內(nèi)容
- pyspark.sql模塊 模塊上下文 Spark SQL和DataFrames的重要類: pyspark.sql...
- 引言 Malcolm Hutt視用戶信息安全與隱私保護(hù)為自己的“生命線”巨缘。我們秉承“一切以用戶價(jià)值為依歸”的理念添忘,...
- 2017年過(guò)的真快乏冀,年初的失戀,消沉了很久洋只,當(dāng)時(shí)靠著陰陽(yáng)師活了下來(lái)辆沦。。识虚。 不過(guò)現(xiàn)在都過(guò)去了肢扯,5月認(rèn)識(shí)了現(xiàn)在的她...