簡(jiǎn)述題
1 costfunction形式:
J(\mathscr o)=\frac{1}{2}\sum_{0}^{N}(h_\mathscr o (x^i)-y^i)^2
2
3.KNN算法大體過程是:
- 1.計(jì)算出樣本數(shù)據(jù)和待分類數(shù)據(jù)的距離盟蚣;
- 選擇K個(gè)與其距離最小的樣本
- 統(tǒng)計(jì)出K個(gè)樣本中大多數(shù)樣本所屬的類蛤奢,這個(gè)類即為待分類數(shù)據(jù)所屬的類
K:K值的選擇一般選取1,3,5,7等較小的奇數(shù)。因?yàn)槿绻x擇的K值過于大的話會(huì)造成分類的偏差大饶号,如果選擇偶數(shù)則會(huì)發(fā)生在一個(gè)區(qū)域內(nèi)對(duì)于樣本的投票相同的場(chǎng)景所以選擇較小的奇數(shù)
8.隨機(jī)梯度下降
編程題
1
from sklearn import neighbors
from sklearn import datasets
knn = neighbors.KNeighborsClassifier()
iris = datasets.load_iris()
knn.fit(iris.data, iris.target)
predictedLabel = knn.predict([[3.5, 0.4, 2.3, 2.5]])
print(predictedLabel)
2
import numpy as np
a=np.array([[0,1,2,3],[0,1,2,3],[0,1,2,3],[0,1,2,3]])
#print(a)
def transform(a):
for i in range(len(a)-1):
for j in range(len(a[i])):
if j>i:
temp=a[i][j];
a[i][j]=a[j][i]
a[j][i]=temp
print(a)
transform(a)