聪明的人懂得说,智慧的人懂得听,高明的人懂得问。若要前行,就得离开你现在停留的地方。早安!
机器学习实战之knn算法pandas,供大家参考,具体内容如下
开始学习机器学习实战这本书,打算看完了再回头看 周志华的 机器学习。机器学习实战的代码都是用numpy写的,有些麻烦,所以考虑用pandas来实现代码,也能回顾之前学的 用python进行数据分析。感觉目前章节的测试方法太渣,留着以后学了更多再回头写。
# coding: gbk import pandas as pd import numpy as np def getdata(path): data = pd.read_csv(path, header=None, sep='\t') character = data.iloc[:, :-1] label = data.iloc[:, -1] chara_max = character.max() chara_min = character.min() chara_range = chara_max - chara_min normal_chara = (character - chara_min) / chara_range return normal_chara, label # 获得归一化特征值和标记 def knn(inX, normal_chara, label, k): data_sub = normal_chara - inX data_square = data_sub.applymap(np.square) data_sum = data_square.sum(axis=1) data_sqrt = data_sum.map(np.sqrt) dis_sort = data_sqrt.argsort() k_label = label[dis_sort[:k]] label_sort = k_label.value_counts() res_label = label_sort.index[0] return res_label # knn算法分类
小编为大家分享一段代码:机器学习--KNN基本实现
# _*_ coding _*_ import numpy as np import math import operator def get_data(dataset): x = dataset[:,:-1].astype(np.float) y = dataset[:,-1] return x,y # def cal_dis(a,b): # x1,y1 = a[:] # x2,y2 = b[:] # dist = math.sqrt(math.pow(2,x2)-math.pow(2,x1)) def knnclassifer(dataset,predict,k=3): x,y = get_data(dataset) dic = {} distince = np.sum((predict-x)**2,axis=1)**0.5 sorted_dict = np.argsort(distince)#[2 1 0 3 4] countLabel = {} for i in range(k): label = y[sorted_dict[i]] # print(i,sorted_dict[i],label) countLabel[label] = countLabel.get(label,0)+1 new_dic = sorted(countLabel,key=operator.itemgetter(0),reverse=True) return new_dic[0][0] if __name__ == '__main__': dataset = np.loadtxt("dataset.txt",dtype=np.str,delimiter=",") predict = [2,2] label = knnclassifer(dataset,predict,3) print(label)
以上就是机器学习实战之knn算法pandas。如果你总以为失去的才是最好的,那么你最好的还会一一失去。更多关于机器学习实战之knn算法pandas请关注haodaima.com其它相关文章!