SVM的主要思想可以概括为两点:
Hard Margin SVM
n维平面中点到直线的距离公式
对于红点和蓝点每个点应满足的不等式条件
image.png
问题最终转化为
即最终转化为有条件的最优化问题
有条件的最优化问题
Soft Margin SVM
将eta值加入模型正则化项,给模型一定的容错能力,C越大,容错空间越小,C越小,容错空间越大
使用scikit-learn中的svm
from sklearn import datasetsimport numpy as np import matplotlib.pyplot as plt#准备数据iris = datasets.load_iris() X = iris['data'] y = iris['target'] X = X[y<2,:2] y = y[y<2]#数据归一化(SVC涉及距离,应该使用数据归一化处理)from sklearn.preprocessing import StandardScaler stdScaler = StandardScaler() stdScaler.fit(X) X_standard = stdScaler.transform(X)#实例化svc对象,训练模型from sklearn.svm import LinearSVC svc = LinearSVC(C=1e9) svc.fit(X_standard,y)
def plot_svc_decision_boundary(model,axis): x0,x1 = np.meshgrid( np.linspace(axis[0],axis[1],int((axis[1]-axis[0])*100)), np.linspace(axis[2],axis[3],int((axis[3]-axis[2])*100)) ) X_new = np.c_[x0.ravel(),x1.ravel()] y_predict = model.predict(X_new) zz = y_predict.reshape(x0.shape) from matplotlib.colors import ListedColormap custom_cmap = ListedColormap(['#EF9A9A','#FFF59D','#90CAF9']) plt.contourf(x0,x1,zz,linewidth=5,cmap=custom_cmap) #除去决策边界外,还要画出svc支撑向量的线 w = model.coef_[0] b = model.intercept_[0] # w0x0 + w1x1 + b = 0 # => x1 = -w0/w1*w0-b/w1 plot_x = np.linspace(axis[0],axis[1],200) up_y = -w[0]/w[1] * plot_x - b/w[1] +1/w[1] down_y = -w[0]/w[1] * plot_x - b/w[1] -1/w[1] up_index = (up_y>=axis[2])&(up_y<=axis[3]) down_index = (down_y>=axis[2])&(down_y<=axis[3]) plt.plot(plot_x[up_index],up_y[up_index],color='black') plt.plot(plot_x[down_index],down_y[down_index],color='black') plot_svc_decision_boundary(svc,axis=[-3,3,-3,3]) plt.scatter(X_standard[y==0,0],X_standard[y==0,1]) plt.scatter(X_standard[y==1,0],X_standard[y==1,1]) plt.show()
svc = LinearSVC(C=1e9)
svc = LinearSVC(C=0.1)
多项式特征应用于SVM
#使用制作数据的方法生成数据,噪音为0.15X,y = datasets.make_moons(noise=0.15) plt.scatter(X[y==0,0],X[y==0,1]) plt.scatter(X[y==1,0],X[y==1,1]) plt.show()from sklearn.preprocessing import PolynomialFeaturesfrom sklearn.pipeline import Pipelinedef PolynomialSVC(degree,C=1.0): return Pipeline([ ("poly",PolynomialFeatures(degree=degree)), ("std_standard",StandardScaler()), ("svc",LinearSVC(C=C)) ])
使用多项式核函数的SVM
from sklearn.svm import SVCdef PolynomialKernelSVC(degree,C=1.0): return Pipeline([ ("std_scaler",StandardScaler()), ("kernelSVC",SVC(kernel="poly",degree=degree,C=C)) ]) poly_kernel_svc = PolynomialKernelSVC(degree=5) poly_kernel_svc.fit(X,y) plot_decision_boundary(poly_kernel_svc,axis=[-1.5,2.5,-1.0,1.5]) plt.scatter(X[y==0,0],X[y==0,1]) plt.scatter(X[y==1,0],X[y==1,1]) plt.show()
与使用LinearSVC不同
原理
将原来的损失函数转换为右式
多项式原本转换是将xi,ji转换为新的矩阵,这里多项式核函数就是K函数,用函数K计算出新的矩阵,达到和原来多项式转换相同的效果
二次项K函数的计算方法
d代表degree,用多项式核函数的方法计算新的矩阵
RBFKernel(高斯核函数)
gamma为高斯核的超参数
def RBFKernelSVC(gamma=1.0): return Pipeline([ ("std_sacler",StandardScaler()), ("svc",SVC(kernel="rbf",gamma=gamma)) ]) svc = RBFKernelSVC() svc.fit(X,y)
def plot_decision_boundary(model,axis): x0,x1 = np.meshgrid( np.linspace(axis[0],axis[1],int((axis[1]-axis[0])*100)), np.linspace(axis[2],axis[3],int((axis[3]-axis[2])*100)) ) X_new = np.c_[x0.ravel(),x1.ravel()] y_predict = model.predict(X_new) zz = y_predict.reshape(x0.shape) from matplotlib.colors import ListedColormap custom_cmap = ListedColormap(['#EF9A9A','#FFF59D','#90CAF9']) plt.contourf(x0,x1,zz,linewidth=5,cmap=custom_cmap) plot_decision_boundary(svc,axis=[-1.5,2.5,-1.0,1.5]) plt.scatter(X[y==0,0],X[y==0,1]) plt.scatter(X[y==1,0],X[y==1,1]) plt.show()
gamma=1.0的高斯核函数进行SVC
gamma=100的高斯核,过拟合
gamma=0.1的高斯核,欠拟合
SVM思想解决回归问题
boston = datasets.load_boston()
X = boston['data'] y = boston['target']from sklearn.model_selection import train_test_split X_train,X_test,y_train,y_test = train_test_split(X,y)from sklearn.svm import LinearSVR#epsilon为超参数def StandardLinearSVR(epsilon=0.1): return Pipeline([ ("std_scaler",StandardScaler()), ("svc",LinearSVR(epsilon=epsilon)) ]) lin_svr = StandardLinearSVR() lin_svr.fit(X_train,y_train) lin_svr.score(X_test,y_test)>>> 0.6735924094720267
作者:冰源_63ad
链接:https://www.jianshu.com/p/51ab4c904dc3