编程语言
首页 > 编程语言> > python – 在scikit-learn中使用rbf内核为SVM使用递归特征消除的ValueError

python – 在scikit-learn中使用rbf内核为SVM使用递归特征消除的ValueError

作者:互联网

我试图在scikit-learn中使用递归特征消除(RFE)函数,但不断得到错误ValueError:coef_仅在使用线性内核时可用.我正在尝试使用rbf内核为支持向量分类器(SVC)执行功能选择.来自网站的这个例子执行得很好:

print(__doc__)

from sklearn.svm import SVC
from sklearn.cross_validation import StratifiedKFold
from sklearn.feature_selection import RFECV
from sklearn.datasets import make_classification
from sklearn.metrics import zero_one_loss

# Build a classification task using 3 informative features
X, y = make_classification(n_samples=1000, n_features=25, n_informative=3,
                       n_redundant=2, n_repeated=0, n_classes=8,
                       n_clusters_per_class=1, random_state=0)

# Create the RFE object and compute a cross-validated score.
svc = SVC(kernel="linear")
rfecv = RFECV(estimator=svc, step=1, cv=StratifiedKFold(y, 2),
          scoring='accuracy')
rfecv.fit(X, y)

print("Optimal number of features : %d" % rfecv.n_features_)

# Plot number of features VS. cross-validation scores
import pylab as pl
pl.figure()
pl.xlabel("Number of features selected")
pl.ylabel("Cross validation score (nb of misclassifications)")
pl.plot(range(1, len(rfecv.grid_scores_) + 1), rfecv.grid_scores_)
pl.show()

但是,只需将内核类型从线性更改为rbf,如下所示,会产生错误:

print(__doc__)

from sklearn.svm import SVC
from sklearn.cross_validation import StratifiedKFold
from sklearn.feature_selection import RFECV
from sklearn.datasets import make_classification
from sklearn.metrics import zero_one_loss

# Build a classification task using 3 informative features
X, y = make_classification(n_samples=1000, n_features=25, n_informative=3,
                       n_redundant=2, n_repeated=0, n_classes=8,
                       n_clusters_per_class=1, random_state=0)

# Create the RFE object and compute a cross-validated score.
svc = SVC(kernel="rbf")
rfecv = RFECV(estimator=svc, step=1, cv=StratifiedKFold(y, 2),
          scoring='accuracy')
rfecv.fit(X, y)

print("Optimal number of features : %d" % rfecv.n_features_)

# Plot number of features VS. cross-validation scores
import pylab as pl
pl.figure()
pl.xlabel("Number of features selected")
pl.ylabel("Cross validation score (nb of misclassifications)")
pl.plot(range(1, len(rfecv.grid_scores_) + 1), rfecv.grid_scores_)
pl.show()

这似乎可能是一个错误,但如果有人能发现我做错了什么就会很好.另外,我正在使用scikit-learn版本0.14.1运行python 2.7.6.

谢谢您的帮助!

解决方法:

这似乎是预期的结果. RFECV要求估算器具有表示要素重要性的coef_:

estimator : object

A supervised learning estimator with a fit method that updates a coef_ attribute that holds the fitted parameters. Important features must correspond to high absolute values in the coef_ array.

根据文档,通过将内核更改为RBF,SVC不再是线性的,并且coef_属性变得不可用.

coef_

array, shape = [n_class-1, n_features]

Weights asigned to the features (coefficients in the primal problem). This is only available in the case of linear kernel.

当内核不是线性时,当RFECV试图访问coef_时,SVC (source)会引发错误.

标签:python,scikit-learn,rfe
来源: https://codeday.me/bug/20190725/1530671.html