如何在python中的sklearn中获取GridSearchCV中的选定功能
作者:互联网
我使用交叉验证(rfecv)的递归特征消除作为GridSearchCV的特征选择技术.
我的代码如下.
X = df[my_features_all]
y = df['gold_standard']
x_train, x_test, y_train, y_test = train_test_split(X, y, random_state=0)
k_fold = StratifiedKFold(n_splits=5, shuffle=True, random_state=0)
clf = RandomForestClassifier(random_state = 42, class_weight="balanced")
rfecv = RFECV(estimator=clf, step=1, cv=k_fold, scoring='roc_auc')
param_grid = {'estimator__n_estimators': [200, 500],
'estimator__max_features': ['auto', 'sqrt', 'log2'],
'estimator__max_depth' : [3,4,5]
}
CV_rfc = GridSearchCV(estimator=rfecv, param_grid=param_grid, cv= k_fold, scoring = 'roc_auc', verbose=10, n_jobs = 5)
CV_rfc.fit(x_train, y_train)
print("Finished feature selection and parameter tuning")
现在,我想从上面的代码中获得最佳数量的功能和所选功能.
为此,我运行了以下代码.
#feature selection results
print("Optimal number of features : %d" % rfecv.n_features_)
features=list(X.columns[rfecv.support_])
print(features)
但是,出现以下错误:
AttributeError:“ RFECV”对象没有属性“ n_features_”.
还有其他获取这些详细信息的方法吗?
如果需要,我很乐意提供更多详细信息.
解决方法:
您传递给GridSearchCV的对象rfecv不适合它.首先将其克隆,然后将这些克隆拟合至数据并评估超参数的所有不同组合.
因此,要访问最佳功能,您需要访问GridSearchCV的best_estimator_属性:
CV_rfc.fit(x_train, y_train)
print("Finished feature selection and parameter tuning")
print("Optimal number of features : %d" % rfecv.n_features_)
features=list(X.columns[CV_rfc.best_estimator_.support_])
print(features)
标签:scikit-learn,machine-learning,data-mining,python,gridsearchcv 来源: https://codeday.me/bug/20191210/2105154.html