编程语言
首页 > 编程语言> > python – 如何在sklearn逻辑回归中应用class_weights?

python – 如何在sklearn逻辑回归中应用class_weights?

作者:互联网

我对sklearn如何应用我们提供的课程重量感兴趣. documentation没有明确说明应用类权重的位置和方式.阅读源代码也没有帮助(似乎sklearn.svm.liblinear用于优化,我无法读取源代码,因为它是.pyd文件……)

但我想它适用于成本函数:当指定类权重时,相应类的成本将乘以类权重.例如,如果我分别从0级(权重= 0.5)和1级(权重= 1)得到2个观察值,那么成本函数将是:

Cost = 0.5*log(…X_0,y_0…) + 1*log(…X_1,y_1…) + penalization

有谁知道这是否正确?

解决方法:

检查the following lines in the source code

le = LabelEncoder()
if isinstance(class_weight, dict) or multi_class == 'multinomial':
    class_weight_ = compute_class_weight(class_weight, classes, y)
    sample_weight *= class_weight_[le.fit_transform(y)]

Here is the source code for the compute_class_weight() function

...
else:
    # user-defined dictionary
    weight = np.ones(classes.shape[0], dtype=np.float64, order='C')
    if not isinstance(class_weight, dict):
        raise ValueError("class_weight must be dict, 'balanced', or None,"
                         " got: %r" % class_weight)
    for c in class_weight:
        i = np.searchsorted(classes, c)
        if i >= len(classes) or classes[i] != c:
            raise ValueError("Class label {} not present.".format(c))
        else:
            weight[i] = class_weight[c]
...

在上面的代码片段中,class_weight被应用于sample_weight,它在一些内部函数中使用,如_logistic_loss_and_grad,_logistic_loss等:

# Logistic loss is the negative of the log of the logistic function.
out = -np.sum(sample_weight * log_logistic(yz)) + .5 * alpha * np.dot(w, w)
# NOTE: --->  ^^^^^^^^^^^^^^^

标签:python,scikit-learn,logistic-regression
来源: https://codeday.me/bug/20190705/1388571.html