其他分享
首页 > 其他分享> > Machine Learning Basics(2)

Machine Learning Basics(2)

作者:互联网

CONTENTS

Capacity, Overfitting and Underfitting

The No Free Lunch Theorem

Regularization

Hyperparameters and Validation Sets

Cross-Validation

The k k k -fold cross-validation algorithm.

Estimators, Bias and Variance

Point Estimation

Bias

Variance and Standard Error

Trading off Bias and Variance to Minimize Mean Squared Error

The relationship between bias and variance is tightly linked to the machine learning concepts of capacity, underfitting and overfitting. In the case where generalization error is measured by the MSE (where bias and variance are meaningful components of generalization error), increasing capacity tends to increase variance and decrease bias. This is illustrated in figure above , where we see again the U-shaped curve of generalization error as a function of capacity.

Consistency

Maximum Likelihood Estimation

Conditional Log-Likelihood and Mean Squared Error

Properties of Maximum Likelihood

Bayesian Statistics

Maximum A Posteriori (MAP) Estimation

标签:Basics,hat,boldsymbol,Machine,right,Learning,theta,data,left
来源: https://blog.csdn.net/qq_40061421/article/details/118462267