其他分享
首页 > 其他分享> > 吴恩达Coursera, 机器学习专项课程, Machine Learning:Supervised Machine Learning: Regression and Classification第一

吴恩达Coursera, 机器学习专项课程, Machine Learning:Supervised Machine Learning: Regression and Classification第一

作者:互联网

Practice quiz: Supervised vs unsupervised learning

第 1 个问题:Which are the two common types of supervised learning? (Choose two)

【正确】Regression
【解释】Regression predicts a number among potentially infinitely possible numbers.
【不选】Clustering
【正确】Classification
【解释】Classification predicts from among a limited set of categories (also called classes). These could be a limited set of numbers or labels such as "cat" or "dog".

第 2 个问题:Which of these is a type of unsupervised learning?

Classification
【正确】Clustering
Regression
【解释】Clustering groups data into groups or clusters based on how similar each item (such as a hospital patient or shopping customer) are to each other.

Practice quiz: Regression

第 1 个问题:For linear regression, the model is f_{w,b}(x)=wx+b. Which of the following are the inputs, or features, that are fed into the model and with which the model is expected to make a prediction?

w and b.
【正确】x
m
(x,y)
【解释】The xx, the input features, are fed into the model to generate a prediction f_{w,b}(x)

第 2 个问题:For linear regression, if you find parameters ww and bb so that J(w,b) is very close to zero, what can you conclude?

【正确】The selected values of the parameters w and b cause the algorithm to fit the training set really well.
The selected values of the parameters w and b cause the algorithm to fit the training set really poorly.
This is never possible -- there must be a bug in the code.
【解释】When the cost is small, this means that the model fits the training set well.

Practice quiz: Train the model with gradient descent

第 1 个问题:Gradient descent is an algorithm for finding values of parameters w and b that minimize the cost function J. When \frac{\partial J(w,b)}{\partial w} is a negative number (less than zero), what happens to ww after one update step?

【正确】w increases.
w decreases
It is not possible to tell if ww will increase or decrease.
w stays the same
【解释】The learning rate is always a positive number, so if you take W minus a negative number, you end up with a new value for W that is larger (more positive).

第 2 个问题:For linear regression, what is the update step for parameter b?

image

标签:吴恩达,set,learning,parameters,number,Machine,Learning,model,Regression
来源: https://www.cnblogs.com/chuqianyu/p/16438303.html