其他分享
首页 > 其他分享> > Basics of Neural Network Programming - Computation Graph

Basics of Neural Network Programming - Computation Graph

作者:互联网

This is the notes when studying the class Neural Networks & Deep Learning by Andrew Ng, section 2.7 computation graph. Share it with you and hope it helps.


The computations of a neural network are organized in terms of a forward propagation step in which we compute the output of the neural network followed by a backward propagation step which we use to compute gradients or derivatives. The computation graph explained why it's organized this way.

Let's use a simpler example than logistic regression to explain how computation graph works.

Suppose:

J(a,b,c)=3(a+bc)

Computing this function actually has 3 distinct steps:

\\u=bc \\ v=a+u \\ J=3v

We can take these 3 steps and draw them in a computation graph as below:

In addition, we've shown a concrete example in the computation graph by setting the variables a,b,c to specific values.

The computation graph comes in handy when there is some special output variable, such as J in this case, you want to optimize. In the case of logistic regression, J is the cost function we try to optimize. In this little example, through a left-to-right pass, we can compute the value of J. In the next couple of classes, we'll see that in order to compute derivatives, it'll be a right-to-left path going in the opposite direction as the red arrows.

<end>

标签:Basics,compute,Network,Neural,logistic,graph,steps,computation,example
来源: https://blog.csdn.net/edward_wang1/article/details/118066832