其他分享
首页 > 其他分享> > Word Vectors/Attention - hands on after lecture

Word Vectors/Attention - hands on after lecture

作者:互联网

1. Write the co-occurrence matrix {\rm X}X for this sentence, using a 4-word context window (i.e. two context words on either side of the central word).

 

2. Use torch.svd() to compute the singular value decompositon of this matrix {\rm X} = {\rm USV}^{\rm T}X=USVT.

 

3. Extract a word representation from the first two columns of {\rm U}U and use matplotlib to plot the words on a 2-dimensional graph.

标签:context,Vectors,word,matrix,Attention,two,Word,words,rm
来源: https://www.cnblogs.com/selfmade-Henderson/p/16486817.html