其他分享
首页 > 其他分享> > https://blog.csdn.net/FnqTyr45/article/details/110675336

https://blog.csdn.net/FnqTyr45/article/details/110675336

作者:互联网

Reviewer #1: To build a shared learning model for training parameters at the distributed edge to protect data privacy, a novel distributed hierarchical tensor depth optimization algorithm is designed in this paper. The mold parameters in a high-dimensional tensor space are compressed into a union of low-dimensional subspaces to reduce the bandwidth consumption and storage requirements of federated learning. Experimental results show that the proposed algorithm reduces the load of communication bandwidth and the energy consumption at the edge. This article is original and has a unique idea, but there are some areas that need to be improved as follows:

1. Introduction is not hierarchical and logical enough. It tries to reflect the direct paragraph relationship and refer to recent literatures, and some terms should be expressed accurately.
2. A related work section that discuss the contribution and difference from the previous works should be included.
3. The concept of terminology in the paper is wrong. For example, there is no WEIGHT in the pooling layer and it needs to be improved.
4. The reference format is inaccurate and should be revised. If the latest references are too few, the latest references should be added. Finally, the emphasis should be on strengthening language and grammar revisions. There are grammar and spelling errors throughout.

标签:FnqTyr45,grammar,110675336,parameters,distributed,should,blog,bandwidth,paper
来源: https://www.cnblogs.com/devilmaycry812839668/p/16294632.html