其他分享
首页 > 其他分享> > BERT

BERT

作者:互联网

目录

前言

论文全称及链接:《BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding》

项目地址:google-research/bert

BERT全称:Bidirectional Encoder Representations from Transformers

标签:Pre,BERT,Transformers,前言,全称,Bidirectional
来源: https://www.cnblogs.com/zjuhaohaoxuexi/p/16412560.html