其他分享
首页 > 其他分享> > pytorch的bert预训练模型名称及下载路径

pytorch的bert预训练模型名称及下载路径

作者:互联网

google的bert预训练模型:

pytorch的bert预训练模型(pretrained_model_name_or_path):

1 PRETRAINED_VOCAB_ARCHIVE_MAP = {
2     'bert-base-uncased': "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt",
3     'bert-large-uncased': "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-vocab.txt",
4     'bert-base-cased': "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-vocab.txt",
5     'bert-large-cased': "https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-vocab.txt",
6     'bert-base-multilingual-uncased': "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased-vocab.txt",
7     'bert-base-multilingual-cased': "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-vocab.txt",
8     'bert-base-chinese': "https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-vocab.txt",
9 }

 

标签:bert,layer,12,heads,BERT,路径,pytorch,base
来源: https://www.cnblogs.com/zxcayumi/p/16195958.html