向语言模型中添加知识
作者:互联网
Add Knowledge to Language Model
目录Add pre-trained entity embeddings
-
entity linking: map the word occurred in text to the entity in knowledge database.
-
incorporate pretrained entity embeddings from a different embedding space?
Fusion layer
-
ERNIE: pretraining tasks MLM, NSP, Knowledge Pretraining Task (on word-entity pairs)
-
KnowBERT: integrated entity linker. No annotations
Use an external memory
Easy to modify (like SQL?) : )
- KGLM: build a local knowledge graph when iterating over the sequence. LSTM to predict type of next word: related entity(in local KG), new entity(not in local KG), or not entity? take into account aliases vocabulary to the standard vocabulary
- kNN-LM: it's easier to learn similarities between text sequences rather than predicting the next word. lambda: control the ratio of using kNN or traditional LM
Modify the training data
mask or corrupt the training data to introduce additional training tasks(like justify the truth value of an argument).
Advantages: No additional memory or computation requirements. No modification of the architecture required.
-
WKLM: Weakly Supervised Knowledge-Pretrained Language Model. distinguish between true and false knowledge. Use MLM(token-level) loss and entity replacement(entity-level) loss to train.
-
Learn inductive biases through masking(more clever strategy about masking):
- ERNIE: phrase-level masking and entity-level masking (in Chinese language)
- Salient span masking:
Evaluating knowledge in LMs
-
LAMA: Language Model Analysis Probe. exam for the model: Cloze task (No fine-tune)
-
LAMA-UnHelpful Names: LAMA-UHN, remove examples from LAMA that can be answered without relational knowledge.
-
better prompts to query knowledge in LMs: model may be sensitive to the phrasing structure of sentence. sometimes slightly re-writing the query helps the model to recall what it has learned.
-
Knowledge-driven downstream tasks: (requires fine-tune) QA, entity typing, relation extraction
标签:word,knowledge,level,模型,知识,entity,添加,masking,Knowledge 来源: https://www.cnblogs.com/BUAA-Stargazer/p/16548596.html