tuananhfrtk / combining-contextual-words-and-kg-embeddings Goto Github PK
View Code? Open in Web Editor NEWContextual embeddings are able to encode word meaning and polysemy to some degree. However, richer semantic information requires using representations other than texts, like knowledge graphs (KG). The goal of this project is to design a model to combining contextual and KG embeddings.