Structure embedding for knowledge base completion and analytics
2017, 2017 International Joint Conference on Neural Networks (IJCNN)
https://doi.org/10.1109/IJCNN.2017.7965925Abstract
To explore the latent information of Human Knowledge, the analysis for Knowledge Bases (KBs) (e.g. WordNet, Freebase) is essential. Some previous KB element embedding frameworks are used for KBs structure analysis and completion. These embedding frameworks use low-dimensional vector space representation for large scale of entities and relations in KB. Based on that, the vector space representation of entities and relations which are not contained by KB can be measured. The embedding idea is reasonable, while the current embedding methods have some issues to get proper embeddings for KB elements. The embedding methods use entity-relation-entity triplet, contained by most of current KB, as training data to output the embedding representation of entities and relations. To measure the truth of one triplet (The knowledge represented by triplet is true or false), some current embedding methods such as Structured Embedding (SE) project entity vectors into subspace, the meaning of such subspace is not clear for knowledge reasoning. Some other embedding methods such as TransE use simple linear vector transform to represent relation (such as vector add or minus), which can't deal with the multiple relations match or multiple entities match problem. For example, there are multiple relations between two entities, or there are multiple entities have same relation with one entity. Insipred by previous KB element structured embedding methods, we propose a new method, Bipartite Graph Network Structured Embedding (BGNSE). BGNSE combines the current KB embedding methods with bipartite graph network model, which is widely used in many fields including image data compression, collaborative filtering. BGNSE embeds each entity-relation-entity KB triplet into a bipartite graph network structure model, represents each entity by one bipartite graph layer, represents relation by link weights matrix of bipartite graph network. Based on bipartite graph model, our proposed method has following advantages. BGNSE model uses one matrix for each relation, the relation transform between two entities can be done directly by forward and backward propagation of bipartite graph network, no need for subspace projection. Because of using bipartite graph network, the relation transforms between entities are nonlinear (network layer propagation), the multiple relations match or multiple entities match problems can be dealt. The learnt entity and relation embeddings can be used for problems such as knowledge base completions.
References (16)
- R. Davis, H. Shrobe, and P. Szolovits, "What is a knowledge represen- tation?" AI magazine, vol. 14, no. 1, p. 17, 1993.
- A. Bordes, J. Weston, R. Collobert, and Y. Bengio, "Learning structured embeddings of knowledge bases," in Conference on Artificial Intelli- gence, no. EPFL-CONF-192344, 2011.
- A. Bordes, N. Usunier, A. Garcia-Duran, J. Weston, and O. Yakhnenko, "Translating embeddings for modeling multi-relational data," in Ad- vances in Neural Information Processing Systems, 2013, pp. 2787-2795.
- G. E. Hinton and T. J. Sejnowski, "Learning and releaming in boltz- mann machines," Parallel distributed processing: Explorations in the microstructure of cognition, vol. 1, pp. 282-317, 1986.
- R. Kindermann and J. L. Snell, Markov Random Fields and Their Applications. American Mathematical Society,, 1980.
- G. E. Hinton and R. R. Salakhutdinov, "Reducing the dimensionality of data with neural networks," Science, vol. 313, no. 5786, pp. 504-507, 2006.
- A. Bordes, X. Glorot, J. Weston, and Y. Bengio, "Joint learning of words and meaning representations for open-text semantic parsing." in AISTATS, vol. 351, 2012, pp. 423-424.
- Z. Wang, J. Zhang, J. Feng, and Z. Chen, "Knowledge graph embedding by translating on hyperplanes." in AAAI. Citeseer, 2014, pp. 1112-1119.
- R. Socher, D. Chen, C. D. Manning, and A. Ng, "Reasoning with neural tensor networks for knowledge base completion," in Advances in Neural Information Processing Systems, 2013, pp. 926-934.
- Y. Lin, Z. Liu, M. Sun, Y. Liu, and X. Zhu, "Learning entity and relation embeddings for knowledge graph completion." in AAAI, 2015, pp. 2181- 2187.
- J. Pearl, Probabilistic reasoning in intelligent systems: networks of plausible inference. Morgan Kaufmann, 2014.
- D. H. Ackley, G. E. Hinton, and T. J. Sejnowski, "A learning algorithm for boltzmann machines," Cognitive science, vol. 9, no. 1, pp. 147-169, 1985.
- R. Salakhutdinov, A. Mnih, and G. Hinton, "Restricted boltzmann machines for collaborative filtering," in Proceedings of the 24th inter- national conference on Machine learning. ACM, 2007, pp. 791-798.
- H. Lee, A. Battle, R. Raina, and A. Y. Ng, "Efficient sparse coding algorithms," in Advances in neural information processing systems, 2006, pp. 801-808.
- A. Y. Ng, "Feature selection, l 1 vs. l 2 regularization, and rotational invariance," in Proceedings of the twenty-first international conference on Machine learning. ACM, 2004, p. 78.
- A. Bordes, X. Glorot, J. Weston, and Y. Bengio, "A semantic match- ing energy function for learning with multi-relational data," Machine Learning, vol. 94, no. 2, pp. 233-259, 2014.