Multi-Channel Graph Neural Network for Entity …ing Graph Attention Network (GAT) (Velickovic et...

10
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 1452–1461 Florence, Italy, July 28 - August 2, 2019. c 2019 Association for Computational Linguistics 1452 Multi-Channel Graph Neural Network for Entity Alignment Yixin Cao 1 Zhiyuan Liu 2 Chengjiang Li 3 Zhiyuan Liu 3 Juanzi Li 3 Tat-Seng Chua 1 1 School of Computing, National University of Singapore, Singapore 3 School of Science, Xi’an Jiaotong University, Xi’an, China 3 Department of CST, Tsinghua University, Beijing, China {caoyixin2011,acharkq,iamlockelightning}@gmail.com {liuzy,lijuanzi}@tsinghua.edu.cn, [email protected] Abstract Entity alignment typically suffers from the is- sues of structural heterogeneity and limited seed alignments. In this paper, we propose a novel Multi-channel Graph Neural Network model (MuGNN) to learn alignment-oriented knowledge graph (KG) embeddings by ro- bustly encoding two KGs via multiple chan- nels. Each channel encodes KGs via differ- ent relation weighting schemes with respect to self-attention towards KG completion and cross-KG attention for pruning exclusive enti- ties respectively, which are further combined via pooling techniques. Moreover, we also in- fer and transfer rule knowledge for completing two KGs consistently. MuGNN is expected to reconcile the structural differences of two KGs, and thus make better use of seed align- ments. Extensive experiments on five pub- licly available datasets demonstrate our su- perior performance (5% Hits@1 up on av- erage). Source code and data used in the experiments can be accessed at https:// github.com/thunlp/MuGNN. 1 Introduction Knowledge Graphs (KGs) store the world knowl- edge in the form of directed graphs, where nodes denote entities and edges are their relations. Since it was proposed, many KGs are constructed (e.g., YAGO (Rebele et al., 2016)) to provide struc- tural knowledge for different applications and lan- guages. These KGs usually contain complemen- tary contents, attracting researchers to integrate them into a unified KG, which shall benefit many knowledge driven tasks, such as information ex- traction (Cao et al., 2018a) and recommenda- tion (Wang et al., 2018a). It is non-trivial to align different KGs due to their distinct surface forms, which makes the sym- bolic based methods (Suchanek et al., 2011) not Northeastern Mandarin Jilin City Province Dialect 吉省(Jilin) Province Dialect (Liu Fei) Dialect Mayor KG1 KG2 东话(Northeastern Mandarin吉市 (Jilin City) Changchun (Changchun) Nearby Capital Capital Jilin Figure 1: Illustration of the structural differences (dashed lines and ellipse) between different KGs. always effective. Instead, recent work utilizes gen- eral KG embedding methods (e.g., TransE (Bor- des et al., 2013)) and align equivalent entities into a unified vector space based on a few seed align- ments (Chen et al., 2017; Sun et al., 2017; Zhu et al., 2017; Chen et al., 2018; Sun et al., 2018; Wang et al., 2018b). The assumption is that enti- ties and their counterparts in different KGs should have similar structures and thus similar embed- dings. However, alignment performance is unsat- isfactory mainly due to the following challenges: Heterogeneity of Structures Different KGs usu- ally differ a lot, and may mislead the representa- tion learning and the alignment information from seeds. Take the entity Jilin City as an example (Figure 1), KG1 and KG2 present its subgraphs derived from English and Chinese Wikipedia, re- spectively. Since it is a Chinese city, KG2 is more informative than KG1 (denoted by dashed lines and ellipse), such as the relations of Dialect and Nearby, and the entity Liu Fei through rela- tion Mayor. Clearly, the province Jilin in KG1 and Jilin City in KG2, which are incorrect align- ment, are more probable close in the vector space, because they have more similar structures (e.g.,

Transcript of Multi-Channel Graph Neural Network for Entity …ing Graph Attention Network (GAT) (Velickovic et...

  • Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 1452–1461Florence, Italy, July 28 - August 2, 2019. c©2019 Association for Computational Linguistics

    1452

    Multi-Channel Graph Neural Network for Entity Alignment

    Yixin Cao1 Zhiyuan Liu2 Chengjiang Li3Zhiyuan Liu3 Juanzi Li3 Tat-Seng Chua1

    1School of Computing, National University of Singapore, Singapore3School of Science, Xi’an Jiaotong University, Xi’an, China3Department of CST, Tsinghua University, Beijing, China

    {caoyixin2011,acharkq,iamlockelightning}@gmail.com{liuzy,lijuanzi}@tsinghua.edu.cn, [email protected]

    Abstract

    Entity alignment typically suffers from the is-sues of structural heterogeneity and limitedseed alignments. In this paper, we proposea novel Multi-channel Graph Neural Networkmodel (MuGNN) to learn alignment-orientedknowledge graph (KG) embeddings by ro-bustly encoding two KGs via multiple chan-nels. Each channel encodes KGs via differ-ent relation weighting schemes with respectto self-attention towards KG completion andcross-KG attention for pruning exclusive enti-ties respectively, which are further combinedvia pooling techniques. Moreover, we also in-fer and transfer rule knowledge for completingtwo KGs consistently. MuGNN is expectedto reconcile the structural differences of twoKGs, and thus make better use of seed align-ments. Extensive experiments on five pub-licly available datasets demonstrate our su-perior performance (5% Hits@1 up on av-erage). Source code and data used in theexperiments can be accessed at https://github.com/thunlp/MuGNN.

    1 Introduction

    Knowledge Graphs (KGs) store the world knowl-edge in the form of directed graphs, where nodesdenote entities and edges are their relations. Sinceit was proposed, many KGs are constructed (e.g.,YAGO (Rebele et al., 2016)) to provide struc-tural knowledge for different applications and lan-guages. These KGs usually contain complemen-tary contents, attracting researchers to integratethem into a unified KG, which shall benefit manyknowledge driven tasks, such as information ex-traction (Cao et al., 2018a) and recommenda-tion (Wang et al., 2018a).

    It is non-trivial to align different KGs due totheir distinct surface forms, which makes the sym-bolic based methods (Suchanek et al., 2011) not

    Northeastern Mandarin

    Jilin CityProv

    ince

    Dialect

    吉林林省(Jilin)

    Province

    Dialect

    刘⾮非(Liu Fei)

    Dialect

    Mayor

    KG1KG2

    东北北话(Northeastern Mandarin)

    吉林林市 (Jilin City)

    Changchun

    ⻓长春(Changchun)

    Nearby

    Capital

    Capital

    Jilin

    Figure 1: Illustration of the structural differences(dashed lines and ellipse) between different KGs.

    always effective. Instead, recent work utilizes gen-eral KG embedding methods (e.g., TransE (Bor-des et al., 2013)) and align equivalent entities intoa unified vector space based on a few seed align-ments (Chen et al., 2017; Sun et al., 2017; Zhuet al., 2017; Chen et al., 2018; Sun et al., 2018;Wang et al., 2018b). The assumption is that enti-ties and their counterparts in different KGs shouldhave similar structures and thus similar embed-dings. However, alignment performance is unsat-isfactory mainly due to the following challenges:

    Heterogeneity of Structures Different KGs usu-ally differ a lot, and may mislead the representa-tion learning and the alignment information fromseeds. Take the entity Jilin City as an example(Figure 1), KG1 and KG2 present its subgraphsderived from English and Chinese Wikipedia, re-spectively. Since it is a Chinese city, KG2 ismore informative than KG1 (denoted by dashedlines and ellipse), such as the relations of Dialectand Nearby, and the entity Liu Fei through rela-tion Mayor. Clearly, the province Jilin in KG1and Jilin City in KG2, which are incorrect align-ment, are more probable close in the vector space,because they have more similar structures (e.g.,

    https://github.com/thunlp/MuGNNhttps://github.com/thunlp/MuGNN

  • 1453

    Northeastern Mandarin and Changchun). What’sworse, this incorrect alignment shall spread furtherover the graph.

    Limited Seed Alignments Recent efforts basedon general embedding methods heavily rely onexisting alignments as training data, while seedalignments are usually insufficient (Chen et al.,2017) for high-quality entity embeddings. Wanget al. (2018b) introduces Graph Convolution Net-work (GCN) (Kipf and Welling, 2017) to enhancethe entity embeddings by modeling structural fea-tures, but fails to consider structural heterogeneity.

    To address the issues, we propose to performKG inference and alignment jointly to explicitlyreconcile the structural difference between differ-ent KGs, and utilize a graph-based model to makebetter use of seed alignment information. The ba-sic idea of structural reconciliation is to completemissing relations and prune exclusive entities. Asshown in Figure 1, to reconcile the differences ofJilin City, it is necessary to complete the missingrelations Dialect and Nearby in KG1, and filter outentity Liu Fei exclusive in KG2. The asymmetricentities and relations are caused not only by theincompleteness nature of KG, but also from theirdifferent demands.

    In this paper, we propose a novel Multi-channelGraph Neural Network model MuGNN, whichcan encode different KGs to learn alignment-oriented embeddings. For each KG, MuGNNutilizes different channels towards KG comple-tion and pruning, so as to reconcile two types ofstructural differences: missing relations and ex-clusive entities. Different channels are combinedvia pooling techniques, thus entity embeddings areenhanced with reconciled structures from differ-ent perspectives, making utilization of seed align-ments effectively and efficiently. Between KGs,each channel transfers structure knowledge viashared parameters.

    Specifically, for KG completion, we first em-ploy AMIE+ (Galárraga et al., 2015) on eachKG to induce rules, then transfer them betweenKGs towards consistent completion. Follow-ing Graph Attention Network (GAT) (Velickovicet al., 2018), we utilize KG self-attention toweighting relations for GNN channels. For KGpruning, we design cross-KG attention to filterout exclusive entities by assigning low weights tocorresponding relations. We summarize the maincontributions as follows:

    • We propose a novel Multi-channel GNNmodel MuGNN that learns alignment-oriented embeddings by encoding graphsfrom different perspectives: completion andpruning, so as to be robust to structuraldifferences.

    • We propose to perform KG inference andalignment jointly, so that the heterogeneity ofKGs are explicitly reconciled through com-pletion by rule inference and transfer, andpruning via cross-KG attention.

    • We perform extensive experiments on fivepublicly available datasets for entity align-ment tasks, and achieve significant improve-ments of 5% Hits@1 on average. Further ab-lation study demonstrates the effectiveness ofour key components.

    2 Preliminaries and Framework

    2.1 Preliminaries

    KG is a directed graph G = (E,R, T ) involvinga set of entities E, relation types R, and triplets T .Each triplet t = (ei, rij , ej) ∈ T denotes that headentity ei is related to tail entity ej through relationrij ∈ R.Rule knowledge K = {k} can be induced fromKG, e.g., in the form of ∀x, y ∈ E : (x, rs, y) ⇒(x, rc, y), stating that two entities might be relatedthrough rc if they are related through rs. The leftside of the arrow is defined as premise, and theright side is a conclusion. We denote rule as k =(rc|rs1, · · · , rsp) consisting of one or multiple |p|premises and only one conclusion.Rule Grounding is to find suitable triplets satis-fying the premise-conclusion relationship definedby rules. For rule k, we denote one of its groundsas g(k) = (tc|ts1, · · · , tsp) including |p| + 1triplets. The triplets satisfies: ts1 ∧ · · · ∧ tsp ⇒tc, where ∧ is the logical conjunction that playsa similar role as ‘and’. Other compositions in-clude disjunction ∨ (similar to ‘or’) and negation¬ (similar to ‘not’). For example, given a rulebornIn(x, y) ∧ cityOf (y, z) ⇒ nationality(x, z),we ground it in a KG, and obtain : bornIn(Obama,Hawaii) ∧ cityOf (Hawaii, United States) ⇒ na-tionality(Obama, United States). We use G(k) ={g(k)} to denote all groundings of rule k.Entity alignment takes two heterogeneous KGsG and G′ = (E′, R′, T ′) as input, the goal is

  • 1454

    0.2

    0.60.4

    0.60.0

    0.2

    0.4

    r01AAACx3icjVHLSsNAFD2Nr1pfVZdugkV0VRIRdFl0o7sK9gG1lCSdtkPTJEwmxVJc+ANu9c/EP9C/8M44BbWITkhy5tx7zsy9109CnkrHec1ZC4tLyyv51cLa+sbmVnF7p57GmQhYLYjDWDR9L2Uhj1hNchmyZiKYN/JD1vCHFyreGDOR8ji6kZOEtUdeP+I9HnhSUeKw43aKJafs6GXPA9eAEsyqxsUX3KKLGAEyjMAQQRIO4SGlpwUXDhLi2pgSJwhxHWe4R4G0GWUxyvCIHdK3T7uWYSPaK89UqwM6JaRXkNLGAWliyhOE1Wm2jmfaWbG/eU+1p7rbhP6+8RoRKzEg9i/dLPO/OlWLRA9nugZONSWaUdUFxiXTXVE3t79UJckhIU7hLsUF4UArZ322tSbVtaveejr+pjMVq/aByc3wrm5JA3Z/jnMe1I/LrlN2r09KlXMz6jz2sI8jmucpKrhEFTXyHuART3i2rqzYGlt3n6lWzmh28W1ZDx+C45BMAAACx3icjVHLSsNAFD2Nr1pfVZdugkV0VRIRdFl0o7sK9gG1lCSdtkPTJEwmxVJc+ANu9c/EP9C/8M44BbWITkhy5tx7zsy9109CnkrHec1ZC4tLyyv51cLa+sbmVnF7p57GmQhYLYjDWDR9L2Uhj1hNchmyZiKYN/JD1vCHFyreGDOR8ji6kZOEtUdeP+I9HnhSUeKw43aKJafs6GXPA9eAEsyqxsUX3KKLGAEyjMAQQRIO4SGlpwUXDhLi2pgSJwhxHWe4R4G0GWUxyvCIHdK3T7uWYSPaK89UqwM6JaRXkNLGAWliyhOE1Wm2jmfaWbG/eU+1p7rbhP6+8RoRKzEg9i/dLPO/OlWLRA9nugZONSWaUdUFxiXTXVE3t79UJckhIU7hLsUF4UArZ322tSbVtaveejr+pjMVq/aByc3wrm5JA3Z/jnMe1I/LrlN2r09KlXMz6jz2sI8jmucpKrhEFTXyHuART3i2rqzYGlt3n6lWzmh28W1ZDx+C45BMAAACx3icjVHLSsNAFD2Nr1pfVZdugkV0VRIRdFl0o7sK9gG1lCSdtkPTJEwmxVJc+ANu9c/EP9C/8M44BbWITkhy5tx7zsy9109CnkrHec1ZC4tLyyv51cLa+sbmVnF7p57GmQhYLYjDWDR9L2Uhj1hNchmyZiKYN/JD1vCHFyreGDOR8ji6kZOEtUdeP+I9HnhSUeKw43aKJafs6GXPA9eAEsyqxsUX3KKLGAEyjMAQQRIO4SGlpwUXDhLi2pgSJwhxHWe4R4G0GWUxyvCIHdK3T7uWYSPaK89UqwM6JaRXkNLGAWliyhOE1Wm2jmfaWbG/eU+1p7rbhP6+8RoRKzEg9i/dLPO/OlWLRA9nugZONSWaUdUFxiXTXVE3t79UJckhIU7hLsUF4UArZ322tSbVtaveejr+pjMVq/aByc3wrm5JA3Z/jnMe1I/LrlN2r09KlXMz6jz2sI8jmucpKrhEFTXyHuART3i2rqzYGlt3n6lWzmh28W1ZDx+C45BMAAACx3icjVHLSsNAFD2Nr1pfVZdugkV0VRIRdFl0o7sK9gG1lCSdtkPTJEwmxVJc+ANu9c/EP9C/8M44BbWITkhy5tx7zsy9109CnkrHec1ZC4tLyyv51cLa+sbmVnF7p57GmQhYLYjDWDR9L2Uhj1hNchmyZiKYN/JD1vCHFyreGDOR8ji6kZOEtUdeP+I9HnhSUeKw43aKJafs6GXPA9eAEsyqxsUX3KKLGAEyjMAQQRIO4SGlpwUXDhLi2pgSJwhxHWe4R4G0GWUxyvCIHdK3T7uWYSPaK89UqwM6JaRXkNLGAWliyhOE1Wm2jmfaWbG/eU+1p7rbhP6+8RoRKzEg9i/dLPO/OlWLRA9nugZONSWaUdUFxiXTXVE3t79UJckhIU7hLsUF4UArZ322tSbVtaveejr+pjMVq/aByc3wrm5JA3Z/jnMe1I/LrlN2r09KlXMz6jz2sI8jmucpKrhEFTXyHuART3i2rqzYGlt3n6lWzmh28W1ZDx+C45BM

    r02AAACx3icjVHLSsNAFD2Nr1pfVZdugkV0VRIRdFl0o7sK9gG1lCSdtkPTJEwmxVJc+ANu9c/EP9C/8M44BbWITkhy5tx7zsy9109CnkrHec1ZC4tLyyv51cLa+sbmVnF7p57GmQhYLYjDWDR9L2Uhj1hNchmyZiKYN/JD1vCHFyreGDOR8ji6kZOEtUdeP+I9HnhSUeKwc9wplpyyo5c9D1wDSjCrGhdfcIsuYgTIMAJDBEk4hIeUnhZcOEiIa2NKnCDEdZzhHgXSZpTFKMMjdkjfPu1aho1orzxTrQ7olJBeQUobB6SJKU8QVqfZOp5pZ8X+5j3VnupuE/r7xmtErMSA2L90s8z/6lQtEj2c6Ro41ZRoRlUXGJdMd0Xd3P5SlSSHhDiFuxQXhAOtnPXZ1ppU16566+n4m85UrNoHJjfDu7olDdj9Oc55UD8uu07ZvT4pVc7NqPPYwz6OaJ6nqOASVdTIe4BHPOHZurJia2zdfaZaOaPZxbdlPXwAhUOQTQ==AAACx3icjVHLSsNAFD2Nr1pfVZdugkV0VRIRdFl0o7sK9gG1lCSdtkPTJEwmxVJc+ANu9c/EP9C/8M44BbWITkhy5tx7zsy9109CnkrHec1ZC4tLyyv51cLa+sbmVnF7p57GmQhYLYjDWDR9L2Uhj1hNchmyZiKYN/JD1vCHFyreGDOR8ji6kZOEtUdeP+I9HnhSUeKwc9wplpyyo5c9D1wDSjCrGhdfcIsuYgTIMAJDBEk4hIeUnhZcOEiIa2NKnCDEdZzhHgXSZpTFKMMjdkjfPu1aho1orzxTrQ7olJBeQUobB6SJKU8QVqfZOp5pZ8X+5j3VnupuE/r7xmtErMSA2L90s8z/6lQtEj2c6Ro41ZRoRlUXGJdMd0Xd3P5SlSSHhDiFuxQXhAOtnPXZ1ppU16566+n4m85UrNoHJjfDu7olDdj9Oc55UD8uu07ZvT4pVc7NqPPYwz6OaJ6nqOASVdTIe4BHPOHZurJia2zdfaZaOaPZxbdlPXwAhUOQTQ==AAACx3icjVHLSsNAFD2Nr1pfVZdugkV0VRIRdFl0o7sK9gG1lCSdtkPTJEwmxVJc+ANu9c/EP9C/8M44BbWITkhy5tx7zsy9109CnkrHec1ZC4tLyyv51cLa+sbmVnF7p57GmQhYLYjDWDR9L2Uhj1hNchmyZiKYN/JD1vCHFyreGDOR8ji6kZOEtUdeP+I9HnhSUeKwc9wplpyyo5c9D1wDSjCrGhdfcIsuYgTIMAJDBEk4hIeUnhZcOEiIa2NKnCDEdZzhHgXSZpTFKMMjdkjfPu1aho1orzxTrQ7olJBeQUobB6SJKU8QVqfZOp5pZ8X+5j3VnupuE/r7xmtErMSA2L90s8z/6lQtEj2c6Ro41ZRoRlUXGJdMd0Xd3P5SlSSHhDiFuxQXhAOtnPXZ1ppU16566+n4m85UrNoHJjfDu7olDdj9Oc55UD8uu07ZvT4pVc7NqPPYwz6OaJ6nqOASVdTIe4BHPOHZurJia2zdfaZaOaPZxbdlPXwAhUOQTQ==AAACx3icjVHLSsNAFD2Nr1pfVZdugkV0VRIRdFl0o7sK9gG1lCSdtkPTJEwmxVJc+ANu9c/EP9C/8M44BbWITkhy5tx7zsy9109CnkrHec1ZC4tLyyv51cLa+sbmVnF7p57GmQhYLYjDWDR9L2Uhj1hNchmyZiKYN/JD1vCHFyreGDOR8ji6kZOEtUdeP+I9HnhSUeKwc9wplpyyo5c9D1wDSjCrGhdfcIsuYgTIMAJDBEk4hIeUnhZcOEiIa2NKnCDEdZzhHgXSZpTFKMMjdkjfPu1aho1orzxTrQ7olJBeQUobB6SJKU8QVqfZOp5pZ8X+5j3VnupuE/r7xmtErMSA2L90s8z/6lQtEj2c6Ro41ZRoRlUXGJdMd0Xd3P5SlSSHhDiFuxQXhAOtnPXZ1ppU16566+n4m85UrNoHJjfDu7olDdj9Oc55UD8uu07ZvT4pVc7NqPPYwz6OaJ6nqOASVdTIe4BHPOHZurJia2zdfaZaOaPZxbdlPXwAhUOQTQ==

    r2AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLopsuKthVqKcl0WkPzYjJRShH8Abf6aeIf6F94Z5yCWkQnJDlz7j1n5t7rp2GQScd5LVgLi0vLK8XV0tr6xuZWeXunnSW5YLzFkjARV76X8TCIeUsGMuRXqeBe5Ie844/PVLxzy0UWJPGlnKS8F3mjOBgGzJNEXYh+rV+uOFVHL3seuAZUYFYzKb/gGgMkYMgRgSOGJBzCQ0ZPFy4cpMT1MCVOEAp0nOMeJdLmlMUpwyN2TN8R7bqGjWmvPDOtZnRKSK8gpY0D0iSUJwir02wdz7WzYn/znmpPdbcJ/X3jFRErcUPsX7pZ5n91qhaJIU50DQHVlGpGVceMS667om5uf6lKkkNKnMIDigvCTCtnfba1JtO1q956Ov6mMxWr9szk5nhXt6QBuz/HOQ/atarrVN3zo0r91Iy6iD3s45DmeYw6GmiiRd4jPOIJz1bDiq3cuvtMtQpGs4tvy3r4AAIfkBw=AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLopsuKthVqKcl0WkPzYjJRShH8Abf6aeIf6F94Z5yCWkQnJDlz7j1n5t7rp2GQScd5LVgLi0vLK8XV0tr6xuZWeXunnSW5YLzFkjARV76X8TCIeUsGMuRXqeBe5Ie844/PVLxzy0UWJPGlnKS8F3mjOBgGzJNEXYh+rV+uOFVHL3seuAZUYFYzKb/gGgMkYMgRgSOGJBzCQ0ZPFy4cpMT1MCVOEAp0nOMeJdLmlMUpwyN2TN8R7bqGjWmvPDOtZnRKSK8gpY0D0iSUJwir02wdz7WzYn/znmpPdbcJ/X3jFRErcUPsX7pZ5n91qhaJIU50DQHVlGpGVceMS667om5uf6lKkkNKnMIDigvCTCtnfba1JtO1q956Ov6mMxWr9szk5nhXt6QBuz/HOQ/atarrVN3zo0r91Iy6iD3s45DmeYw6GmiiRd4jPOIJz1bDiq3cuvtMtQpGs4tvy3r4AAIfkBw=AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLopsuKthVqKcl0WkPzYjJRShH8Abf6aeIf6F94Z5yCWkQnJDlz7j1n5t7rp2GQScd5LVgLi0vLK8XV0tr6xuZWeXunnSW5YLzFkjARV76X8TCIeUsGMuRXqeBe5Ie844/PVLxzy0UWJPGlnKS8F3mjOBgGzJNEXYh+rV+uOFVHL3seuAZUYFYzKb/gGgMkYMgRgSOGJBzCQ0ZPFy4cpMT1MCVOEAp0nOMeJdLmlMUpwyN2TN8R7bqGjWmvPDOtZnRKSK8gpY0D0iSUJwir02wdz7WzYn/znmpPdbcJ/X3jFRErcUPsX7pZ5n91qhaJIU50DQHVlGpGVceMS667om5uf6lKkkNKnMIDigvCTCtnfba1JtO1q956Ov6mMxWr9szk5nhXt6QBuz/HOQ/atarrVN3zo0r91Iy6iD3s45DmeYw6GmiiRd4jPOIJz1bDiq3cuvtMtQpGs4tvy3r4AAIfkBw=AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLopsuKthVqKcl0WkPzYjJRShH8Abf6aeIf6F94Z5yCWkQnJDlz7j1n5t7rp2GQScd5LVgLi0vLK8XV0tr6xuZWeXunnSW5YLzFkjARV76X8TCIeUsGMuRXqeBe5Ie844/PVLxzy0UWJPGlnKS8F3mjOBgGzJNEXYh+rV+uOFVHL3seuAZUYFYzKb/gGgMkYMgRgSOGJBzCQ0ZPFy4cpMT1MCVOEAp0nOMeJdLmlMUpwyN2TN8R7bqGjWmvPDOtZnRKSK8gpY0D0iSUJwir02wdz7WzYn/znmpPdbcJ/X3jFRErcUPsX7pZ5n91qhaJIU50DQHVlGpGVceMS667om5uf6lKkkNKnMIDigvCTCtnfba1JtO1q956Ov6mMxWr9szk5nhXt6QBuz/HOQ/atarrVN3zo0r91Iy6iD3s45DmeYw6GmiiRd4jPOIJz1bDiq3cuvtMtQpGs4tvy3r4AAIfkBw=

    r1AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl002VF+4BaSjKd1qF5MZkopQj+gFv9NPEP9C+8M6agFtEJSc6ce8+Zuff6SSBS5TivBWthcWl5pbhaWlvf2Nwqb++00jiTjDdZHMSy43spD0TEm0qogHcSyb3QD3jbH5/rePuWy1TE0ZWaJLwXeqNIDAXzFFGXsu/2yxWn6phlzwM3BxXkqxGXX3CNAWIwZAjBEUERDuAhpacLFw4S4nqYEicJCRPnuEeJtBllccrwiB3Td0S7bs5GtNeeqVEzOiWgV5LSxgFpYsqThPVptolnxlmzv3lPjae+24T+fu4VEqtwQ+xfulnmf3W6FoUhTk0NgmpKDKOrY7lLZrqib25/qUqRQ0KcxgOKS8LMKGd9to0mNbXr3nom/mYyNav3LM/N8K5vSQN2f45zHrSOqq5TdS+OK7WzfNRF7GEfhzTPE9RQRwNN8h7hEU94tupWZGXW3WeqVcg1u/i2rIcP/7CQGw==AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl002VF+4BaSjKd1qF5MZkopQj+gFv9NPEP9C+8M6agFtEJSc6ce8+Zuff6SSBS5TivBWthcWl5pbhaWlvf2Nwqb++00jiTjDdZHMSy43spD0TEm0qogHcSyb3QD3jbH5/rePuWy1TE0ZWaJLwXeqNIDAXzFFGXsu/2yxWn6phlzwM3BxXkqxGXX3CNAWIwZAjBEUERDuAhpacLFw4S4nqYEicJCRPnuEeJtBllccrwiB3Td0S7bs5GtNeeqVEzOiWgV5LSxgFpYsqThPVptolnxlmzv3lPjae+24T+fu4VEqtwQ+xfulnmf3W6FoUhTk0NgmpKDKOrY7lLZrqib25/qUqRQ0KcxgOKS8LMKGd9to0mNbXr3nom/mYyNav3LM/N8K5vSQN2f45zHrSOqq5TdS+OK7WzfNRF7GEfhzTPE9RQRwNN8h7hEU94tupWZGXW3WeqVcg1u/i2rIcP/7CQGw==AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl002VF+4BaSjKd1qF5MZkopQj+gFv9NPEP9C+8M6agFtEJSc6ce8+Zuff6SSBS5TivBWthcWl5pbhaWlvf2Nwqb++00jiTjDdZHMSy43spD0TEm0qogHcSyb3QD3jbH5/rePuWy1TE0ZWaJLwXeqNIDAXzFFGXsu/2yxWn6phlzwM3BxXkqxGXX3CNAWIwZAjBEUERDuAhpacLFw4S4nqYEicJCRPnuEeJtBllccrwiB3Td0S7bs5GtNeeqVEzOiWgV5LSxgFpYsqThPVptolnxlmzv3lPjae+24T+fu4VEqtwQ+xfulnmf3W6FoUhTk0NgmpKDKOrY7lLZrqib25/qUqRQ0KcxgOKS8LMKGd9to0mNbXr3nom/mYyNav3LM/N8K5vSQN2f45zHrSOqq5TdS+OK7WzfNRF7GEfhzTPE9RQRwNN8h7hEU94tupWZGXW3WeqVcg1u/i2rIcP/7CQGw==AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl002VF+4BaSjKd1qF5MZkopQj+gFv9NPEP9C+8M6agFtEJSc6ce8+Zuff6SSBS5TivBWthcWl5pbhaWlvf2Nwqb++00jiTjDdZHMSy43spD0TEm0qogHcSyb3QD3jbH5/rePuWy1TE0ZWaJLwXeqNIDAXzFFGXsu/2yxWn6phlzwM3BxXkqxGXX3CNAWIwZAjBEUERDuAhpacLFw4S4nqYEicJCRPnuEeJtBllccrwiB3Td0S7bs5GtNeeqVEzOiWgV5LSxgFpYsqThPVptolnxlmzv3lPjae+24T+fu4VEqtwQ+xfulnmf3W6FoUhTk0NgmpKDKOrY7lLZrqib25/qUqRQ0KcxgOKS8LMKGd9to0mNbXr3nom/mYyNav3LM/N8K5vSQN2f45zHrSOqq5TdS+OK7WzfNRF7GEfhzTPE9RQRwNN8h7hEU94tupWZGXW3WeqVcg1u/i2rIcP/7CQGw==

    r3AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl002VF+4BaSjKd1sG8mEyUUgR/wK1+mvgH+hfeGVNQi+iEJGfOvefM3Hv9JBCpcpzXgjU3v7C4VFwurayurW+UN7daaZxJxpssDmLZ8b2UByLiTSVUwDuJ5F7oB7zt35zpePuWy1TE0aUaJ7wXeqNIDAXzFFEXsn/YL1ecqmOWPQvcHFSQr0ZcfsEVBojBkCEERwRFOICHlJ4uXDhIiOthQpwkJEyc4x4l0maUxSnDI/aGviPadXM2or32TI2a0SkBvZKUNvZIE1OeJKxPs008M86a/c17Yjz13cb093OvkFiFa2L/0k0z/6vTtSgMcWJqEFRTYhhdHctdMtMVfXP7S1WKHBLiNB5QXBJmRjnts200qald99Yz8TeTqVm9Z3luhnd9Sxqw+3Ocs6B1UHWdqnt+VKmd5qMuYge72Kd5HqOGOhpokvcIj3jCs1W3Iiuz7j5TrUKu2ca3ZT18AAR/kB0=AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl002VF+4BaSjKd1sG8mEyUUgR/wK1+mvgH+hfeGVNQi+iEJGfOvefM3Hv9JBCpcpzXgjU3v7C4VFwurayurW+UN7daaZxJxpssDmLZ8b2UByLiTSVUwDuJ5F7oB7zt35zpePuWy1TE0aUaJ7wXeqNIDAXzFFEXsn/YL1ecqmOWPQvcHFSQr0ZcfsEVBojBkCEERwRFOICHlJ4uXDhIiOthQpwkJEyc4x4l0maUxSnDI/aGviPadXM2or32TI2a0SkBvZKUNvZIE1OeJKxPs008M86a/c17Yjz13cb093OvkFiFa2L/0k0z/6vTtSgMcWJqEFRTYhhdHctdMtMVfXP7S1WKHBLiNB5QXBJmRjnts200qald99Yz8TeTqVm9Z3luhnd9Sxqw+3Ocs6B1UHWdqnt+VKmd5qMuYge72Kd5HqOGOhpokvcIj3jCs1W3Iiuz7j5TrUKu2ca3ZT18AAR/kB0=AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl002VF+4BaSjKd1sG8mEyUUgR/wK1+mvgH+hfeGVNQi+iEJGfOvefM3Hv9JBCpcpzXgjU3v7C4VFwurayurW+UN7daaZxJxpssDmLZ8b2UByLiTSVUwDuJ5F7oB7zt35zpePuWy1TE0aUaJ7wXeqNIDAXzFFEXsn/YL1ecqmOWPQvcHFSQr0ZcfsEVBojBkCEERwRFOICHlJ4uXDhIiOthQpwkJEyc4x4l0maUxSnDI/aGviPadXM2or32TI2a0SkBvZKUNvZIE1OeJKxPs008M86a/c17Yjz13cb093OvkFiFa2L/0k0z/6vTtSgMcWJqEFRTYhhdHctdMtMVfXP7S1WKHBLiNB5QXBJmRjnts200qald99Yz8TeTqVm9Z3luhnd9Sxqw+3Ocs6B1UHWdqnt+VKmd5qMuYge72Kd5HqOGOhpokvcIj3jCs1W3Iiuz7j5TrUKu2ca3ZT18AAR/kB0=AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl002VF+4BaSjKd1sG8mEyUUgR/wK1+mvgH+hfeGVNQi+iEJGfOvefM3Hv9JBCpcpzXgjU3v7C4VFwurayurW+UN7daaZxJxpssDmLZ8b2UByLiTSVUwDuJ5F7oB7zt35zpePuWy1TE0aUaJ7wXeqNIDAXzFFEXsn/YL1ecqmOWPQvcHFSQr0ZcfsEVBojBkCEERwRFOICHlJ4uXDhIiOthQpwkJEyc4x4l0maUxSnDI/aGviPadXM2or32TI2a0SkBvZKUNvZIE1OeJKxPs008M86a/c17Yjz13cb093OvkFiFa2L/0k0z/6vTtSgMcWJqEFRTYhhdHctdMtMVfXP7S1WKHBLiNB5QXBJmRjnts200qald99Yz8TeTqVm9Z3luhnd9Sxqw+3Ocs6B1UHWdqnt+VKmd5qMuYge72Kd5HqOGOhpokvcIj3jCs1W3Iiuz7j5TrUKu2ca3ZT18AAR/kB0=

    r4AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVRIp6LLopsuKthVqKcl0WofmxWSilCL4A27108Q/0L/wzpiCWkQnJDlz7j1n5t7rJ4FIleO8FqyFxaXlleJqaW19Y3OrvL3TTuNMMt5icRDLK99LeSAi3lJCBfwqkdwL/YB3/PGZjnduuUxFHF2qScJ7oTeKxFAwTxF1Ifu1frniVB2z7Hng5qCCfDXj8guuMUAMhgwhOCIowgE8pPR04cJBQlwPU+IkIWHiHPcokTajLE4ZHrFj+o5o183ZiPbaMzVqRqcE9EpS2jggTUx5krA+zTbxzDhr9jfvqfHUd5vQ38+9QmIVboj9SzfL/K9O16IwxImpQVBNiWF0dSx3yUxX9M3tL1UpckiI03hAcUmYGeWsz7bRpKZ23VvPxN9Mpmb1nuW5Gd71LWnA7s9xzoP2UdV1qu55rVI/zUddxB72cUjzPEYdDTTRIu8RHvGEZ6thRVZm3X2mWoVcs4tvy3r4AAbfkB4=AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVRIp6LLopsuKthVqKcl0WofmxWSilCL4A27108Q/0L/wzpiCWkQnJDlz7j1n5t7rJ4FIleO8FqyFxaXlleJqaW19Y3OrvL3TTuNMMt5icRDLK99LeSAi3lJCBfwqkdwL/YB3/PGZjnduuUxFHF2qScJ7oTeKxFAwTxF1Ifu1frniVB2z7Hng5qCCfDXj8guuMUAMhgwhOCIowgE8pPR04cJBQlwPU+IkIWHiHPcokTajLE4ZHrFj+o5o183ZiPbaMzVqRqcE9EpS2jggTUx5krA+zTbxzDhr9jfvqfHUd5vQ38+9QmIVboj9SzfL/K9O16IwxImpQVBNiWF0dSx3yUxX9M3tL1UpckiI03hAcUmYGeWsz7bRpKZ23VvPxN9Mpmb1nuW5Gd71LWnA7s9xzoP2UdV1qu55rVI/zUddxB72cUjzPEYdDTTRIu8RHvGEZ6thRVZm3X2mWoVcs4tvy3r4AAbfkB4=AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVRIp6LLopsuKthVqKcl0WofmxWSilCL4A27108Q/0L/wzpiCWkQnJDlz7j1n5t7rJ4FIleO8FqyFxaXlleJqaW19Y3OrvL3TTuNMMt5icRDLK99LeSAi3lJCBfwqkdwL/YB3/PGZjnduuUxFHF2qScJ7oTeKxFAwTxF1Ifu1frniVB2z7Hng5qCCfDXj8guuMUAMhgwhOCIowgE8pPR04cJBQlwPU+IkIWHiHPcokTajLE4ZHrFj+o5o183ZiPbaMzVqRqcE9EpS2jggTUx5krA+zTbxzDhr9jfvqfHUd5vQ38+9QmIVboj9SzfL/K9O16IwxImpQVBNiWF0dSx3yUxX9M3tL1UpckiI03hAcUmYGeWsz7bRpKZ23VvPxN9Mpmb1nuW5Gd71LWnA7s9xzoP2UdV1qu55rVI/zUddxB72cUjzPEYdDTTRIu8RHvGEZ6thRVZm3X2mWoVcs4tvy3r4AAbfkB4=AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVRIp6LLopsuKthVqKcl0WofmxWSilCL4A27108Q/0L/wzpiCWkQnJDlz7j1n5t7rJ4FIleO8FqyFxaXlleJqaW19Y3OrvL3TTuNMMt5icRDLK99LeSAi3lJCBfwqkdwL/YB3/PGZjnduuUxFHF2qScJ7oTeKxFAwTxF1Ifu1frniVB2z7Hng5qCCfDXj8guuMUAMhgwhOCIowgE8pPR04cJBQlwPU+IkIWHiHPcokTajLE4ZHrFj+o5o183ZiPbaMzVqRqcE9EpS2jggTUx5krA+zTbxzDhr9jfvqfHUd5vQ38+9QmIVboj9SzfL/K9O16IwxImpQVBNiWF0dSx3yUxX9M3tL1UpckiI03hAcUmYGeWsz7bRpKZ23VvPxN9Mpmb1nuW5Gd71LWnA7s9xzoP2UdV1qu55rVI/zUddxB72cUjzPEYdDTTRIu8RHvGEZ6thRVZm3X2mWoVcs4tvy3r4AAbfkB4=

    KG Completion

    r2AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLopsuKthVqKcl0WkPzYjJRShH8Abf6aeIf6F94Z5yCWkQnJDlz7j1n5t7rp2GQScd5LVgLi0vLK8XV0tr6xuZWeXunnSW5YLzFkjARV76X8TCIeUsGMuRXqeBe5Ie844/PVLxzy0UWJPGlnKS8F3mjOBgGzJNEXYh+rV+uOFVHL3seuAZUYFYzKb/gGgMkYMgRgSOGJBzCQ0ZPFy4cpMT1MCVOEAp0nOMeJdLmlMUpwyN2TN8R7bqGjWmvPDOtZnRKSK8gpY0D0iSUJwir02wdz7WzYn/znmpPdbcJ/X3jFRErcUPsX7pZ5n91qhaJIU50DQHVlGpGVceMS667om5uf6lKkkNKnMIDigvCTCtnfba1JtO1q956Ov6mMxWr9szk5nhXt6QBuz/HOQ/atarrVN3zo0r91Iy6iD3s45DmeYw6GmiiRd4jPOIJz1bDiq3cuvtMtQpGs4tvy3r4AAIfkBw=AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLopsuKthVqKcl0WkPzYjJRShH8Abf6aeIf6F94Z5yCWkQnJDlz7j1n5t7rp2GQScd5LVgLi0vLK8XV0tr6xuZWeXunnSW5YLzFkjARV76X8TCIeUsGMuRXqeBe5Ie844/PVLxzy0UWJPGlnKS8F3mjOBgGzJNEXYh+rV+uOFVHL3seuAZUYFYzKb/gGgMkYMgRgSOGJBzCQ0ZPFy4cpMT1MCVOEAp0nOMeJdLmlMUpwyN2TN8R7bqGjWmvPDOtZnRKSK8gpY0D0iSUJwir02wdz7WzYn/znmpPdbcJ/X3jFRErcUPsX7pZ5n91qhaJIU50DQHVlGpGVceMS667om5uf6lKkkNKnMIDigvCTCtnfba1JtO1q956Ov6mMxWr9szk5nhXt6QBuz/HOQ/atarrVN3zo0r91Iy6iD3s45DmeYw6GmiiRd4jPOIJz1bDiq3cuvtMtQpGs4tvy3r4AAIfkBw=AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLopsuKthVqKcl0WkPzYjJRShH8Abf6aeIf6F94Z5yCWkQnJDlz7j1n5t7rp2GQScd5LVgLi0vLK8XV0tr6xuZWeXunnSW5YLzFkjARV76X8TCIeUsGMuRXqeBe5Ie844/PVLxzy0UWJPGlnKS8F3mjOBgGzJNEXYh+rV+uOFVHL3seuAZUYFYzKb/gGgMkYMgRgSOGJBzCQ0ZPFy4cpMT1MCVOEAp0nOMeJdLmlMUpwyN2TN8R7bqGjWmvPDOtZnRKSK8gpY0D0iSUJwir02wdz7WzYn/znmpPdbcJ/X3jFRErcUPsX7pZ5n91qhaJIU50DQHVlGpGVceMS667om5uf6lKkkNKnMIDigvCTCtnfba1JtO1q956Ov6mMxWr9szk5nhXt6QBuz/HOQ/atarrVN3zo0r91Iy6iD3s45DmeYw6GmiiRd4jPOIJz1bDiq3cuvtMtQpGs4tvy3r4AAIfkBw=AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVZIi6LLopsuKthVqKcl0WkPzYjJRShH8Abf6aeIf6F94Z5yCWkQnJDlz7j1n5t7rp2GQScd5LVgLi0vLK8XV0tr6xuZWeXunnSW5YLzFkjARV76X8TCIeUsGMuRXqeBe5Ie844/PVLxzy0UWJPGlnKS8F3mjOBgGzJNEXYh+rV+uOFVHL3seuAZUYFYzKb/gGgMkYMgRgSOGJBzCQ0ZPFy4cpMT1MCVOEAp0nOMeJdLmlMUpwyN2TN8R7bqGjWmvPDOtZnRKSK8gpY0D0iSUJwir02wdz7WzYn/znmpPdbcJ/X3jFRErcUPsX7pZ5n91qhaJIU50DQHVlGpGVceMS667om5uf6lKkkNKnMIDigvCTCtnfba1JtO1q956Ov6mMxWr9szk5nhXt6QBuz/HOQ/atarrVN3zo0r91Iy6iD3s45DmeYw6GmiiRd4jPOIJz1bDiq3cuvtMtQpGs4tvy3r4AAIfkBw=

    r1AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl002VF+4BaSjKd1qF5MZkopQj+gFv9NPEP9C+8M6agFtEJSc6ce8+Zuff6SSBS5TivBWthcWl5pbhaWlvf2Nwqb++00jiTjDdZHMSy43spD0TEm0qogHcSyb3QD3jbH5/rePuWy1TE0ZWaJLwXeqNIDAXzFFGXsu/2yxWn6phlzwM3BxXkqxGXX3CNAWIwZAjBEUERDuAhpacLFw4S4nqYEicJCRPnuEeJtBllccrwiB3Td0S7bs5GtNeeqVEzOiWgV5LSxgFpYsqThPVptolnxlmzv3lPjae+24T+fu4VEqtwQ+xfulnmf3W6FoUhTk0NgmpKDKOrY7lLZrqib25/qUqRQ0KcxgOKS8LMKGd9to0mNbXr3nom/mYyNav3LM/N8K5vSQN2f45zHrSOqq5TdS+OK7WzfNRF7GEfhzTPE9RQRwNN8h7hEU94tupWZGXW3WeqVcg1u/i2rIcP/7CQGw==AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl002VF+4BaSjKd1qF5MZkopQj+gFv9NPEP9C+8M6agFtEJSc6ce8+Zuff6SSBS5TivBWthcWl5pbhaWlvf2Nwqb++00jiTjDdZHMSy43spD0TEm0qogHcSyb3QD3jbH5/rePuWy1TE0ZWaJLwXeqNIDAXzFFGXsu/2yxWn6phlzwM3BxXkqxGXX3CNAWIwZAjBEUERDuAhpacLFw4S4nqYEicJCRPnuEeJtBllccrwiB3Td0S7bs5GtNeeqVEzOiWgV5LSxgFpYsqThPVptolnxlmzv3lPjae+24T+fu4VEqtwQ+xfulnmf3W6FoUhTk0NgmpKDKOrY7lLZrqib25/qUqRQ0KcxgOKS8LMKGd9to0mNbXr3nom/mYyNav3LM/N8K5vSQN2f45zHrSOqq5TdS+OK7WzfNRF7GEfhzTPE9RQRwNN8h7hEU94tupWZGXW3WeqVcg1u/i2rIcP/7CQGw==AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl002VF+4BaSjKd1qF5MZkopQj+gFv9NPEP9C+8M6agFtEJSc6ce8+Zuff6SSBS5TivBWthcWl5pbhaWlvf2Nwqb++00jiTjDdZHMSy43spD0TEm0qogHcSyb3QD3jbH5/rePuWy1TE0ZWaJLwXeqNIDAXzFFGXsu/2yxWn6phlzwM3BxXkqxGXX3CNAWIwZAjBEUERDuAhpacLFw4S4nqYEicJCRPnuEeJtBllccrwiB3Td0S7bs5GtNeeqVEzOiWgV5LSxgFpYsqThPVptolnxlmzv3lPjae+24T+fu4VEqtwQ+xfulnmf3W6FoUhTk0NgmpKDKOrY7lLZrqib25/qUqRQ0KcxgOKS8LMKGd9to0mNbXr3nom/mYyNav3LM/N8K5vSQN2f45zHrSOqq5TdS+OK7WzfNRF7GEfhzTPE9RQRwNN8h7hEU94tupWZGXW3WeqVcg1u/i2rIcP/7CQGw==AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVRIRdFl002VF+4BaSjKd1qF5MZkopQj+gFv9NPEP9C+8M6agFtEJSc6ce8+Zuff6SSBS5TivBWthcWl5pbhaWlvf2Nwqb++00jiTjDdZHMSy43spD0TEm0qogHcSyb3QD3jbH5/rePuWy1TE0ZWaJLwXeqNIDAXzFFGXsu/2yxWn6phlzwM3BxXkqxGXX3CNAWIwZAjBEUERDuAhpacLFw4S4nqYEicJCRPnuEeJtBllccrwiB3Td0S7bs5GtNeeqVEzOiWgV5LSxgFpYsqThPVptolnxlmzv3lPjae+24T+fu4VEqtwQ+xfulnmf3W6FoUhTk0NgmpKDKOrY7lLZrqib25/qUqRQ0KcxgOKS8LMKGd9to0mNbXr3nom/mYyNav3LM/N8K5vSQN2f45zHrSOqq5TdS+OK7WzfNRF7GEfhzTPE9RQRwNN8h7hEU94tupWZGXW3WeqVcg1u/i2rIcP/7CQGw==

    r3AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl002VF+4BaSjKd1sG8mEyUUgR/wK1+mvgH+hfeGVNQi+iEJGfOvefM3Hv9JBCpcpzXgjU3v7C4VFwurayurW+UN7daaZxJxpssDmLZ8b2UByLiTSVUwDuJ5F7oB7zt35zpePuWy1TE0aUaJ7wXeqNIDAXzFFEXsn/YL1ecqmOWPQvcHFSQr0ZcfsEVBojBkCEERwRFOICHlJ4uXDhIiOthQpwkJEyc4x4l0maUxSnDI/aGviPadXM2or32TI2a0SkBvZKUNvZIE1OeJKxPs008M86a/c17Yjz13cb093OvkFiFa2L/0k0z/6vTtSgMcWJqEFRTYhhdHctdMtMVfXP7S1WKHBLiNB5QXBJmRjnts200qald99Yz8TeTqVm9Z3luhnd9Sxqw+3Ocs6B1UHWdqnt+VKmd5qMuYge72Kd5HqOGOhpokvcIj3jCs1W3Iiuz7j5TrUKu2ca3ZT18AAR/kB0=AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl002VF+4BaSjKd1sG8mEyUUgR/wK1+mvgH+hfeGVNQi+iEJGfOvefM3Hv9JBCpcpzXgjU3v7C4VFwurayurW+UN7daaZxJxpssDmLZ8b2UByLiTSVUwDuJ5F7oB7zt35zpePuWy1TE0aUaJ7wXeqNIDAXzFFEXsn/YL1ecqmOWPQvcHFSQr0ZcfsEVBojBkCEERwRFOICHlJ4uXDhIiOthQpwkJEyc4x4l0maUxSnDI/aGviPadXM2or32TI2a0SkBvZKUNvZIE1OeJKxPs008M86a/c17Yjz13cb093OvkFiFa2L/0k0z/6vTtSgMcWJqEFRTYhhdHctdMtMVfXP7S1WKHBLiNB5QXBJmRjnts200qald99Yz8TeTqVm9Z3luhnd9Sxqw+3Ocs6B1UHWdqnt+VKmd5qMuYge72Kd5HqOGOhpokvcIj3jCs1W3Iiuz7j5TrUKu2ca3ZT18AAR/kB0=AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl002VF+4BaSjKd1sG8mEyUUgR/wK1+mvgH+hfeGVNQi+iEJGfOvefM3Hv9JBCpcpzXgjU3v7C4VFwurayurW+UN7daaZxJxpssDmLZ8b2UByLiTSVUwDuJ5F7oB7zt35zpePuWy1TE0aUaJ7wXeqNIDAXzFFEXsn/YL1ecqmOWPQvcHFSQr0ZcfsEVBojBkCEERwRFOICHlJ4uXDhIiOthQpwkJEyc4x4l0maUxSnDI/aGviPadXM2or32TI2a0SkBvZKUNvZIE1OeJKxPs008M86a/c17Yjz13cb093OvkFiFa2L/0k0z/6vTtSgMcWJqEFRTYhhdHctdMtMVfXP7S1WKHBLiNB5QXBJmRjnts200qald99Yz8TeTqVm9Z3luhnd9Sxqw+3Ocs6B1UHWdqnt+VKmd5qMuYge72Kd5HqOGOhpokvcIj3jCs1W3Iiuz7j5TrUKu2ca3ZT18AAR/kB0=AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVRIVdFl002VF+4BaSjKd1sG8mEyUUgR/wK1+mvgH+hfeGVNQi+iEJGfOvefM3Hv9JBCpcpzXgjU3v7C4VFwurayurW+UN7daaZxJxpssDmLZ8b2UByLiTSVUwDuJ5F7oB7zt35zpePuWy1TE0aUaJ7wXeqNIDAXzFFEXsn/YL1ecqmOWPQvcHFSQr0ZcfsEVBojBkCEERwRFOICHlJ4uXDhIiOthQpwkJEyc4x4l0maUxSnDI/aGviPadXM2or32TI2a0SkBvZKUNvZIE1OeJKxPs008M86a/c17Yjz13cb093OvkFiFa2L/0k0z/6vTtSgMcWJqEFRTYhhdHctdMtMVfXP7S1WKHBLiNB5QXBJmRjnts200qald99Yz8TeTqVm9Z3luhnd9Sxqw+3Ocs6B1UHWdqnt+VKmd5qMuYge72Kd5HqOGOhpokvcIj3jCs1W3Iiuz7j5TrUKu2ca3ZT18AAR/kB0=

    r4AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVRIp6LLopsuKthVqKcl0WofmxWSilCL4A27108Q/0L/wzpiCWkQnJDlz7j1n5t7rJ4FIleO8FqyFxaXlleJqaW19Y3OrvL3TTuNMMt5icRDLK99LeSAi3lJCBfwqkdwL/YB3/PGZjnduuUxFHF2qScJ7oTeKxFAwTxF1Ifu1frniVB2z7Hng5qCCfDXj8guuMUAMhgwhOCIowgE8pPR04cJBQlwPU+IkIWHiHPcokTajLE4ZHrFj+o5o183ZiPbaMzVqRqcE9EpS2jggTUx5krA+zTbxzDhr9jfvqfHUd5vQ38+9QmIVboj9SzfL/K9O16IwxImpQVBNiWF0dSx3yUxX9M3tL1UpckiI03hAcUmYGeWsz7bRpKZ23VvPxN9Mpmb1nuW5Gd71LWnA7s9xzoP2UdV1qu55rVI/zUddxB72cUjzPEYdDTTRIu8RHvGEZ6thRVZm3X2mWoVcs4tvy3r4AAbfkB4=AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVRIp6LLopsuKthVqKcl0WofmxWSilCL4A27108Q/0L/wzpiCWkQnJDlz7j1n5t7rJ4FIleO8FqyFxaXlleJqaW19Y3OrvL3TTuNMMt5icRDLK99LeSAi3lJCBfwqkdwL/YB3/PGZjnduuUxFHF2qScJ7oTeKxFAwTxF1Ifu1frniVB2z7Hng5qCCfDXj8guuMUAMhgwhOCIowgE8pPR04cJBQlwPU+IkIWHiHPcokTajLE4ZHrFj+o5o183ZiPbaMzVqRqcE9EpS2jggTUx5krA+zTbxzDhr9jfvqfHUd5vQ38+9QmIVboj9SzfL/K9O16IwxImpQVBNiWF0dSx3yUxX9M3tL1UpckiI03hAcUmYGeWsz7bRpKZ23VvPxN9Mpmb1nuW5Gd71LWnA7s9xzoP2UdV1qu55rVI/zUddxB72cUjzPEYdDTTRIu8RHvGEZ6thRVZm3X2mWoVcs4tvy3r4AAbfkB4=AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVRIp6LLopsuKthVqKcl0WofmxWSilCL4A27108Q/0L/wzpiCWkQnJDlz7j1n5t7rJ4FIleO8FqyFxaXlleJqaW19Y3OrvL3TTuNMMt5icRDLK99LeSAi3lJCBfwqkdwL/YB3/PGZjnduuUxFHF2qScJ7oTeKxFAwTxF1Ifu1frniVB2z7Hng5qCCfDXj8guuMUAMhgwhOCIowgE8pPR04cJBQlwPU+IkIWHiHPcokTajLE4ZHrFj+o5o183ZiPbaMzVqRqcE9EpS2jggTUx5krA+zTbxzDhr9jfvqfHUd5vQ38+9QmIVboj9SzfL/K9O16IwxImpQVBNiWF0dSx3yUxX9M3tL1UpckiI03hAcUmYGeWsz7bRpKZ23VvPxN9Mpmb1nuW5Gd71LWnA7s9xzoP2UdV1qu55rVI/zUddxB72cUjzPEYdDTTRIu8RHvGEZ6thRVZm3X2mWoVcs4tvy3r4AAbfkB4=AAACxnicjVHLSsNAFD2Nr1pfVZdugkVwVRIp6LLopsuKthVqKcl0WofmxWSilCL4A27108Q/0L/wzpiCWkQnJDlz7j1n5t7rJ4FIleO8FqyFxaXlleJqaW19Y3OrvL3TTuNMMt5icRDLK99LeSAi3lJCBfwqkdwL/YB3/PGZjnduuUxFHF2qScJ7oTeKxFAwTxF1Ifu1frniVB2z7Hng5qCCfDXj8guuMUAMhgwhOCIowgE8pPR04cJBQlwPU+IkIWHiHPcokTajLE4ZHrFj+o5o183ZiPbaMzVqRqcE9EpS2jggTUx5krA+zTbxzDhr9jfvqfHUd5vQ38+9QmIVboj9SzfL/K9O16IwxImpQVBNiWF0dSx3yUxX9M3tL1UpckiI03hAcUmYGeWsz7bRpKZ23VvPxN9Mpmb1nuW5Gd71LWnA7s9xzoP2UdV1qu55rVI/zUddxB72cUjzPEYdDTTRIu8RHvGEZ6thRVZm3X2mWoVcs4tvy3r4AAbfkB4=

    r01AAACx3icjVHLSsNAFD2Nr1pfVZdugkV0VRIRdFl0o7sK9gG1lCSdtkPTJEwmxVJc+ANu9c/EP9C/8M44BbWITkhy5tx7zsy9109CnkrHec1ZC4tLyyv51cLa+sbmVnF7p57GmQhYLYjDWDR9L2Uhj1hNchmyZiKYN/JD1vCHFyreGDOR8ji6kZOEtUdeP+I9HnhSUeKw43aKJafs6GXPA9eAEsyqxsUX3KKLGAEyjMAQQRIO4SGlpwUXDhLi2pgSJwhxHWe4R4G0GWUxyvCIHdK3T7uWYSPaK89UqwM6JaRXkNLGAWliyhOE1Wm2jmfaWbG/eU+1p7rbhP6+8RoRKzEg9i/dLPO/OlWLRA9nugZONSWaUdUFxiXTXVE3t79UJckhIU7hLsUF4UArZ322tSbVtaveejr+pjMVq/aByc3wrm5JA3Z/jnMe1I/LrlN2r09KlXMz6jz2sI8jmucpKrhEFTXyHuART3i2rqzYGlt3n6lWzmh28W1ZDx+C45BMAAACx3icjVHLSsNAFD2Nr1pfVZdugkV0VRIRdFl0o7sK9gG1lCSdtkPTJEwmxVJc+ANu9c/EP9C/8M44BbWITkhy5tx7zsy9109CnkrHec1ZC4tLyyv51cLa+sbmVnF7p57GmQhYLYjDWDR9L2Uhj1hNchmyZiKYN/JD1vCHFyreGDOR8ji6kZOEtUdeP+I9HnhSUeKw43aKJafs6GXPA9eAEsyqxsUX3KKLGAEyjMAQQRIO4SGlpwUXDhLi2pgSJwhxHWe4R4G0GWUxyvCIHdK3T7uWYSPaK89UqwM6JaRXkNLGAWliyhOE1Wm2jmfaWbG/eU+1p7rbhP6+8RoRKzEg9i/dLPO/OlWLRA9nugZONSWaUdUFxiXTXVE3t79UJckhIU7hLsUF4UArZ322tSbVtaveejr+pjMVq/aByc3wrm5JA3Z/jnMe1I/LrlN2r09KlXMz6jz2sI8jmucpKrhEFTXyHuART3i2rqzYGlt3n6lWzmh28W1ZDx+C45BMAAACx3icjVHLSsNAFD2Nr1pfVZdugkV0VRIRdFl0o7sK9gG1lCSdtkPTJEwmxVJc+ANu9c/EP9C/8M44BbWITkhy5tx7zsy9109CnkrHec1ZC4tLyyv51cLa+sbmVnF7p57GmQhYLYjDWDR9L2Uhj1hNchmyZiKYN/JD1vCHFyreGDOR8ji6kZOEtUdeP+I9HnhSUeKw43aKJafs6GXPA9eAEsyqxsUX3KKLGAEyjMAQQRIO4SGlpwUXDhLi2pgSJwhxHWe4R4G0GWUxyvCIHdK3T7uWYSPaK89UqwM6JaRXkNLGAWliyhOE1Wm2jmfaWbG/eU+1p7rbhP6+8RoRKzEg9i/dLPO/OlWLRA9nugZONSWaUdUFxiXTXVE3t79UJckhIU7hLsUF4UArZ322tSbVtaveejr+pjMVq/aByc3wrm5JA3Z/jnMe1I/LrlN2r09KlXMz6jz2sI8jmucpKrhEFTXyHuART3i2rqzYGlt3n6lWzmh28W1ZDx+C45BMAAACx3icjVHLSsNAFD2Nr1pfVZdugkV0VRIRdFl0o7sK9gG1lCSdtkPTJEwmxVJc+ANu9c/EP9C/8M44BbWITkhy5tx7zsy9109CnkrHec1ZC4tLyyv51cLa+sbmVnF7p57GmQhYLYjDWDR9L2Uhj1hNchmyZiKYN/JD1vCHFyreGDOR8ji6kZOEtUdeP+I9HnhSUeKw43aKJafs6GXPA9eAEsyqxsUX3KKLGAEyjMAQQRIO4SGlpwUXDhLi2pgSJwhxHWe4R4G0GWUxyvCIHdK3T7uWYSPaK89UqwM6JaRXkNLGAWliyhOE1Wm2jmfaWbG/eU+1p7rbhP6+8RoRKzEg9i/dLPO/OlWLRA9nugZONSWaUdUFxiXTXVE3t79UJckhIU7hLsUF4UArZ322tSbVtaveejr+pjMVq/aByc3wrm5JA3Z/jnMe1I/LrlN2r09KlXMz6jz2sI8jmucpKrhEFTXyHuART3i2rqzYGlt3n6lWzmh28W1ZDx+C45BM

    r02AAACx3icjVHLSsNAFD2Nr1pfVZdugkV0VRIRdFl0o7sK9gG1lCSdtkPTJEwmxVJc+ANu9c/EP9C/8M44BbWITkhy5tx7zsy9109CnkrHec1ZC4tLyyv51cLa+sbmVnF7p57GmQhYLYjDWDR9L2Uhj1hNchmyZiKYN/JD1vCHFyreGDOR8ji6kZOEtUdeP+I9HnhSUeKwc9wplpyyo5c9D1wDSjCrGhdfcIsuYgTIMAJDBEk4hIeUnhZcOEiIa2NKnCDEdZzhHgXSZpTFKMMjdkjfPu1aho1orzxTrQ7olJBeQUobB6SJKU8QVqfZOp5pZ8X+5j3VnupuE/r7xmtErMSA2L90s8z/6lQtEj2c6Ro41ZRoRlUXGJdMd0Xd3P5SlSSHhDiFuxQXhAOtnPXZ1ppU16566+n4m85UrNoHJjfDu7olDdj9Oc55UD8uu07ZvT4pVc7NqPPYwz6OaJ6nqOASVdTIe4BHPOHZurJia2zdfaZaOaPZxbdlPXwAhUOQTQ==AAACx3icjVHLSsNAFD2Nr1pfVZdugkV0VRIRdFl0o7sK9gG1lCSdtkPTJEwmxVJc+ANu9c/EP9C/8M44BbWITkhy5tx7zsy9109CnkrHec1ZC4tLyyv51cLa+sbmVnF7p57GmQhYLYjDWDR9L2Uhj1hNchmyZiKYN/JD1vCHFyreGDOR8ji6kZOEtUdeP+I9HnhSUeKwc9wplpyyo5c9D1wDSjCrGhdfcIsuYgTIMAJDBEk4hIeUnhZcOEiIa2NKnCDEdZzhHgXSZpTFKMMjdkjfPu1aho1orzxTrQ7olJBeQUobB6SJKU8QVqfZOp5pZ8X+5j3VnupuE/r7xmtErMSA2L90s8z/6lQtEj2c6Ro41ZRoRlUXGJdMd0Xd3P5SlSSHhDiFuxQXhAOtnPXZ1ppU16566+n4m85UrNoHJjfDu7olDdj9Oc55UD8uu07ZvT4pVc7NqPPYwz6OaJ6nqOASVdTIe4BHPOHZurJia2zdfaZaOaPZxbdlPXwAhUOQTQ==AAACx3icjVHLSsNAFD2Nr1pfVZdugkV0VRIRdFl0o7sK9gG1lCSdtkPTJEwmxVJc+ANu9c/EP9C/8M44BbWITkhy5tx7zsy9109CnkrHec1ZC4tLyyv51cLa+sbmVnF7p57GmQhYLYjDWDR9L2Uhj1hNchmyZiKYN/JD1vCHFyreGDOR8ji6kZOEtUdeP+I9HnhSUeKwc9wplpyyo5c9D1wDSjCrGhdfcIsuYgTIMAJDBEk4hIeUnhZcOEiIa2NKnCDEdZzhHgXSZpTFKMMjdkjfPu1aho1orzxTrQ7olJBeQUobB6SJKU8QVqfZOp5pZ8X+5j3VnupuE/r7xmtErMSA2L90s8z/6lQtEj2c6Ro41ZRoRlUXGJdMd0Xd3P5SlSSHhDiFuxQXhAOtnPXZ1ppU16566+n4m85UrNoHJjfDu7olDdj9Oc55UD8uu07ZvT4pVc7NqPPYwz6OaJ6nqOASVdTIe4BHPOHZurJia2zdfaZaOaPZxbdlPXwAhUOQTQ==AAACx3icjVHLSsNAFD2Nr1pfVZdugkV0VRIRdFl0o7sK9gG1lCSdtkPTJEwmxVJc+ANu9c/EP9C/8M44BbWITkhy5tx7zsy9109CnkrHec1ZC4tLyyv51cLa+sbmVnF7p57GmQhYLYjDWDR9L2Uhj1hNchmyZiKYN/JD1vCHFyreGDOR8ji6kZOEtUdeP+I9HnhSUeKwc9wplpyyo5c9D1wDSjCrGhdfcIsuYgTIMAJDBEk4hIeUnhZcOEiIa2NKnCDEdZzhHgXSZpTFKMMjdkjfPu1aho1orzxTrQ7olJBeQUobB6SJKU8QVqfZOp5pZ8X+5j3VnupuE/r7xmtErMSA2L90s8z/6lQtEj2c6Ro41ZRoRlUXGJdMd0Xd3P5SlSSHhDiFuxQXhAOtnPXZ1ppU16566+n4m85UrNoHJjfDu7olDdj9Oc55UD8uu07ZvT4pVc7NqPPYwz6OaJ6nqOASVdTIe4BHPOHZurJia2zdfaZaOaPZxbdlPXwAhUOQTQ==

    0.6

    0.0

    0.20.4

    0.2

    0.60.4

    Shared

    Parameter

    0.60.0

    0.2

    0.4

    0.2

    0.60.4

    r03AAACx3icjVHLTsJAFD3UF+ILdemmkRhdkRYI4I7oRneYCJIgIW0ZoKGvTKdEQlz4A271z4x/oH/hnbEkuiA6Tds7555zZu69duS5sTCM94y2srq2vpHdzG1t7+zu5fcP2nGYcIe1nNALece2Yua5AWsJV3isE3Fm+bbH7uzJpczfTRmP3TC4FbOI9XxrFLhD17GEhPhpv9zPF4zieb1aqlR1o2gYNbNkyqBUq5QrukmIXAWkqxnm33CPAUI4SOCDIYCg2IOFmJ4uTBiICOthThinyFV5hkfkSJsQixHDInRC3xHtuika0F56xkrt0CkevZyUOk5IExKPUyxP01U+Uc4SXeY9V57ybjP626mXT6jAmNC/dAvmf3WyFoEh6qoGl2qKFCKrc1KXRHVF3lz/UZUgh4gwGQ8ozyl2lHLRZ11pYlW77K2l8h+KKVG5d1Jugk95SxrwYor68qBdKppG0bypFBoX6aizOMIxzmieNTRwhSZa5D3GM17wql1roTbVHr6pWibVHOLX0p6+ACB5kJA=AAACx3icjVHLTsJAFD3UF+ILdemmkRhdkRYI4I7oRneYCJIgIW0ZoKGvTKdEQlz4A271z4x/oH/hnbEkuiA6Tds7555zZu69duS5sTCM94y2srq2vpHdzG1t7+zu5fcP2nGYcIe1nNALece2Yua5AWsJV3isE3Fm+bbH7uzJpczfTRmP3TC4FbOI9XxrFLhD17GEhPhpv9zPF4zieb1aqlR1o2gYNbNkyqBUq5QrukmIXAWkqxnm33CPAUI4SOCDIYCg2IOFmJ4uTBiICOthThinyFV5hkfkSJsQixHDInRC3xHtuika0F56xkrt0CkevZyUOk5IExKPUyxP01U+Uc4SXeY9V57ybjP626mXT6jAmNC/dAvmf3WyFoEh6qoGl2qKFCKrc1KXRHVF3lz/UZUgh4gwGQ8ozyl2lHLRZ11pYlW77K2l8h+KKVG5d1Jugk95SxrwYor68qBdKppG0bypFBoX6aizOMIxzmieNTRwhSZa5D3GM17wql1roTbVHr6pWibVHOLX0p6+ACB5kJA=AAACx3icjVHLTsJAFD3UF+ILdemmkRhdkRYI4I7oRneYCJIgIW0ZoKGvTKdEQlz4A271z4x/oH/hnbEkuiA6Tds7555zZu69duS5sTCM94y2srq2vpHdzG1t7+zu5fcP2nGYcIe1nNALece2Yua5AWsJV3isE3Fm+bbH7uzJpczfTRmP3TC4FbOI9XxrFLhD17GEhPhpv9zPF4zieb1aqlR1o2gYNbNkyqBUq5QrukmIXAWkqxnm33CPAUI4SOCDIYCg2IOFmJ4uTBiICOthThinyFV5hkfkSJsQixHDInRC3xHtuika0F56xkrt0CkevZyUOk5IExKPUyxP01U+Uc4SXeY9V57ybjP626mXT6jAmNC/dAvmf3WyFoEh6qoGl2qKFCKrc1KXRHVF3lz/UZUgh4gwGQ8ozyl2lHLRZ11pYlW77K2l8h+KKVG5d1Jugk95SxrwYor68qBdKppG0bypFBoX6aizOMIxzmieNTRwhSZa5D3GM17wql1roTbVHr6pWibVHOLX0p6+ACB5kJA=AAACx3icjVHLTsJAFD3UF+ILdemmkRhdkRYI4I7oRneYCJIgIW0ZoKGvTKdEQlz4A271z4x/oH/hnbEkuiA6Tds7555zZu69duS5sTCM94y2srq2vpHdzG1t7+zu5fcP2nGYcIe1nNALece2Yua5AWsJV3isE3Fm+bbH7uzJpczfTRmP3TC4FbOI9XxrFLhD17GEhPhpv9zPF4zieb1aqlR1o2gYNbNkyqBUq5QrukmIXAWkqxnm33CPAUI4SOCDIYCg2IOFmJ4uTBiICOthThinyFV5hkfkSJsQixHDInRC3xHtuika0F56xkrt0CkevZyUOk5IExKPUyxP01U+Uc4SXeY9V57ybjP626mXT6jAmNC/dAvmf3WyFoEh6qoGl2qKFCKrc1KXRHVF3lz/UZUgh4gwGQ8ozyl2lHLRZ11pYlW77K2l8h+KKVG5d1Jugk95SxrwYor68qBdKppG0bypFBoX6aizOMIxzmieNTRwhSZa5D3GM17wql1roTbVHr6pWibVHOLX0p6+ACB5kJA=

    Rule Inference and Transfer

    Multi-channel Graph Neural Network

    GNN EncoderGNN

    Encoder

    GNN EncoderGNN Encoder

    Relation Weighting

    Align Model

    Figure 2: Framework. Rectangles denote two main steps, and rounded rectangles denote the key components ofthe corresponding step. After rule inference and transfer, we utilize rules to complete each KG, denoted by dashedlines r′3. Through relation weighting, we obtain multiple weighted graphs for different GNN channels, in whichrelation r4 is weighted to 0.0 that prunes exclusive entities. These channels are combined as the input for alignmodel for alignment-oriented KG embeddings.

    to find as many alignments as possible Ae ={(e, e′) ∈ E × E′|e ↔ e′} for which an equiv-alent relation ↔ holds between e and e′. That is,e and e′ are in different KGs, but denote the samething. As shown in Figure 1, Jilin City in EnglishWikipedia (i.e., KG1) and in Chinese Wikipedia(i.e., KG2) has different structures, but denote thesame Chinese city. Normally, some prior align-ments of entitiesAse and relationsAsr = {(r, r′) ∈R × R′|r ↔ r′} can be easily obtained manuallyor by simple lexicon-based methods (e.g., entitytitle translation), namely seed alignments (seedfor short). We use bold-face letters to denote thevector representations of the corresponding termsthroughout of the paper.

    2.2 Framework

    MuGNN aims at learning alignment-oriented KGembeddings for entity alignment. It introduces KGinference and transfer to explicitly complete KGs,and utilizes different relation weighting schemes:KG self-attention and cross-KG attention, to en-code KGs robustly. As shown in Figure 2, thereare two main steps in our framework:

    KG Completion aims at reconciling the structuraldifferences by completing the missing relations. Itnot only induces rules by using a popular rule min-ing system AMIE+ (Galárraga et al., 2015), butalso transfers them into each other based on seedaligned relations between KGs. Rule transferringis based on the assumption that knowledge can begeneralized into various KGs, no matter in which

    languages or domains.

    Multi-channel Graph Neural Network is to en-code each KG through different channels. Thechannels enhance the entity embeddings from dif-ferent perspectives: towards completion and prun-ing, so that the entities and their counterparts havesimilar structures. MuGNN contains three maincomponents: (1) relation weighting that gener-ates weight matrix for each KG according to twoschemes: KG self-attention and cross-KG atten-tion. Each type of attention refers to a GNNchannel that shares parameters between KGs forstructural knowledge transfer; (2) GNN encoder tomodel the entire graph features by improving en-tity embeddings with its neighbors, thus the seedalignment information shall be propagated overthe entire graph; We combine the outputs of GNNencoders in different channels via pooling tech-niques as the input of (3) Align model, which em-beds two KGs into a unified vector space by push-ing the aligned entities (and relations) of seeds to-gether.

    3 KG Completion

    In this section, we introduce how to utilize ruleknowledge to explicitly complete KG, which firstinfers rules from each KG, then transfers theserules between KGs based on knowledge invariantassumption, and finally grounds rules in each KGfor consistent completion.

  • 1455

    3.1 Rule Inference and TransferSince the acquirement of rule knowledge is not ourfocus in this paper, we utilize AMIE+ (Galárragaet al., 2015), a modern rule mining system, to effi-ciently find Horn rules from large-scale KG, suchas marriedTo(x, y) ∧ liveIn(x, z) ⇒ liveIn(y, z).Its source code is available online1.

    Formally, given two KGs G and G′, we firstmine rules separately and obtain two sets of ruleknowledge K and K′. These rule knowledge arequite different since KGs are constructed to meetdifferent demands of applications or languages.Although they can be used to complete their ownKGs separately, we further transfer the two sets ofrules into each other through Knowledge Invari-ant Assumption:

    Knowledge has universality no matter in whichlanguages or domains.

    Given aligned relations Asr and a rule k ∈ K,we replace all relations involved in the rule k =(rc|rs1, · · · , rsp) with its counterparts if there are(rc, r

    ′c), (rsi, r

    ′si) ∈ Asr, i = 1, · · · , p. Thus, we

    obtain such a rule k′ = (r′c|r′s1, · · · , r′sp) and addit to K̃′ = K′ ∪ k′ if k′ /∈ K′. Real exam-ples of transferred rules can be found in experi-ments. Note that there may be no transfered rulesif aligned relations can not be found Asr = ∅.

    3.2 Rule GroundingWe now ground each rule sets on the correspond-ing KG for completion, which not only acceler-ates the efficiency of align model through denserKG for propagation, but also adds extra constraintsthat is helpful for high-quality entity embeddinglearning.

    Take KG G as an example, given a rule k ∈K, we collect its grounds that the premise triplescan be found in the KG, but not the conclusiontriplet: G(k) = {g(k)|ts1, · · · , tsp ∈ T, tc /∈ T}.Thus, we add all conclusion triples into the KGG̃ = G∪tc, tc ∈ G(k). Similarly, we can completeKG G′ to G̃′.

    As shown in Figure 1, we obtain the ruleprovince(x, y) ∧ dialect(y, z) ⇒ dialect(x, z)from the informative KG2, then transfer it to KG1based on the aligned relation province and di-alect. Thus, in KG1, we find the suitable tripletsprovince(Jilin City, Jilin) ∧ dialect(Jilin, North-eastern Mandarin), thus obtain a new triplet di-alect(Jilin City, Northeastern Mandarin).

    1https://www.mpi-inf.mpg.de/

    It is worth noting that the inferred rules do nothold in all cases, and maybe we can consider theconfidence value for each grounding. We leave itin future work.

    4 Multi-Channel Graph Neural Network

    In this section, we describe the three main com-ponents involved in MuGNN to encode differ-ent graphs towards alignment-oriented embeddinglearning: relation weighting, multi-channel GNNencoder and align model.

    4.1 Relation Weighting

    Relation weighting is to generate weighted con-nectivity matrix A based on a graph G as the inputstructural features of GNN encoder, which will bedetailed later. Each element aij in the matrix de-note the weighted relation between ei and ej .

    As mentioned in Section 1, there are two typesof structure differences: the missing relations dueto the incompleteness nature of KG, and the exclu-sive entities caused by different construction de-mands of applications or languages. We utilizetwo channels of GNN encoder for each KG, soas to reconcile the two types of differences sep-arately. That is, we generate two adjacency ma-trices for each channel: A1 based on KG self-attention and A2 based on cross-KG attention.Next, we will describe how to compute each el-ement aij in A1 and A2. Similarly, we can obtainA′1 and A

    ′2 for KG G

    ′.

    KG Self-Attention

    KG self-attention aims at making better use ofseed alignments based on the KG structure itself.This component selects informative neighbors ac-cording to the current entity and assigns them withhigh weights. Following GAT (Velickovic et al.,2018), we define the normalized element aij inA1representing the connectivity from entity ei to ejas follows:

    aij = softmax(cij) =exp(cij)∑

    ek∈Nei∪eiexp(cik)

    (1)where ek ∈ Nei ∪ {ei} denotes neighbors of eiwith self-loop, and cij is the attention coefficientmeasuring the importance of ei to ej and is calcu-

    https://www.mpi-inf.mpg.de/

  • 1456

    lated by an attention function attn as follows:

    cij = attn(Wei,Wej)

    = LeakyReLU(p[Wei||Wej])(2)

    where || indicates vector concatenation, W and pare trainable parameters.

    Cross-KG AttentionCross-KG Attention aims at modeling the com-mon subgraph of two KGs as structural featurestowards consistency. It prunes exclusive entitiesby assigning lower weights for corresponding re-lations that have no counterparts in another KG.We define the aij in A2 as follows:

    aij = maxr∈R,r′∈R′

    1((ei, r, ej) ∈ T )sim(r, r′) (3)

    where 1(·) indicates 1 if holds true, otherwise 0.sim(·) is a similarity measure between relationtypes and is defined as inner-product sim(r, r′) =rT r′. Thus, aij is to find the best mapping betweentwo KGs, and shall be 0 if there is no such relationtypes for exclusive entities.

    4.2 Multi-Channel GNN Encoder

    GNN is a type of neural network model that dealswith graph-structured data, the main idea of whichis similar to a propagation model: to enhance thefeatures of a node (i.e., entity) according to itsneighbor nodes. Thus, we may stack multiple Llayers of GNNs to achieve further propagation.

    One of its variants is based on spectral graphconvolutions, such as GCN (Kipf and Welling,2017). Every GNN encoder takes the hidden statesof node representations in the current layer as in-puts, and computes new node representations as:

    GNN(A,H,W ) = σ(AHW) (4)

    where A is an adjacency matrix showing the con-nectivity between nodes, H is the current noderepresentations, W is the learned parameters, andσ is the activation function chosen as ReLU(·) =max(0, ·).

    Inspired by the multi-head attention net-works (Velickovic et al., 2018), we use the twoabove-mentioned strategies to calculate connec-tivity matrices as different channels to propagateinformation from different aspects and aggregate

    them with a Pooling function. As for our multi-channel GNN encoder, it is built by stacking mul-tiple GNN encoder defined as:

    MultiGNN(H l;A1, · · · , Ac) =Pooling(H l+11 , · · · , H

    l+1c )

    (5)

    where c is the number of the channels, Ai is theconnectivity matrices in the ith channel, and H l+1iis the computed hidden states in the (l+1)th layerand ith channel, which can be formulated as:

    Hl+1i = GNN(Ai, Hl,Wi) (6)

    whereWi is the weight parameters in the ith chan-nel. Here, we set i = 1, 2 referring to the abovetwo attention schemes. We set H0 as the entityembeddings initialized randomly. In experiments,we select average pooling techniques for Poolingfunction due to its superior performance.

    We use such multi-channel GNN encoders toencode each KG, and obtain HL, H′L representingthe enhanced entity embeddings, where each chan-nel shares parameters W1 = W ′1 and W2 = W

    ′2

    for structural knowledge transferring.

    4.3 Align ModelAlign model is to embed two KGs into a unifiedvector space by pushing the seed alignments of en-tities (and relations) together. We judge whethertwo entities or two relations are equivalent by thedistance between them. The objective of the alignmodel is given as below:

    La =∑

    (e,e′)∈Ase

    ∑(e−,e

    ′−)∈A

    se−

    [d(e, e′) + γ1 − d(e−, e

    ′−)]++

    ∑(r,r′)∈Asr

    ∑(r−,r

    ′−)∈A

    sr−

    [d(r, r′) + γ2 − d(r−, r

    ′−)]+

    (7)where [·]+ = max{0, ·} represents the maximumbetween 0 and the input, d(·) = || · ||2 is the dis-tance measure chosen as L2 distance, Ase− andAsr− represents for the negative pair set of Ase andAsr, respectively, and γ1 > 0 and γ2 > 0 are mar-gin hyper-parameters separating positive and neg-ative entity and relation alignments. During theexperiments, by calculating cosine similarity, weselect 25 entities closest to the corresponding en-tity in the same KG as negative samples (Sun et al.,2018). Negative samples will be re-calculated ev-ery 5 epochs.

  • 1457

    Rule Knowledge ConstraintsSince we have changed the KG structure by addingnew triplets (i.e., grounded rules), we also intro-duce the triplet loss to hold the grounded rules asvalid in the unified vector space.

    Taking KG G as an example, following Guoet al. (2016), we define the loss function as fol-lows:

    Lr =∑

    g+∈G(K)

    ∑g−∈G−(K)

    [γr − I(g+) + I(g−)]+

    +∑t+∈T

    ∑t−∈T−

    [γr − I(t+) + I(t−)]+

    (8)

    where g is short for rule grounding g(k), G(K) andT denote all rule grounds and all triplets. G−(K)and T− are negative sample sets obtained by re-placing one of the involved entity using nearestsampling (Sun et al., 2018). I(·) is the true valuefunction for triplet t:

    I(t) = 1− 13√d||ei + rij − ej ||2 (9)

    or for grounding g = (tc|ts1, · · · , tsp), which isrecursively calculated by:

    I(ts) = I(ts1 ∧ ts2) = I(ts1) · I(ts2)I(ts ⇒ tc) = I(ts) · I(tc)− I(ts) + 1

    (10)

    where d is the embedding size. Similarly, we ob-tain the loss L′r for KG G′. Thus, the overall lossfunction for multi-channel GNN is as follows:

    L = La + L′r + Lr (11)

    5 Experiment

    In this section, we conduct experiments on fivepublicly available datasets involving both differentlanguage pairs and sources. We further investigatethe key components of MuGNN and analyze howthe knowledge inference and transfer mechanismcontribute to KG alignment.

    5.1 Experiment Settings

    Datasets Following Sun et al. (2017, 2018),we conduct experiments on benchmark datasetsDBP15K and DWY100K. DBP15K contains threecross-lingual datasets: DBPZH-EN(Chinese to En-glish), DBPJA-EN (Japanese to English), and DBPFR-EN (French to English). All the above datasets

    are extracted from multilingual DBpedia and in-clude 15,000 entity pairs as seed alignments.DWY100K consists of two large-scale cross-resource datasets: DWY-WD (DBpedia to Wiki-data) and DWY-YG (DBpedia to YAGO3). Eachdataset includes 100,000 alignments of entitiesin advance. As for the seed alignments of rela-tions, we employ the official relation alignmentlist published by DBpedia for DWY100K. As forDWY-YG, we manually align the relations be-cause there are only a small set of relation types(31) in YAGO3. The statistics2 is listed in Table 1.

    Datasets |Asr| #Relation #Entity #Triple

    DBPZH891

    2,830 66,469 153,929DBPEN 2,317 98,125 237,674

    DBPJA582

    2,043 65,744 164,373DBPEN 2,096 95,680 233,319

    DBPFR75

    1,379 66,858 192,191DBPEN 2,209 105,889 278,590

    DWYDB62

    330 100,000 463,294DWYWD 220 100,000 448,774

    DWYDB24

    302 100,000 428,952DWYYG 31 100,000 502,563

    Table 1: Statistics of DBP15K and DWY100k.

    For each dataset, we employ AMIE+ for rulemining by setting the max number of premiseas p = 2 and PCA confidence not less than0.8. The statistical results of rules, transferredrules (Tr.Rule for short), ground triples and groundtriples based on transferred rules (Tr.ground forshort) are exhibited in Table 2.

    Datasets #Rule #Tr.Rule #Ground #Tr.ground

    DBPZH 2,279 1,058 46,959 19,278DBPEN 1,906 578 78,450 24,018

    DBPJA 1,440 651 61,733 25,337DBPEN 1,316 259 77,614 17,838

    DBPFR 1,263 25 77,342 1,527DBPEN 1,252 12 75,338 1,364

    DWYDB 843 40 281,271 13,136DWYWD 630 51 184,010 56,373

    DWYDB 503 4 277,031 92,923DWYYG 39 16 129,334 10,446

    Table 2: Statistics of KG inference and transfer.

    Baselines To investigate MuGNN’s ability onentity alignment, we select four competitivebaselines including three translation based mod-

    2|Asr| denotes the number of seed alignments of relations.

  • 1458

    MethodsDBPZH-EN DBPJA-EN DBPFR-EN DBP-WD DBP-YG

    H@1 H@10 MRR H@1 H@10 MRR H@1 H@10 MRR H@1 H@10 MRR H@1 H@10 MRR

    MTransE .308 .614 .364 .279 .575 .349 .244 .556 .335 .281 .520 .363 .252 .493 .334JAPE .412 .745 .490 .363 .685 .476 .324 .667 .430 .318 .589 .411 .236 .484 .320

    AlignEA .472 .792 .581 .448 .789 .563 .481 .824 .599 .566 .827 .655 .633 .848 .707GCN-Align .413 .744 .549 .399 .745 .546 .373 .745 .532 .506 .772 .600 .597 .838 .682

    MuGNN w/o Asr .479 .833 .597 .487 .851 .604 .496 .869 .621 .590 .887 .693 .730 .934 .801MuGNN .494 .844 .611 .501 .857 .621 .495 .870 .621 .616 .897 .714 .741 .937 .810

    Table 3: Overall performance.

    els and one graph-based model for comparison.MTransE (Chen et al., 2017) trains independentembedding of knowledge graph with TransE, andassigns the entity pairs in seed alignments withsimilar embeddings by minimizing their Euclideandistances. JAPE (Sun et al., 2017) learns therepresentation of entities and relations from dif-ferent KGs in a unified embedding space. Ittakes advantage of attribute triples to capture ho-mogeneous entity properties cross KGs. GCN-Align (Wang et al., 2018b) employs Graph Convo-lution Networks to construct entity representationby propagating information from the neighbor-hood. AlignEA (Sun et al., 2018) swaps alignedentities in triples to calibrate the embedding ofKGs in a unified embedding space. AlignEA isthe up to date non-iterative state of the art model.

    Training Details Following Sun et al. (2017,2018), we split 30% of entity seed alignments astraining data and left the remaining data for test-ing. By convention, Hits@N and Mean ReciprocalRank are used as evaluation metrics. Hits@N in-dicates the percentage of the targets that have beencorrectly ranked in top N (H in Table 3 for short).MRR is the average of the reciprocal of the rankresults. Higher Hits@N and MRR refer to higherperformance.

    To make a fair comparison, we set embeddingsize to 128 for MuGNN and all baselines. Allgraph models stack two layers of GNN. We uti-lize Adagrad (Duchi et al., 2011) as the optimizer.For the margins in MuGNN, we empirically setγ1 = 1.0 and γ2 = 1.0. We set γr = 0.12 to en-sure rule knowledge constraints have less impactthan the alignment model. Other hyperparame-ters are chosen by running an exhaustively searchover the following possible values: learning ratein {0.1, 0.01, 0.001}, L2 in {0.01, 0.001, 0.0001},dropout in {0.1, 0.2, 0.5}. The optimal configu-ration of MuGNN for entity alignment is: learn-

    ing rate= 0.001, L2= 0.01, dropout = 0.2. Weimplement MuGNN with PyTorch-1.0. The ex-periments are conducted on a server with two 6-core Intel Xeon E5-2620 [email protected] CPUs, twoGeForce GTX TITAN X and 128 GB of memory.500 epochs cost nearly one hour.

    5.2 Overall Performance

    Table 3 shows the experimental results onDBP15K and DWY100K. In general, MuGNNsignificantly outperforms all baselines regardingall metrics, mainly because it reconciles the struc-tural differences by two different schemes for KGcompletion and pruning, which are thus well mod-eled in multi-channel GNN.

    More specifically, on three small-scale cross-lingual datasets, the average gains of MuGNNregarding Hits@1, Hits@10 and MRR are 3%,6%, and 4%, respectively. While on large-scale datasets, MuGNN achieves significant im-provements (8%, 8% and 8% regarding Hits@1,Hits@10 and MRR, respectively). This is mainlybecause the large-scale datasets (e.g., DBP-YG)provide more prior knowledge (more than 3.5 factsper entity v.s. less than 2.5 facts in DBP15K) forrule mining, thus our proposed method has morecapability in reconciling the structural differencesbetween KGs, and makes better use of seed align-ments.

    Since some methods do not rely on seed align-ments of relations, we also test MuGNN withoutthem, marked as MuGNN w/o Asr. This also im-plies that we have no transferred rules betweenKGs. We can see that our method still performscompetitively, and even achieves the best Hits@1and MRR on DBPFR-EN. This is because the culturedifference between French and English is muchsmaller than that between Chinese/Japanese andEnglish, thus there is only a few exclusive rulesmined from each KG, which can be transferred to-wards consistent completion (25 and 12 pieces of

  • 1459

    10.0% 15.0% 20.0% 25.0% 30.0% 35.0% 40.0% 45.0% 50.0%

    (A) DBPZH-EN

    0

    20

    40

    60

    80

    100

    Hit

    s@N

    10.0% 15.0% 20.0% 25.0% 30.0% 35.0% 40.0% 45.0% 50.0%

    (B) DBPJA-EN10.0% 15.0% 20.0% 25.0% 30.0% 35.0% 40.0% 45.0% 50.0%

    (C) DBPFR-EN

    Hits@1Hits@10

    AlignEA

    GCN-AlignMuGNN

    Figure 3: Sensitivity to entity seed alignments (x-axis: proportion of seed alignments used for training).

    rules transferred between two KGs, as shown inTable 2).

    We also observe that GNN-based method (i.e.,GCN-Align) performs better than translation-based methods except AlignEA. To better under-stand their advantages and disadvantages, we fur-ther conduct ablation study as follows.

    5.3 Impact of Two Channels and RuleTransfer

    DBPZH-EN DBPJA-EN DBPFR-ENHits@1

    46

    47

    48

    49

    50

    51

    52

    MuGNN

    MuGNN w/o CroAtt

    MuGNN w/o SelAtt

    MuGNN w/o RulTra

    DBPZH-EN DBPJA-EN DBPFR-ENMRR

    0.56

    0.57

    0.58

    0.59

    0.60

    0.61

    0.62

    0.63

    0.64

    Figure 4: Impact of two channels and rule transfer.

    The core components of MuGNN involve twochannels based on KG self-attention and cross-KG attention, and rule transfer towards consistentcompletion based on knowledge invariant assump-tion. We thus remove them from our model toto investigate their impacts to reconcile the struc-tural differences, marked as MuGNN w/o SelAtt,MuGNN w/o CroAtt and MuGNN w/o RulTra.

    As shown in Figure 4, there is a performancedrop in MuGNN w/o SelAtt and MuGNN w/oCroAtt as compared to MuGNN, which demon-strates the effectiveness of both channels. Specif-ically, the performance decrease more with theloss of cross-KG attention channel than that ofKG self-attention, which implies the importanceof utilizing cross-KG information for entity align-ment. As for rule transfer, we can see that inmost cases, it contributes much in performance.

    However, the performance difference betweenMuGNN and MuGNN w/o RulTra is negligible onDBPFR-EN. The reason is that the ground rule tripleamounts for French and English datasets are lim-ited (Table 2), which are less than 1% of the ora-cle triples. Therefore, rule transfer cannot providesufficient cross-graph heterogeneous structure in-formation. As a contrast, DBPJA-EN and DBPZH-ENprovide more than 10k ground-rule triples, whichgain decent performance improvements from ruletransfer.

    5.4 Impact of Seed AlignmentsTo investigate the advantages and disadvantagesbetween GNN-based method and translation-based methods, we test MuGNN, GCN-Align andAlignEA using different size of seed alignments.We gradually increase the proportion of entityseeds from 10% to 50%, and we can see themodel’s sensitivity to seed alignments.

    As shown in Figure 3, GNN-based methods per-form better than translation-based methods whenthere is only limited seeds available (10%), butperform worse along with the increase of seedalignments. This is because graph models canmake better of seeds by propagating them over theentire structure, while they suffer from the hetero-geneity between KGs due to the GNN’s sensitivityto structural differences, which lead to propaga-tion errors aggregation. However, the performanceof translation-based methods increases graduallyalong with the growing seeds since it can implic-itly complete KG via knowledge representationlearning, such as transE. MuGNN utilizes AMIE+to explicitly complete two KGs via rule miningand transfer, which reconciles the structural differ-ences; meanwhile, the GNN encoders make betteruse of seed information via two channels over thegraphs.

  • 1460

    (U.S., leaderTitle, U.S.President) ∧ (U.S.Secretary of State, reports to, U.S.President)⇒ (U.S.Secretary of State, seat, U.S.)(Chiang Kaishek,party,Kuomingtang)∧ (Chiang Weikuo,president,Chiang Kaishek)⇒ (Chiang Weikuo,party,Kuomintang)

    Table 4: Examples of groundings of transferred rules.

    5.5 Qualitative Analysis

    We qualitatively analyze how the rule works bypresenting the transferred rules and their ground-ings in Table 4. We can see the rule groundingin the first line indicates a common knowledgein the United States, which thus is easily minedin English KG DBPEN. Meanwhile, we find thatsuch knowledge is missing in DBPZH, the Chi-nese KG. By transferring the corresponding rulesfrom DBPEN to DBPZH, the asymmetric informa-tion is smoothed. Corresponding entities in Chi-nese DBPZH shall have a similar structure withtheir counterparts in English DBPEN, thus similarembeddings. That is, MuGNN indeed reconcilesstructural differences by rule transfer, and learnsalignment-oriented embeddings. The second linepresents a similar case that transfers a Chinesecommon rule knowledge into English KG. Thisdemonstrates the effectiveness of rule transfer.Error Analysis: As shown in Table 2, the only4 rules transfer from YAGO3 to DBpedia aregrounded to 92,923 new ground rule triples, whichis shocking and not informative. Further investi-gation finds that the rule (a, team, b) ⇒ (a, af-filiation, b) alone contributes 92,743 ground ruletriples. Although the rule is logically correct, it issuspicious such a rule that establishes similar rela-tions between entities would benefit entity align-ment. We will deal with such noise in future.

    6 Related Work

    Merging different KGs into a unified one has at-tracted much attention since it shall benefit manyKnowledge-driven applications, such as informa-tion extraction (Cao et al., 2017a, 2018b), questionanswering (Zhang et al., 2015) and recommenda-tion (Cao et al., 2019). Early approaches for en-tity alignment leverage various features to over-come the heterogeneity between KGs, such as ma-chine translation and external lexicons (Suchaneket al., 2011; Wang et al., 2013). Followingthe success of KG representation learning, re-cent work embeds entities in different KGs intoa low-dimensional vector space with the helpof seed alignments (Chen et al., 2017). How-ever, the limited seeds and structural differences

    take great negative impacts on the quality of KGembeddings, which performs alignment poorly.JAPE (Sun et al., 2017) and KDCoE (Chen et al.,2018) introduced attributes or descriptions infor-mation to improve entity embeddings, while IP-TransE (Zhu et al., 2017) and BootEA (Sun et al.,2018) enlarged the seed set by selecting predictedalignments with high confidence iteratively.

    Clearly, the above strategies can be seen asa general enhancement for most alignment ap-proaches (Sun et al., 2018), thus we focus on im-proving the alignment performance without anyexternal information and in a non-iterative way.Inspired by Wang et al. (2018b), which uti-lize Graph Convolutional Network (GCN) (Kipfand Welling, 2017) to encode the entire KGs,we aim at reconciling the heterogeneity betweenKGs through completion and pruning, and learnalignment-oriented KG embeddings by modelingstructural features from different perspectives viaMulti-channel GNNs.

    7 Conclusions

    In this paper, we propose a novel Multi-channelGraph Neural Network model, MuGNN, whichlearns alignment-oriented KG embeddings for en-tity alignment. It is able to alleviate the nega-tive impacts caused by the structural heterogeneityand limited seed alignments. Through two chan-nels, MuGNN not only explicitly completes theKGs, but also pruning exclusive entities by us-ing different relation weighting schemes: KG self-attention and cross-KG attention, showing robustgraph encoding capability. Extensive experimentson five publicly available datasets and further anal-ysis demonstrate the effectiveness of our method.

    In future, we are interested in introducing textinformation of entities for alignment by consider-ing word ambiguity (Cao et al., 2017b); and mean-while, through cross-KG entity proximity (Caoet al., 2015).

    Acknowledgments

    NExT++ research is supported by the National Re-search Foundation, Prime Minister’s Office, Sin-gapore under its IRC@SG Funding Initiative.

  • 1461

    ReferencesAntoine Bordes, Nicolas Usunier, Alberto Garcia-

    Duran, Jason Weston, and Oksana Yakhnenko.2013. Translating embeddings for modeling multi-relational data. In NIPS.

    Yixin Cao, Lei Hou, Juanzi Li, and Zhiyuan Liu.2018a. Neural collective entity linking. In Proceed-ings of the 27th International Conference on Com-putational Linguistics, pages 675–686.

    Yixin Cao, Lei Hou, Juanzi Li, Zhiyuan Liu,Chengjiang Li, Xu Chen, and Tiansi Dong. 2018b.Joint representation learning of cross-lingual wordsand entities via attentive distant supervision. InEMNLP.

    Yixin Cao, Lifu Huang, Heng Ji, Xu Chen, and JuanziLi. 2017a. Bridge text and knowledge by learningmulti-prototype entity mention embedding. In ACL.

    Yixin Cao, Juanzi Li, Xiaofei Guo, Shuanhu Bai, HengJi, and Jie Tang. 2015. Name list only? target entitydisambiguation in short texts. In EMNLP.

    Yixin Cao, Jiaxin Shi, Juanzi Li, Zhiyuan Liu, andChengjiang Li. 2017b. On modeling sense related-ness in multi-prototype word embedding. In IJC-NLP.

    Yixin Cao, Xiang Wang, Xiangnan He, Tat-Seng Chua,et al. 2019. Unifying knowledge graph learning andrecommendation: Towards a better understanding ofuser preferences. arXiv preprint arXiv:1902.06236.

    Muhao Chen, Yingtao Tian, Kai-Wei Chang, StevenSkiena, and Carlo Zaniolo. 2018. Co-training em-beddings of knowledge graphs and entity descrip-tions for cross-lingual entity alignment. In IJCAI.

    Muhao Chen, Yingtao Tian, Mohan Yang, and CarloZaniolo. 2017. Multilingual knowledge graph em-beddings for cross-lingual knowledge alignment. InIJCAI.

    John Duchi, Elad Hazan, and Yoram Singer. 2011.Adaptive subgradient methods for online learningand stochastic optimization. Journal of MachineLearning Research.

    Luis Galárraga, Christina Teflioudi, Katja Hose, andFabian M Suchanek. 2015. Fast rule mining in on-tological knowledge bases with amie++. The VLDBJournal—The International Journal on Very LargeData Bases.

    Shu Guo, Quan Wang, Lihong Wang, Bin Wang, andLi Guo. 2016. Jointly embedding knowledge graphsand logical rules. In ACL.

    Thomas N Kipf and Max Welling. 2017. Semi-supervised classification with graph convolutionalnetworks. In ICLR.

    Thomas Rebele, Fabian Suchanek, Johannes Hoffart,Joanna Biega, Erdal Kuzey, and Gerhard Weikum.2016. Yago: A multilingual knowledge base fromwikipedia, wordnet, and geonames. In ISWC.

    Fabian M Suchanek, Serge Abiteboul, and PierreSenellart. 2011. Paris: Probabilistic alignment ofrelations, instances, and schema. Proceedings of theVLDB Endowment.

    Zequn Sun, Wei Hu, and Chengkai Li. 2017.Cross-lingual entity alignment via joint attribute-preserving embedding. In ISWC.

    Zequn Sun, Wei Hu, Qingheng Zhang, and YuzhongQu. 2018. Bootstrapping entity alignment withknowledge graph embedding. In IJCAI.

    Petar Velickovic, Guillem Cucurull, Arantxa Casanova,Adriana Romero, Pietro Liò, and Yoshua Bengio.2018. Graph attention networks. In ICLR.

    Xiang Wang, Dingxian Wang, Canran Xu, XiangnanHe, Yixin Cao, and Tat-Seng Chua. 2018a. Explain-able reasoning over knowledge graphs for recom-mendation. arXiv preprint arXiv:1811.04540.

    Zhichun Wang, Juanzi Li, and Jie Tang. 2013. Boost-ing cross-lingual knowledge linking via concept an-notation. In IJCAI.

    Zhichun Wang, Qingsong Lv, Xiaohan Lan, andYu Zhang. 2018b. Cross-lingual knowledge graphalignment via graph convolutional networks. InEMNLP.

    Mengdi Zhang, Tao Huang, Yixin Cao, and Lei Hou.2015. Target detection and knowledge learning fordomain restricted question answering. In NLPCC.

    Hao Zhu, Ruobing Xie, Zhiyuan Liu, and MaosongSun. 2017. Iterative entity alignment via jointknowledge embeddings. In IJCAI.