|国家预印本平台
首页|Directed Graph Grammars for Sequence-based Learning

Directed Graph Grammars for Sequence-based Learning

Directed Graph Grammars for Sequence-based Learning

来源:Arxiv_logoArxiv
英文摘要

Directed acyclic graphs (DAGs) are a class of graphs commonly used in practice, with examples that include electronic circuits, Bayesian networks, and neural architectures. While many effective encoders exist for DAGs, it remains challenging to decode them in a principled manner, because the nodes of a DAG can have many different topological orders. In this work, we propose a grammar-based approach to constructing a principled, compact and equivalent sequential representation of a DAG. Specifically, we view a graph as derivations over an unambiguous grammar, where the DAG corresponds to a unique sequence of production rules. Equivalently, the procedure to construct such a description can be viewed as a lossless compression of the data. Such a representation has many uses, including building a generative model for graph generation, learning a latent space for property prediction, and leveraging the sequence representational continuity for Bayesian Optimization over structured data. Code is available at https://github.com/shiningsunnyday/induction.

Michael Sun、Orion Foo、Gang Liu、Wojciech Matusik、Jie Chen

计算技术、计算机技术

Michael Sun,Orion Foo,Gang Liu,Wojciech Matusik,Jie Chen.Directed Graph Grammars for Sequence-based Learning[EB/OL].(2025-05-28)[2025-06-25].https://arxiv.org/abs/2505.22949.点此复制

评论