Neural-Symbolic Recursive Machine for Systematic Generalization

Qing Li1     Yixin Zhu3     Yitao Liang1,3     Ying Nian Wu2     Song-Chun Zhu1,3     Siyuan Huang1    
1Beijing Institute for General Artificial Intelligence (BIGAI)    
2UCLA     3Peking University    

TL;DR: We present Neural-Symbolic Recursive Machine for systematic generalization, which achieves state-of-the-art performance on SCAN, PCFG, and HINT.

Abstract

Current learning models often struggle with human-like systematic generalization, particularly in learning compositional rules from limited data and extrapolating them to novel combinations. We introduce the Neural-Symbolic Recursive Machine (NSR), whose core is a Grounded Symbol System (GSS), allowing for the emergence of combinatorial syntax and semantics directly from training data. The NSR employs a modular design that integrates neural perception, syntactic parsing, and semantic reasoning. These components are synergistically trained through a novel deduction-abduction algorithm. Our findings demonstrate that NSR’s design, imbued with the inductive biases of equivariance and compositionality, grants it the expressiveness to adeptly handle diverse sequence-to-sequence tasks and achieve unparalleled systematic generalization. We evaluate NSR’s efficacy across four challenging benchmarks designed to probe systematic generalization capabilities: SCAN for semantic parsing, PCFG for string manipulation, HINT for arithmetic reasoning, and a compositional machine translation task. The results affirm NSR’s superiority over contemporary neural and hybrid models in terms of generalization and transferability.

BibTeX

@inproceedings{li2024nsr,
    title={Neural-Symbolic Recursive Machine for Systematic Generalization},
    author={Li, Qing and Zhu, Yixin and Liang, Yitao and Wu, Ying Nian and Zhu, Song-Chun and Huang, Siyuan},
    booktitle={International Conference on Learning Representations (ICLR)},
    year={2024}
}