Scalable Interconnect Learning in Boolean Networks
Scalable Interconnect Learning in Boolean Networks
Learned Differentiable Boolean Logic Networks (DBNs) already deliver efficient inference on resource-constrained hardware. We extend them with a trainable, differentiable interconnect whose parameter count remains constant as input width grows, allowing DBNs to scale to far wider layers than earlier learnable-interconnect designs while preserving their advantageous accuracy. To further reduce model size, we propose two complementary pruning stages: an SAT-based logic equivalence pass that removes redundant gates without affecting performance, and a similarity-based, data-driven pass that outperforms a magnitude-style greedy baseline and offers a superior compression-accuracy trade-off.
Fabian Kresse、Emily Yu、Christoph H. Lampert
计算技术、计算机技术
Fabian Kresse,Emily Yu,Christoph H. Lampert.Scalable Interconnect Learning in Boolean Networks[EB/OL].(2025-07-03)[2025-07-18].https://arxiv.org/abs/2507.02585.点此复制
评论