论文标题
DevFormer:用于上下文感知设备放置的对称变压器
DevFormer: A Symmetric Transformer for Context-Aware Device Placement
论文作者
论文摘要
在本文中,我们介绍了Devformer,这是一种基于变压器的新型体系结构,用于解决硬件设计优化的复杂且苛刻的问题。尽管变压器在包括自然语言处理和计算机视觉在内的域中表现出了功效,但它们在硬件设计中的使用受到离线数据的稀缺的限制。我们的方法通过引入强烈的归纳偏见(例如相对位置嵌入和动作渗透对称性)来解决这种限制,从而有效地捕获硬件上下文,并使用有限的离线数据实现有效的设计优化。我们将DevFoemer应用于解耦电容器放置的问题,并表明它在模拟和真实硬件中都优于最先进的方法,从而改善了性能,同时将组件数量降低了30美元以上。最后,我们表明我们的方法在其他基于上下文的组合优化任务中实现了有希望的结果。
In this paper, we present DevFormer, a novel transformer-based architecture for addressing the complex and computationally demanding problem of hardware design optimization. Despite the demonstrated efficacy of transformers in domains including natural language processing and computer vision, their use in hardware design has been limited by the scarcity of offline data. Our approach addresses this limitation by introducing strong inductive biases such as relative positional embeddings and action-permutation symmetricity that effectively capture the hardware context and enable efficient design optimization with limited offline data. We apply DevFoemer to the problem of decoupling capacitor placement and show that it outperforms state-of-the-art methods in both simulated and real hardware, leading to improved performances while reducing the number of components by more than $30\%$. Finally, we show that our approach achieves promising results in other offline contextual learning-based combinatorial optimization tasks.