AQT: Adversarial Query Transformers for Domain Adaptive Object Detection

Wei Jie Huang, Yu Lin Lu, Shih Yao Lin, Yusheng Xie, Yen Yu Lin

研究成果: Conference contribution同行評審

18 引文 斯高帕斯(Scopus)

摘要

Adversarial feature alignment is widely used in domain adaptive object detection. Despite the effectiveness on CNN-based detectors, its applicability to transformer-based detectors is less studied. In this paper, we present AQT (adversarial query transformers) to integrate adversarial feature alignment into detection transformers. The generator is a detection transformer which yields a sequence of feature tokens, and the discriminator consists of a novel adversarial token and a stack of cross-attention layers. The cross-attention layers take the adversarial token as the query and the feature tokens from the generator as the key-value pairs. Through adversarial learning, the adversarial token in the discriminator attends to the domain-specific feature tokens, while the generator produces domain-invariant features, especially on the attended tokens, hence realizing adversarial feature alignment on transformers. Thorough experiments over several domain adaptive object detection benchmarks demonstrate that our approach performs favorably against the state-of-the-art methods. Source code is available at https://github.com/weii41392/AQT.

原文English
主出版物標題Proceedings of the 31st International Joint Conference on Artificial Intelligence, IJCAI 2022
編輯Luc De Raedt, Luc De Raedt
發行者International Joint Conferences on Artificial Intelligence
頁面972-979
頁數8
ISBN(電子)9781956792003
出版狀態Published - 2022
事件31st International Joint Conference on Artificial Intelligence, IJCAI 2022 - Vienna, 奧地利
持續時間: 23 7月 202229 7月 2022

出版系列

名字IJCAI International Joint Conference on Artificial Intelligence
ISSN(列印)1045-0823

Conference

Conference31st International Joint Conference on Artificial Intelligence, IJCAI 2022
國家/地區奧地利
城市Vienna
期間23/07/2229/07/22

指紋

深入研究「AQT: Adversarial Query Transformers for Domain Adaptive Object Detection」主題。共同形成了獨特的指紋。

引用此