2021 journal article

Robust Multi-Modal Sensor Fusion: An Adversarial Approach

IEEE SENSORS JOURNAL, 21(2), 1885–1896.

co-author countries: United States of America πŸ‡ΊπŸ‡Έ
author keywords: Sensor fusion; Sensor phenomena and characterization; Generators; Sensor systems; Generative adversarial networks; Feature extraction; Multi-modal sensors; target detection; Generative Adversarial Networks (GAN); Event Driven Fusion
Source: Web Of Science
Added: January 19, 2021

In recent years, multi-modal fusion has attracted a lot of research interest, both in academia, and in industry. Multimodal fusion entails the combination of information from a set of different types of sensors. Exploiting complementary information from different sensors, we show that target detection and classification problems can greatly benefit from this fusion approach and result in a performance increase. To achieve this gain, the information fusion from various sensors is shown to require some principled strategy to ensure that additional information is constructively used, and has a positive impact on performance. We subsequently demonstrate the viability of the proposed fusion approach by weakening the strong dependence on the functionality of all sensors, hence introducing additional flexibility in our solution and lifting the severe limitation in unconstrained surveillance settings with potential environmental impact. Our proposed data driven approach to multimodal fusion, exploits selected optimal features from an estimated latent space of data across all modalities. This hidden space is learned via a generative network conditioned on individual sensor modalities. The hidden space, as an intrinsic structure, is then exploited in detecting damaged sensors, and in subsequently safeguarding the performance of the fused sensor system. Experimental results show that such an approach can achieve automatic system robustness against noisy/damaged sensors.