SeeByte to Develop Enhanced Situational Awareness for Future Active Protection Systems
Future Active Protection Systems (APS), and specifically Modular Integrated Protection Systems (MIPS) are likely to incorporate passive sensor subsystems as a crucial element of their sensing suite.
With advanced imaging processing techniques, passive imaging sensors could provide additional enhanced situational awareness (SA) capabilities, such as object detection, identification and tracking, and image segmentation and range estimation, whilst also providing their core APS function.
Under Phase 2 of DASA’s “Advanced Vision 2020 and Beyond” competition, run by the Defence Science and Technology Laboratory (Dstl) and Defence and Security Accelerator (DASA), SeeByte will develop a state-of-the-art multi-task DNN framework. The multi-task architecture provides semantic image segmentation, object detection, and depth estimate (bearing and range) outputs for monocular electro-optics (EO/IR) sensors.
SeeByte’s previous experience with multi-task DNN architectures has demonstrated that it is possible to substantially compress DNN model size and complexity without a drop in performance.
In later phases SeeByte will also address limited imagery datasets containing relevant target objects by using Generative Adversarial Networks (GAN) to inject synthetic objects into real imagery.