Posted: 2021/08/18 17:02 | Author: NICA

The first author of the article is Dan Li, title:“Deep Modality Assistance Co-Training Network for Semi-Supervised Multi-Label Semantic Decoding”

Abstract:Multi-label semantic decoding is a challenging task with great scientific significance and application value. The existing methods mainly focus on label learning and ignore the amount of information contained in the sample itself, especially non-image sample, which may limit their performance. To address these issues, we propose a novel semi-supervised modality assistance co-training network, which utilizes image modality to assist non-image modality for multi-label learning. In real application, there are two thorny issues: (i) non-image modality tends to be missing owing to the difficulty in obtaining them; (ii) although the image modality is easy to obtain from the Internet, image label annotation is still time-consuming and expensive. Therefore, the proposed method utilizes a small number of paired & labeled images and non-image modalities, and a large number of unpaired & unlabeled images from web sources to improve results. It consists of the modality-specific feature generators, the feature translators and the label relationship network. Specifically, the modality-specific feature generators are used to generate different features (views) for each modality. Semantic translators are employed to capture the relationship between the paired modalities and impute the missing modality feature by using unpaired & unlabeled images. Label relation network is a graph convolution network (GCN) aiming to capture the correlation between labels. To mine the information in unlabeled features, the co-training mechanism is considered. With this mechanism, we introduce a multi-view orthogonality constraint and a multi-label co-regularization constraint. Extensive experiments on three computer vision and neuroscience datasets demonstrate the effectiveness of the proposed method.

Article link:https://ieeexplore.ieee.org/document/9516952