↓ Skip to main content

Multi-Label Noise Robust Collaborative Learning for Remote Sensing Image Classification

Overview of attention for article published in IEEE Transactions on Neural Networks and Learning Systems, May 2024
Altmetric Badge

About this Attention Score

  • Above-average Attention Score compared to outputs of the same age (53rd percentile)
  • Above-average Attention Score compared to outputs of the same age and source (62nd percentile)

Mentioned by

twitter
3 X users

Citations

dimensions_citation
15 Dimensions

Readers on

mendeley
13 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Multi-Label Noise Robust Collaborative Learning for Remote Sensing Image Classification
Published in
IEEE Transactions on Neural Networks and Learning Systems, May 2024
DOI 10.1109/tnnls.2022.3209992
Pubmed ID
Authors

Ahmet Kerem Aksoy, Mahdyar Ravanbakhsh, Begüm Demir

Abstract

The development of accurate methods for multi-label classification (MLC) of remote sensing (RS) images is one of the most important research topics in RS. The MLC methods based on convolutional neural networks (CNNs) have shown strong performance gains in RS. However, they usually require a high number of reliable training images annotated with multiple land-cover class labels. Collecting such data is time-consuming and costly. To address this problem, the publicly available thematic products, which can include noisy labels, can be used to annotate RS images with zero-labeling cost. However, multi-label noise (which can be associated with wrong and missing label annotations) can distort the learning process of the MLC methods. To address this problem, we propose a novel multi-label noise robust collaborative learning (RCML) method to alleviate the negative effects of multi-label noise during the training phase of a CNN model. RCML identifies, ranks, and excludes noisy multi-labels in RS images based on three main modules: 1) the discrepancy module; 2) the group lasso module; and 3) the swap module. The discrepancy module ensures that the two networks learn diverse features, while producing the same predictions. The task of the group lasso module is to detect the potentially noisy labels assigned to multi-labeled training images, while the swap module is devoted to exchange the ranking information between two networks. Unlike the existing methods that make assumptions about noise distribution, our proposed RCML does not make any prior assumption about the type of noise in the training set. The experiments conducted on two multi-label RS image archives confirm the robustness of the proposed RCML under extreme multi-label noise rates. Our code is publicly available at: http://www.noisy-labels-in-rs.org.

X Demographics

X Demographics

The data shown below were collected from the profiles of 3 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 13 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 13 100%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 3 23%
Professor > Associate Professor 1 8%
Lecturer 1 8%
Student > Doctoral Student 1 8%
Unknown 7 54%
Readers by discipline Count As %
Computer Science 4 31%
Agricultural and Biological Sciences 1 8%
Psychology 1 8%
Engineering 1 8%
Unknown 6 46%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 2. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 27 October 2022.
All research outputs
#16,383,995
of 25,864,668 outputs
Outputs from IEEE Transactions on Neural Networks and Learning Systems
#1,422
of 3,418 outputs
Outputs of similar age
#73,833
of 169,369 outputs
Outputs of similar age from IEEE Transactions on Neural Networks and Learning Systems
#14
of 40 outputs
Altmetric has tracked 25,864,668 research outputs across all sources so far. This one is in the 34th percentile – i.e., 34% of other outputs scored the same or lower than it.
So far Altmetric has tracked 3,418 research outputs from this source. They receive a mean Attention Score of 2.7. This one has gotten more attention than average, scoring higher than 57% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 169,369 tracked outputs that were published within six weeks on either side of this one in any source. This one has gotten more attention than average, scoring higher than 53% of its contemporaries.
We're also able to compare this research output to 40 others from the same source and published within six weeks on either side of this one. This one has gotten more attention than average, scoring higher than 62% of its contemporaries.