Sea ice has a significant effect on climate change and ship navigation. Hence, it is crucial to draw sea-ice maps that reflect the geographical distribution of different types of sea ice. Many automatic sea-ice classification methods using synthetic aperture radar (SAR) images are based on the polarimetric characteristics or image texture features of sea ice. They either require professional knowledge to design the parameters and features or are sensitive to noise and condition changes. Moreover, ice changes over time are often ignored. In this article, we propose a new SAR sea-ice image classification method based on a combined learning of spatial and temporal features, derived from residual convolutional neural networks (ResNet) and long short-term memory (LSTM) networks. In this way, we achieve automatic and refined classification of sea-ice types. First, we construct a seven-type ice data set according to the Canadian Ice Service ice charts. We extract spatial feature vectors of a time series of sea-ice samples using a trained ResNet network. Then, using the feature vectors as inputs, the LSTM network further learns the variation of the set of sea-ice samples with time. Finally, the extracted high-level features are fed into a softmax classifier to output the most recent ice type. Taking both spatial features and time variation into consideration, our method can achieve a high classification accuracy of 95.7% for seven ice types. Our method can automatically produce more objective sea-ice interpretation maps, allowing detailed sea-ice distribution and improving the efficiency of sea-ice monitoring tasks.

Automatic Sea-Ice Classification of SAR Images Based on Spatial and Temporal Features Learning

Perra C.
2021-01-01

Abstract

Sea ice has a significant effect on climate change and ship navigation. Hence, it is crucial to draw sea-ice maps that reflect the geographical distribution of different types of sea ice. Many automatic sea-ice classification methods using synthetic aperture radar (SAR) images are based on the polarimetric characteristics or image texture features of sea ice. They either require professional knowledge to design the parameters and features or are sensitive to noise and condition changes. Moreover, ice changes over time are often ignored. In this article, we propose a new SAR sea-ice image classification method based on a combined learning of spatial and temporal features, derived from residual convolutional neural networks (ResNet) and long short-term memory (LSTM) networks. In this way, we achieve automatic and refined classification of sea-ice types. First, we construct a seven-type ice data set according to the Canadian Ice Service ice charts. We extract spatial feature vectors of a time series of sea-ice samples using a trained ResNet network. Then, using the feature vectors as inputs, the LSTM network further learns the variation of the set of sea-ice samples with time. Finally, the extracted high-level features are fed into a softmax classifier to output the most recent ice type. Taking both spatial features and time variation into consideration, our method can achieve a high classification accuracy of 95.7% for seven ice types. Our method can automatically produce more objective sea-ice interpretation maps, allowing detailed sea-ice distribution and improving the efficiency of sea-ice monitoring tasks.
2021
Ice charts; long short-term memory (LSTM); residual convolution network; sea-ice classification; synthetic aperture radar (SAR); surface roughness; radar polarimetry; sea surface
File in questo prodotto:
File Dimensione Formato  
Automatic_Sea-Ice_Classification_of_SAR_Images_Based_on_Spatial_and_Temporal_Features_Learning.pdf

Solo gestori archivio

Descrizione: articolo online (Early Access)
Tipologia: versione editoriale (VoR)
Dimensione 6.48 MB
Formato Adobe PDF
6.48 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11584/321731
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 31
  • ???jsp.display-item.citation.isi??? 24
social impact