Pre-processing training data improves accuracy and generalisability of convolutional neural network based landscape semantic segmentation

04/28/2023
by   Andrew Clark, et al.
0

In this paper, we trialled different methods of data preparation for Convolutional Neural Network (CNN) training and semantic segmentation of land use land cover (LULC) features within aerial photography over the Wet Tropics and Atherton Tablelands, Queensland, Australia. This was conducted through trialling and ranking various training patch selection sampling strategies, patch and batch sizes and data augmentations and scaling. We also compared model accuracy through producing the LULC classification using a single pass of a grid of patches and averaging multiple grid passes and three rotated version of each patch. Our results showed: a stratified random sampling approach for producing training patches improved the accuracy of classes with a smaller area while having minimal effect on larger classes; a smaller number of larger patches compared to a larger number of smaller patches improves model accuracy; applying data augmentations and scaling are imperative in creating a generalised model able to accurately classify LULC features in imagery from a different date and sensor; and producing the output classification by averaging multiple grids of patches and three rotated versions of each patch produced and more accurate and aesthetic result. Combining the findings from the trials, we fully trained five models on the 2018 training image and applied the model to the 2015 test image with the output LULC classifications achieving an average kappa of 0.84 user accuracy of 0.81 and producer accuracy of 0.87. This study has demonstrated the importance of data pre-processing for developing a generalised deep-learning model for LULC classification which can be applied to a different date and sensor. Future research using CNN and earth observation data should implement the findings of this study to increase LULC model accuracy and transferability.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset