Fast Solar Image Classification Using Deep Learning and its Importance for Automation in Solar Physics
The volume of data being collected in solar physics has exponentially increased over the past decade and with the introduction of the Daniel K. Inouye Solar Telescope (DKIST) we will be entering the age of petabyte solar data. Automated feature detection will be an invaluable tool for post-processing of solar images to create catalogues of data ready for researchers to use. We propose a deep learning model to accomplish this; a deep convolutional neural network is adept at feature extraction and processing images quickly. We train our network using data from Hinode/Solar Optical Telescope (SOT) Hα images of a small subset of solar features with different geometries: filaments, prominences, flare ribbons, sunspots and the quiet Sun (i.e. the absence of any of the other four features). We achieve near perfect performance on classifying unseen images from SOT (≈99.9%) in 4.66 seconds. We also for the first time explore transfer learning in a solar context. Transfer learning uses pre-trained deep neural networks to help train new deep learning models i.e. it teaches a new model. We show that our network is robust to changes in resolution by degrading images from SOT resolution (≈0.33^" at λ=6563Å) to Solar Dynamics Observatory/Atmospheric Imaging Assembly (SDO/AIA) resolution (≈1.2^") without a change in performance of our network. However, we also observe where the network fails to generalise to sunspots from SDO/AIA bands 1600/1700Å due to small-scale brightenings around the sunspots and prominences in SDO/AIA 304Å due to coronal emission.
READ FULL TEXT