Neural Memory Networks for Robust Classification of Seizure Type

12/10/2019 ∙ by David Ahmedt-Aristizabal, et al. ∙ CSIRO qut 0

Classification of seizure type is a key step in the clinical process for evaluating an individual who presents with seizures. It determines the course of clinical diagnosis and treatment, and its impact stretches beyond the clinical domain to epilepsy research and the development of novel therapies. Automated identification of seizure type may facilitate understanding of the disease, and seizure detection and prediction has been the focus of recent research that has sought to exploit the benefits of machine learning and deep learning architectures. Nevertheless, there is not yet a definitive solution for automating the classification of seizure type, a task that must currently be performed by an expert epileptologist. Inspired by recent advances in neural memory networks (NMNs), we introduce a novel approach for the classification of seizure type using electrophysiological data. We first explore the performance of traditional deep learning techniques which use convolutional and recurrent neural networks, and enhance these architectures by using external memory modules with trainable neural plasticity. We show that our model achieves a state-of-the-art weighted F1 score of 0.945 for seizure type classification on the TUH EEG Seizure Corpus with the IBM TUSZ preprocessed data. This work highlights the potential of neural memory networks to support the field of epilepsy research, along with biomedical research and signal analysis more broadly.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

I Introduction

Epilepsy is one of the most prevalent neurological conditions and people with epilepsy have recurrent seizures. Separating individual seizures into different types helps guide antiepileptic therapies [19]. Classification of seizures serves many purposes. It is informative of the potential triggers for a patient’s seizures, the risks of comorbidities including intellectual disability, learning difficulties, mortality risk such as sudden unexpected death from epilepsy, and psychiatric features such as autism spectrum disorder[19].

Together with observation of clinical signs, electroencephalography (EEG) plays a major role in seizure type evaluation and automating this process can support clinical evaluation. Recent advances in artificial intelligence and deep learning have demonstrated high success in other healthcare applications using brain signals 

[15]. However, application of these architectures within neuroscience and specifically to the processing of EEG recordings for epilepsy research, have been limited to date [4, 13]. Current deep learning approaches have mostly focused on the goals of seizure detection [48, 5] and seizure onset prediction [26]

; and deep convolutional neural networks (CNN) and recurrent neural networks (RNN) have been the most common architectures proposed to capture patterns during seizures 

[49, 43, 21]. Nevertheless, the automated capability to discriminate among seizure types (e.g. focal or generalized seizures) is a challenging and largely underdeveloped field due to both a lack of datasets and the highly complex nature of the task. A significant data resource, the TUH EEG Corpus [40], has recently become available for epilepsy research, creating a unique opportunity to evaluate deep learning techniques. To date, a limited set of methods have been applied to this challenging corpus in full for the task of seizure classification [35, 7, 42], although some researchers have used a small number of data sample from selected seizure types as input for their models [36, 37, 41].

Motivated by the tremendous success of neural memory networks to precisely store and retrieve relevant information [32, 18, 17]

, we propose a novel approach based on long-term memory modules to identify and exploit relationships across the entire EEG data set for seizure events. We capture the variability, both intrasubject (seizures of the same patient) and intersubject (seizures across patients), for each epilepsy type in this long-term relationship. One of the main limitations of using traditional recurrent neural networks such as Long Short Term Memory (LSTM) 

[22]

or Gated Recurrent Unit (GRU) 

[11] layers with seizure recordings is that they focus more on the recent history and previous memories are lost after updates [10], i.e. they consider dependencies only within a given input sequence. To address this limitation, we need to extract and store events over time, and this is possible with an external memory bank. In this scenario, a framework with an external memory should also learn when to store an event, as well as when to recall it for use in the future [17]. With the help of the external memory, the network no longer needs to squeeze all useful past information into the state variable (the cell state that saves information from the past) of the LSTM or GRU. We also adopt the concept of synaptic plasticity, which emulates the biological process of the same name to enable efficient lifelong learning, and to enhance the attention based knowledge retrieval mechanisms used in memory networks [31]

. The plastic neural memory exploits both static and dynamic connection weights in the memory read, write and output generation procedures (i.e. a connection between any two neurons has both a fixed component and a plastic component) 

[16].

In this research, we perform cross-patient seizure type classification, with an application of supporting the analysis of scalp EEG seizure recordings where epileptologists are not available. We first explore the feasibility of adapting deep learning algorithms that have shown promising results for seizure detection for the specific evaluation of seizure type classification. Then, we introduce our framework based on memory networks and trainable neural plasticity [31]

, which is a mechanism for knowledge discovery, i.e. a dynamic strategy to read and write relevant information to capture temporal relationships. We expand on the work introduced for anomaly detection 

[16] to demonstrate the potential of our architecture for the complex task of multi-class classification of seizures, and compare the results with previously published baseline methods.

Our technical contributions are summarized as follows:

  1. This study presents baseline results that compares several traditional deep learning algorithms proposed for seizure detection optimised and evaluated for the task of classifying seizure types.

  2. We propose a robust approach based on neural memory networks which outperforms state-of-the-art methods for seizure type classification on the TUH EEG Seizure corpus [40], and the IBM TUSZ pre-processed data [35].

  3. We introduce the first application of memory modules, which are capable of mapping long-term relationships, to the field of epilepsy research and demonstrate how they can provide a clear separation between classes using the extracted memory embeddings.

Ii Methodology

In this paper, we propose a neural memory network (NMN), which facilitates trainable neural plasticity for robust classification of seizure types. We compare and explore the difference between traditional deep learning techniques such as recurrent convolutional neural networks (RCNN) and our proposed framework using an external memory module. Traditional deep learning techniques exploit short spatio-temporal relationships to model sequential data. Memory modules, on the other hand, act as a large knowledge-store, and instead of making decisions based on the current observation (input data sample), map long-term relationships across all seizure recordings. A typical memory module [47] consists of a memory stack for information storage, a read controller to query the knowledge stored in the memory, an update controller to update the memory with new knowledge, and an output controller which controls what results are passed out from the memory. An abstract view of these components and their interaction with the specific application of seizure classification is given in Fig. 1. We compare the proposed approach with baseline algorithms for seizure detection [29, 1, 33, 23, 5, 44, 43, 21] and classification [35, 7, 42]

, and train all methods using supervised learning for direct comparison.

Fig. 1: Overview of the framework proposed for classifying seizure types using sequential and neural memory networks. 1.

We use the TUH EEG Seizure Corpus which contains scalp EEG data from seizure recordings and a pre-processing strategy based on the fast Fourier transform.

2. We map each data sample with 2 stacked LSTMs as input to the memory model. 3. External memory model: The state of the memory at time instant is . The input controller receives the encoded hidden states and determines what facts within the input data to use to query the memory

. An attention score vector

is used to quantify the similarity between the content stored in each slot of and the query vector to generate the input to the output controller. The output controller regulates what results from the memory stack () are passed out to the memory module for the current state (). The update controller updates the memory state based on the output of the memory module and propagates it to the next time step. These controllers utilise a combination of fixed weights and plastic components. 4. The output of the memory model is fed to a dense layer with a soft-max activation to predict each seizure type.
Seizure Type Seizures Patients Data sample
1. Focal Non-specific Seizure (FNSZ) 992 108 292,725
2. Generalized Non-Specific Seizure (GNSZ) 415 44 137,033
3. Simple Partial Seizure (SPSZ) 44 2 6,028
4. Complex Partial Seizure (CPSZ) 342 34 132,200
5. Absence Seizure (ABSZ) 99 12 3,087
6. Tonic Seizure (TNSZ) 67 2 4,888
7. Tonic Clonic Seizure (TCSZ) 50 11 22,524
TABLE I: Total count for seizures and patients identified with each seizure type.

Ii-a Seizure Dataset

We use the world’s largest publicly available database of EEG recordings, the Temple University Hospital EEG (TUH EEG) [40] database. We focus on the subset, the TUH EEG Seizure Corpus (TUSZ, v1.4.0), which was developed to motivate research on seizure detection. Recordings were sampled at 250Hz and contain the standard channels of a 10-20 configuration. The seizure corpus contains 2012 seizures with different lengths and eight types of seizure. Some seizures of the same patient are categorised with different seizure types. Seizure recordings were annotated based on the following manifestations: electrographic, electroclinical, and clinical. For seizure type classification experiments, we exclude only myoclonic seizures because of the small number of seizures recorded (three seizure events). The seven types of seizure selected for analysis are Focal Non-Specific Seizure (FNSZ), Generalized Non-Specific Seizure (GNSZ), Simple Partial Seizure (SPSZ), Complex Partial Seizure (CPSZ), Absence Seizure (ABSZ), Tonic Seizure (TNSZ), and Tonic Clonic Seizure (TCSZ). The data for one seizure event consists of only the interval that contained a seizure based on the labeling reported in [40]. One class is defined as the combination of all seizure recordings across sessions and patients for the same seizure type. Although we are considering here seven classes of seizure labeled in the corpus based on neurologists’ reports as described in [40], we note that these are not clinically disjoint classes. Clinically SPSZ and CPSZ are more specific subclasses of FNSZ while ABSZ, TNSZ, and TCSZ are more specific subclasses of GNSZ. Thus, in cases where there was insufficient evidence to classify the type of seizure more finely, the corpus categorises the seizure event as the more general class of FNSZ or GNSZ depending on how and where it began in the brain.

We adopt the preprocessed version of TUSZ known as the IBM TUSZ pre-processed data (v1.0.0, method #1) [35]. This work used the temporal central parasagittal montage [39] to select the following 20 channels as the input: ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; . In this preprocessing method, the authors computed time-frequency representations of EEG patterns to capture rich information from the raw EEG signals. A fast fourier transform (FFT) was applied to each window of length ( second and frequency bands) with seconds overlap () across all EEG recording channels, as is illustrated in Fig. 1. The transformed data of all channels in one time window constitutes one data sample. Thus the task here is to perform classification based on 1 second of EEG data. The number of data sample per seizure type corresponds to the total number of windows all seizures across all patients. The input shape representation to train and test the model to classify seizure types is defined by [#data sample, #channels, #frequency bands]. We adopt this input data to compare the performance of our framework with baseline results using traditional machine learning and deep learning techniques. Table LABEL:table:seizure_type summarises the total number of seizures, patients and data samples available for each seizure type.

Ii-B Traditional deep learning methods and baseline models

Deep learning has revolutionised many medical applications and with the increasing availability of EEG datasets, these algorithms have been applied to quantify information regarding seizures [2, 4]. We aim to adapt well-known classical deep learning structures from related domains and evaluate them for the specific task of seizure type classification. While the objective of seizure detection is to classify the input data into two classes (a seizure class and a non-seizure class), seizure type classification aims to identify different types of epileptic seizures; i.e. it is a multi-class classification task. As such, methods initially proposed for seizure detection can be applied to the task of seizure type classification.

As it is not possible to consider all existing methods for seizure detection and classification in our study, we adopt only the most significant approaches based on their overall precision; the compatibility of their input data (e.g. preprocessing, image-based EEG) with the IBM TUSZ pre-processed data [35], and level of documentation provided by the original authors to ensure full and correct reproducibility of each model.

Ii-B1 Techniques adapted from the seizure detection domain

The following methods are adapted to the task of seizure classification and are evaluated in this paper:

  • Stacked auto-encoders (SAE): SAE are an unsupervised learning technique composed of multiple sparse autoencoders 

    [27]

    . An auto-encoder consists of two parts, an encoder and a decoder. The encoder is used to map the input data to a hidden representation, and decoder is used to reconstruct input data from the hidden representation. SAE based approaches have been used by 

    [29, 20].

  • Convolutional neural networks (CNNs): A CNN consists of multiple stacked layers of different types: convolutional layers, nonlinear layers, and pooling layers, followed by fully connected layers. Compared with traditional feed-forward neural networks, CNNs exploit spatial locality by enforcing local connectivity and parameter sharing 

    [28]. The purpose of pooling is to achieve invariance to small local distortions and reduce the dimensionality of the feature space [28]. The differences in the architecture of various proposed CNNs is due to the number of layers included in the framework, and layer parameters. Some methods fine-tune a well-known architecture such as VGG, ResNet, while other design their own deep or shallow network. Methods including  [34, 6, 1, 45, 33, 23, 46] are examples of this approach.

  • Recurrent neural networks (RNNs): RNNs introduce the notion of time into a deep learning model by including recurrent edges that span adjacent time steps [30]. RNNs are termed recurrent as they perform the same task for every element of a sequence, with the output being dependant on the previous computations. LSTMs [22]

    were proposed to provide more flexibility to RNNs by employing an external memory, termed the cell state to deal with the vanishing gradient problem. Three logic gates are also introduced to adjust this external memory and internal memory. GRUs 

    [11] are a variant of LSTMs which combine the forget and input gates making the model simpler. RNNs are used by papers including [5, 44].

  • Hybrid networks: Hybrid or cascaded networks such as recurrent convolutional neural networks (RCNN) are used to better exploit variable-length sequential data [9], to extract spatio-temporal features and classify through an end-to-end deep learning model [14]. In this scenario, RCNN denotes a number of convolution layers followed by stacked recurrent units (LSTMs or GRUs). Such methods have been proposed in [43, 21, 39].

Given the dynamic nature of EEG data, RCNNs appear to be a reasonable choice for modeling the temporal evolution of brain activity. Therefore, we have designed a shallow RCNN based on architectures used for seizure detection with video recordings as an input [2, 3]. We aim to demonstrate that shallow architectures are capable of reaching similar results to more complex traditional deep learning models. Through extensive experiments, the design of the network architecture that shows the best performance consists of 1) CNN: two convolutional layers (32 kernels of size

) stacked together followed by one max-pooling layer (size

) and a fully-connected layer (512 nodes); 2) LSTM: two LSTM layers each with 128 cells followed by a densely connected layer with a softmax activation layer.

Ii-B2 Baseline methods for seizure type classification

The following methods are compared directly with the proposed approach:

  • Traditional machine learning techniques: K-Nearest Neighbors, SGD, XGBoost, and AdaBoost classifiers were proposed in 

    [35].

  • Traditional CNNs: a residual network ResNet50 was retrained to perform classification in [35]. Three pretrained models, AlexNet, VGG16 and VGG19, were used in [42] to solve the classification problem. However, an additional class of non-seizure events was included in this publication.

  • SeizureNet [7]: the authors proposed two sub-networks, a deep convolutional network (multiple bottleneck convolutions interconnected through dense connections) and a classification network.

Ii-C Neural memory networks and neural plasticity

The design of the network architecture for the task of seizure type classification from EEG recordings is displayed in Fig. 1. This approach aims to update the external memory model (a memory stack for information storage) with new information from each data sample, and as such the memory learns to store distinctive characteristics from each seizure type across patients. First, for modelling short-term relationships within the data sample we use LSTMs. To extract the relevant attributes through long-term dependencies (across seizures and patients), we employ the proposed neural memory architecture. The seizure classification output is generated using a dense layer with softmax classification.

The neural memory architecture is composed of a memory stack , with memory slots each with an embedding size (), and its respective input, output and update controllers. Each of these controllers is composed of an LSTM cell following [32, 17]. The input controller passes the encoded hidden state from the stacked LSTMs, , at time instant and generates a vector, , to retrieve the salient information from the stored knowledge in the memory. We generate an attention score vector to quantify the similarity between and the content of each slot of . Then, the output controller can retrieve the memory output, , for the current state. We pass this resultant embedding through an update controller to generate an updated vector , which is used to update the memory and propagate it to the next time step. We update the content of each memory slot based on the informativeness reflected in the score vector [16]. We define the input, output and update operations such that,

(1)
(2)
(3)
(4)
(5)

where is a matrix of ones, and are vectors of ones and denotes the outer vector product which duplicates its left vector or times to form a matrix. Ideally, we expect that the memory output

should capture salient information from both the input and stored history that can be used to estimate each type of seizure.

Inspired by the success of [31] in demonstrating how neural plasticity can be optimized by gradient descent in recurrent networks, we adopt neural plasticity to enhance the memory access mechanisms in the memory model. To perform the injection of plasticity for memory components, we adopt the formulation of the Hebbian rule for its flexibility and simplicity (“neurons that fire together, wire together”) [31]. We define a fixed component (a traditional connection weight ) and a plastic component for each pair of neurons and , where the plastic component is stored in a Hebbian trace Hebb, which evolves over time based on the inputs and outputs. The Hebbian trace is simply an average of the product of pre- and post-synaptic activity. Thus, the network equation for the output of neuron are:

(6)
(7)

Here controls the contribution from fixed and plastic terms of a particular weight connection, and is the learning rate of plastic components. Thus, we replace the component of the controllers to produce a plastic neural memory such that,

(8)
(9)
(10)
(11)

Further technical information on neural memory networks and plasticity can be found in [32, 18, 17, 16].

Iii Evaluation

Iii-a Experimental setup

All models were assessed through a 5-fold cross validation (CV) strategy to ensure that the data for hyperparameter tuning, and the data to test the algorithm were disjoint. For each fold, we used 80% of each data sample of each seizure type for training the models, and 20% for testing. We used a weighted-F1 score to measure performance which is more informative than accuracy given the uneven class distribution. F1 score is the weighted average of precision and recall. Therefore, this score takes both false positives and false negatives into account. In a weighted-F1 score, we weight the F1 score of each class by the number of samples from that class.

For each traditional deep learning models adapted for the task of seizure type classification, we follow the specifications provided by each author to train the architecture by optimizing the categorical cross-entropy loss. Our proposed shallow RCNN model was also trained by optimizing the categorical cross-entropy loss. We used the Adam optimizer [25] with a learning rate of

, and decay rates for the first and second moments of 0.9 and 0.999 respectively. For regularization, we employed dropout with a probability of 50% in the fully connected layer. Batch-size was set to 32. We trained the model over 150 epochs using the default initialization parameters from Keras 

[12].

For the proposed plastic NMN model, we also adopt the Adam optimizer and categorical cross-entropy loss and train for 50 epochs. Hyper-parameters (hidden state dimension), (memory length), and (learning rate of plasticity) were evaluated experimentally, and were chosen as they provide the best accuracy on the validation set.

Iii-B Classification of epilepsy types

Table II summarizes the results of seizure type classification on the TUH EEG Seizure corpus with the IBM TUSZ pre-processed data using our proposed framework, along with the baseline methods and the adapted methodologies from the seizure detection domain. It is evident that through the utilization of the proposed external memory model via augmented read and write mechanisms with plasticity, we were able to achieve superior classification results. Fig. 2

shows the normalized confusion matrices of the seven types of seizure for the proposed Plastic NMN method. The normalized confusion matrix allows the visualization of performance where each column of the matrix represents the instances in a predicted class and each row represents the instances in an actual class. In this confusion matrix, we can identify that the most difficult seizure types to discriminate are those which have a small number of seizure recordings available for training and testing (i.e. SPSZ, ABSZ, TNSZ and TCSZ).

To qualitatively illustrate the significance of the salient information and what the model has learned in terms of the model activations, we randomly sample 500 inputs from the test set and apply PCA [24] and plot the top two components in 2D. The embeddings are extracted from the last LSTM layer in the RCNN model and from the external memory in the plastic NMN. Fig. 3 and Fig. 4 depict the resultant plot where each seizure type is indicated based on the ground truth class identity. We observe clear separation between the seven type of seizures using the memory embeddings compare to the features learnt by our proposed RCNN model or SeizureNet [7]. This clearly demonstrates that the resultant sparse vectors are sufficient to discriminate between classes with simple classifiers.

Baseline methods Weighted-F1 score
Adaboost [35] 0.509
SGD [35] 0.649
XGBoost [35] 0.782
KNN [35] 0.884
CNN (ResNet50) [35] 0.723
CNN (AlexNet) [42] 0.802
SeizureNet [7] 0.900
Baseline from adapted methods Weighted-F1 score
SAE (based on [29]) 0.675
CNN (based on [1]) 0.716
CNN (based on  [33]) 0.826
CNN (based on [23]) 0.901
LSTM (based on [5]) 0.692
LSTM (based on [44]) 0.701
CNN-LSTM (based on [43]) 0.795
CNN-LSTM (based on [21]) 0.831
CNN-LSTM (this work) 0.824
Proposed framework Weighted-F1 score
Plastic NMN (this work) 0.945
TABLE II: Cross-validation performance using a sampling frequency of 24Hz for the classification of seizures types.
Fig. 2: Normalized confusion matrices for seizure type classification on the TUH EEG Seizure Corpus for the proposed Plastic NMN model.

Iv Discussion

Several studies have demonstrated that machine learning models, specifically deep learning networks, can successfully detect and/or predict the onset of seizures from scalp and intracranial EEG. Although such models may be useful in identifying biomarkers of an existing epileptic condition, they are rarely of use for discriminating between different type of seizures. In this paper, we have evaluated traditional deep learning methods proposed in the epilepsy domain for cross-patient seizure type classification, and we have improved on existing reported results by presenting a neural memory network based framework.

We note that RCNNs have reached better performance than models based on CNNs or RNNs alone for the task of seizure type classification, which is similar to their relative performance reported for the seizure detection task. Given the inherent temporal structure of EEGs, we expected that recurrent networks would be more widely employed than models that do not consider time dependencies. However, almost half of the models proposed in the epilepsy domain have used CNNs. This observation supports recent discussions regarding the effectiveness of CNNs for processing time series data [8]. Another finding of our study of baseline models is that the shallow RCNN proposed performed as well as deep CNNs models. This supports other research that has preferred shallow networks for analysing EEG data. Schirrmeister et al. [38] focused on this aspect, comparing the performance of architectures with different depths and structures. The authors showed that shallower fully convolutional models outperformed their deeper counterparts. However, we note that hyperparameter tuning of baseline models may be key to using deeper architectures with physiological recordings.

The potential of recurrent neural networks to handle sequence information was evident in the experimental results. However, it is essential to consider historic behaviour over the full length of seizures, and map long-term dependencies between seizures to generate more precise classification. The process of capturing seizure behaviour is highly complex because of the increased heterogeneity of participants and the temporal evolution during epileptic seizures. Analyzing dynamic changes during a seizure is a major aspect of epilepsy patient assessment. Even though an RNN model has the ability to capture temporal information, it considers only the relationships within the current sequence due to the internal memory structure, making accurate long-term prediction intractable. RNNs such as LSTMS or GRUs exhibit one common limitation related to their storage capacity because their internal state is modified, heavily or slightly, at each computation step. By incorporating a neural memory network, we are able to increase the model’s storage capacity without having to increase the size of the model; as demonstrated by [32], who compared using neural memory networks to map long-term dependencies among the stored facts with LSTMs which map dependencies within the input sequence.

The memory network proposed in this paper is capable of capturing both short-term (within each data sample) as well as long-term (across the entire collection of data samples) relationships to predict a seizure type (i.e. long-term memory and working memory). Therefore, our proposed system eliminates the deficiencies of current baseline models in epilepsy classification which only consider within-sequence relationships. An additional benefit of the implemented memory network is that we have introduced the concept of synaptic plasticity through the read and write operation of the learnable controllers. We apply local plasticity rules (Hebbian trace) to update feed-forward synaptic weights following feedback projection. The plastic nature of the memory access mechanisms in the neural memory model allows our system to provide a varying level of attention to the stored information, i.e. the plastic network acts as a content-addressable memory.

To allow comparison with baseline methods [35, 7, 42] we defined the classification task here similarly: to separate seven classes of seizure labeled in the Corpus. As noted above, these classes are not actually clinically disjoint, but form a hierarchy. This semantic structure is not exploited by the present method. Where for example in Fig. 4 more specific seizure classes such as ABSZ are readily separated from the more general class GNSZ, this may indicate overfitting due to the small number of distinct patients for some seizure classes in the Corpus. This shows the value of continuing to expand the seizure corpus with more patients for future work.

Fig. 3: 2D illustration of extracted embeddings from the CNN-LSTM model for randomly selected 500 samples from the test set.
Fig. 4: 2D illustration of extracted memory embeddings from the Plastic NMN for randomly selected 500 samples from the test set.

V Conclusion

This paper presents a deep learning based framework which consists of a neural memory network with neural plasticity for EEG-based seizure type classification. A brief overview of commonly used deep learning approaches in the epilepsy domain is also presented. The proposed approach is capable of modelling long-term relationships which enables the model to learn rich and highly discriminative features for seizure type classification. With increasing computational capabilities and the collection of larger datasets, clinicians and researchers will increasingly benefit from the significant progress already made in their application to epilepsy. An accurate classification of seizures along with neuroimaging and behavioural analsyis are one step towards more accurate prognosis. In future, we plan to investigate the introduction of the memory component to map relationships directly from raw intracranial EEG recordings without a preprocessing phase, i.e. without extracting information contained in the frequency transform of the time-series EEG.

References

  • [1] U. R. Acharya, S. L. Oh, Y. Hagiwara, J. H. Tan, and H. Adeli (2018) Deep convolutional neural network for the automated detection and diagnosis of seizure using eeg signals. Computers in biology and medicine 100, pp. 270–278. Cited by: 2nd item, §II, TABLE II.
  • [2] D. Ahmedt-Aristizabal, S. Denman, K. Nguyen, S. Sridharan, S. Dionisio, and C. Fookes (2019) Understanding patients’ behavior: vision-based analysis of seizure disorders. IEEE journal of biomedical and health informatics. Cited by: §II-B1, §II-B.
  • [3] D. Ahmedt-Aristizabal, C. Fookes, S. Denman, K. Nguyen, S. Sridharan, and S. Dionisio (2019)

    Aberrant epileptic seizure identification: a computer vision perspective

    .
    Seizure 65, pp. 65–71. Cited by: §II-B1.
  • [4] D. Ahmedt-Aristizabal, C. Fookes, S. Dionisio, K. Nguyen, J. P. S. Cunha, and S. Sridharan (2017) Automated analysis of seizure semiology and brain electrical activity in presurgery evaluation of epilepsy: a focused survey. Epilepsia 58 (11), pp. 1817–1831. Cited by: §I, §II-B.
  • [5] D. Ahmedt-Aristizabal, C. Fookes, K. Nguyen, and S. Sridharan (2018) Deep classification of epileptic signals. In EMBC, pp. 332–335. Cited by: §I, 3rd item, §II, TABLE II.
  • [6] A. Antoniades, L. Spyrou, C. C. Took, and S. Sanei (2016) Deep learning for epileptic intracranial eeg data. In MLSP, pp. 1–6. Cited by: 2nd item.
  • [7] U. Asif, S. Roy, J. Tang, and S. Harrer (2019) SeizureNet: a deep convolutional neural network for accurate seizure type classification and seizure detection. arXiv preprint arXiv:1903.03232. Cited by: §I, 3rd item, §II, §III-B, TABLE II, §IV.
  • [8] S. Bai, J. Z. Kolter, and V. Koltun (2018) An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. arXiv preprint arXiv:1803.01271. Cited by: §IV.
  • [9] P. Bashivan, I. Rish, M. Yeasin, and N. Codella (2015) Learning representations from eeg with deep recurrent-convolutional neural networks. arXiv preprint arXiv:1511.06448. Cited by: 4th item.
  • [10] Q. Chen, X. Zhu, Z. Ling, S. Wei, and H. Jiang (2016) Enhancing and combining sequential and tree lstm for natural language inference. arXiv preprint arXiv:1609.06038. Cited by: §I.
  • [11] K. Cho, B. Van Merriënboer, C. Gulcehre, D. Bahdanau, F. Bougares, H. Schwenk, and Y. Bengio (2014) Learning phrase representations using rnn encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078. Cited by: §I, 3rd item.
  • [12] F. Chollet et al. (2017) Keras. Cited by: §III-A.
  • [13] A. Craik, Y. He, and J. L. P. Contreras-Vidal (2019) Deep learning for electroencephalogram (EEG) classification tasks: a review. Journal of neural engineering. Cited by: §I.
  • [14] J. Donahue, L. Anne Hendricks, S. Guadarrama, M. Rohrbach, S. Venugopalan, K. Saenko, and T. Darrell (2015) Long-term recurrent convolutional networks for visual recognition and description. In CVPR, pp. 2625–2634. Cited by: 4th item.
  • [15] O. Faust, Y. Hagiwara, T. J. Hong, O. S. Lih, and U. R. Acharya (2018) Deep learning for healthcare applications based on physiological signals: a review. Computer methods and programs in biomedicine 161, pp. 1–13. Cited by: §I.
  • [16] T. Fernando, S. Denman, D. Ahmedt-Aristizabal, S. Sridharan, K. Laurens, P. Johnston, and C. Fookes (2019) Neural memory plasticity for anomaly detection. arXiv preprint arXiv:1910.05448. Cited by: §I, §I, §II-C, §II-C.
  • [17] T. Fernando, S. Denman, A. McFadyen, S. Sridharan, and C. Fookes (2018) Tree memory networks for modelling long-term temporal dependencies. Neurocomputing 304, pp. 64–81. Cited by: §I, §II-C, §II-C.
  • [18] T. Fernando, S. Denman, S. Sridharan, and C. Fookes (2018) Task specific visual saliency prediction with memory augmented conditional generative adversarial networks. In WACV, pp. 1539–1548. Cited by: §I, §II-C.
  • [19] R. S. Fisher, J. H. Cross, J. A. French, N. Higurashi, E. Hirsch, F. E. Jansen, L. Lagae, S. L. Moshé, J. Peltola, E. Roulet Perez, et al. (2017) Operational classification of seizure types by the international league against epilepsy: position paper of the ilae commission for classification and terminology. Epilepsia 58 (4), pp. 522–530. Cited by: §I.
  • [20] M. Golmohammadi, A. H. Harati Nejad Torbati, S. Lopez de Diego, I. Obeid, and J. Picone (2019) Automatic analysis of eegs using big data and hybrid deep learning architectures. Frontiers in human neuroscience 13, pp. 76. Cited by: 1st item.
  • [21] M. Golmohammadi, S. Ziyabari, V. Shah, E. Von Weltin, C. Campbell, I. Obeid, and J. Picone (2017) Gated recurrent networks for seizure detection. In SPMB, pp. 1–5. Cited by: §I, 4th item, §II, TABLE II.
  • [22] K. Greff, R. K. Srivastava, J. Koutník, B. R. Steunebrink, and J. Schmidhuber (2016) LSTM: a search space odyssey. IEEE transactions on neural networks and learning systems 28 (10), pp. 2222–2232. Cited by: §I, 3rd item.
  • [23] Y. Hao, H. M. Khoo, N. von Ellenrieder, N. Zazubovits, and J. Gotman (2018) DeepIED: an epileptic discharge detector for eeg-fmri based on deep learning. NeuroImage: Clinical 17, pp. 962–975. Cited by: 2nd item, §II, TABLE II.
  • [24] I. Jolliffe (2011) Principal component analysis. Springer. Cited by: §III-B.
  • [25] D. P. Kingma and J. Ba (2014) Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980. Cited by: §III-A.
  • [26] L. Kuhlmann, K. Lehnertz, M. P. Richardson, B. Schelter, and H. P. Zaveri (2018) Seizure prediction—ready for a new era. Nature Reviews Neurology, pp. 1. Cited by: §I.
  • [27] H. Larochelle, Y. Bengio, J. Louradour, and P. Lamblin (2009) Exploring strategies for training deep neural networks. Journal of machine learning research 10 (Jan), pp. 1–40. Cited by: 1st item.
  • [28] Y. LeCun, Y. Bengio, and G. Hinton (2015) Deep learning. Nature 521 (7553), pp. 436–444. Cited by: 2nd item.
  • [29] Q. Lin, S. Ye, X. Huang, S. Li, M. Zhang, Y. Xue, and W. Chen (2016) Classification of epileptic eeg signals with stacked sparse autoencoder based on deep learning. In ICIC, pp. 802–810. Cited by: 1st item, §II, TABLE II.
  • [30] Z. C. Lipton, J. Berkowitz, and C. Elkan (2015) A critical review of recurrent neural networks for sequence learning. arXiv preprint arXiv:1506.00019. Cited by: 3rd item.
  • [31] T. Miconi, K. Stanley, and J. Clune (2018)

    Differentiable plasticity: training plastic neural networks with backpropagation

    .
    In ICML, pp. 3556–3565. Cited by: §I, §I, §II-C.
  • [32] T. Munkhdalai and H. Yu (2017) Neural semantic encoders. In Proceedings of the conference. Association for Computational Linguistics. Meeting, Vol. 1, pp. 397. Cited by: §I, §II-C, §II-C, §IV.
  • [33] A. O’Shea, G. Lightbody, G. Boylan, and A. Temko (2018) Investigating the impact of cnn depth on neonatal seizure detection performance. In EMBC, pp. 5862–5865. Cited by: 2nd item, §II, TABLE II.
  • [34] A. Page, C. Shea, and T. Mohsenin (2016)

    Wearable seizure detection using convolutional neural networks with transfer learning

    .
    In ISCAS, pp. 1086–1089. Cited by: 2nd item.
  • [35] S. Roy, U. Asif, J. Tang, and S. Harrer (2019) Machine learning for seizure type classification: setting the benchmark. arXiv preprint arXiv:1902.01012. Cited by: item 2, §I, 1st item, 2nd item, §II-A, §II-B, §II, TABLE II, §IV.
  • [36] I. R. D. Saputro, N. D. Maryati, S. R. Solihati, I. Wijayanto, S. Hadiyoso, and R. Patmasari (2019)

    Seizure type classification on EEG signal using support vector machine

    .
    In Journal of Physics: Conference Series, Vol. 1201, pp. 012065. Cited by: §I.
  • [37] I. R. D. Saputro, R. Patmasari, and S. Hadiyoso (2018) Tonic clonic seizure classification based on EEG signal using artificial neural network method. In SOFTT, Cited by: §I.
  • [38] R. T. Schirrmeister, J. T. Springenberg, L. D. J. Fiederer, M. Glasstetter, K. Eggensperger, M. Tangermann, F. Hutter, W. Burgard, and T. Ball (2017) Deep learning with convolutional neural networks for eeg decoding and visualization. Human brain mapping 38 (11), pp. 5391–5420. Cited by: §IV.
  • [39] V. Shah, M. Golmohammadi, S. Ziyabari, E. Von Weltin, I. Obeid, and J. Picone (2017) Optimizing channel selection for seizure detection. In SPMB, pp. 1–5. Cited by: 4th item, §II-A.
  • [40] V. Shah, E. Von Weltin, S. Lopez de Diego, J. R. McHugh, L. Veloso, M. Golmohammadi, I. Obeid, and J. Picone (2018) The temple university hospital seizure detection corpus. Frontiers in Neuroinformatics 12, pp. 83. Cited by: item 2, §I, §II-A.
  • [41] X. Song, L. Aguilar, A. Herb, and S. Yoon (2019) Dynamic modeling and classification of epileptic EEG data. In NER, pp. 49–52. Cited by: §I.
  • [42] N. Sriraam, Y. Temel, S. V. Rao, P. L. Kubben, et al. (2019) A convolutional neural network based framework for classification of seizure types. In EMBC, pp. 2547–2550. Cited by: §I, 2nd item, §II, TABLE II, §IV.
  • [43] P. Thodoroff, J. Pineau, and A. Lim (2016) Learning robust features using deep learning for automatic seizure detection. In MLHC, pp. 178–190. Cited by: §I, 4th item, §II, TABLE II.
  • [44] K. M. Tsiouris, V. C. Pezoulas, M. Zervakis, S. Konitsiotis, D. D. Koutsouris, and D. I. Fotiadis (2018) A long short-term memory deep learning network for the prediction of epileptic seizures using eeg signals. Computers in biology and medicine 99, pp. 24–37. Cited by: 3rd item, §II, TABLE II.
  • [45] I. Ullah, M. Hussain, H. Aboalsamh, et al. (2018) An automated system for epilepsy detection using eeg brain signals based on deep learning approach. Expert Systems with Applications 107, pp. 61–71. Cited by: 2nd item.
  • [46] Z. Wei, J. Zou, J. Zhang, and J. Xu (2019) Automatic epileptic eeg detection using convolutional neural network with improvements in time-domain. Biomedical Signal Processing and Control 53, pp. 101551. Cited by: 2nd item.
  • [47] C. Xiong, S. Merity, and R. Socher (2016) Dynamic memory networks for visual and textual question answering. In ICML, pp. 2397–2406. Cited by: §II.
  • [48] M. Zabihi, S. Kiranyaz, V. Jantti, T. Lipping, and M. Gabbouj (2019) Patient-specific seizure detection using nonlinear dynamics and nullclines. IEEE journal of biomedical and health informatics. Cited by: §I.
  • [49] Y. Zhang, Y. Guo, P. Yang, W. Chen, and B. Lo (2019) Epilepsy seizure prediction on eeg using common spatial pattern and convolutional neural network. IEEE Journal of Biomedical and Health Informatics. Cited by: §I.