Learning Spatial and Temporal EEG Patterns for Alzheimer’s Disease Detection with CNN-LSTM Networks
Main Article Content
Abstract
Alzheimer's disease (AD) is a neurological disorder that gets worse over time and makes it very hard to remember things and think clearly. It is very important to find AD early so that it can be treated effectively. Electroencephalography (EEG) has become a hopeful, non-invasive, and low-cost way to find problems in the brain that are linked to Alzheimer's disease. Conversely, conventional machine learning techniques can rely on manually developed features and struggle to capture the complex spatial and temporal patterns of EEG data. This paper presents a hybrid deep learning model that automatically learns and sorts EEG data depending on patterns in space and time using both Convolutional Neural Networks (CNN) and Long Short-Term Memory (LSTM) networks. Using a publicly accessible EEG dataset that contained recordings from Alzheimer's patients and healthy controls, we developed a robust filtering process. We then instructed the CNN-LSTM model to differentiate between the two groups. The model outperformed simple models such as SVM, Random Forest, CNN-only, and LSTM-only designs with an accuracy of 93.2%, a precision of 91.5%, a recall of 94.8%, and an F1-score of 93.1%. The results show that mixing spatial and temporal feature extraction works well for accurate EEG-based Alzheimer's detection, which opens up a lot of possibilities for real-time and clinical diagnostic uses.