Summary: New research shows that deep learning can use EEG signals to distinguish Alzheimer’s disease from frontotemporal dementia with high accuracy. By analyzing both the timing and frequency of brain activity, the model uncovered distinct patterns: broader disruption across multiple regions in Alzheimer’s and more localized frontal and temporal changes in frontotemporal dementia.
The system also estimated disease severity, offering clinicians faster insight than traditional tools. These findings suggest that affordable EEG technology, paired with advanced AI, may streamline diagnosis and personalize care for people experiencing cognitive decline.
Key Facts
- EEG Biomarkers: Slow delta waves in frontal and central areas signaled disease in both conditions.
- Distinct Patterns: Alzheimer’s showed widespread disruption, while frontotemporal dementia remained more localized.
- High Accuracy: A two-stage deep learning system reached 84% accuracy in separating the two disorders.
Source: FAU
Dementia is a group of disorders that gradually impair memory, thinking and daily functioning. Alzheimer’s disease (AD), the most common form of dementia, affects about 7.2 million Americans aged 65 and older in 2025.
Frontotemporal dementia (FTD), while rarer, is the second most common cause of early-onset dementia, often striking people in their 40s to 60s.
Although both diseases damage the brain, they do so in distinct ways. AD primarily affects memory and spatial awareness, while FTD targets regions responsible for behavior, personality and language.
Because their symptoms can overlap, it often leads to misdiagnosis. Distinguishing between them is not just a scientific challenge but a clinical necessity, as accurate diagnosis can profoundly affect treatment, care and quality of life.
MRI and PET scans are effective for diagnosing AD but are costly, time-consuming and require specialized equipment. Electroencephalography (EEG) offers a portable, non-invasive and affordable alternative by measuring brain activity with sensors across various frequency bands.
However, signals are often noisy and vary between individuals, making analysis difficult. Even with machine learning applications to EEG data, results are inconsistent and differentiating AD from FTD remains difficult.
To tackle this issue, researchers from the College of Engineering and Computer Science at Florida Atlantic University have created a deep learning model that detects and evaluates AD and FTD. It boosts EEG accuracy and interpretability by analyzing both frequency- and time-based brain activity patterns linked to each disease.
The results of the study, published in the journal Biomedical Signal Processing and Control, found that slow delta brain waves were an important biomarker for both AD and FTD, mainly in the frontal and central regions of the brain.
In AD, brain activity was more widely disrupted, also affecting other regions of the brain and frequency bands like beta, indicating more extensive brain damage. These differences help explain why AD is typically easier to detect than FTD.
The model achieved more than 90% accuracy in distinguishing individuals with dementia (AD or FTD) from cognitively normal participants. It also predicted disease severity with relative errors of less than 35% for AD and 15.5% for FTD.
Because AD and FTD share similar symptoms and brain activity, telling them apart was difficult. Using feature selection, the researchers boosted the model’s specificity – how well it identified people without the disease – from 26% to 65%.
Their two-stage design – first detecting healthy individuals, then separating AD from FTD – achieved 84% accuracy, ranking among the best EEG-based methods so far.
The model merges convolutional neural networks and attention-based LSTMs to detect both the type and severity of dementia from EEG data. Grad-CAM shows which brain signals influenced the model, helping clinicians understand its decisions.
This approach offers a new view of how brain activity evolves and which regions and frequencies drive diagnosis – something traditional tools rarely capture.
“What makes our study novel is how we used deep learning to extract both spatial and temporal information from EEG signals,” said Tuan Vo, first author and a doctoral student in the FAU Department of Electrical Engineering and Computer Science.
“By doing this, we can detect subtle brainwave patterns linked to Alzheimer’s and frontotemporal dementia that would otherwise go unnoticed. Our model doesn’t just identify the disease – it also estimates how severe it is, offering a more complete picture of each patient’s condition.”
The findings also revealed that AD tends to be more severe, impacting a wider range of brain areas and leading to lower cognitive scores, while FTD’s effects are more localized to the frontal and temporal lobes.
These insights align with previous neuroimaging studies but add new depth by showing how these patterns appear in EEG data – an inexpensive and noninvasive diagnostic tool.
“Our findings show that Alzheimer’s disease disrupts brain activity more broadly, especially in the frontal, parietal and temporal regions, while frontotemporal dementia mainly affects the frontal and central areas,” said Hanqi Zhuang, Ph.D., co-author and associate dean and professor, FAU Department of Electrical Engineering and Computer Science.
“This difference explains why Alzheimer’s is often easier to detect. However, our work also shows that careful feature selection can significantly improve how well we distinguish FTD from Alzheimer’s.”
Overall, the study shows that deep learning can streamline dementia diagnosis by combining detection and severity assessment in one system, cutting down on lengthy evaluations and giving clinicians real-time tools to track disease progression.
“This work demonstrates how merging engineering, AI and neuroscience can transform how we confront major health challenges,” said Stella Batalama, Ph.D., dean of the College of Engineering and Computer Science.
“With millions affected by Alzheimer’s and frontotemporal dementia, breakthroughs like this open the door to earlier detection, more personalized care, and interventions that can truly improve lives.”
Study co-authors are Ali K. Ibrahim, Ph.D., an assistant professor of teaching; and Chiron Bang, a doctoral student, both with the FAU Department of Electrical Engineering and Computer Science.
Key Questions Answered:
A: Their symptoms and EEG signatures often overlap, leading to misdiagnosis without specialized imaging.
A: It analyzes spatial and temporal features simultaneously, revealing subtle brainwave differences missed by standard methods.
A: Yes — it estimates severity levels for both conditions, helping clinicians track progression more effectively.
Editorial Notes:
- This article was edited by a Neuroscience News editor.
- Journal paper reviewed in full.
- Additional context added by our staff.
About this AI and neurotech research news
Author: Gisele Galoustian
Source: FAU
Contact: Gisele Galoustian – FAU
Image: The image is credited to Neuroscience News
Original Research: Open access.
“Extraction and interpretation of EEG features for diagnosis and severity prediction of Alzheimer’s Disease and Frontotemporal dementia using deep learning” by Tuan Vo et al. Biomedical Signal Processing and Control
Abstract
Extraction and interpretation of EEG features for diagnosis and severity prediction of Alzheimer’s Disease and Frontotemporal dementia using deep learning
Alzheimer’s Disease (AD) is the most common form of dementia, characterized by progressive cognitive decline and memory loss. Frontotemporal dementia (FTD), the second most common form of dementia, affects the frontal and temporal lobes, causing changes in personality, behavior, and language.
Due to overlapping symptoms, FTD is often misdiagnosed as AD. Although electroencephalography (EEG) is portable, non-invasive, and cost-effective, its diagnostic potential for AD and FTD is limited by the similarities between the two diseases.
To address this, we introduce an EEG-based feature extraction method to identify and predict the severity of AD and FTD using deep learning. Key findings include increased delta band activities in the frontal and central regions as biomarkers.
By extracting temporal and spectral features from EEG signals, our model combines a Convolutional Neural Network with an attention-based Long Short-Term Memory (aLSTM) network, achieving over 90% accuracy in distinguishing AD and FTD from cognitively normal (CN) individuals.
It also predicts severity with relative errors of less than 35% for AD and approximately 15.5% for FTD. Differentiating FTD from AD remains challenging due to shared characteristics.
However, applying a feature selection procedure improves the specificity in separating AD from FTD, increasing it from 26% to 65%. Building on this, we developed a two-stage approach to classify AD, CN, and FTD simultaneously. In this approach, CN is identified first, followed by the differentiation of FTD from AD.
This method achieves an overall accuracy of 84% in classifying AD, CN, and FTD.

