A Comprehensive Study of Fifteen Approaches Including Deep Learning and Probabilistic Models
| Model | Architecture Type | Input Format | Temporal Modeling | Parameters | Training Speed | Accuracy | Explainability |
|---|---|---|---|---|---|---|---|
| FFNN | Feature MLP | Statistical features | None | Few (100s-1Ks) | Fastest | Good | Moderate |
| Transformer | Single-scale Attention | Raw signals | Global | Many (100Ks) | Moderate | Excellent | High (attention) |
| 3stageFormer | Multi-scale Attention | Raw (3 resolutions) | Multi-scale | Most (100Ks+) | Slowest | Excellent+ | High (hierarchical) |
| CNN | Convolutional | Raw signals | Local | Moderate (10Ks-100Ks) | Fast | Good-Excellent | Moderate |
| LSTM | Recurrent | Raw signals | Sequential | Moderate (10Ks-100Ks) | Moderate | Good-Excellent | High (sequential) |
| Hopfield | Energy-based | Raw signals | Associative | Moderate (10Ks-100Ks) | Moderate | Good-Excellent | Moderate |
| VAE | Variational Autoencoder | Raw signals | Latent factors | Moderate (10Ks-100Ks) | Moderate | Good-Excellent | Highest (factors) |
| LTC | Continuous-time Neural ODE | Raw signals | Continuous-time | Moderate (10Ks-100Ks) | Moderate | Good-Excellent | Moderate |
| HMM | Probabilistic Sequence | Raw signals (discretized) | Hidden states | Few (1Ks) | Fast | Good | Moderate |
| Hierarchical HMM | Multi-level HMM | Raw signals (discretized) | Multi-scale hidden states | Few (1.5Ks) | Fast | Good-Excellent | Moderate |
| DBN | Temporal Bayesian Network | Raw signals | Temporal dependencies | Moderate (50Ks) | Moderate | Good-Excellent | High (uncertainty) |
| MDP | Sequential Decision | Raw signals | Decision process | Few (5Ks) | Moderate | Good | Moderate |
| PO-MDP | Partially Observable MDP | Raw signals | Hidden state decision | Moderate (8Ks) | Moderate | Good | Moderate |
| MRF | Spatial-temporal | Raw signals | Dependency modeling | Moderate (40Ks) | Moderate | Good-Excellent | Moderate |
| Granger | Causal Analysis | Raw signals | Causal relationships | Moderate (30Ks) | Moderate | Good | High (causal) |
Systematic comparison of fifteen distinct machine learning architectures on standardized metrics including accuracy, precision, recall, F1-score, training time, and inference time.
Three-Stage Hierarchical Transformer uniquely processes ECG signals at multiple temporal resolutions (1000, 500, 250 timesteps) for comprehensive pattern recognition.
Variational Autoencoder provides 21 interpretable latent factors (FactorECG approach) enabling clinical interpretability and generative capabilities.
CNN model offers optimal balance between accuracy and computational efficiency, making it ideal for practical deployment scenarios.
Hopfield Network demonstrates unique pattern completion and noise robustness through energy-based associative memory mechanisms.
LSTM network provides bidirectional sequential processing with explicit memory gates for rhythm analysis and temporal pattern recognition.
Liquid Time-Constant Network (LTC) models ECG signals as continuous-time processes using neural ODEs with adaptive time constants, capturing both fast and slow temporal patterns.
3stageFormer achieves highest accuracy through multi-scale hierarchical processing. Transformer provides excellent accuracy with global attention. CNN, LSTM, VAE, Hopfield, and LTC offer competitive accuracy with different architectural strengths.
FFNN is fastest for training and inference, ideal for real-time applications. CNN provides the best accuracy-efficiency balance. 3stageFormer is slowest but achieves highest accuracy.
VAE offers highest explainability through interpretable latent factors. Transformer and 3stageFormer provide attention-based interpretability. LSTM offers sequential processing interpretability.
Models processing raw signals (all except FFNN) demonstrate better generalization. 3stageFormer excels at multi-scale patterns. Hopfield shows superior noise robustness.
pip install -r requirements.txt
python benchmark.py
# Feedforward Neural Network
python neural_network.py
# Transformer Model
python transformer_ecg.py
# Three-Stage Hierarchical Transformer
python three_stage_former.py
# CNN and LSTM
python cnn_lstm_ecg.py
# Hopfield Network
python hopfield_ecg.py
# Variational Autoencoder
python vae_ecg.py
# Liquid Time-Constant Network
python ltc_ecg.py
# Hidden Markov Model
python hmm_ecg.py
# Dynamic Bayesian Network
python dbn_ecg.py
# Markov Decision Process / PO-MDP
python mdp_ecg.py
# Markov Random Field
python mrf_ecg.py
# Granger Causality
python granger_ecg.py
If you use this code or findings, please cite:
@article{chandra2025ecg,
title={Comparative Analysis of Neural Network Architectures for ECG Classification: A Comprehensive Study of Eight Deep Learning Approaches},
author={Chandra, Shyamal Suhana},
journal={Sapana Micro Software Research},
year={2025},
note={Implementation and benchmarking of FFNN, Transformer, 3stageFormer, CNN, LSTM, Hopfield, VAE, LTC, HMM, Hierarchical HMM, DBN, MDP, PO-MDP, MRF, and Granger Causality architectures}
}
This work builds upon the following foundational research: