Kullback-Leibler divergence in feature selection: a methodology for improved detection of heart valve disorders

Scritto il 16/05/2026
da V H Preetha

Phys Eng Sci Med. 2026 May 16. doi: 10.1007/s13246-026-01737-z. Online ahead of print.

ABSTRACT

Dimensionality reduction is crucial for the effective management of high-dimensional datasets, particularly in the healthcare industry. Feature selection identifies the most relevant attributes, reducing computational overhead and ensuring robust performance, especially in resource-constrained environments. This study introduces a Kullback-Leibler Divergence (KLD)-based feature selection method for heart sound analysis to diagnose valvular heart diseases. Mel-frequency cepstral coefficients and mel spectrograms were extracted from the dataset as input features. The KLD was applied to identify the most informative features, which were subsequently validated using various classifiers. This approach resulted in an accuracy of 99% in diagnosing five distinct heart sounds, outperforming classifiers using the full feature set. The method prioritizes critical features, leading to improved performance across all evaluated classifiers. This endeavor also aims to classify heart sounds on an embedded platform, enabling the efficient analysis and accurate diagnosis of cardiovascular conditions. These findings highlight the potential of KLD-based feature selection for the real-time detection of heart valve disorders. By reducing the processing overhead while preserving classification accuracy, this approach supports the development of efficient, cost-effective edge-based tools, ultimately improving diagnostic precision and healthcare resource efficiency.

PMID:42143178 | DOI:10.1007/s13246-026-01737-z