Mathematical Evaluation of Deep Learning Architecture with Feature Fusion for Cervical Cancer Detection Classification
Main Article Content
Abstract
Cervical cancer is still one of the main reasons women die around the world, which shows how important it is to have accurate and quick ways to diagnose the disease. This study looks into how to improve the accuracy of finding cervical cancer using a deep learning-based classification system that combines features. In this mixed classic feature extraction methods, like Histogram of Oriented Gradients (HOG) and Scale-Invariant Feature Transform (SIFT), with deep learning architectures, like Graph Convolutional Networks (GCNs) and Capsule Networks. Each of these added its own benefits to the fusion model. Both HOG and SIFT are good at capturing important texture and structure features, even when the image conditions change. GCNs are used to learn relationship data between picture areas. This helps get a better sense of how things fit together in space, which is very important for medical imaging. Capsule Networks, on the other hand, let you understand the subtleties of hierarchical patterns in pictures of cervical cells. They do this by looking at spatial hierarchies and rotational changes that are very important for finding cancer. Our suggested fusion model combines these multi-level features, taking advantage of how the best features of deep learning and standard learning methods work together. Validation on standard cervical cancer image datasets showed that this fusion approach was much more accurate and precise at classifying than individual methods. Quantitative testing showed that the fusion model had higher sensitivity and specificity, which shows that it can reduce the number of fake negatives and positives, which is very important in clinical diagnosis. Our results show that the feature fusion model not only gives us a strong and accurate way to classify cervical cancer, but it also lays the groundwork for more uses in other complicated areas of medical imaging.