

Gender Detection from Hand Palm Images: A PySpark-Driven Approach with VGG19 Feature Extraction and MLP Classification
This paper presents a comprehensive methodology for gender detection using hand palm images, leveraging image processing techniques and PySpark for scalable and efficient processing. The approach encompasses a meticulous image preprocessing pipeline, incorporating essential stages like grayscale conversion, the application ofthe Difference of Gaussians (DoG) filter, and adaptive histogram equalization. This approach not only refmes image features but also ensures scalability, accommodating large datasets seamlessly. After preprocessing of hand images, the VGG19 model is employed as a powerful feature extractor, utilizing its pre-trained convolutional layers to capture high-level abstract features from hand palm images. The fmal step was to train and fit the tabular data and extracted features on the Multilayer Perceptron (MLP) model, which is a feedforward neural network for classification tasks in PySpark. The MLP model, with its ability to learn complex patterns, proves effective in accurately predicting gender categories based on the extracted features. Finally, the model results were 94% accuracy for training and 87% testing which indicates that the model is generalizing well to unseen data. Also, another evaluation metrics were used like Precision 0.87, Recall 0.87, Fl Score 0.87 and AUC 0.92. © 2024 IEEE.