%A D. Bhavana, K. Kishore Kumar, Medasani Bipin Chandra​, P.V. Sai Krishna Bhargav​, D. Joy Sanjanaa, and G. Mohan Gopi %T Hand Sign Recognition using CNN %0 Journal Article %D 2021 %J Int J Performability Eng %R 10.23940/ijpe.21.03.p7.314321 %P 314-321 %V 17 %N 3 %U {https://www.ijpe-online.com/CN/abstract/article_4559.shtml} %8 2021-03-27 %X

Our aim is to produce a model that can recognize hand gestures and signs. We will train a model for the purpose of sign language conversion, a simple gesture recognizing model; this will help people converse with people who are innately deaf and mentally disabled. This project can be implemented in several ways such as KNN, Logistic Regression, Naïve Bayes Classification, Support vector machine and can be implemented with CNN. The method we have chosen is CNN as it gives better accuracy compared to the rest of the methods. A computer program is developed using python language which is used to train the model based on the CNN algorithm. The program will be able to recognize hand gestures by comparing the input with preexisting dataset formed using the American sign Language. We will be able to convert Sign Language into text as output for users to recognize the signs presented by the sign language speaker. This model is implemented in Jypter Lab, an extension to the platform Anaconda documentation. To further improve, we will also add / integrate the inputs into black and white and take input from camera after using the method of Background subtraction. With the mask set to detect the human skin, this model will not require a plain background to function and can be implemented using a basic camera and a computing device.