“Virtual Mouse Using Hand & Eye Gesture and Chatbot”

Uncategorized

Authors: Prof. Supriya G Purohit, Mr.Tanveer Ahmed, Mr.Syed Adnan, Mr.Mohammed Zaid Noman

Abstract: In an increasingly digital world, the need for accessible and intuitive human-computer interaction (HCI) solutions is more critical than ever—especially for individuals with physical disabilities. This project proposes a Virtual Mouse System that integrates hand gestures, eye tracking, and an AI-powered chatbot to offer a seamless, multimodal interface for touch-free computing. By combining real-time computer vision, deep learning, and natural language processing (NLP), the system replaces traditional input devices like keyboards and mice with a more inclusive and efficient alternative. The hand gesture module utilizes OpenCV, MediaPipe, and Convolutional Neural Networks (CNNs) to detect and interpret finger movements and predefined gestures for cursor movement, clicking, and scrolling. The eye-tracking module employs Haar cascade classifiers, Hough Transform, and Eye Aspect Ratio (EAR) techniques to track gaze and blinks for cursor control and selection, enabling hands-free navigation. To enhance interactivity, a chatbot module powered by NLP models such as BERT or GPT handles voice and text-based queries for performing system-level commands and basic computational tasks. Communication among these modules is managed through a Flask-based backend, ensuring synchronized, responsive interaction. Designed for both general users and those with motor impairments, the system addresses limitations found in standalone gesture or voice- based solutions, such as lighting sensitivity, gesture misrecognition, or voice command errors. By integrating multiple input modalities, the system enhances accuracy, usability, and user autonomy. Applications span from accessibility tools to smart environments, virtual reality, and beyond. Future work includes improving gaze estimation through deep learning and enhancing chatbot capabilities for broader conversational interaction.

DOI: https://doi.org/10.5281/zenodo.18169212

× How can I help you?