FlowBeats: Gesture‑Based Control Technique For Intelligent Music Interaction System

Uncategorized

Authors: Siddhi Pawar, Anuradha Raut, Tanuja Suryawanshi, Shravani Wadghare, Pradnya Satpute

Abstract: Gesture recognition has become an important research area in Human–Computer Interaction (HCI). It enables users to control digital systems using natural hand movements instead of traditional input devices such as keyboards or touch screens. This paper presents FlowBeat, a gesture‑controlled music interaction system that allows users to control music playback using simple hand gestures captured through a webcam. The system uses computer vision techniques with OpenCV and MediaPipe to detect hand landmarks and classify gestures in real time. The recognized gestures are mapped to commands such as play, pause, next track, and previous track. The proposed system provides a low‑cost, touchless, and intuitive interface for music control. The paper discusses existing gesture recognition techniques, system architecture, algorithm design, and advantages of the proposed solution. The motivation behind developing the FlowBeat system is to create a more natural and convenient way for users to interact with multimedia applications. Traditional music control methods often require physical contact with devices, which may not always be practical in certain situations. Gesture-based interaction allows users to control music playback without touching the device, thereby improving accessibility and user comfort. The proposed system focuses on providing an efficient and user-friendly gesture recognition framework that can operate using commonly available hardware such as a standard webcam. By combining computer vision techniques with real-time gesture detection, the system aims to deliver smooth interaction and reliable performance. The study also highlights the potential of gesture-based interfaces in future multimedia systems and interactive technologies.

 

 

× How can I help you?