Human-Computer Interaction (HCI) technology has evolved tremendously, with the computer mouse being one of its major innovations. While wireless and Bluetooth mice have become common, they still rely on batteries and dongles, meaning they are not truly "device-free." To address this limitation and push the boundaries of touch-free computing, we introduce a groundbreaking solution—a hand-gesture-controlled virtual mouse.
Our virtual mouse utilizes a webcam or built-in camera to track hand movements using computer vision technology. Built with the Python programming language and powered by the OpenCV library, this system enables precise hand tracking and translates hand gestures into mouse functions. With the integration of MediaPipe, Pynput, Autopy, and PyAutoGUI, the system can perform actions like left-clicks, right-clicks, scrolling, and more—entirely through hand gestures.
To further enhance the human-computer interaction, we integrate a voice assistant into the system. Using speech recognition and language processing algorithms, the voice assistant listens for specific commands, effectively filtering out background noise, and offering a seamless and efficient experience.
In essence, our AI Virtual Mouse project represents the future of touch-free computing and aims to enhance the efficiency and interactivity of computer systems, leading the way for a more intuitive and intuitive HCI experience.
Experience the AI Virtual Mouse in action! Watch the demo video showcasing the hand gesture control and voice assistant functionalities.