NeuroFocusAI

Uncategorized

Authors: V. Mounica, Peteti Anuneha, Shaik Abdul Karimulla, Sadhu B.S.V.V.N.S.R. Prasanth, Rangineni Sai Swarup, Badviti Sai Deepak

Abstract: Student engagement monitoring in modern classroom and online learning environments presents a significant challenge, as traditional attendance-based systems measure physical presence but fail to quantify cognitive attention. This paper presents NeuroFocusAI, an AI-based student concentration monitoring system that evaluates real-time attention levels using a multi-modal analysis pipeline comprising facial landmark tracking, eye gaze estimation, blink detection, emotion recognition, and environmental noise analysis. The system processes live webcam input using the MediaPipe FaceMesh model, which detects 468 facial landmark points to enable precise iris-based gaze tracking and Eye Aspect Ratio (EAR) blink detection. Emotional state classification is performed using the DeepFace library across six emotion categories. Environmental noise levels are concurrently measured using Root Mean Square (RMS) audio signal processing via the SoundDevice library. A weighted scoring algorithm combines gaze direction (60%), emotion state (20%), and environmental noise (20%) to compute a concentration score between 0 and 100, which is stored periodically for session analytics. The backend is implemented using FastAPI, with SQLite as the persistent data store, and a React.js-based dashboard provides real-time analytics for both students and teachers. Experimental results demonstrate that the system accurately classifies student attention into three levels — High Focus (80–100), Moderate Focus (60–79), and Low Focus (0–59) — with significant improvements over traditional attendance-based engagement measurement.

DOI:

 

 

× How can I help you?