TLDR: This research introduces an explainable machine learning framework to detect student disengagement from voluntary online quizzes in higher distance education. By analyzing student interaction data, the framework identifies three key behavioral patterns (erratic, delayed, irregular) and categorizes students into risk levels. Achieving 91% accuracy, the system provides actionable insights for educators to implement tailored interventions, fostering better student engagement and academic success.
In the evolving landscape of higher distance education, keeping students engaged in their studies is a significant challenge. Disengagement, particularly from non-mandatory tasks like voluntary quizzes, can lead to serious consequences, including academic dropout. A new research paper explores an innovative approach to tackle this issue: using explainable machine learning to detect and understand student disengagement.
The study, conducted across 42 online courses over four semesters at a distance-based university, focused on identifying patterns in student log data from Moodle, a popular learning management system. The researchers meticulously extracted and processed informative student interaction data, such as quiz attempts, activity timestamps, and periods of inactivity. This rich dataset formed the foundation for their machine learning models.
At the core of their solution is an explainable machine learning framework. After comparing eight different machine learning algorithms, a Neural Network (NN) model emerged as the most effective, achieving a balanced accuracy of 91% in detecting disengagement. This means the model could correctly identify approximately 85% of disengaged students, a crucial step towards timely intervention.
What makes this research particularly impactful is its emphasis on ‘explainability’. Unlike traditional black-box AI models, this framework uses the SHAP (SHapley Additive exPlanations) method to help educators understand *why* a student is predicted to be disengaged. This transparency is vital for designing effective support strategies. The study identified three key behavioral patterns linked to disengagement:
Also Read:
- Smarter AI Decisions: New Methods for Dynamic Abstraction in Monte Carlo Tree Search
- Unmasking Flaws in AI Agent Benchmarks: Introducing the Agentic Benchmark Checklist (ABC)
Types of Disengagement
-
Erratic Behavior: Characterized by inconsistent activity patterns, such as significant differences in engagement between weekdays and weekends, indicating poor time management.
-
Delayed Behavior: Marked by long periods of inactivity after initially starting a task, suggesting procrastination or a lack of motivation.
-
Irregular Behavior: Evidenced by unpredictable intervals between interactions, indicating hesitation or uncertainty about completing the quiz.
Based on these behavioral patterns, the framework categorizes students into high, medium, and low-risk levels of disengagement. For instance, a student exhibiting both erratic and delayed behaviors is considered high-risk, while irregular behavior alone might indicate a low risk.
The paper doesn’t stop at detection; it also proposes tailored intervention strategies aligned with each risk level. For high-risk students, immediate and structured support like fixed study time slots and personalized motivational messages are recommended. Medium-risk students might benefit from gamification elements or progressive deadlines. Low-risk students could be helped by self-reflection tools and individualized time slot suggestions.
This research offers a powerful tool for educators in distance learning. By accurately detecting disengagement and providing clear explanations for these predictions, it enables instructors to offer timely, personalized support, ultimately helping students stay on track and succeed in their academic journeys. The full research paper can be accessed here.


