Aligning Educational Stakeholder Perceptions of Learner Profiling with Explainable AI
DOI:
https://doi.org/10.3991/ijet.v20i04.55861Keywords:
Learning Analytics, Learner Profile, Explainable AI, Conceptual Modeling, Stakeholder Requirements, Machine LearningAbstract
The education community continuously develops learner profile (LP) models to support decision-making in learning analytics (LA). However, a gap persists in aligning abstract stakeholder requirements with complex machine learning (ML) patterns. Educational stakeholders such as decision-makers, educators, and pedagogical engineers perceive and categorize LPs (e.g., learners in difficulty, active/inactive learners, those in progress, and success-oriented vs. at-risk learners) through mental models. These latter reflect real-world perceptions and pedagogical practices grounded in common educational concepts. To bridge this gap, data scientists must ensure technical ML insights align with stakeholder needs by selecting relevant features, addressing explainability, mitigating biases, and validating patterns against domain assumptions. For example, a learner generating extensive log data through repeated solution attempts may appear engaged from a human perspective but exhibit disengagement based on unexpected ML discovery patterns, highlighting biases in data interpretation or human perception. We propose Req2XAI (From Requirements to Explainable Machine Learning Models), a framework that establishes a bidirectional mapping between stakeholder requirements for LP analysis and ML-driven learner profiles. Req2XAI externalizes stakeholders’ mental model about LP via a conceptual model into requirements and goals and formalizes an end-to-end workflow, from stakeholder objectives to explainable ML models, ensuring transparency at each stage. A proof-of-concept prototype is implemented through a use case, considering the requirements of the Steering Committee of the écri+ project. This work introduces open research challenges associated with the Req2XAI framework, which merit further exploration.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 Abdelkader Ouared, Madeth May, Claudine Piau-Toffolon, Nicolas Dugué

This work is licensed under a Creative Commons Attribution 4.0 International License.