Intelligent Physical Education via Wearable Sensors and Smartphone Interaction
DOI:
https://doi.org/10.3991/ijim.v20i09.61737Keywords:
wearable sensors; smartphone interaction; PE; motion recognition; error diagnosis; adaptive feedback; edge–cloud collaboration; multimodal fusionAbstract
Traditional physical education (PE) instruction suffers from delayed motion feedback, homogeneous guidance strategies, and the reinforcement of incorrect movement patterns. Existing sensor-based motion recognition systems often struggle to simultaneously achieve real-time responsiveness, personalized instruction, and robust adaptation to complex instructional scenarios. To address these challenges, this study proposes an intelligent PE system that integrates wearable sensors with smartphone-based interaction, forming a full-chain technical framework from multimodal data acquisition to adaptive intervention. The proposed system introduces several key innovations: (1) a multimodal semantic fusion strategy to enhance motion recognition accuracy and contextual understanding; (2) an early event detection mechanism that enables advanced prediction of motion errors and millisecond-level real-time feedback; (3) a personalized adaptive intervention mechanism that dynamically accommodates learners with different skill levels; (4) an edge-cloud collaborative architecture that balances real-time processing on mobile devices with in-depth analytical capabilities; and (5) the construction of a multimodal sports motion dataset and evaluation benchmark for instructional scenarios. This study provides a novel technical paradigm for intelligent mobile PE, and the released dataset and benchmarks offer valuable resources for future research, promoting the digitalization and intelligent transformation of physical education.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2026 Liyuan Xie, Qian Sun, Jinggang Li , Yongliang Zhang

This work is licensed under a Creative Commons Attribution 4.0 International License.

