Limb Trajectory Perception in Calisthenics Training: A Multi-Sensor Fusion and Spatial Behavior Analysis Framework
Main Article Content
Abstract
Calisthenics training, with its structured and rhythmic movement patterns, represents a form of orchestrated spatial behavior within a defined environment. This paper reconceptualizes such training as a subject for spatial analysis and proposes a technical framework to digitally capture and quantify its dynamics. By integrating an 8-node wireless Inertial Measurement Unit (IMU) network with a Kinect V2 depth camera, we establish a multi-sensor fusion system for precise limb trajectory perception. An Extended Kalman Filter (EKF) is employed to fuse high-frequency inertial data with absolute vision-based positioning, reconstructing drift-corrected 3D trajectories of key limbs. These trajectories are subsequently analyzed by a hybrid CNN-LSTM model to automatically recognize fundamental movement patterns with 97.8% accuracy. Experiments demonstrate a 68% improvement in trajectory accuracy over IMU-only methods. The contribution of this work is twofold: it presents a robust, low-cost framework for high-fidelity motion capture, and it positions rhythmic physical training as a viable domain for computational spatial behavior analysis, with potential implications for the design of intelligent training environments and human-centered architectural spaces.
Article Details
Issue
Section
Articles

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
How to Cite
Limb Trajectory Perception in Calisthenics Training: A Multi-Sensor Fusion and Spatial Behavior Analysis Framework. (2025). Architecture Image Studies, 6(3), 735-747. https://doi.org/10.62754/ais.v6i3.303