Project: Emotion on Motion

Motivation

Emotion shapes the way we move — the way we walk, rise from a chair, and initiate movement all carry subtle signatures of our internal state. Yet most emotion recognition research focuses on faces and voice, modalities that are easy to fake, difficult to capture passively, and impossible to use for people with facial or speech impairments. Gait and functional movement transitions like sit-to-walk are different: they're automatic, difficult to consciously suppress, and directly tied to the motor and postural control systems that mood disorders are known to affect. If we could reliably decode emotional state from movement, we'd have a low-burden, manipulation-resistant signal with real clinical potential — for monitoring mood disorders like bipolar disorder and depression, tracking rehabilitation progress, or flagging emotional dysregulation before it becomes clinically apparent. Beyond emotion recognition, these same biomechanical shifts have direct implications for fall prevention: emotions like fear and sadness alter dynamic balance in ways that matter for vulnerable populations.

Our Solution

We built a laboratory pipeline to capture full-body 3D biomechanics across five emotional states — anger, sadness, joy, fear, and neutral — using an autobiographical memory induction paradigm to elicit genuine emotional recall. Participants performed walking and sit-to-walk trials under each condition while instrumented with reflective markers tracked by a Vicon motion capture system and processed through Visual3D. From these recordings we extracted over 150 biomechanical variables spanning spatiotemporal parameters and full-body joint kinematics across all major segments. One study applied machine learning — including Random Forest, XGBoost, KNN, Logistic Regression, and MLP — with SMOTE for class balancing and Leave-One-Participant-Out cross-validation to classify emotional states from gait. A parallel study extended the same movement framework to classify depression and anxiety symptoms in college students using clinically validated screening instruments, testing whether gait and sit-to-walk biomechanics could objectively flag elevated symptom levels. A third study validated these spatiotemporal findings using ankle-mounted inertial measurement units, testing whether IMUs could replicate the emotion-linked gait changes previously documented with motion capture and extending the work to gait variability.

My Contributions

As first author on the dynamic balance study, I ran participants through the full walking and sit-to-walk protocol, applied 74 reflective markers, operated the Vicon motion capture system, cleaned data in Vicon Nexus, computed COM-to-ankle angles at heel strike, toe-off, and seat-off events in Visual3D, applied eligibility-based trial exclusion, and performed all statistical analysis — finding that fear produced the greatest deviation during gait initiation and joy most significantly altered balance during sit-to-walk. I presented these findings as first author at the SCASB Annual Meeting, Fort Worth, TX (2024).

For the emotion classification study (Stout et al., Gait & Posture 2025), I contributed to data collection, data curation, and feature extraction from the 155 biomechanical variables computed in Visual3D, and served as a writing reviewer and editor on the manuscript. For the depression and anxiety classification study (Stout et al., Gait & Posture 2026), I contributed to data collection and curation, and held a writing role spanning both original drafting and review. The best models in that study achieved 75% accuracy for gait and 77% for sit-to-walk under nested participant-level holdout validation across 30 participants. I also contributed to data collection and analysis for the IMU validation study (Alvarez et al., Sensors 2025), which confirmed that ankle-mounted IMUs can replicate emotion-linked spatiotemporal changes previously documented only with laboratory motion capture.

Project Outcomes

Poster 1: SCASB 24 Poster | Poster 2: SPUR 24 Poster | Poster 3: SCASB 25 Poster

Publication: Feasibility of Machine Learning Classification of Depression and Anxiety Symptoms among College Students using 3D Gait and Sit-to-Walk Biomechanics

Publication: Emotion Classification Using Gait Biomechanics and Machine Learning

Looking to discuss further? Contact me at research@mkmaharana.com