Project: Emotion on Motion

Motivation

Emotion shapes the way we move — the way we walk, rise from a chair, and initiate movement all carry subtle signatures of our internal state. Yet most emotion recognition research focuses on faces and voice, modalities that are easy to fake, difficult to capture passively, and impossible to use for people with facial or speech impairments. Gait and functional movement transitions like sit-to-walk are different: they're automatic, difficult to consciously suppress, and directly tied to the motor and postural control systems that mood disorders are known to affect. If we could reliably decode emotional state from movement, we'd have a low-burden, manipulation-resistant signal with real clinical potential — for monitoring mood disorders like bipolar disorder and depression, tracking rehabilitation progress, or flagging emotional dysregulation before it becomes clinically apparent. Beyond emotion recognition, these same biomechanical shifts have direct implications for fall prevention: emotions like fear and sadness alter dynamic balance in ways that matter for vulnerable populations.

Our Solution

We built a laboratory pipeline to capture full-body 3D biomechanics across five emotional states — anger, sadness, joy, fear, and neutral — using an autobiographical memory induction paradigm to elicit genuine emotional recall. Participants performed walking and sit-to-walk trials under each condition while instrumented with reflective markers tracked by a Vicon motion capture system and processed through Visual3D. From these recordings we extracted over 150 biomechanical variables spanning spatiotemporal parameters and full-body joint kinematics across all major segments. We then applied machine learning — including Random Forest, XGBoost, KNN, Logistic Regression, and MLP — with SMOTE for class balancing and Leave-One-Participant-Out cross-validation to ensure results generalized beyond individual movement patterns. A parallel study extended the same movement framework to classify depression and anxiety symptoms using clinically validated screening instruments, testing whether the same gait and sit-to-walk pipeline could objectively flag elevated symptom levels.

My Contributions

I contributed across multiple studies at different levels of ownership. As first author on the dynamic balance study, I ran participants through the full walking and sit-to-walk protocol, applied reflective markers, operated the Vicon motion capture system, processed COM-to-ankle angles at heel strike, toe-off, and seat-off events in Visual3D, and performed all statistical analysis — finding that fear produced the greatest deviation during gait initiation and joy most significantly altered balance during sit-to-walk. I presented these findings as first author at the SCASB Annual Meeting, Fort Worth, TX (2024).

>Across the broader lab studies I held formal CRediT roles — data curation, formal analysis, and manuscript writing — on two peer-reviewed publications. For the emotion classification study (Gait & Posture 2025), I contributed to feature extraction from 155 biomechanical variables and ML model evaluation across KNN, Logistic Regression, Random Forest, MLP, and XGBoost, with the best model (XGBoost, top 20 features) achieving 59% accuracy vs. 25% chance. For the depression and anxiety classification study (Gait & Posture 2026), I contributed data curation, formal analysis, and manuscript writing — the best models achieved 75% accuracy for gait and 77% for sit-to-walk under nested participant-level holdout validation across 30 participants.

Project Outcomes

Poster 1: SCASB 24 Poster | Poster 2: SPUR 24 Poster | Poster 3: SCASB 25 Poster

Publication: Feasibility of Machine Learning Classification of Depression and Anxiety Symptoms among College Students using 3D Gait and Sit-to-Walk Biomechanics

Publication: Emotion Classification Using Gait Biomechanics and Machine Learning

Looking to discuss further? Contact me at research@mkmaharana.com