Objective Assessment of Karate Kata Performance through Synchronized MultiView Video Analysis and Deep Learning
Author : Bevin Benet
Abstract :Karate kata play a central role in grading, competition, and the development of core psychomotor skills. Traditional human-judged evaluations can be susceptible to subjectivity and inter-rater variability. This challenge aligns closely with the broader field of Action Quality Assessment (AQA), which focuses on evaluating the quality of human actions in a way that mimics expert judgment. In this work, we introduce a novel regression pipeline that integrates synchronized dual-view video capture (frontal and lateral) with markerless pose estimation and Long Short-Term Memory (LSTM)–based regression to generate objective, continuous quality scores for entire kata performances. As a starting point, we have chosen the first kata of the Shōrin-ryū style of karate named Taikyoku Shodan. A total of 50 kata executions were recorded simultaneously from two calibrated camera perspectives and annotated by an expert judge. MediaPipe was used to extract and normalize 2D skeletal keypoints, filtering out the frames from which angles cannot be extracted correctly using the face orientation. Frame-wise joint angles were then computed and assembled into time-series feature vectors.To address data scarcity, we augmented the sequences using temporal shifts, Gaussian noise injection, and slight amplitude scaling, expanding the dataset to 2,400 training samples. The vectors were fed into a bidirectional LSTM network that learns temporal dependencies across the full kata sequence. Our supervised regression model achieved a mean squared error (MSE) of 0.141 and a Spearman's rank correlation coefficient of 0.802 compared to expert scores, demonstrating high fidelity in replicating human judgment.
Keywords :Karate, Kata, Pose Estimation, Computer Vision, Data Augmentation.
Conference Name :International Conference on Science, Engineering & Technology (ICSET-25)
Conference Place Trivandrum, India
Conference Date 5th Oct 2025