Using AI to Improve Sports Performance & Achieve Better Running Efficiency

Screen Shot 2019-04-29 at 11.29.07 AM.png

Professional sports teams leverage advanced technology to take them over the top, outfitting players with a suite of sensors to measure, quantify, and analyze every aspect of player and team performance. Amateur athletes seek help from professional trainers but the associated session and equipment cost can be prohibitive. With the ubiquity of video recorders embedded in today’s phones and modern advancements in computer vision technology, it begs the question:

Can amateur athletes improve their performance using artificial intelligence and nothing more than a smart phone? As an AI practitioner and a dedicated runner, I decided to find out.

1.25 million. That’s the approximate number of running steps that I took to train and complete my last marathon in 2016. With an impulsive impact at each step, overtraining and poor form can promote undue strain on muscles, joints, and tendons and the cumulative effect can lead to serious injury. I learned this lesson all too well in 2012 when I tore my Achilles tendon during training and had to abandon running for nearly two years as a result. At that time, short sprints to catch a bus or playing with my kids on the soccer field ultimately led to discomfort and pain. In 2014, I finally found relief and recovery through surgery but was told by my physical therapist, “you probably should never run another marathon again” as my aging body was “not as resilient as it once was.''  Not the news I wanted to hear. Certainly, I am not alone in confronting a running injury. In fact, it is estimated that 40% to 50% of all runners experience at least one injury on an annual basis [1]. The popularity of the sport has also risen dramatically over the last decade, with an estimated +40.43% growth in the number of people participating in marathons worldwide from 2008 to 2018 [2].

Marathon runners, by their nature, are determined folk and I wasn’t about to let my physical therapist thwart my aspirations to continue running. However, I was mindful of my form, which I theorized was the primary cause of my injury, outside of ignoring some of the aches and pains I experienced during training.  I adopted a form of running thought to attenuate risk of injury while optimizing efficiency [3, 4]. Some basic concepts of this form are:

  • maintain a high cadence of 180+ steps/min

  • limit ground contact time and vertical motion

  • display consistent and symmetric motion with left/right appendages

  • strike the ground with a slightly bent knee, forward of the ankle at impact

  • maintain a forward leaning posture from the feet to the head, forming a “straight” invisible line between the ankle, hip, and shoulder at impact

  • replace dorsiflexing + heel strike with a mid-front foot strike followed by a push backwards and upwards

Many of these aspects are feasible for the average runner but are difficult to accomplish by feel alone. Sports watches loaded with sensors coupled with chest straps provide excellent feedback by measuring various performance metrics but are limited in their ability to convey motion symmetry and running form. As a complement to the data I receive from my sports watch, I wanted a tool that I could use to provide visual feedback on my running form and facilitate quantification of relevant performance and form metrics. My colleagues and I at Xyonix were exploring some intriguing pose estimation tools that I thought might be able to help.

Body pose estimation is an advanced AI technology that automatically generates a skeleton for people in each frame of a video. The results can be powerful for sports improvement

as human motion and related kinematics can be tracked and analyzed. Pose estimation models identify pre-defined points of interest on a human body, such as joints and organs, which are subsequently linked to form a computer generated “skeleton” of each person in an image. Skeletal motion can be tracked across multiple frames of a video and translated to estimate body kinematics, which are used directly to assess running performance. The only hardware that is required is your mobile phone to record a video.  

VISUAL OBSERVATIONS

Consider the following mobile phone video recording of my daughter and I running at a local park. Overlaid on each frame of the videos are:

  • Colored line segments comprising computer generated body skeletons as predicted by a body pose estimation model.

  • Trails of red, blue, and white circles representing the latest history of nose, right ankle, and left ankle positions, respectively.

  • Colored rectangular regions whose vertical span indicates the maximum range of vertical oscillation encountered in the displayed histories.

The isolation of a single human body against a backdrop of sticks, branches, trees, and fencing is impressive given that these components could mistakenly be construed as body parts. While the pose estimation model is capable of identifying multiple figures at multiple depths in a scene [5], we recorded only one runner at a time for visual clarity and to facilitate isolated analysis.

Visual inspection of the videos shows a striking difference between my running form and that of my daughter:

  • My daughter’s strides are more consistent and symmetric, as indicated by the left ankle (white dot) and right ankle (blue dot) trails. I tend to raise my right ankle higher than my left in the back stroke, which likely means that I’m not engaging my left gluteus maximus as much as my right. It also means that my right foot has a slightly longer distance to travel to prepare for the next step. This issue may seem trivial but the lack of symmetry can throw off my timing, which can lead to improper foot placement, which can lead to harsher impacts, which can lead to joint, muscle, or tendon damage. Over hundreds of thousands of steps, stride asymmetry can promote injury.

  • The average vertical oscillation of my daughter’s head, indicated by the trail of nose positions (red dots) tracked over time, is seemingly less than mine during a typical stride.

QUANTIFYING PERFORMANCE METRICS

Below is a plot of the vertical motion of various body parts tracked over time throughout the video. These series are colored based on ground height, which was estimated via a “pixels to inches” conversion behind the scenes.

Screen Shot 2019-04-29 at 11.50.40 AM.png

With these data in hand, we are set to perform a more detailed analysis. Our goal is to confirm qualitative observations with quantitative metrics. We follow each estimated metric with a TAKEAWAY section, which identifies practical advice for future training.

Let’s begin with cadence, which is theorized to be positively correlated with running efficiency [6]. Cadence is defined as the average number of steps over a given time interval, typically given in units of steps per minute. We estimate cadence as follows:

  • Isolate left and right ankle series.

  • Remove outliers and perform local interpolation to refill gaps.

  • Smooth and detrend each series and use the result to estimate local minima, whose locations identify impact locations in time, I.e., step times.

  • Estimate the cadence as the median number of steps per minute.

Screen Shot 2019-04-29 at 11.53.52 AM.png

My cadence is estimated at 169 steps/min while my daughter’s is 173 steps/min. These results match well with typical cadence estimates as reported independently by GPS running watches.

TAKEAWAY If we are to adhere to the advice to run at a cadence of 180+ steps/min, it looks like we both need to pick up the tempo a bit.

We also can use the ankle series to quantify the median difference in left/right ankle elevations on the backward portion of the running stride, a metric we will call median stride asymmetry:

  • Extract left and right detrended ankle series from cadence assessment.

  • Find local maxima.

  • Calculate elevation differences between left-right pairs and report the median of those differences.

Screen Shot 2019-04-29 at 12.00.58 PM.png

The results show that I have a whopping 5.29 inch median stride asymmetry while my daughter’s motion exhibits a more acceptable 2.34 inch value.

TAKEAWAY I need to focus on engaging my left gluteus maximus more and matching the height of my right ankle during the backstroke. Generally, I need to be more aware of my tendency to stride asymmetrically.

Finally, let us quantify and compare the vertical oscillation of the head using the following steps:

  • Isolate the nose position history.

  • Find local minima.

  • Detrend by subtracting a spline model fit using the minima from the original series.

  • Find the local maxima of detrended series

  • Recenter the series by subtracting the mean (optional, but lends to a nice visual display).

  • Calculate the absolute difference between successive minima-maxima pairs and report the median value.

Screen Shot 2019-04-29 at 12.01.10 PM.png

TAKEAWAY My daughter does a slightly better job in using her energy for forward propulsion, wasting less of her energy in vertical motion. This is something to continue to monitor over time but likely will naturally diminish by lessening my stride asymmetry,  promoting a stronger forward lean from my toes, and by increasing my cadence.

SUMMARY & EXTENSIONS

We have demonstrated the use of pose estimation for quantifying three running performance metrics.

Screen Shot 2019-04-29 at 12.01.26 PM.png

These metrics, combined with the visual feedback obtained from pose estimation videos, have given me something solid to work on in future training sessions. I look forward to recording and processing another video to verify that my efforts to adopt a better running form are working. I can validate these improvements quantitatively via the proposed metrics with a goal of making me a more efficient runner.

While we have demonstrated the efficacy of using pose estimation to better one’s running, the story doesn’t end there. Pose estimation can be used in a wide variety of important applications. Here is a list describing a few imports applications of body pose estimation [7]:

  • Sports

    • kinematic analysis of tackles in American football

    • golf swing dynamics

    • tennis swing form

  • Assisted living

    • help future robots correctly interpret body language

  • Intelligent driver assistance

    • automated estimation of driver inebriation levels

  • Medical Application

    • detection of postural issues such as scoliosis

    • physical therapy

    • study of cognitive brain development in young children by monitoring motor functionality

    • identifying disability onset via correlated motion

    • behavioral understanding

  • Video games

    • avatar movement

    • gesture and motion recognition

Screen Shot 2019-04-29 at 12.01.55 PM.png

I am happy to report that, since adopting a new running form, I have been injury free for years now and feel less tired during training runs, which may be a testament to the efficacy of the technique. Even better, my children have taken an interest in running, achieving some pretty lofty goals at their young ages while spending precious bonding time with their dad. Pictured is one of those happy moments, where one of us won first place in their division. Can you predict which one of us was the victor?

Have sports performance data? Want to automatically recommend performance improvements to athletes? Contact us, we’ve taught machines to understand all kinds of video imagery and other performance improvement data — we might be able to help.


REFERENCES

  1. Fields, K.B., Sykes, J.C., Walker K.M., and Jackson, J.C. (2010). Prevention of running injuries. Current Sports Medicine Reports, May-Jun;9(3):176-82. doi: 10.1249/JSR.0b013e3181de7ec5.

  2. Marathon Statistics Worldwide, https://runrepeat.com/research-marathon-performance-across-nations

  3. Danny Dreyer and Katherine Dreyer, ChiRunning: A Revolutionary Approach to Effortless, Injury-Free Running (May 2009). Atria Books.

  4. https://www.runnersworld.com/training/a20854024/what-makes-a-running-stride-efficient/

  5. Zhe Cao, Tomas Simon, Shih-En Wei, Yaser Sheikh (2017) Realtime Multi-Person 2D Pose Estimation using Part Affinity Fields. Computer Vision and Pattern Recognition. Accepted as CVPR 2017 Oral. arXiv:1611.08050.

  6. https://www.mcmillanrunning.com/cadence/

  7. https://en.wikipedia.org/wiki/Articulated_body_pose_estimation