People often skip the exercises recommended for physical rehabilitation at home, but five biomedical engineering students are hoping to change that with an interactive program designed to make the exercises more engaging.
“We have multiple versions of the video game,” says Rahul Yerrabelli, a biomedical engineering student and chief technology officer of MoTrack Therapy (short for motion tracking therapy). “One version prompts people to do an exercise in a certain amount of time and scores how they move. It provides live corrective feedback and even has music to make it fun.”
In 2015, Yerrabelli participated in a Johns Hopkins University student-run hackathon called MedHacks. In collaboration with Benjamin Pikus, Parth Singh, Himanshu Dashora and Adam Polevoy, the group tailored existing plug-and-play computer vision technology that works with the camera on a laptop or desktop computer to read and track hand movements. They focused on physical rehabilitation for patients with hand conditions like wrist fractures and carpal tunnel.
The students then developed a software program and incorporated machine learning to gauge the extent to which a person can do the exercises over time, predict future performance, and predict how long it might take the individual to recover. “Afterwards, it sends that information to the patient’s clinician,” says Yerrabelli. “We will see if the technology can help improve recovery times.”
With support and funding from Johns Hopkins University Tech Ventures, Yerrabelli started MoTrack Therapy with his classmates in 2016 and is currently testing their innovation in the clinic, as well as devising a version that works on a mobile phone.
“Gamification, computer vision, and machine learning technologies have already started transforming fields outside of health care,” says Yerrabelli. ”Now they’re making their way into medicine.”