April Flowers[1] for redOrbit.com – Your Universe Online
Most yoga[2] classes depend on the participants watching an instructor to learn how to properly hold a position. For the blind[3] or those who have problems with their sight, this type of class can be frustrating. Now, a new software program that watches a user’s movements and gives verbal feedback on what to change to accurately complete a yoga pose has been developed by a team of University of Washington computer scientists.
“My hope for this technology is for people who are blind or low-vision to be able to try it out, and help give a basic understanding of yoga in a more comfortable setting,” said project lead Kyle Rector[4] , a UW doctoral student in computer science and engineering.
Eyes-Free Yoga[5] , the new program, employs Microsoft Kinect[6] software to track body movements and offer verbal feedback for six yoga poses in real time. The poses included Warrior I and II, Tree and Chair poses.
Rector’s new program instructs the Kinect to read the user’s body angles. It then gives verbal feedback on how to adjust his or her arms, legs, neck or back to complete the pose successfully. The program might say “Rotate your shoulders left,” or “Lean sideways toward your left” to help the user.
[ Watch the Video: Yoga For The Blind[7] ]
The end result is a video game used for exercise – an “exergame” for yoga – that would allow people with little to no sight to interact verbally with a simulated yoga instructor. The research team, which includes Julie Kientz, a UW assistant professor in Human Centered Design & Engineering, and Cynthia Bennett, a research assistant in computer science and engineering, believe that their program can transform a typically visual activity into something that blind people can also enjoy.
“I see this as a good way of helping people who may not know much about yoga to try something on their own and feel comfortable and confident doing it,” Kientz said. “We hope this acts as a gateway to encouraging people with visual impairments to try exercise on a broader scale.”
The team programmed about 30 different commands for each of the six poses based on rules deemed essential for each position. A number of yoga instructors worked with the research team to develop the criteria for reaching the correct alignment in each pose. The program first checks the user’s core and suggests changes to their alignment. Then the Kinect moves to the head and neck area, and finally the arms and legs. The program gives positive feedback when the person is holding a pose correctly, as well.
To develop this technology, Rector practiced a lot of yoga, deliberately making mistakes in order to test and tweak each aspect of the program. The result of this testing is a program she believes is robust and useful for the blind.
“I tested it all on myself so I felt comfortable having someone else try it,” she said.
The researchers recruited 16 blind and low vision participants around the state to test the program and gain feedback about its effectiveness. Several of the study participants had never tried yoga before, while some had experience with casual or regular yoga classes. A majority – 13 out of 16 – said they would recommend the program to others, and almost the entire group said they would be willing to use it again themselves.
Simple geometry and the law of cosines were used to calculate angles created during yoga. In some poses, a bent leg must be at a 90-degree angle, for example, while the arm spread needs to form a 160-degree angle. The program allows the Kinect to read the angle of the pose using cameras and skeletal-tracking technology to be able to guide the user in reaching the desired angle.
Rector chose the Kinect software because it is open source and easily accessible. She says it does have some limitations, however, in the level of detail with which it tracks movement. She and her team plan to make this software available online so users can download it and easily begin learning yoga. They are also continuing their research with additional projects that help with fitness.
The findings[8] of the study were published in the conference proceedings of the Association for Computing Machinery’s SIGACCESS[9] International Conference on Computers and Accessibility.
Image 2 (below): An incorrect Warrior II yoga pose is outlined showing angles and measurements. Using geometry, the Kinect reads the angles and responds with a verbal command to raise the arms to the proper height. Credit: Kyle Rector, UW
References
- ^ April Flowers (blogs.redorbit.com)
- ^ yoga (www.redorbit.com)
- ^ blind (www.redorbit.com)
- ^ Kyle Rector (homes.cs.washington.edu)
- ^ Eyes-Free Yoga (dub.washington.edu)
- ^ Microsoft Kinect (www.redorbit.com)
- ^ Yoga For The Blind (www.redorbit.com)
- ^ findings (homes.cs.washington.edu)
- ^ SIGACCESS (www.sigaccess.org)
No comments :
Post a Comment