The AssistGlove with the Cmod

For the Digilent Design Contest, one of the teams created an AssistGlove that senses hand movements and enables intuitive sign language. The Instructable by  tells how they used the chipKIT Cmod to do this project. 

You will need a chipKIT Cmod, a PmodBT2, and a PmodACL2. If you want to really expand the project and allow for further speech, you can use a ZYBO.

You’ll need to use BlueTooth in conjunction with the Cmod and Pmod accelerometers. Measurements are taken , and the X measurement is sent over BlueTooth and received by the PC being used. The received values are shown, and the Cmod’s LEDs blink at different frequencies depending on whether it was moved up or down (i.e., whether X is positive or negative).

assistglove-blinky

Some of the other endeavors on this as yet to-be-completed project are more ZYBO-focused. These include a debugging project, connecting the ZYBO and PmodWiFi, writing a Linux application for ZYBO, and then compiling a speech synthesizer library.

Let us know if you’ve worked on this project!

Author

  • Amber Mear

    I was the Digilent blog editor, and now I'm a contributor. I love learning about wearables and writing about social issues in STEM. Outside of work, I can be found watching Netflix with my cat, working on an art project, or trying to find new, delicious local foods.

    View all posts
Be the 1st to vote.

About Amber Mear

I was the Digilent blog editor, and now I'm a contributor. I love learning about wearables and writing about social issues in STEM. Outside of work, I can be found watching Netflix with my cat, working on an art project, or trying to find new, delicious local foods.

View all posts by Amber Mear →

Leave a Reply

Your email address will not be published. Required fields are marked *