Just because Microsoft’s Kinect doesn’t actually support sign language anymore, doesn’t mean that engineers out there in the world have to completely forget about it. In fact, that just gives room for those out there interested in developing new ways for systems to understand it. Courtesy of engineers from the University of Washington comes the MobileASL project, designed specifically for those that are deaf or are hard of hearing to still use their cell phones, but to use their primary method of communication.
While texting may work for some, not everyone enjoys doing it. Shocking, we know. The MobileASL project, which is beginning its testing phase now, seems to work quite well over networks that already have a 3G network in place. There’s no word on how well it would work on slower speeds. The device itself seems to connect to your smartphone, and from there use a standard video capturing method to increase the image quality of face and hand gestures, but at the same time reduce the data rate to only 30 kilobytes per second. Additionally, the system uses motion detectors to determine whether or not the user in front of the camera is doing sign language at any given moment or not.
There’s no definitive mention as to when something like this could see mainstream release, but at least the idea is there. Setting up your device to function as a camera, which can then support American Sign Language is a great idea, and will give people who need it another option to communicate more comfortably.