Tag Archives: Learning Devices

MotionSavvy UNI Portable Sign Language Interpreter System

Summary

A team of deaf technologists and innovators have launched a company called MotionSavvy and developed a product called UNI — a portable sign language interpreter system available for preorder on IndieGogoThe product has been reviewed by MedGadgetTechCrunch, and Time Magazine.

Challenges

Real time translator systems have been developed for spoken language, but this is the first portable device that is designed to assist with sign language communication. Here are some of the challenges the developers will face:

  • Hands Free Operation. In some of the company videos and literature, the device user is shown holding the device in their left hand (for example) and using sign language with their right hand. However, sign language typically requires both hands for many signs.
  • Wide Area Analysis. The UNI is based on LEAP Motion technology which analyzes hand gestures in a small area above a viewer. Sign language often utilizes a wider space around a person.
  • Body Analysis. Sign language uses the body. The LEAP Motion technology only tracks hand movement.
  • Placement. The placement in 3D space of a sign relative to the signer also has meaning. The LEAP Motion technology doesn’t seem to account for this.
  • Facial Expressions. Often a facial expression can change the meaning of a sign significantly. For example, if I point, and purse my lips (as if whistling) this means something is close. However, in describing something far away, a person will open their mouth and use a different facial expression.
  • Similar Hand Shapes. Some sings use the same hand shape, but the hand location and orientation change, and this changes the meaning of the sign. The LEAP Motion technology will have a difficult time identifying these nuances.
  • Hidden Signs. In some signs, the fingers are hidden from the viewer, but because of a familiarity with sign language and the context of the conversation, one knows what’s being signed. For example,  when a person is hungry, they will take their open hand as if holding an invisible cup, touch their chest in the middle, and move their hand down. This is the sign for hungry. This is done quickly, and any software designed to analyze finger placement won’t be able to visually see the fingers for such signs.

Despite these limitations, the technology does show promise. As long as the signer limits their language to vocabulary recognizable by the device, it should serve in a simple capacity. Presumably, future versions of this technology, many years from now, will place a camera on the floor or at a sufficient distance to see the upper torso and surrounding area of the signer. Perhaps some kind of x-ray technology could be used for ‘seeing’ hands regardless of any obstruction.

Videos

Below are videos about the product, the team, and their development process.

Below is a video about the technology incubator center called the LEAP-AXLR8R where MotionSavvy is based.

“The LEAP Axlr8r is a unique program designed for developers, designers and founders interested in reinventing industries through gesture based technology. We will provide design guidance, access to industry expertise, access to LEAP engineers for software and hardware development and business design expertise to help turn your product into a massively scalable business that changes the way people interact with the world forever.” (source)

LEAP Motion is a device that tracks hand motion for use in many different applications as shown below.

Advertisements