You have requested a machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Neither SPIE nor the owners and publishers of the content make, and they explicitly disclaim, any express or implied representations or warranties of any kind, including, without limitation, representations and warranties as to the functionality of the translation feature or the accuracy or completeness of the translations.
Translations are not retained in our system. Your use of this feature and the translations is subject to all use restrictions contained in the Terms and Conditions of Use of the SPIE website.
4 February 2013Finger tracking for hand-held device interface using profile-matching stereo vision
Hundreds of millions of people use hand-held devices frequently and control them by touching the screen with their
fingers. If this method of operation is being used by people who are driving, the probability of deaths and accidents
occurring substantially increases. With a non-contact control interface, people do not need to touch the screen. As a
result, people will not need to pay as much attention to their phones and thus drive more safely than they would
otherwise. This interface can be achieved with real-time stereovision. A novel Intensity Profile Shape-Matching
Algorithm is able to obtain 3-D information from a pair of stereo images in real time. While this algorithm does have a
trade-off between accuracy and processing speed, the result of this algorithm proves the accuracy is sufficient for the
practical use of recognizing human poses and finger movement tracking. By choosing an interval of disparity, an object
at a certain distance range can be segmented. In other words, we detect the object by its distance to the cameras. The
advantage of this profile shape-matching algorithm is that detection of correspondences relies on the shape of profile and
not on intensity values, which are subjected to lighting variations. Based on the resulting 3-D information, the
movement of fingers in space from a specific distance can be determined. Finger location and movement can then be
analyzed for non-contact control of hand-held devices.