The blue-c system is capable of recognizing a simple pointing gesture performed with one or two hands and is able to use it as an additional tracking device.
Our pointing gesture is modelled as the line of sight connecting the eyes of the user and the fingertip of the pointing hand (see the blue lines in the figures below). The algorithm is based on the silhouettes extracted for the visual hull computation of three horizontally aligned and one overhead camera. Therefore, the computational expenses for the algorithm are quite low and allow the tracker to run in real time even at high frame rates.
Simple assumptions regarding the visibility of interesting points such as the head and the stretched arm and their cast in the silhouettes are used to reconstruct the origin and the direction of the pointing gesture in the world frame. Since the cameras are calibrated, epipolar geometry is used to solve the correspondence problem between the views for our points of interest. Then the world coordinates of the users eyes and pointing fingertip are being computed and passed to the applications.
Focus of future research will be the reconstruction of the complete body pose in order to detect a complete set of gestures.