Correa Hernandez, Pedro
[UCL]
Czyz, Jacek
[UCL]
Umeda, Toshiyuki
[UCL]
Marques, Ferran
[]
Macq, Benoît
[UCL]
This paper presents a novel technique for 2D human motion estimation using a single non calibrated camera. The user%92s five extremities (head, hands and feet) are extracted, labeled and tracked after silhouette segmentation. As they are the minimal number of points that can be used in order to characterize human gestures [8], we will henceforth refer to these features as crucial points. The crucial point%92s candidates are defined as the local maxima of the geodesic distance with respect to the center of gravity of the actor region (or silhouette) following the silhouette boundary. In order to disambiguate the selected crucial points into head, left and right foot, left and right hand classes, we propose a Bayesian method that combines a prior human model and the dynamics of the tracked crucial points. The system can run at 50Hz paces on standard Personal Computers and has proved to be robust in presence of noisy data and self-occlusions, with an average error rate of 4!
% under those conditions.


Bibliographic reference |
Correa Hernandez, Pedro ; Czyz, Jacek ; Umeda, Toshiyuki ; Marques, Ferran ; Macq, Benoît. SILHOUETTE-BASED 2D MOTION CAPTURE FOR REAL-TIME APPLICATIONS.IEEE-ICIP 2005 (Genoa (ITALY), du 11/09/2005 au 14/09/2005). In: Proceedings of the IEEE International Conference on Image Processing 2005 (ICIP), 2005, p. III - 836-9 |
Permanent URL |
http://hdl.handle.net/2078.1/92332 |