DI-UMONS : Dépôt institutionnel de l’université de Mons

Recherche transversale
Rechercher
(titres de publication, de périodique et noms de colloque inclus)
2017-06-30 - Colloque/Article dans les actes avec comité de lecture - Anglais - 8 page(s)

Tits Mickaël , Tilmanne Joëlle , Dutoit Thierry , "Morphology Independent Feature Engineering in Motion Capture Database for Gesture Evaluation" in 4th International Conference on Movement Computing, London, United Kingdom, 2017

  • Codes CREF : Sciences de l'ingénieur (DI2000)
  • Unités de recherche UMONS : Théorie des circuits et Traitement du signal (F105)
  • Instituts UMONS : Institut NUMEDIART pour les Technologies des Arts Numériques (Numédiart)
Texte intégral :

Abstract(s) :

(Anglais) In the recent domain of motion capture and analysis, a new challenge has been the automatic evaluation of skill in gestures. Many methods have been proposed for gesture evaluation based on feature extraction, skill modeling and gesture comparison. However, movements can be influenced by many factors other than skill, including morphology. All these influences make comparison between gestures of different people difficult. In this paper, we propose a new method based on constrained linear regression to remove the influence of morphology on motion features. To validate our method, we compare it to a baseline method, consisting in a scaling of the skeleton data. Results show that our method outperforms previous work both in removing morphology influence on feature, and in improving feature relation with skill. For a set of 326 features extracted from two datasets of Taijiquan gestures, we show that morphology influence is completely removed for 100% of the features using our method, whereas the baseline method only allows limited reduction of morphology influence for 74% of the features. Our method improves correlation with skill as assessed by an expert by 0.04 (p<0.0001) in average for 98% of the features, against 0.001 (p=0.68) for 58% of the features with the baseline method. Our method is also more general than previous work, as it could potentially be applied with any interindividual factor on any feature.