A PERCEPTUAL DESYNCHRONIZATION STUDY OF MANUAL AND FACIAL INFORMATION IN FRENCH CUED SPEECH

Emilie Troille, Marie-Agnès Cathiard & Christian Abry
Département ICP-Parole et Cognition de GIPSA-Lab

ID 1237
[full paper]

French Cued Speech, adapted from American Cued Speech, disambiguates lipreading by a manual code of keys allowing the deaf to recover a more accurate phoneme identification. Using movement tracking of manual and facial actions coproduced in CS, Attina et al. evidenced a significant anticipation of the hand over the lips. In this study we tested the natural temporal integration of this bimodal hand-face communication system, using a desynchronization paradigm in order to evaluate the robustness of CS to temporal decoherence. Our results obtained with 17 deaf subjects demonstrate hand gestures can be delayed relative to the lips without consequences for perception, as long as this delay does not push the hand outside the visible articulatory phase of the consonant constriction state. Perceptual coherence or recomposition of coherence (recoherence) depends crucially on the compatibility of hand and mouth states, i.e. on the timing patterns evidenced in preceding production studies.