Recognizing Non-Manual Signs in Turkish Sign Language
Loading...

Date
2019
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
IEEE
Open Access Color
OpenAIRE Downloads
OpenAIRE Views
Abstract
Recognition of non-manual components in sign language has been a neglected topic, partly due to the absence of annotated non-manual sign datasets. We have collected a dataset of videos with non-manual signs, displaying facial expressions and head movements and prepared frame-level annotations. In this paper, we present the Turkish Sign Language (TSL) non manual signs dataset and provide a baseline system for non manual sign recognition. A deep learning based recognition system is proposed, in which the pre-trained ResNet Convolutional Neural Network (CNN) is employed to recognize question, negation side to side and negation up-down, affirmation and pain movements and expressions. Our subject independent method achieves 78.49% overall frame-level accuracy on 483 TSL videos performed by six subjects, who are native TSL signers. Prediction results of consecutive frames are filtered for analyzing the qualitative results.
Description
ORCID
Keywords
Facial Expression Recognition, Sign Language Recognition, Non-Manual Sign Analysis
Fields of Science
Citation
WoS Q
Scopus Q
Source
9th International Conference on Image Processing Theory, Tools and Applications (IPTA) -- NOV 06-09, 2019 -- Istanbul, TURKEY
