Detecting Autism From Head Movements Using Kinesics

No Thumbnail Available

Date

2024

Journal Title

Journal ISSN

Volume Title

Publisher

Assoc Computing Machinery

Open Access Color

Green Open Access

Yes

OpenAIRE Downloads

OpenAIRE Views

Publicly Funded

No
Impulse
Average
Influence
Average
Popularity
Average

Research Projects

Journal Issue

Abstract

Head movements play a crucial role in social interactions. The quantification of communicative movements such as nodding, shaking, orienting, and backchanneling is significant in behavioral and mental health research. However, automated localization of such head movements within videos remains challenging in computer vision due to their arbitrary start and end times, durations, and frequencies. In this work, we introduce a novel and efficient coding system for head movements, grounded in Birdwhistell's kinesics theory, to automatically identify basic head motion units such as nodding and shaking. Our approach first defines the smallest unit of head movement, termed kine, based on the anatomical constraints of the neck and head. We then quantify the location, magnitude, and duration of kines within each angular component of head movement. Through defining possible combinations of identified kines, we define a higher-level construct, kineme, which corresponds to basic head motion units such as nodding and shaking. We validate the proposed framework by predicting autism spectrum disorder (ASD) diagnosis from video recordings of interacting partners. We show that the multi-scale property of the proposed framework provides a significant advantage, as collapsing behavior across temporal scales reduces performance consistently. Finally, we incorporate another fundamental behavioral modality, namely speech, and show that distinguishing between speaking- and listening-time head movements significantly improves ASD classification performance.

Description

Yankowitz, Lisa/0000-0003-2604-5840; Gokmen, Muhittin/0000-0001-7290-199X; Schultz, Robert/0000-0001-9817-3425; Zampella, Casey/0000-0002-7973-8520

Keywords

Movements, Kinesics, Computer Vision, Psychology, Autism

Turkish CoHE Thesis Center URL

Fields of Science

Citation

WoS Q

N/A

Scopus Q

N/A
OpenCitations Logo
OpenCitations Citation Count
N/A

Source

Companion International Conference on Multimodal Interaction -- NOV 04-08, 2024 -- San Jose, COSTA RICA

Volume

Issue

Start Page

350

End Page

354
PlumX Metrics
Citations

CrossRef : 2

Scopus : 3

PubMed : 1

Captures

Mendeley Readers : 8

SCOPUS™ Citations

3

checked on Feb 03, 2026

Web of Science™ Citations

3

checked on Feb 03, 2026

Page Views

246

checked on Feb 03, 2026

Google Scholar Logo
Google Scholar™
OpenAlex Logo
OpenAlex FWCI
2.10840169

Sustainable Development Goals

1

NO POVERTY
NO POVERTY Logo

2

ZERO HUNGER
ZERO HUNGER Logo

3

GOOD HEALTH AND WELL-BEING
GOOD HEALTH AND WELL-BEING Logo

4

QUALITY EDUCATION
QUALITY EDUCATION Logo

5

GENDER EQUALITY
GENDER EQUALITY Logo

7

AFFORDABLE AND CLEAN ENERGY
AFFORDABLE AND CLEAN ENERGY Logo

8

DECENT WORK AND ECONOMIC GROWTH
DECENT WORK AND ECONOMIC GROWTH Logo

9

INDUSTRY, INNOVATION AND INFRASTRUCTURE
INDUSTRY, INNOVATION AND INFRASTRUCTURE Logo

10

REDUCED INEQUALITIES
REDUCED INEQUALITIES Logo

11

SUSTAINABLE CITIES AND COMMUNITIES
SUSTAINABLE CITIES AND COMMUNITIES Logo

12

RESPONSIBLE CONSUMPTION AND PRODUCTION
RESPONSIBLE CONSUMPTION AND PRODUCTION Logo

13

CLIMATE ACTION
CLIMATE ACTION Logo

14

LIFE BELOW WATER
LIFE BELOW WATER Logo

15

LIFE ON LAND
LIFE ON LAND Logo

16

PEACE, JUSTICE AND STRONG INSTITUTIONS
PEACE, JUSTICE AND STRONG INSTITUTIONS Logo

17

PARTNERSHIPS FOR THE GOALS
PARTNERSHIPS FOR THE GOALS Logo