See it in Search
This page is a preview of the following resource. Continue onto eagle-i search using the button on the right to see the full record.
Expressive Crossmodal Affect Integration in Autism
eagle-i ID
http://ohsu.eagle-i.net/i/0000012c-3beb-6be1-cc1a-f59980000000
Resource Type
Properties
-
-
Related grant number
-
NIH 1R21DC010239-01
-
-
Resource Description
-
Children with autism spectrum disorder (ASD) have often been observed to express affect either weakly, only in one modality at a time (e.g., choice of words) or in multiple modalities but not in a coordinated fashion. These difficulties in crossmodal integration of affect expression may have roots in certain global characteristics of brain structure in autism, specifically atypical interconnectivity between brain areas. Poor crossmodal integration of affect expression may also play a critical role in communications difficulties that are well documented in ASD. Not understanding how e.g., facial expression can be used to modify the interpretation of words undermines social reciprocity. Impairment in crossmodal integration of affect is thus a potentially powerful explanatory concept in ASD. Software related to this project addresses the need for data on expressive crossmodal integration impairment in ASD and its association with receptive croosmodal integration impairment by using innovative technologies to create stimuli for a judgmental procedure that makes possible independent assessment of the individual modalities; these technologies are critical because human observers are not able to selectively filter out modalities. In addition, the vocal measures and the audiovisual database lay the essential groundwork for the next step: Creation of audiovisual analysis methods for automated assessment of expressive crossmodal integration. These methods are applied to audio-visual recordings of a structured play situation; the child will participate in this play situation twice, once with a caregiver and once with an examiner. This procedure for measuring expressive crossmodal integration is complemented by a procedure for measuring crossmodal integration of affect processing using dynamic talking-face stimuli in which the audio and video stream are recombined (preserving perfect synchrony of the facial and vocal channels) to create stimuli with congruent vs. incongruent affect expression.
-
-
Used by
-
Center for Spoken Language Understanding
-
-
Website(s)
-
http://projectreporter.nih.gov/project_info_description.cfm?aid=7712656&icde=3630892
-
-
Website(s)
-
http://www.ohsu.edu/xd/education/schools/school-of-medicine/departments/basic-science-departments/biomedical-engineering/center-for-spoken-language-understanding/expressive-crossmodal-affect.cfm?WT_rank=1
-
-
Developed by
-
van Santen, Jan P.H., Ph.D.
-
-
Developed by
-
Black, Lois M., Ph.D.
-
-
Software license
-
Open source software license