Neural systems underlying spatial language in American Sign Language
A [(15)O]water PET experiment was conducted to investigate the neural regions engaged in processing constructions unique to signed languages: classifier predicates in which the position of the hands in signing space schematically represents spatial relations among objects. Ten deaf native signers viewed line drawings depicting a spatial relation between two objects (e.g., a cup on a table) and were asked either to produce a classifier construction or an American Sign Language (ASL) preposition that described the spatial relation or to name the figure object (colored red). Compared to naming objects, describing spatial relationships with classifier constructions engaged the supramarginal gyrus (SMG) within both hemispheres. Compared to naming objects, naming spatial relations with ASL prepositions engaged only the right SMG. Previous research indicates that retrieval of English prepositions engages both right and left SMG, but more inferiorly than for ASL classifier constructions. Compared to ASL prepositions, naming spatial relations with classifier constructions engaged left inferior temporal (IT) cortex, a region activated when naming concrete objects in either ASL or English. Left IT may be engaged because the handshapes in classifier constructions encode information about object type (e.g., flat surface). Overall, the results suggest more right hemisphere involvement when expressing spatial relations in ASL, perhaps because signing space is used to encode the spatial relationship between objects.
Neuroimage. 2002 Oct;17(2):812-24.
Salk Institute for Biological Studies, La Jolla, California 92037, USA. emmorey@salk.edu