Вы находитесь на странице: 1из 5

Brain & Development 26 (2004) 146150

www.elsevier.com/locate/braindev

Review

The neurology of sign language


Neil Gordon*
Huntlywood, 3 Styal Road, Wilmslow SK9 4AE, UK
Received 4 October 2002; received in revised form 24 June 2003; accepted 24 June 2003

Abstract
Forms of sign language have developed in a number of countries. American Sign Language, which originated from French signing, has
been most extensively researched. As sign language is based on gestures executed in space and perceived visually it might be thought that it
would mainly be a function of the right cerebral hemisphere when this is the non-dominant one. A number of studies are reviewed showing
that sign language is a language in its own right and therefore, as with spoken language, its primary site of organization is in the dominant
hemisphere. This does not mean that there is not a significant contribution from the other hemisphere with an interplay between the two. Each
research project usually contributes some facet of knowledge apart from the main conclusions. These included the importance of
distinguishing signs from gestures, the localization of different types of signing within the left dominant cerebral hemisphere, the fact that
lesions of the right non-dominant hemisphere, although not causing a loss of signing will result in dyspraxia, and that aphasic symptoms of
signing and speech are not modality dependant but reflected a disruption of language processes common to all languages. Examples are given
of discoveries made by the use of the newer neuroradiological techniques such as functional magnetic resonance imaging and positron
emission tomography, and no doubt these will lead to further advances in knowledge. The use of sign language in the treatment of patients
with verbal aphasia is considered, especially of children with the Landau Kleffner syndrome, but therapy of this kind can be used in children
with delayed language development, and in other types of acquired aphasia at any age. Other methods of treatment than signing, such as
cochlear implants may be increasingly used in the future, but it seems likely that sign language will continue to be a dominant feature in the
deaf culture.
q 2003 Elsevier B.V. All rights reserved.
Keywords: Neurology of sign language; Use in treatment

1. Introduction
The history of the development of sign language goes
back for more than 400 years, so it is not surprising that a
complicated culture of the deaf has arisen [1]. What is
surprising is that until a few years ago there were bitter
arguments between those who favoured oral language only
when communicating with those who were severely deaf
and the supporters of signing. Some of the former would not
allow the use of sign language, which never seemed to be
very logical, as surely an input of language, whatever form
it took, should be of benefit. A form of sign language has
developed in at least 13 European countries, but some of
these are mutually unintelligible and may have little or no
similarities with the local spoken language. Most research
has been done on American Sign Language, which
* Tel.: 44-1625-525437.
E-mail address: neil-gordon@doctors.org.uk (N. Gordon).
0387-7604/$ - see front matter q 2003 Elsevier B.V. All rights reserved.
doi:10.1016/S0387-7604(03)00128-1

originated from French signing about two hundred years


ago, and this has shown that it is indeed a fully-fledged
language that can be used in every day conversation,
intellectual argument, wit and poetry [2]. In the case of wit
punning is a good example. This has been studied among
subjects using American Sign Language, and depends on the
similarity between signs which have different meanings.
With poetry the subjects were those associated with the
National Theater of the Deaf who used the same signing
language, and was based on an analysis of the flow of
movement between signs and the patterns of movement [3].
Also, the study of the neurology of signing can help to
discover the mysteries of language in general [4]. For
instance the work of Petitto and Marentette [5], who showed
that deaf infants of deaf parents whose only language
exposure was to American Sign Language engage in a form
of babbling which is quite like the babbling of hearing
infants. Around about the age of one the former produced
manual activities different from the typical gesturing of

N. Gordon / Brain & Development 26 (2004) 146150

normal children at this age, showing that babbling is most


probably related to the abstract linguistic structure of
language rather than the speech modality per se.

2. Neuronal basis for sign language


In most instances there is no doubt that the left cerebral
hemisphere dominates the organization of spoken language,
and that the right hemisphere is mainly concerned with
visual-spatial functions. Therefore as sign language is based
on gestures executed in space and perceived visually it
might have been thought that it would be mainly a function
of the right side, but this does not seem to be the case. Also
in studying such functions sign language must be clearly
distinguished from gesture. Metz-Lutz et al. [6] reported a
congenitally deaf child, aged five, with focal epilepsy. He
showed specific impairment in French sign language but an
amazing performance in miming, which supports an early
dissociation of the two systems, verbal and non-verbal. This
does not mean that the right cerebral hemisphere does not
have a part to play in the functional organization of sign
language, just as it does with spoken speech, for example in
automatic speech such as repetition. Neuroimaging studies
have shown that in the case of sign language the right
cerebral hemisphere is involved in such areas as discourse
comprehension, prosodic functions, and in the direct
encoding of objects [7]. Also, in deaf signers damage to
the right hemisphere may not result in aphasia but will cause
gross impairment of non-linguistic visual-spatial abilities,
and as with hearing subjects left hemisphere damage can
cause subtle difficulties of this kind in addition to the
linguistic effects [8]. As stated by Maratsos and Matheny [9]
language function is most unlikely to be a wholly
independent one.
Most right handed people are left hemisphere dominant
for language; and Vaid et al. [10] tested the performance of
the right and left hands of deaf native users of American sign
language. They found that more signs were produced with
the preferred hand when right and left handers were tested,
but left handers were more flexible in signing with their nonpreferred hand and more unusual patterns of hand use were
found in a right-handed deaf signer after a left hemisphere
lesion. This suggests an increased use of the intact right
hemisphere. Also it has been found that although much of
sign language function is organized in the left cerebral
hemisphere, even within these areas the exact localization of
any lesion will result in different types of dysphasia, for
example whether this affects vocabulary or grammar [11].
Damasio et al. [12] studied a hearing, strongly righthanded, woman who had learnt American Sign Language at
the age of eighteen. The left intracarotid injection of a
barbiturate, the Wada test, was used, before and after a right
temporal lobectomy for intractable epilepsy. Neuropsycholohical and anatomical asymmetries favoured left cerebral
dominance for auditory based language. Single photon

147

emission tomography showed lateralized activity of the left


Brocas and Wernickes areas for spoken language. During
the Wada test these areas would be inactivated, and the
patient was rendered aphasic for both English and sign
language, both before and after the operation. The results
therefore suggest that structures in the left cerebral hemisphere subserve language in a visuo-spatial as well as an
auditory system, rather than individually on speech sounds
or the form of the signal.
Similar findings were recorded in a hearing 19-year-old
raised by deaf parents who suffered a head injury. There was
severe right hemiplegia. He had no evidence of apraxia or
visual-spatial deficits, but was aphasic for the spoken word
and for sign language [13]. In contrast an adult patient, after
a right cerebral vascular lesion had no disturbance of sign
language or finger spelling [14].
In a study of language processing in various forms
MacSweeney et al. [15] used neuroimaging with functional
MRI to explore the perception of British Sign Language in
nine hearing and nine congenitally deaf native signers while
they performed a British Sign Language sentence-acceptability task. Eight hearing, non-signing subjects performed
an analogous task that involved audio-visual English
sentences. They found that there were modality- independent patterns with regions activated by the sign language
among the deaf and by spoken English among the hearing.
These included the inferior prefrontal regions bilaterally and
the superior temporal regions bilaterally; and lateralization
patterns were similar for the two languages. There was no
evidence of enhanced right hemisphere recruitment for the
sign language processing in comparison with audio-visual
English. In relation to modality-specific patterns, audiovisual speech in hearing subject generated greater activation
in the primary and secondary cortices than the sign language
in the deaf signers, whereas the sign language generated
enhanced activation in the posterior occipito-temporal
regions, reflecting the greater movement component of
sign language. Then the influence of hearing status on the
recruitment of sign language processing systems was
investigated by comparing deaf and hearing adults who
had British Sign Language as their first language, and the
former showed greater activation in the left superior
temporal gyrus in response to this language (native signers);
suggesting that this region may be privileged for processing
heard speech even in the hearing native signers, and that in
the absence of auditory input this region can be recruited for
visual processing. In fact it shows that these regions
constitute the core language system regardless of the
language modality and hearing status.
Research investigating the neural organization of signed
language and spatial cognition in deaf people by Hickok
et al. [16] has shown the same hemispheric asymmetries
found in those with normal hearing and speech. However,
the hemisphere organization of grammatical aspects of
language seems to be independent of the modality, and to be
unaffected by the significant degree of visual-spatial

148

N. Gordon / Brain & Development 26 (2004) 146150

processing used in sign language. Evidence from cerebral


lesions among deaf signers suggests that there is a high
degree of similarity in the neural organization of signed and
spoken language within the left cerebral hemisphere, and
that any slight differences that may occur are due to
variations in input output systems. For example, a
congenitally deaf left-handed man suffered a stroke in
later life, resulting from a left middle cerebral artery
thrombosis. Apart from the right hemiplegia, he lost his
ability to communicate with sign language so that even
though he was strongly left-handed it is reasonable to
suppose his language function was located in the left
cerebral hemisphere [17].
Chiarello et al. [18] investigated a sixty five year-old
prelingually deaf woman who was fluent in American Sign
Language. She had suffered from a left parietal infarct
involving the left supramarginal and angular gyri which
resulted in aphasia for both the sign language and written
and finger-spelled English. By five to seven weeks after this
there was fluent but paraphasic signing, anomia, impaired
comprehension and repetition, alexia and agraphia with
elements of neologistic jargon. In general the patients sign
errors showed a consistent disruption in the structure of
American Sign Language which paralleled the speech errors
of oral aphasic patients. It was concluded that most of the
aphasic symptoms were not modality dependant, but
reflected a disruption of linguistic processes common to
all human languages.
Three hearing adults studied by Anderson et al. [19]
suffered from severe aphasia, but became competent in
certain aspects of American Sign Language and finger
spelling, although virtually unable to speak the English
counterparts of the signs. Two had cerebrovascular
accidents and one herpes simplex encephalitis. The former
suffered damage to the left postero-lateral temporal and
inferior parietal cortices, and they mastered the production
and comprehension of simple signs and short meaningful
sequences but the other, with damage to most of the left
temporal cortex, was less accurate in single sign processing
and was unable to produce sequences at all. The findings
suggested that conceptual knowledge is represented independently of the auditory vocal records for the corresponding lexical entries. Also that the areas damaged in the third
patient are part of the neural network supporting the links
between conceptual knowledge and linguistic signs,
especially as they are used in sequencing. In other words
dysfunction of the primary auditory-vocal system of
communication does not predict dysfunction in alternative
visual-gestural systems.

3. The therapeutic use of sign language


If a child is unable to communicate with spoken speech,
whether the cause is congenital or acquired, the possibility
of granting the opportunity to do this with the use of sign

language must be considered. In a survey of up to 40 schools


and units for autistic and aphasic children in the UK,
published in 1984 [20], it was found that the majority used
some form of augmentative system. The use of the British
Sign Language was favoured for autistic children and the
Paget Gorman Sign System for those with aphasia. The
results were very variable, but were better for the latter.
In the case of delayed language development there seems
to be no reason why the use of sign language should not be
introduced at an early age. Von Tetzchner [21] report a
3-year-old boy who was not talking due to right temporal
agenesis. He was taught sign language, and 6 months later
showed a marked improvement in spoken language.
In patients who lose the ability to use spoken speech due
to acquired aphasia, for example in childhood epileptic
aphasia, or the Landau Kleffner syndrome, sign language
may provide an alternative means of communication. Perez
et al. [22] report such a child. A boy between the ages of 3
and 4 years began to respond less to auditory stimuli, and
deafness was suspected. Then seizures started and this
syndrome was diagnosed, and by the age of 6 the child was
mute. He subsequently developed considerable skills in
using signing, and to evaluate this his performance at the
age of 13 years 6 months was compared with that of a child
with congenital deafness. This showed that he had acquired
the same proficiency as the other child, and that this had not
impeded the return of spoken language, and may have even
enhanced recovery. This must mean that sign language was
processed in spared higher language areas, and supports the
idea that it does not impair the acquisition of spoken speech.
Woll and Sieratzki [23] point out that in using sign
language, depending mainly on spatial features, in the
treatment of a condition such as the Landau Kleffner
syndrome affected children are not using representations of
spoken language but a quite separate language, which is
likely to be as accessible and natural to them as to severely
deaf children; and this seems to be the case. Sieratzki et al.
[24] maintain, as a result of functional magnetic resonance
imaging studies, that this syndrome is due to a hyper-intense
cortical response to speech input rather than a disconnection
of cortical speech areas due to epileptic activity [25], and
believe that all affected children should be given the
opportunity to learn sign language. If the processing
requirements of language can trigger paroxysmal activity
which blocks the use of spoken language, and the patient
can learn signing, this must surely mean that this form of
communication can be acquired without access to the
primary system in the dominant hemisphere.
In her studies of children with the Landau Kleffner
syndrome Bishop [26] found that there were strikingly
similar patterns of performance between children who were
profoundly deaf and those with this syndrome, probably due
to both having deviant comprehension because of their
reliance on the visual modality to learn grammar. If, in both
instances, language comprehension is deviant rather than

N. Gordon / Brain & Development 26 (2004) 146150

delayed this must be taken into account in assessing the


progress of these children.
Also, as quoted, people with acquired aphasia can be
helped to communicate with the use of signs when they
cannot use the spoken word [19]. The success of this may
depend on the type of language deficit. For example
Kirshner and Webb [27] reported an adult who after a
bitemporal infarction showed a severe disturbance of
auditory comprehension with relative sparing of the
comprehension of printed or gestured symbols; and her
ability to communicate was greatly aided by instruction in
Amerind (American Indian Signs) and American Sign
Language. It has been found that the gestures of the former
were easier to imitate than the matched signs of the latter.
The contribution of the right cerebral hemisphere was small
in both instances, and the sign and verbal systems showed
evidence of dissociation [28]. These finding have been
confirmed by Guilford et al. [29], although they found no
differences between the systems. It was not possible to
predict the subjects level of acquisition for sign language
by his expressive speech characteristics alone. Also the
severer the aphasia the more difficult it will be for the
patient to learn sign language [30,31,32], and success
depends particularly on three factors; the degree to which a
signs physical or structural characteristics make its meaning evident, the motor complexity of the signs, and their
linguistic function (noun, verb or adjective). This has
obvious implications when selecting a sign vocabulary [33].

4. Conclusions
The importance of sign language to the world of the deaf
is shown very clearly in Seeing Voices, a fascinating book
by Oliver Sacks [34]. The complexity of the deaf culture are
described in a way that gives insight to those who have no
concept of what it may be like to be deprived of auditory
input. This culture must be respected but efforts must still be
made to overcome the disability, for example with the use of
more sophisticated digital hearing aids, and operative
treatment.
There has been much criticism of cochlear implantation
from the deaf community, especially in the case of the
prelingually deaf children. This is mainly on the grounds
either that it does not work, or that there are severe
complications, or that it impoverishes language development, or that it is destroying a linguistic minority. However,
there is increasing evidence that the assessment of infants
hearing is becoming more accurate and that the operation is
now safer and more successful. Such success is of particular
importance when there is improved development of speech
reception, language acquisition and reading comprehension,
so that such children can take their place in a hearing
environment on equal terms [35]. However, the results at
this stage are limited. Priesler et al. [36] investigated
patterns of communication of 22 deaf children, aged 2 5

149

years old, who had used implants for between 1 and 3.5
years. The majority of them could take part in simple oral
conversations in a well known context, but they did not take
an active role in fantasy play with their peers who did not
use sign language, and mostly interacted with signing
adults. In fact they were still considered to be socially deaf
and did not show any improvement in social competence.
Obviously, longer follow-up studies are needed as there will
be many factors determining future development.
The most important findings in support of signing being a
language in its own right are anatomical. A number of these
investigations have been reviewed, and two further studies
by Poizner and his colleagues [37,38] are of special
importance. They assessed four deaf signers, three of
whom had damage to the left cerebral hemisphere and one
to the right. The patients were tested for dyspraxia and
pantomime recognition, and the latter was not apraxic as
might have been expected, but all the others were aphasic
for sign language and strong dissociations emerged between
their capacities for sign language and their nonlinguistic
motor skills. Therefore the language deficits of these
patients related to the specific language components of
sign language rather than to an underlying motor disorder or
to an inability to express or comprehend symbols, however
it has been argued that there may also be a disordered motor
component when it comes to choosing oral musculature for
speech or hand muscles for signing [39]. There is surely
nothing in this which is incompatible with the ideas on
modularity in a discussion by Marshall [40]. In this he states
that it is not unreasonable to admit that a module can cut
across sensory boundaries.
Studies of congenital and acquired deafness and
dysphasia have already contributed to the knowledge of
cerebral localization and adaptability, and will continue to
do so. Although the results of examinations of patients with
brain lesions may be limited it is certain that the use of
techniques such as functional magnetic resonance imaging
and positron emission tomography will solve many of the
problems [41]. For example, Corina and McBurney [42]
confirmed that by using cortical stimulation mapping in deaf
signers there was evidence that Brocas area of the dominant
hemisphere was specialized for sign production. Also, by
means of functional magnetic resonance imaging in deaf
signers there was no doubt about the importance of
dominant hemisphere language structures, but that the
right hemisphere made an important contribution to the
processing of signing as well. In fact it is easy to
oversimplify hemisphere specialization between language
and visual-spatial functions when in sign language there is
an interplay between them [43].

References
[1] Ree J. I see a voice. A philosophical history of language, deafness and
the senses. London: Harper Collins; 1999.

150

N. Gordon / Brain & Development 26 (2004) 146150

[2] Marshall JC. Signs of language in the brain. Nature 1986;322:307 8.


[3] Klima ES, Bellugi U. Wit and poetry in American Sign Language.
Sign Lang Studies 1975;8:203 24.
[4] Tranel D. Neurology of language. Curr Opin Neurol Neurosurg 1992;
5:7782.
[5] Petitto LA, Marentette PF. Babbling in the manual mode: evidence for
the otogeny of language. Science 1991;251:14936.
[6] Metz-Lutz MN, de Saint Martin A, Monpiou S, Massa R, Hirsch E,
Marescaux C. Early dissociation of verbal and nonverbal gestural
ability in an epileptic deaf child. Ann Neurol 1999;46:929 32.
[7] Ronnberg J, Soderfeldt B, Risberg J. The cognitive neuroscience of
signed language. Acta Psychol 2000;105:237 54.
[8] Hickok G, Kirk K, Bellugi U. Hemispheric organization of local- and
global-level visuospatial processes in deaf signers and its relation to
sign language aphasia. Brain Lang 1998;65:27686.
[9] Maratsos M, Matheny L. Language specificity and elasticity: brain
and clinical syndrome studies. Annu Rev Psychol 1994;45:487516.
[10] Vaid J, Bellugi U, Poizner H. Hand dominance for signing: clues to
brain lateralization of language. Neuropsychologia 1989;27:949 60.
[11] Bellugi U, Poizner H, Klima ES. Brain organization for language:
clues from sign aphasia. Hum Neurobiol 1983;2:15570.
[12] Damasio A, Bellugi U, Damasio H, Poisner H, Van Gilder J. Sign
language aphasia during left-hemisphere amytal injection. Nature
1986;322:363 5.
[13] Meckler RJ, Mack JL, Bennett R. Sign language aphasia in a nondeaf-mute. Neurology 1979;29:103740.
[14] Kimura D, Davidson W, McCormick CW. No impairment in sign
language after right-hemisphere stroke. Brain Lang 1982;17:359 62.
[15] MacSweeney M, Woll B, Campbell R, McGuire PK, David AS,
Williams SC, et al. Neural systems underlying British sign language
and audio-visual English processing in native users. Brain 2002;125:
158393.
[16] Hickok G, Bellugi U, Klima ES. The basis of the neural organization
for language: evidence from sign language aphasia. Rev Neurosci
1997;8:205 22.
[17] Underwood JK, Paulson CJ. Aphasia and congenital deafness: a case
study. Brain Lang 1981;12:285 91.
[18] Chiarello C, Knight R, Mandel M. Aphasia in a prelingually deaf
women. Brain 1982;105:2951.
[19] Anderson SW, Damasio H, Damasio AR, Klima E, Bellugi U, Brandt
JP. Acquisition of signs from American sign language in hearing
individuals following left hemisphere damage and aphasia. Neuropsychologia 1992;30:32940.
[20] Kiernan C, Reid B. The use of augmentative communication systems
in schools and units for autistic and aphasic children in the United
Kingdom. Br J Disord Commun 1984;19:47 61.
[21] von Tetzchner S. Facilitation of early speech development in a
dysphasic child by the use of signed Norwegian. Scand J Psychol
1984;25:265 75.
[22] Perez ER, Davidoff V. Sign language in childhood epileptic aphasia
(Landau Kleffner syndrome). Dev Med Child Neurol 2001;43:
73944.

[23] Woll B, Sieratzki JS. Sign language for children with acquired
aphasia. J Child Neurol 1996;11:3479.
[24] Sieratzki JS, Calvert GA, Brammer M, David A, Woll B.
Accessibility of spoken, written, and sign language in Landau
Kleffner syndrome: a linguistic and functional MRI study. Epileptic
Disord 2001;3:7989.
[25] Gordon N. The Landau Kleffner syndrome: increased understanding.
Brain Dev 1997;19:311 6.
[26] Bishop DV. Comprehension of spoken, written and signed sentences
in childhood language disorders. J Child Psychol Psychiatry 1982;23:
120.
[27] Kirshner HS, Webb WG. Selective involvement of the auditory-verbal
modality in an acquired communication disorder: benefit from sign
language therapy. Brain Lang 1981;13:161 70.
[28] Daniloff JK, Fritelli G, Buckingham HW, Hoffman PR, Daniloff RG.
Amer-Ind versus ASL: recognition and imitation in aphasic subjects.
Brain Lang 1986;28:95113.
[29] Guilford AM, Scheuerle J, Shirek PG. Manual communication skills
in aphasia. Arch Phys Med Rehabil 1982;63:601 4.
[30] Peterson LN, Kirshner HS. Gestural impairment and gestural ability in
aphasia: a review. Brain Lang 1981;14:33348.
[31] Coelho CA, Duffu RJ. The relationship of the acquisition of manual
signs to severity of aphasia: a training study. Brain Lang 1987;31:
328 45.
[32] Coelho CA. Acquisition and generalization of simple manual sign
grammars by aphasic subjects. J Commun Disord 1990;23:383 400.
[33] Coelho CA, Duffy RJ. Effects of iconicity, motoric complexity, and
linguistic function on sign acquisition in severe aphasia. Percept Mot
Skills 1986;63:51930.
[34] Sacks O. Seeing voices. London: Picador; 1991.
[35] Rubinstein JT. Paediatric cochlear implantation: prosthetic hearing
and language development. Lancet 2002;360:4835.
[36] Preisler G, Tvingstedt AL, Ahlstrom M. A psychosocial follow-up
study of deaf preschool children using cochlear implants. Child Care
Health Dev 2002;28:40318.
[37] Poizner H, Kaplan E, Bellugi U, Padden CA. Visual-spatial
processing in deaf brain-damaged signers. Brain Lang 1984;3:
281 306.
[38] Poizner H, Bellugi U, Iragui V. Apraxia and aphasia for a visualgestural language. Am J Physiol 1984;246:R86883.
[39] Kimura D. Neural mechanisms in manual signing. Sign Lang Studies
1981;33:291312.
[40] Marshall JC. Multiple perspectives on modality. Cognition 1984;17:
209 42.
[41] Hickok G, Bellugi U, Klima ES. Sign language in the brain. Sci Am
2001;284:5865.
[42] Corina DP, McBurney SL. The neural representation of language in
users of American Sign Language. J Commun Disord 2001;34:
455 71.
[43] Bellugi U, Klima ES, Poizner H. Sign language and the brain. Res
Publ Assoc Res Nerv Ment Dis 1988;66:3956.

Вам также может понравиться