Skip to main content
Login | Suomeksi | På svenska | In English

Browsing by Subject "kieli"

Sort by: Order: Results:

  • Grünthal, Alva (2019)
    Objectives. Sign languages are perceived and produced by different modalities compared to spoken languages. This has lead to different viewpoints regarding the status of sign languages: are they human languages at all or are they fully comparable to spoken languages? Modality differences have been an interesting topic in the research of the brain mechanisms of sign languages. For spoken languages, the brain mechanisms are mainly based on the left hemisphere and especially on its perisylvian areas. Particularly the role of the left posterior superior temporal sulcus and the left inferior frontal gyrus have been researched. In this review the similarities and differences between the brain mechanisms of sign language and spoken language are discussed based on the current research. Methods. The method of this review was literature review. The studies and books were found from Google Scholar and Helka Finna using, for example, search term sign language brain. Some references for the introduction were searched also from Google. In addition, the suggestion and citation functions were used in Google Scholar, and articles were discovered also by conducting a manual search from the reference lists of the found studies. Results and conclusions. In the studies on the brain mechanisms of sign language it was consistently found that similarly to spoken language, the processing of sign language is mainly based on the left hemisphere and especially on its perisylvian areas. The left inferior frontal gyrus was engaged in production of both spoken and signed languages. In comprehension, the left superior temporal sulcus, the left superior temporal gyrus and the left inferior frontal gyrus were engaged. Results suggested that auditory cortex in the temporal lobe is a constituent part of brain mechanisms despite that sign languages are not perceived auditorily. On the other hand, the relevance of the visuospatial component for the brain mechanisms of sign language was noted in the studies and it was observed that sign language was perceived more bilaterally than spoken language. The more spatial processing was required, the more the activation in the right hemisphere and in the parietal lobes increased. More research is needed about the distinction and overlap of sign language and gesture. Also, the role of the basal ganglia should be studied more in languages of different modalities. The similarities between the brain mechanisms of sign language and spoken language suggest that language is not just a bunch of sensomotor mechanisms but has a deeper neural function in the brain.