Research into neurobiological mechanisms of morphosyntactic processing of language has suggested specialised systems for decomposition and storage, which are used flexibly during the processing of complex polymorphemic words (such as those formed through affixation, e.g., boy + s = noun + plural marker or boy + ish = noun plus attenuator). However, neural underpinnings of acquisition of novel morphology are still unknown. We implicitly trained our participants with new derivational affixes through a word–picture association task and investigated the neural processes underlying formation of neural memory traces for new affixes. The participants' brain activity was recorded using magnetoencephalography (MEG), as they passively listened to the newly trained and untrained suffixes combined with real word and pseudoword stems. The MEG recording was repeated after a night's sleep using the same stimuli, to test the effects of overnight consolidation. The newly trained suffixes combined with real stems elicited stronger source activity in the left inferior frontal gyrus (LIFG) at ∼50 msec after the suffix onset than untrained suffixes, suggesting memory trace formation for the newly learned suffixes already on the same day. The following day, the suffix learning effect spread to the left superior temporal gyrus (STG) where it was again manifest as a response enhancement, particularly at ∼200–300 msec after the suffix onset, which might reflect an additional effect of overnight consolidation. Overall, the results demonstrate the rapid and dynamic processes of both immediate build-up and longer-term consolidation of neocortical memory traces for novel morphology, taking place after a short period of exposure to novel morphology and involving fronto-temporal perisylvian language circuitry.
n the healthy human brain, the processing of language is strongly lateralised, usually to the left hemisphere, while the processing of complex non-linguistic sounds recruits brain regions bilaterally. Here we asked whether the anterior temporal lobes, strongly implicated in semantic processing, are critical to this special treatment of spoken words. Nine patients with semantic dementia (SD) and fourteen age-matched controls underwent magnetoencephalography and structural MRI. Voxel based morphometry demonstrated the stereotypical pattern of SD: severe grey matter loss restricted to the anterior temporal lobes, with the left side more affected. During magnetoencephalography, participants listened to word sets in which identity and meaning were ambiguous until word completion, for example PLAYED versus PLATE. Whereas left-hemispheric responses were similar across groups, patients demonstrated increased right hemisphere activity 174–294 msec after stimulus disambiguation. Source reconstructions confirmed recruitment of right-sided analogues of language regions in SD: atrophy of anterior temporal lobes was associated with increased activity in right temporal pole, middle temporal gyrus, inferior frontal gyrus and supramarginal gyrus. Overall, the results indicate that anterior temporal lobes are necessary for normal and efficient lateralised processing of word identity by the language network.
We here investigate whether the well-known laterality of spoken language to the dominant left hemisphere could be explained by the learning of sensorimotor links between a word's articulatory program and its corresponding sound structure. Human-specific asymmetry of acoustic-articulatory connectivity is evident structurally, at the neuroanatomical level, in the arcuate fascicle, which connects superior-temporal and frontal cortices and is more developed in the left hemisphere. Because these left-lateralised fronto-temporal fibres provide a substrate for auditory-motor associations, we hypothesised that learning of acoustic-articulatory coincidences produces laterality, whereas perceptual learning does not. Twenty subjects studied a large (n=48) set of novel meaningless syllable combinations, pseudowords, in a perceptual learning condition, where they carefully listened to repeatedly presented novel items, and, crucially, in an articulatory learning condition, where each item had to be repeated immediately, so that articulatory and auditory speech-evoked cortical activations coincided. In the 14 subjects who successfully passed the learning routine and could recognize the learnt items reliably, both perceptual and articulatory learning were found to lead to an increase of pseudoword-elicited event-related potentials (ERPs), thus reflecting the formation of new memory circuits. Importantly, after articulatory learning, pseudoword-elicited ERPs were more strongly left-lateralised than after perceptual learning. Source localisation confirmed that perceptual learning led to increased activation in superior-temporal cortex bilaterally, whereas items learnt in the articulatory condition activated bilateral superior-temporal auditory in combination with left-pre-central motor areas. These results support a new explanation of the laterality of spoken language based on the neuroanatomy of sensorimotor links and Hebbian learning principles.