Where is rhythm processed in the brain




















Subjects Participating in the study were five musicians with at least an undergraduate university degree in music, and five non-musicians with no music training or performance experience beyond childhood. Stimuli and Task The experimental tasks included conditions in which participants discriminated the rhythmic elements of pattern, meter e. Open in a separate window. Figure 1. Procedure Each subject subsequently performed nine PET trials: two trials each of the pattern task, meter task, tempo task, and melody task, as well as one rest trial.

Results 3. Activity for Tempo, Meter and Pattern Combined The mean activity for all subjects during performance of the three musical rhythm tasks of meter, tempo, and pattern, minus rest, exhibited a distributed set of locations see Table 1 and Figure 2. Figure 2. Table 1 Stereotactic MNI coordinates, Z -score values, and anatomical and Brodmann areas for activations during the Pattern, Meter and Tempo combined contrasted to rest.

Activity for Pattern In an analysis of the pattern task, as compared to rest, activity was observed in right areas of middle temporal gyrus BA 22 , superior temporal gyrus BA 22, 42, 41 , and transverse temporal gyrus BA 42 see Figure 3 and Table 2.

Figure 3. Table 2 Stereotactic MNI coordinates, Z -score values, and anatomical and Brodmann areas for activations during the Pattern condition contrasted to rest. Activity for Meter In an analysis of the meter task, as compared to rest, activity was observed in right hemispheric regions of inferior frontal gyrus BA 44, 9 , precentral gyrus 6, 44 , and middle frontal gyrus BA 46 see Figure 4 and Table 3.

Figure 4. Table 3 Stereotactic MNI coordinates, Z -score values, and anatomical and Brodmann areas for activations during the Meter condition contrasted to rest. Figure 5. Table 4 Stereotactic MNI coordinates, Z -score values, and anatomical and Brodmann areas for activations during the Tempo condition contrasted to rest.

Melody In an analysis of the melody task, as compared to rest, foci were detected in right insula, right postcentral gyrus BA 40 , right claustrum, and right superior temporal gyrus BA Figure 6.

Table 5 Stereotactic MNI coordinates, Z -score values, and anatomical and Brodmann areas for activations during the Melody condition contrasted to rest. Comparing Musicians vs. Non-Musicians We conducted direct group contrasts between musicians and non-musicians, but no clusters reached significance at our pre-determined statistical criteria cluster-wise correction, see Experimental Section , Imaging Analysis.

Discussion 4. Common Mechanisms for Pattern, Meter, and Tempo Common to the processing required for pattern, meter, and tempo in this study are the elements of a performing a sensory analysis of the sounds; b representing the beat structure and possibly phrase structure implicit in those sounds, in either an auditory or motor-sensory e.

Distinct Mechanisms for Pattern, Meter, and Tempo Distinctly different brain activity was elicited in each of the conditions for pattern, meter, and tempo processing. Pattern Processing pattern involves representing temporal intervals at each local time point that vary across segments and must be linked at a higher level of organisation.

Meter Meter is an abstract mathematical subdivision of time events like pulses without specified musical content of phrased rhythmic content.

Tempo Processing tempo in this study primarily requires representing the change in rate of sounds per unit of time. Specific Mechanisms for Melody Performing our melody task likely involves processing the elements of pitch interval, pitch height, melodic contour, tonal centre, phrase structure, and rhythm, consequently activating emotional and various cognitive mechanisms.

General Discussion One striking feature of these exploratory findings is that pattern, tempo, and meter elicit different distributed neural mechanisms. Motor-Sensory Representations Prior research has demonstrated spontaneous activity in motor-related systems when musicians listen to music and when non-musicians passively listening to musical rhythms [ 20 ]. Cerebellum We observed activity in right posterior cerebellum crus I common to meter, pattern, and tempo. Conclusions The findings in this exploratory study outline some of the distinct neural components of rhythmic structure that may be present in complex interactions when those elements are blended in typical music experiences.

Author Contributions L. Conflicts of Interest The authors declare no conflict of interest. References 1. Koelsch S. Toward a neural basis of music perception—A review and updated model. McDermott J.

Music Perception. Desain P. Rhythm: Perception and Production. Hallam S. Oxford Handbook of Music Psychology. Alluri V. Large—Scale brain networks emerge from dynamic processing of musical timbre, key, and rhythm. Fujisawa T. The perception of harmonic triads: An fMRI study. Brain Imaging Behav. Garza Villarreal E. Distinct neural responses to chord violations: A multiple source analysis study.

Brain Res. Klein M. A role for the right superior temporal sulcus in categorical perception of musical chords. Lee Y. Investigation of melodic contour processing in the brain using multivariate pattern-based fMRI.

Parsons L. Pitch discrimination in cerebellar patients: Evidence for a sensory deficit. Kohlmetz C. Selective loss of timbre perception for keyboard and percussion instruments following a right temporal lesion. Peretz I. Brain organisation for music processing. Stewart L. Music and the brain: Disorders of musical listening.

Grube M. Dissociation of duration-based and beat-based auditory timing in cerebellar degeneration. Teki S. Distinct neural substrates of duration-based and beat-based auditory timing. Yeston M. The Stratification of Musical Rhythm. Fitch W. Rhythmic cognition in humans and animals: Distinguishing meter and pulse perception. White J. The Analysis of Music. Platel H. The structural components of music perception. A functional anatomical study. Chen J. Bengtsson S. Listening to rhythms activates motor and pre-motor cortex.

Kornysheva K. Brain Mapp. Grahn J. Neural bases of individual differences in beat perception. Limb C. Left hemispheric lateralization of brain activity during passive rhythm perception in musicians. A Discov. Chapin H. Neural responses to complex auditory rhythms: The role of attending. Rao S. The evolution of brain activation during temporal processing. Finding and feeling the musical beat: Striatal dissociations between detection and prediction of regularity.

Seger C. Corticostriatal contributions to musical expectancy perception. Coull J. Dissociating explicit timing from temporal expectation with fMRI. Neural substrates of mounting temporal expectation. PLoS Biol.

Lewis P. Brain activation patterns during measurement of sub- and supra-second intervals. Smith A. A right hemispheric frontocerebellar network for time discrimination of several hundreds of milliseconds. Tecchio F. Conscious and preconscious adaptation to rhythmic auditory stimuli: A magnetoencephalographic study of human brain responses. Boundaries of separability between melody and rhythm in music discrimination: A neuropsychological perspective. Foxton J. Povel D.

Perception of temporal patterns. Music Percept. Processing of local and global musical information by unilateral brain-damaged patients. Di Pietro M. Receptive amusia: Temporal auditory processing deficit in a professional musician following a left temporo-parietal lesion.

Contribution of different cortical areas in the temporal lobes to music processing. Geiser E. The noisy signals of fMRI are even more difficult, compounded by the absence of intuitive or perceptible rhythms in the data. It follows from the analysis of repetition, just described. As the repetitions were counted, the intervals between each occurrence of any repeating sequence motif as each length from 4 to 11 images were also recorded.

Where motifs recur more than twice, we compare the intervals between repetitions. If these intervals are equal, then that motif is recurring rhythmically. The presence of these congruent intervals as a proportion of all intervals then serves as one index of rhythmicity.

Note that as certain intervals recur more frequently, then other intervals become relatively more rare. This tilt toward rhythmicity is therefore reflected in the standard deviation of the set of numbers of occurrences of each interval. This value can then be compared to standard deviations of random surrogates derived from null network variations generated for each subject.

These values then are compared across the two experimental conditions. The search for harmony rests on a more tentative approach. Together these higher frequencies form the harmonics of the fundamental. These are also called overtones or partials. A signal with this spectral structure, then, is harmonic.

In principle, harmonicity is easy to detect: the peak amplitude frequencies should be separated in frequency by constant differences.

However, with these data neither the fundamental nor the harmonic frequencies are known, and certainly not apparent from the noisy Fourier spectrum. Instead, we exploit the rhythm information just collected: the repeating intervals are easily converted into frequencies, and thus the histogram of intervals transforms into the histogram of frequencies. In effect, this is an alternative form of signal spectrum not based in Fourier analysis , where numbers of occurrences of each interval converts to amplitude at each frequency.

Then, we adapted the amplitude spectra to amplify the hidden harmonics. Specifically, the spectrum for each subject was downsampled by factors of 2 through 7, and the resultant vectors added to the original. Downsampling decreases the sampling rate by integer factors. For example, a vector downsampled by a factor of two comprises every second element of the original. The downsampling of a harmonic signal spectrum preserves peaks at the same point in each downsampling.

In effect, each harmonic peak is moved left by an integer factor, so lower and higher harmonics coincide. This is shown schematically in Figure 2. Adding the original and downsampled vectors amplifies the magnitude of the harmonics. Using this method a maximum value for the summed vectors original and downsampled transforms can be calculated. The position in the vector of this maximum is often interpreted as the fundamental frequency, but this is not necessary for the present analysis.

Here we are interested in the maximum magnitude of this compounded amplitude. The technique is applied to sequences of each tested length 4 through 11 images. In effect, this analysis considers repeating sequences as oscillators, and groups oscillators by the length of the sequences that repeat.

Thus, we cast the net broadly in the hopes that harmonic oscillation can emerge from the background of the inharmonic. These values were compared in the two experimental conditions. Figure 2. Accordingly, if a harmonic signal spectrum is compressed to one half its original length i.

Their sum thereby amplifies the presence of the harmonic partial. As this process is repeated for successive downsamples, harmonic partials are increasingly amplified. The presence of amplified peaks is thus the marker of harmonic oscillators. A An original spectrum of frequencies and amplitudes from Fourier analysis of choral singing. B The same spectrum, downsampled by 2. The 1st harmonic now coincides with the fundamental.

C The orginal spectrum, downsampled by 3. The second harmonic now corresponds with the original fundamental and the 1st harmonic. D Summing the original and the two downsamplings. Harmonicity is apparent in the sharp peaks in the summation. Note that the summed spectrum no longer represents the original frequency gradient, being is a mix of different frequencies at every point.

The original fundamental frequency cannot be recovered from the summation, but we can determine that the fundamental is some integer multiple of the main summed peak. The same conclusion applies to the other peaks in D , some of which could be fundamental frequencies, while some might be subharmonics of a higher frequency with a greater amplitude, and some might represent higher harmonics of a lower frequency partial.

Several standard approaches are the obvious foils: These derive from Fourier analysis and include Wavelet decomposition and measurements of phase synchrony. Fourier analysis construes a signal as a superposition of sinusoids at various frequencies and phases.

Periodic signals are built on that basis, stretched and slipped along the time axis. Phase synchrony between two signals is calculated from instantaneous phase measurements Lachaux et al. The methods in this study, in contrast to Fourier and Wavelet approaches, make fewer assumptions about the basis function to be tested against the target signal in this case, a thematic profile.

We use segments of the thematic profile itself as basis functions, and analyze the entire signal against each of the thematic profile segments. Thus, every short sequence extracted from the signal is tested along a sliding window as a potential basis function for the whole signal itself. Thus, multiple basis functions are tested, derived from the data itself. For each sequence, we measure its repetition, and from the timing of repetitions we calculate rhythm and harmony. As in the initial parcellation, significance is tested via the permutations derived from the null networks.

In this study, the segments tested ranged from four to eleven images seconds. This window was selected because, in general, the analysis is computationally intensive, requiring some selectivity in what can be feasibly explored. Sequences of shorter than 4 s repeated densely in both the data and the permutations, rendering the comparison moot. At greater than 11 s, repetitions occurred only rarely, attenuating the comparison with the null permutations.

The measurement of rhythm follows a similar strategy, namely, examining the intervals between repetitions of repeating sequences. That periodicity in turn directly determines frequency of the revealed rhythms. These can be tested for harmonic relations, as described above. In general, then, the methods here are both open-ended and data-driven, resting on the sequences that occur in the data, and therefore afford more opportunities for discovering temporal regularities even if transient.

In contrast, both Fourier analysis and Wavelet analysis make assumptions about the basis functions. For the FT this is of course the sine function; Wavelets can have many different shapes, but in all cases the analyst specifies the wavelet prior to the analysis. Arguably these assumptions could miss regularities that the methods here might detect.

On the other hand both of the standard methods use basis functions stretched to various scales, and so in this sense are more receptive to regularities at multiple scales than the more constrained methods of this study. Phase synchrony is a powerful marker of functional relatedness Varela et al. However, it too rests on an a priori decision, namely the band-pass filtering of the target signals, necessary for determining instantaneous phase. The methods in the current study identify a range of frequencies involved in multiple rhythms and thus multiple harmonic relationships.

Once again, the data is doing the driving. One final rationale for the methods here is the analogy with music.

The concepts of rhythm and harmony employed here are strict analogs of standard usage in musicology. If there is something to be made of the comparison of brain activation and music, these are among the measures we would hope might apply Lloyd, , In summary, the stages of the analysis are these: The starting point is the full pattern of all voxels for each of brain images for each subject, in two experimental conditions.

These are grouped according to a modularity algorithm into themes. The resultant vector, or thematic profile, is the basis for subsequent analysis. To measure repetition, we examine short excerpts or sequences motifs drawn from each thematic profile, separately considering all sequences from 4 to 11 s in length, counting the number of repeated occurrences of each sequence. To measure rhythm, we examine the intervals between repetitions of repeating sequences, counting the number of times specific inter-sequence intervals occur.

To measure harmonicity, we examine the frequency of rhythmic repetitions, using the downsampling proceedure described above to identify integer ratios between frequencies. In this study, we observe the presence of the temporal counterpart to modularity see Table 1. Within and across subjects, two widely used measures of modularity agreed that the temporal connectome has a modular architecture.

Following the Louvain group method, the modularity statistic, summarizing the degree to which the network can be subdivided into groups with high ingroup similarity and low outgroup similarity, averaged 0. Randomized null networks averaged 0. Pivoted toward time, these module-analogs are called themes. Analysis identified 7. These fundamental observations indicate that the progression of themes is structured, analogous to the modular architectures discovered when graph theory is applied to spatial networks.

Table 1. Global modularity measures for brain images collected during two experimental conditions. Subsequent questions, then, probe the origin and structure of the apparent temporal dynamics, collected in Table 2. Repetition was measured by simply comparing all the subsequences of 4—11 images in each thematic profile, and counting exact matches throughout the profile.

This analysis found highly significant repetition for at least some subjects at all of these durations. On average, each subject displayed significant repetition for 6 sequence lengths. The sequence length exhibiting repetition in the largest subset of subjects was 6 s. Sequence lengths of 6 or 7 s were the most frequent repeaters. Rhythmicity was measured by examining the intervals between repetitions and counting the number of recurrences of each interval.

As with the repetitions, this was separately examined for sequences of lengths 4 through 11 s. These tabulations were compared with similar tabulations for surrogate data sets, and significant deviations recorded as always, correcting for multiple comparisons. Harmonicity was measured by summing the original and six downsampled spectra for each subject. By shrinking the spectra by an integer factor, the downsampling preserves peaks in relationships of integer multiples — i.

Due to the downsampling, the peaks of the summed downsampled spectra cannot be assigned to specific frequencies. This test yielded highly significant harmonics for all subjects in the two conditions.

Sequence lengths of 4—7 s were harmonically organized in nearly all subjects, with all subjects viewing movies displaying harmony for 5 s sequences, and all subjects in the rest condition displaying harmonics for sequences of 5, 6, and 7 s.

In both conditions over 40 specific peaks exceeded baseline surrogate measures in both conditions. Very generally, animal brains face a dual computational demand: they must sense and interpret what is immediately present; and they must predict what will happen next over a future from milliseconds to hours to years Friston and Stephan, ; Clark, , ; Hohwy, In computational terms, all animals need a capacity to continuously maintain representations of past and future environmental and bodily conditions, while at the same time continuously refreshing, updating, and modifying these representations as new information arrives.

This sketch of cognition is subject to three general constraints: First, to be useful, information that is generated at one source must be available to modify information elsewhere.

Global availability leads to the second constraint: For information to be effectively integrated across the brain, it must be transmitted with minimum confusion. In effect, channels must converge and diverge without crosstalk.

Third, all this temporal mixing and matching must work quickly, to keep up with a dynamic world. How might repetition, rhythm, or harmony enable these computational ends? Oscillations are everywhere at frequencies from less than 1 to Hz Biswal et al. Along with their observation we find a cornucopia of proposals for their function. Many of these posit interactions among oscillations at different frequencies, where frequency bands have distinct functions Glassman, ; Onslow et al.

For example, Friston et al. Canolty et al. These are generally derived from EEG data, with the exception of Ville et al. The rhythms described in this study are particularly apt candidates for temporal holding patterns, in that the methods here define the elements in rhythmic repetition as sequences of thematic moments.

What is repeating is itself a temporally extended pattern 4—11 s , drawn from an alphabet of around seven themes. The many rhythms found in both the rest and movie conditions invite a more detailed study of these patterns, in addition to research into their frequency of oscillation Zuo et al.

Among the many discussions of oscillations in the brain, discussions of harmonic relationships among frequencies are rare. Yet harmonic partials are rampant in the data in the present study — at least 30 partials were discovered in the frequencies of sequence occurrence at all sequence lengths. What could be the functional significance of this widespread observation? A speculative argument could begin with functional distinctions between the stages of signal processing in any system, the brain included.

Abundant research explores how neuronal oscillations are generated; a fairly large literature considers how signals propagate over space and time; but there is little consideration of how a received signal is processed. Fourier analyses are computationally intensive, and require many signal samples to be precise — it seems unlikely that the brain computes in this way. In a periodic signal, however, the minimum interpretable packet is one cycle. Accordingly, in principle the fastest processing is most feasible when cycle time, the interval between repetitions of the periodic signal, is shortest.

In general, harmonic signals offer a useful combination of multiple superimposed frequencies and short cycle times. This follows from the definition of harmonics, which are signals whose frequencies are in integer ratios, but might also be illustrated with an example.

Accordingly, the composite signals on the left side of the figure are harmonic signals of various lengths, with the same fundamental frequency.

For each, one cycle is bracketed. The right panel presents examples of inharmonic signals and their cycle times. It is apparent that the harmonic signals have the shortest cycles, as is indeed implied by the integer relationships of their frequencies. Thus, if a signal is composed of multiple frequencies, the smallest package quickest, easiest, most efficient that delivers all the frequency information employs frequencies in harmonic relationships.

The example introduces just two harmonic partials, but this hypothetical computational process can accommodate more complex harmonic signals as well. The presence and absence of harmonics afford the system a binary code, albeit one of modest capacity. Such a system gets off the ground without Fourier analysis, and packages its message in a minimum interval Glassman, Figure 3.

Harmonic and inharmonic periodic signals. Left panels display a basic sinusoidal signal A and two signals with the fundamental frequency of A plus one harmonic B and two harmonics C.

The period of each cycle is demarcated with vertical dashed lines. Phone: Fax: Caption : A team of neuroscientists has found that people are biased toward hearing and producing rhythms composed of simple integer ratios — for example, a series of four beats separated by equal time intervals. Caption : A chart shows the differences in the distribution of responses between the Tsimane and U.

Researchers found that the Tsimane tended to prefer different ratios than Westerners, and that these ratios corresponded to simple integer ratios found in their music but not in Western music. Credits : Image: Nori Jacoby. Caption :. Credits :. Cultural comparisons Next, the researchers performed the same study with members of the Tsimane tribe, who live in a remote part of Bolivia and have little exposure to Western music. In future work, the researchers hope to use this technique to study other groups of people.

Boston Globe Boston Globe reporter Andy Rosen writes about a new MIT study examining how the brain perceives rhythm that finds people tend to reorganize random series of beats into familiar patterns. Related Articles. Why we like the music we do. The results differ past work that identified the basal ganglia — a key component of the motor system — to be consistently engaged in musical beat perception.

Research has found that music with a strong beat or an auditory metronome can improve gait and other aspects of movement in such patients. The research team is pursuing several follow-up studies, including developing a shorter version of the experiment to test non-musicians and comparing musical and motor abilities in trained dancers versus musicians. They are also starting to look at rhythm and beat perception and production in different types of musicians, including singers and percussionists.

Chen, Robert J. Zatorre, and Virginia B.



0コメント

  • 1000 / 1000