Escolar Documentos
Profissional Documentos
Cultura Documentos
GROUP MEMBER
LECTURER
Problem statement:
When different musicians or groups perform the same song or pieces, they may change the
tempo, style, instrumentation of the song, but typically retain enough of the melodic and
harmonic content to preserve the identify of the piece. This project is concerned with automatic
identification of cover versions through cross-correlation of beat-synchronous chroma
representations.
Methodology
Our representation has two main features: We use a chroma feature analysis and synthesis and
cross correlation method. A beat tracker is use to generate a beat-synchronous representation
with one feature vector per beat. Thus, variations in tempo are largely normalized as long as the
same number of beats is used in each phrase. The representation of each beat is a normalized
chroma vector, which sums up spectral energy into twelve bins corresponding to the twelve
distinct semitones within an octave, but attempting to remove the distinction between different
octaves. Chroma features capture both melodic information (since the melody note will typically
dominate the feature) and harmonic information (since other notes in chords will result in
secondary peaks in a given vector). To match two tracks represented by such beat-vs-chroma
matrices, we simply cross-correlate the entire pieces. This entire identification will be carried out
using Matlab software.
Reference:
1. Daniel P.W. Ellis and Graham E. Poliner. Identifying Cover Song with chroma Features
and Dynamic Programming Beat Tracking. Colombia University, New York, 2007.
2. Daniel P.W. Ellis, Courtenay V.Cotton, and Michael I. Mandel. Cross-relation of beat
synchronous Representations for music similarity. Columbia University, New York, 2008.
3. Joan Serra, Emilia Gomez, and Perfecto Herrera. Audio cover song Identification
similarity. Springer Publishing, 2010.
4. Dan Ellis. Beat synchronous Chroma Representations for music Analysis. Columbia
University. New York, 2007.