Você está na página 1de 4

FACULTY OF ENGINEERING

DIGITAL SIGNAL PROCESSING


KE46103
AUDIO BASED PROJECT

GROUP MEMBER
LECTURER

JESEKA MANAHAN (BK12110122)


NURUL ANIS BINTI AHMAD (BK12110284)
MR. LIAU CHUNG FAN

Title: Music and Cover Song Identification.


Background Study:
Music and cover song identification has been a very active area of study within the last few years
in the MIR community, and its relevance can be seen from multiple points of view. From the
perspective of an audio content processing, cover song identification yields important
information on how musical similarity can be measured and modeled. The purpose of many
studies, is to define and evaluate the concept of music similarity, but there are many factors
involved in this problem and some of them are difficult to measure.
The problem of identifying covers is also challenging from the point of view of music cognition,
but apparently it has not attracted much attention but itself. When we detecting a cover, we have
to derive some invariant representation of the whole song. An additional issue is that of the
memory representation of the song in humans. It could be either the case that the canonical song
act as a prototype for any possible version, and that the similarity of the covers is computed in
their encoding step, or either that all the songs are stored in memory and their similarity is
computed at the retrieval phase.

Problem statement:
When different musicians or groups perform the same song or pieces, they may change the
tempo, style, instrumentation of the song, but typically retain enough of the melodic and
harmonic content to preserve the identify of the piece. This project is concerned with automatic
identification of cover versions through cross-correlation of beat-synchronous chroma
representations.

Methodology

Our representation has two main features: We use a chroma feature analysis and synthesis and
cross correlation method. A beat tracker is use to generate a beat-synchronous representation
with one feature vector per beat. Thus, variations in tempo are largely normalized as long as the
same number of beats is used in each phrase. The representation of each beat is a normalized
chroma vector, which sums up spectral energy into twelve bins corresponding to the twelve
distinct semitones within an octave, but attempting to remove the distinction between different
octaves. Chroma features capture both melodic information (since the melody note will typically

dominate the feature) and harmonic information (since other notes in chords will result in
secondary peaks in a given vector). To match two tracks represented by such beat-vs-chroma
matrices, we simply cross-correlate the entire pieces. This entire identification will be carried out
using Matlab software.

Figure 1: Steps in music tracking and identification


Our approach is:
(a) Using chroma feature analysis and synthesis by folding down the musical spectrum at
each instant into 12 values corresponding to the intensity of each semitone of the musical
octave (but removing the distinction between different octaves). This helps normalize for
variations in instruments and different ways of realizing the same harmonic 'color'.
(b) We track the beats by looking for regularly-spaced moments of maximal change in the
sound, then average the 12-dimensional harmonic description to a single value for each
beat. Provided the beats are correctly located, this removes the effect of tempo variations.
We use a couple of possible tempos to protect against major changes in rhythmic `feel'.
(c) Finally, a cover song is identified by sliding the entire semitone-by-beat matrix for two
songs over each other, looking for a good match. This includes a search over shifts in
time (e.g. to find a section that matches even though it may occur at a different point in
the music) and shifts in the (circular) semitone axis. A close match, even over only part of
the music, is strongly indicative of a genuine cover version.

Reference:

1. Daniel P.W. Ellis and Graham E. Poliner. Identifying Cover Song with chroma Features
and Dynamic Programming Beat Tracking. Colombia University, New York, 2007.
2. Daniel P.W. Ellis, Courtenay V.Cotton, and Michael I. Mandel. Cross-relation of beat
synchronous Representations for music similarity. Columbia University, New York, 2008.
3. Joan Serra, Emilia Gomez, and Perfecto Herrera. Audio cover song Identification
similarity. Springer Publishing, 2010.
4. Dan Ellis. Beat synchronous Chroma Representations for music Analysis. Columbia
University. New York, 2007.

Você também pode gostar