Escolar Documentos
Profissional Documentos
Cultura Documentos
1 Introduction
We hereby report some general considerations about
the relationships between science and music that can
help young researchers of MOSART network for
understanding the philosophy of this fascinating topic
of computer music. This is an excerpt of the
introduction written for Interface as Guest editor of
the special issue on Man-Machine Interaction,
Interface, Journal of New Music Research [1].
Besides, some technical parts of this paper have been
already presented in other conferences, here properly
updated and extended [2].
The history of Mathematics, Physics and Technology
have played a crucial role in the history of Music as
regard the evolution of the language reference syntax
(notes and modern harmony) and the shape and the
mechanical functionalitys of musical instruments.
We know that for centuries musical scales has been
constructed by notes whose pitches are related by
simple ratios and that only at the end of 17th century,
the well-tempered schema, based on logarithms
introduced in the realm of mathematics some decades
before (1612), has been proposed. [3].
It may be that the evolution from the plucking system
used in the spinetta and in the clavicembalo to the
striking-hammer system used in the piano, should not
be considered as a true result of progresses in
Science but, rather, as a consequence of the smartness
and the craftsmanship of the Italian musical
instruments manufacturer Bartolomeo Cristofori. But
its highly probable that the terminal part of wind
5 Handle
Based on real-time analysis of video captured images,
a system for recognizing shape, position and rotation
of the hands has been developed: the performer
moves his/her hands in a video-camera capture area,
the camera sends the signal to a video digitizer card
and the computer processes the mapped figures of the
performers hands and produces data concerning x-y
positions, shape (that is, posture)
and angle of rotation of both the
hands. Data extracted from the
image analysis of every frame are
used for controlling real-time
interactive computer music and
computer graphics performances.
Both the hands are taken into consideration.
For each hand the following steps are executed: - the
barycenter is computed and then used for constructing a one-period-signal using distances
from the barycenter to the points along the contour of
6 Mapping
The different kind of gestures such as continuos or
sharp movements, threshold trespassing, rotations
and shifting, are used for generating sound event
and/or for modifying sound/music parameters.
For classic acoustic instruments the relationship
between gesture and sound is the result of the physics
and mechanical arrangement of the instrument itself.
And there exist one and only one relationship.
Using a computer based music equipment, its not so
clear what and where is the instrument; from
gesture interfaces such as the infrared beam
controller or Handle, to loudspeakers which actually
produce sound, there exist a quite long chain of
elements working under control of the computer
which performs many tasks simultaneously:
management of data streaming from the gesture
interfaces, generation and processing of sound,
linkage between data and synthesis algorithms,
distribution of sound on different audio channels, etc.
This means that a music composition must be written
in term of a programming language able to describe
all the components including the modalities for
associate gesture to sound, also said how to map
gesture to sound. The mapping makes therefore part
of the composition.
where
- sig is a local variable;
- ampli, freq and pan are global variables filled in by
Score();
- Env(1) and OscSin(1,freq) belong to the DSPlib
and sounds like that:
float OscSin (int nOsc, float freq)
{
float pos;
pos = oscFase[iN][nOsc] + freq;
if (pos>=tabLenfloat)
{pos=pos-tabLenfloat;}
if (pos<0) {pos=pos+tabLenfloat;}
oscFase[iN][nOsc] = pos;
return Tabsen[(long)pos];
}
float Env (int nEnv) // One-shot envelope
{
float vval,vv,pos;
long
ntabEnv =envNum[iN][nEnv];
pos = envPos[iN][nEnv];
vval = *(envTable[ntabEnv] + (long)pos);
if((long)pos<envLenght[ntabEnv]-1.0)
envPos[iN][nEnv]=pos+1.0;
return vval;
}
References
[1] Tarabella L. - Guest Editor of the Special Issue on
Man-Machine Interaction in live Performance Interface, Journal of New Music Research, Swets &
Zeitlinger B.V. Vol.22 n.3 (1993)
Giving expression to
multimedia performances ACM Multimedia 2000,
Workshop Bridging the Gap: Bringing Together New
Media Artists and Multimedia Technologists Los
Angeles, 2000
8 Conclusions
In the introduction we reported a small but
significative panorama of the state-of-the-art on the
different problems, solutions and results of people
and research centers around the world active on the
gesture interface topic. In particular we described our
approach and the results reached.
The systems and devices we realized have been used
several times in the last years for
demos,
conferences, lectures and concerts, always well
accepted with interest and enthusiasm by the
audience from both scientific and artistic points of
view.
From the experience gained after deeply using the
gesture devices and systems during our academic and
artistic activity, we realized that its time now for
better formalizing mapping strategies and for
classifying sets of gestures [21].
After all, this research activity finds now the right
place in the MOSART project for what concerns in
particular the involvement of young researchers of
the network.
9 Acknowledgments
Special thanks are due to Massimo Magrini and
Gabriele Boschi who, as professional programmers
and electronic designers, greatly contributed in
[10] M.Wanderly,<http://www.ircam.fr/equipes/analysesynthese/wanderle/Gestes/Externe/>
[14] P.