Академический Документы
Профессиональный Документы
Культура Документы
net/publication/303702318
CITATIONS READS
15 525
1 author:
Pavlos Antoniadis
Institut de Recherche et Coordination Acoustique/Musique
12 PUBLICATIONS 33 CITATIONS
SEE PROFILE
Some of the authors of this publication are also working on these related projects:
All content following this page was uploaded by Pavlos Antoniadis on 01 June 2016.
patterning” (Ferneyhough) rather than precise rhythmic captured by 3D accelerometers and 3-axis gyroscopes
and other detail. Furthermore, as already stated in the worn on both wrists. Fig. 2 shows the first page of the
Introduction, we hypothesise that the sight-reading original score and fig. 3 the MAX/MSP patch used for
equals an implicit annotation of the musical score, repre- the synchronisation of the data.
sentable as explicit annotations detailed in 3.2.
The first author’s performance for this case study took 3.2 Representation of implicit annotation
place on 18.04.2014 in the context of his Musical Re-
search Residency at IRCAM. He performed and recor- Given the ambiguity of the term gesture in musical
ded three takes for each page of Lemma-Icon-Epigram in contexts, referring to both musical and physical, compo-
one day. His sight-reading prioritised ergonomic hand sitional and performative properties, the annotation may
and arm movement in the keyboard space as well as include two types of information:
pitch accuracy, while allowing only sporadic and sponta-
neous response to the parameters of rhythm, articulation • Notated “gestural patterning” elements such as pitch,
and dynamics. A Yamaha upright Disklavier was used for articulation, rests, dynamics, pedal, beaming (fig. 4b).
the recording of MIDI information, while multimodal This information is visible, but also heterogeneous,
information included audio captured by two micro- multi-layered and fused. As an example, we have here
phones, video captured by Kinect and movement data only indicated the most salient gesture boundaries,
reserving the rest of the gesture patterning elements for
AUDIO
GESTURE
VIDEO
MIDI
the second phase of refinement in the learning process. heads are employed for finger 5 and blue ones for finger
• Physical gestural elements, such as fingerings, changes 1 in both hands.
of hand position, arm movements, technical patterns.
This sort of information is invisible, concatenated and 3.3 Comparison of gestural signals to recorded audio
embodied: it constitutes a hidden layer of the notation, and video
albeit representable as in fig. 4a. In previous work
[13] we have suggested a typology of physical gestural The qualitative analysis of the multimodal data followed
elements in relation to pitch, following up from ideas two phases: First, we observed the 12-axis gestural si-
by the pianist György Sándor [14]. We have proposed gnals in relation to the audio signals and the kinect vi-
a hierarchical ordering of notated pitch information in deo. The results of our observations for page 1 are pre-
three embodied layers: fingers, hand-grasps and arm sented in fig. 5 and are detailed as follows.
movements. The finger layer corresponds to traditio-
nal fingering and includes all notated pitch indexed
with a number from one to five. Hand-grasps are by • Accelerations related to attacks (clearly visible as
default defined as concatenations of pitch contained amplitude peaks in the audio signals) are unequivo-
between fingers one and five. Depending on individual cally discernible from accelerations related to the
hand span, those pitch sets can be played simulta- horizontal displacement of the hands. The first are
neously as chords or in succession as melody, poten- marked with red ellipses, the latter with blue ellipses
tially involving upper-arm participation and horizontal in the gestural signals of fig.5. Attack accelerations
displacement. Consequently, the grasp layer can be appear as instantaneous high amplitude peaks of the
effectively represented by the pitches assigned to fi- accelerometers and often the gyroscopes signals,
gers one and five, omitting the pitches corresponding while displacement accelerations are mainly captured
to inner fingers. Similarly, hand displacement takes us by the gyroscopes as low amplitude and frequency
to the arm layer, which can be defined as a concatena- peaks. Close comparison to the video reveals patterns
tion of grasps. Its boundaries are defined by the suc- related to the direction of the displacement, clearly
cession of fingers one and five (in the case of outwards marked also in fig. 5 (“values reversed”).
movements, that is displacement from the centre to the • Next to those two distinct types of events, attacks and
extremes of the keyboard for both hands) or by the displacements, we discern two hybrid events: trills
succession of fingers five and one for movements from (excitation visible in all six axis of the signal) and
the extreme to the centre. As a result, the trajectories displacement with simultaneous attacks. Those events
of hand transpositions or arm layer can be defined as a are more complex and more equally distributed bet-
series of segments defined by digits one and five, de- ween the acceleromeers and the gyroscopes, and are
pending on their directionality. indicated with purple ellipses.
• Observation of the sequence of the above-mentioned
Please note that both the grasp and the arm layers may four types of events reveals two types of patterns: i)
be defined as a succesion of two-bit units of information: attacks / trills followed by displacements / displace-
pairs of fingers one and five. Also: The segmental and ments & attacks and ii) succession of attacks without
hierarchical nature of those layers point directly to the intermediate displacements. The pattern ii) indicates
gesture probabilistic models reviewed in section 2. that the events take place inside the boundaries of a
In fig. 4a the grasp layer is represented for both hands single hand-grasp, while the pattern i) indicates
in the form of blue ellipses. There are no hand crossings, changes of hand position and thus moving on to the
thus we keep the same colour for both hands. The high- arm layer.
lighted noteheads indicate grasp boundaries: red note-
Figure 4. Gestural patterning after articulation and rests (left, 4b); grasp layer indicated by blue el-
lipses for both hands, boundaries indicated with red blobs for fingers 5 and blue blobs for fingers 1
(right, 4a)
Figure 6. Comparison of two annotations: the annotation of the original score in grasps trans-
posed on the MIDI score and the annotation of the gestural signal in attacks and displace-
ments. Watch the matching blue arrowed patterns.
Figure 8c. Annotation of fig. 7 keeping one grasp boundary and indicating the gestural pattern
Figure 8d. Arm layer: Annotation of fig. 7 keeping one the maxima and minima of
arm trajectories, without intermediate position changes
Figure 8e. Segmentation and parsing: The relationship of the arm movement is heterodi-
rectional (opposite motion) in the first 3 symmetrical units and homodirectional (parallel
motion) in the units 4 to 6.
Gesture patterning
MIDI annotation
Figure 9. From top to bottom: The alignment of gesture patterning, MIDI patterning with
pitch reduction and the original score annotation.
2For a review of the concept of the embodied navigation of complex notation and its foundation on the field of embodied and extend-
ed cognition, please review [18].
automatic transcription of the MIDI piano-roll into re- 13.P. Antoniadis,“Physicality as a performer-specific
duced proportional representations. perspectival point to I. Xenakis's piano work. Case-
study Mists”, in Proceedings of the I.Xenakis
International Symposium 2011, Goldsmiths
University, London
7. REFERENCES
14.G. Sandor, On Piano Playing. Motion, Sound and
Expression, New York: Schirmer, 1981
1.B. Ferneyhough, Lemma-Icon-Epigram for solo piano, 15.D.Fober, Y. Orlarey, S. Letz: “INScore: An
“Performance Notes”, Edition Peters No. 7233, Environment for the Design of Live Music Scores”,
1982 Proceedings of the Linux Audio Conference-LAC
2012
2.B.Ferneyhough,“Aspects of Notational and
Compositional Practice”, in J. Boros & R. Toop 16.P.Antoniadis, D. Fober, F. Bevilacqua, “Gesture
(eds.) Brian Ferneyhough-Collected Writings, pp. cutting through textual complexity: Towards a tool
2-13, Routledge, 1995 for online gestural analysis and control of complex
3.S. Kanach (ed.), Performing Xenakis, pp. 65-129, in piano notation processing”, in CIM 14 Proceedings,
Berlin, 2014
The Iannis Xenakis Series vol. 2. Hillsdale,
Pendragon Press, 2010 17.P. Antoniadis, F. Bevilacqua, D. Fober, “Gesture
4.I. Pace, “Making Possible the Irrational: Strategies and cutting through textual complexity: Towards a tool
aesthetics in the music of Stockhausen, Cage, for online gestural analysis and control of complex
piano notation processing”https://
Ligeti, Xenakis, Ferneyhough, Barrett” in Collected
Writings of the Orpheus Institute. Leuven: www.youtube.com/watch?v=KV9nQUhhyuI
University Press, 2008 18.P. Antoniadis: “Corporeal Navigation: Embodied and
5.M.Leman, Embodied Music Cognition and Mediation Extended Cognition as a Model for Discourses and
Tools for Complex Piano Music After 1945” in
Technology, MIT Press, 2008
Pedro Alvarez (ed.) CeReNeM Journal, Issue 4,
6.M.Rowlands, The New Science of the Mind: from pages 6-29, March 2014 http://issuu.com/cerenem/
extended mind to embodied phenomenology, MIT docs/cerenem_journal_issue_4
Press, 2010
7.F. Bevilacqua, N.Schnell, N.Rasamimanana, B.
Zamborlin, F. Guedy: “Online Gesture Analysis and
Control of Audio Processing”, in: J.Solis & K. Ng
(eds.) Musical Robots and Interactive Multimodal
Systems, pp. 127-142, Springer-Verlag, 2011
8.B.Caramiaux, M. Wanderley, F. Bevilacqua,
“Segmenting and Parsing Instrumentalists’
Gestures”, in J. New Music Research vol. 41, no 1,
pp. 13-29, 2012
9.J. Françoise, Realtime Segmentation and Recognition
of Gestures using Hierarchical Markov Models,
Master’s Thesis, ATIAM 2010-2011, Université
Pierre et Marie Curie, Ircam, Telecom Paristech,
2011
10.J . F r a n ç o i s e , M o t i o n - S o u n d M a p p i n g b y
Demonstration, PhD Thesis, Université Pierre et
Marie Curie, Ircam, 2015
11.J.Françoise, B.Caramiaux, F. Bevilacqua, “A
Hierarchical Approach for the Design of Gesture -
to - Sound Mappings”, in Proceedings of the 9th
Sound and Music Conference (SMC), Copenhague,
Denmark, 2012
12.F.Bevilacqua, N. Rasamimanana, E. Fléty, S.
Lemouton, F. Baschet, “The augmented violin
project: research, composition and performance
report”, 6th International Conference on New
Interfaces for Musical Expression (NIME 06), Paris,
2006