Incom ist die Kommunikations-Plattform der Fachhochschule Potsdam

In seiner Funktionalität auf die Lehre in gestalterischen Studiengängen zugeschnitten... Schnittstelle für die moderne Lehre

Incom ist die Kommunikations-Plattform der Fachhochschule Potsdam mehr erfahren

Articulated Paint

Expressive Musical Interface for Non-Musicians We present the concept and prototype of a new musical interface that utilizes the close relationship between gestural expression in the act of painting and that of playing a musical instrument in order to provide non-musicians the opportunity to create musical expression. A physical brush on a canvas acts as the instrument. The characteristics of its stroke are intuitively mapped to a conductor program, defining expressive parameters of the tone in real-time. Two different interaction modes highlight the importance of bodily expression in making music as well as the value of a metaphorical visual representation.

Notes

nime_ars.jpgnime_ars.jpg

Accepted to the New Interfaces for Musical Expression conference in New York, June 6th-9th, 2007. We will present a demo and a poster.

Presentation at the Ars Electronica Campus in Linz, September 5th-11th, 2007. Campus is organized by HyperWerk Basel, who proposed the motto „neo-analog“.

There's a sister project to this one, developed at the same time. Have a look at MusicPencil.

Introduction

To non-musicians, the idea of playing a musical instrument is often just related to controlling the instrument in terms of selecting the right note in the right rhythm in order to translate what is written in the score. In fact, most of the practicing in the beginning years of learning an instrument will be concerned with this. Professional musicians invest enormous efforts in order to attain a level of control at which the instrument becomes almost a natural extension to their bodies. As in other art forms, learning the technique is the necessary evil on the way to artistic expression.

With our novel musical interface called Articulated Paint, we want to enable non- and beginning musicians to instantly experience basic aspects of musical expression by eliminating the need to learn how to play the right notes. By giving them a taste of the joy of interpreting music individually, we hope to generate a stronger interest in learning a real musical instrument.

Using Articulated Paint is as intuitive as drawing on a canvas. The paint-brush functions as the instrument. The stroke it leaves behind gives expression to the notes that are automatically served from the computer. The computer reads the notes from a MIDI file and sets the frequency of the notes, one after the other, while the handling of the brush determines expressive parameters such as timing, dynamics, articulation, and vibrato. Using this setup, the user can experiment with different expressions for a music piece and refine it, without being burdened by the insurmountable complexity of a classical instrument.

Related Work

Conductor-like musical interfaces Articulated Paint falls into the category of conductor programs, in which the computer handles the playback of the score while the musician conducts the “manner” in which this happens. Malinowski [1] gives a detailed overview of musical interfaces in this class. There are several projects that enable a user to step into the role of a real conductor, mostly using a baton as an interface to direct an electronic orchestra. iSymphony [2] is the most recent incarnation of such a system.

Another interesting mapping of bodily gesture on musical expression comes in the form of a car driving simulation, the Expression Synthesis Project [3]. The Air Worm [4] allows a user to control loudness and tempo of a music piece by playing an digital theremin.

Painterly interfaces for media control The classical interface of brush and canvas is usually highly abstracted in computer interfaces. Rozin's Easel [5] tries to bring the physical aesthetics of real painting to animated images. The I/O Brush [6] is an interface for kids to paint with their environment as a palette. The MotoGlyph project [7] makes use of an alternative classical painting device, the spray can, to generate music out of images. All these interfaces make successful use of the affordance and intuitive expressiveness of real painting tools.

Designing the Interface

FHP_ArticulatedPaint_screen.gifFHP_ArticulatedPaint_screen.gif

Bodily expression The physical form of the interface (brush on canvas) is chosen deliberately. Motivated by a larger research theme that is concerned with the role of the body in creativity [8], the idea is to create an interface that makes this connection more explicit. As there seems to be a conditioned link between bodily movement and music, the design of a gestural musical instrument seems particularly fruitful. A conventional graphic tablet is thus not adequate. Initial ideas based on tracking the body movement of a musician similar to Marrin's _Conductor's Jacket _ [9] were set aside in favor of the more intuitive and instantaneous brush interface which also provides haptic feedback. The importance of bodily expression is preserved: As with an instrumental musician - who has to move the whole body, including breath, in an artistic conjunction with the parts controlling the instrument in order to achieve a convincing aural expression – the artist in our interface needs to engage in the interaction with the whole body. We assume that for a non-musician trying out Articulated Paint, this is a joyful and revealing experience. We are interested in the effects of such an interface on creative exploration and learning.

Playing Modes As there is no innate way of painting musical expression, we conducted an informal user study prior to building the system, in which subjects were asked to draw the expression of the melodies they were hearing. The result was a wild mixture, ranging from rather conventional waveshape-like drawings to freestyl. From this study we condensed two alternative modes of playing the instrument. The basic setup, however, is the same. A session starts with an empty screen. The music is played to the user in advance so that she can memorize the melody. She then applies the brush strokes to play the same melody by herself. Only a little amount of instructions are given in the beginning, increasing with the learning of the user. Also the parameter mapping is the same in both modes.

Bodily Mode The first mode is more focused on the movement of the brush rather than the image it creates. In fact, the user is encouraged to play it blindly. The user is painting on the a virtual “conveyor belt” that moves in a steady tempo in the direction of an imagined z-axis. This way, she is actually painting on the same vertical position over again and can concentrate on the expressive movement of the brush. Another advantage here is that the resulting image conveys the tempo of the brush movement, which is not possible with a static canvas as in the second mode.

Notational Mode This mode is closer to the conventional notational system. The user paints line-by-line from left to right. Handling of the instrument becomes more complex, while the room for expression is a bit diminished, as movement is now restricted to left-to-right. However, this comes at the benefit of a more powerful representation. It is easier to interpret and is suited to be revised and improved on. Its synaesthetic quality is enhanced by the fact that the gestural mapping we employ offers some freedom for giving each music its characteristic look. Thus, this mode might be used to develop new forms of individual, meaningful notations.

Gesture Mapping The mapping of brush movement to acoustic parameters is very direct, since it happens in real-time. Timing is controlled by either removing and reapplying the brush to elicit the next note, or by changing the direction of movement in legato (horizontally in the bodily mode, and vertically in the notational mode). The intensity of brush movement, resulting from the amount of pressure and the velocity of movement, is mapped to the dynamics of the sound. Articulation is also related to dynamics. The style of legato depends on how quickly the direction of brush movement is changed, while the style of accentuation results from the distribution of pressure when setting the brush on the canvas. Vibrato can be achieved by moving the brush in a wiggly line.

Brush movement || Audible effect


Contact pressure & speed || Dynamics Contact / directional change || Next note (non-legato / legato) Attack of contact / directional change || Articulation of next note / legato Wiggle (amplitude and frequency) || Vibrato (amplitude and frequency) Table 1: Gesture Mappings

This mapping has so far been experienced as rather natural, but the optimal mapping is still being explored. The starting coordinates (x- and y-value) of a brush stroke currently have no additional meaning, but it would be interesting to use it to vary the timbre of the tone.

User study and resulting modes

vlcsnap_7421918.pngvlcsnap_7421918.png
Notational ModeNotational Mode

Interaction with the Prototype

CIMG1664.JPGCIMG1664.JPG
vlcsnap_981019.pngvlcsnap_981019.png

Implementation

The physical part of the interface consists of a custom-built brush and canvas. The flat brush contains two flex sensors amidst its bristles for measuring the amount of bend/contact pressure. Two infrared-LEDs attached to the sides of the brush are tracked by a modified webcam sitting behind the canvas in order to determine the brush's position (currently 320 by 240 pixels at 60 Hz). The canvas is a special rear-projection foil mounted on a traditional stretcher bars. A beamer projects the calculated image from behind it.

The software side is composed using the individual strengths of a mixture of open-source electronic arts software. Video tracking is accomplished by vvvv [10], the visualization is done with Processing [11]. The central part of the application was created in PureData [12]. It analyzes the brush movement and maps it, together with the frequency of the current note obtained from a MIDI file, to the simulation of a physical instrument, provided by the PeRColate [13] library. We currently use the flute and the clarinet simulation.

Building the Prototype

Brush detailBrush detail
Brush from the front, displaying bend sensor and infrared LEDBrush from the front, displaying bend sensor and infrared LED
Behind the projection surfaceBehind the projection surface
Application logic and sound synthesis in PureDataApplication logic and sound synthesis in PureData

Future Work

As this work is still in its infancy, there are several directions which we would like to pursue. First of all, the individual components related to the expressiveness of the interface (input hardware, motion analysis, gesture mapping, sound synthesis) need more refinement for a more powerful effect. For an educational setting, we would like to design and evaluate an instructional program that teaches the different aspects of musical/gestural expression with progressive complexity, from solist controlling just the dynamics, to a full-featured ensemble situation.

As described, especially the notational mode offers a basis for a revisionary layer of interaction, which is especially important for exploratory and creative learning, according to Schoen's seeing-drawing-seeing cycle [14]. We will implement functions to revise one's performance and provide the ability to correct and annotate it. The improved version can then serve as a score for the next attempt.

Finally, we will investigate the instrument's value as a creative tool for conductors, building on the work of Amitani and Hori on the effects of external representation for composers [15].

Summary

We have presented a novel musical interface in the form of a brush on canvas. Using simple brush strokes, it is possible to add musical expression to a predefined piece of music. It is targeted at beginning musicians, who are given the opportunity to try out different expressive elements of playing an instrument as well as the role of the musician's body within a safe setting. The gestural mapping used for this interface is a promising base for a creative tool.

References

[1]Malinowski, S. The Conductor Program. http://www.musanim.com/tapper/

[2] Lee, E., Kiel, H., Dedenbach, S., Grüll, I., Karrer, T., Wolf, M., and Borchers, J. 2006. iSymphony: an adaptive interactive orchestral conducting system for digital audio and video streams. In CHI '06 Extd. Abstracts. ACM Press, New York, NY, 259-262.

[3] Chew, E., Liu, J., and François, A. R. 2006. ESP: roadmaps as constructed interpretations and guides to expressive performance. In Proc. AMCMM '06. ACM Press, New York, NY, 137-145.

[4] Dixon, S., Goebl, W., and Widmer, G. 2005. The „Air Worm“: An Interface for Real-Time Manipulation of Expressive Music Performance. In Proc. ICMC '05, 614-617.

[5] Rozin, D. 1998. Easel. http://smoothware.com/danny/neweasel.html.

[6] Ryokai, K., Marti, S., and Ishii, H. 2004. I/O brush: drawing with everyday objects as ink. In Proc. CHI '04. ACM Press, New York, NY, 303-310.

[7] Digit. 2004. MotoGlyph. http://www.digitlondon.com/motoglyph.

[8] Knörig, A. 2006. Free the body and the mind will follow: An investigation into the role of the human body in creativity, and its application to HCI. Diploma Thesis, Univ. of Applied Sciences Wedel, Germany. http://andreknoerig.de/portfolio/30/30_p01.html

[9] Marrin Nakra, T. 2000. Inside the „Conductor's Jacket“: Analysis, Interpretation and Musical Synthesis of Expressive Gesture. Doctoral Thesis, MIT Media Lab.

[10] vvvv Group. vvvv – a toolkit for real time video synthesis. http://www.vvvv.org/

[11] Fry, B., Reas, C. Processing. http://www.processing.org

[12] Puckette, M.S. Pure Data – a real-time music and multimedia environment. http://crca.ucsd.edu/~msp/software.html

[13] Trueman, D., DuBois, R.L. PeRColate - A collection of synthesis, signal processing, and image processing objects. http://www.music.columbia.edu/PeRColate/

[14] Schoen, D. A. 1983. The Reflective Practitioner: How Professionals Think in Action. Basic Books, NY.

[15] Amitani S. and Hori K. (2002) Supporting Musical Composition by Externalizing the Composer's Mental Space, Proc. C&C 2002, 165-172.

Fachgruppe

Interfacedesign

Art des Projekts

Studienarbeit im Masterstudium

Betreuung

foto: Prof. Boris Müller foto: Prof. Reto Wettach

Entstehungszeitraum

SoSe 06 – WiSe 06 / 07