Вы находитесь на странице: 1из 8

James Love

2955052

SCMP132 Assessment Task No. 1 (alternative assessment for 300-level)

The virtual instrument rack, Reason by Propellerhead Software is used extensively in live music performance, for its capabilities are rarely exhausted even in the most extreme applications; however, the difficulties faced by a composer of computer music for live performance often originate from a lack of understanding or control over its vast array of parameters. To create coherent musical works, it is necessary to limit the performers duties so that clear relationships between sound and gesture are defined. An electronic controller designed to affect parameters in Reason can place new sounds in a more familiar context for an audience, especially if they are accompanied by a strong visual element.

In many ways, interactivity in musical performance has decreased due to technological advancements in the nineteenth and twentieth centuries (Jord 2007, p91). There is a genre known as acousmatic music, which refers to the emission of recorded sounds from a loudspeaker without an accompanying cause or gesture. While this departure from traditional musical performance can be intellectually interesting, it is important to realise that disconnecting cause from effect can also alienate an audience (Arthurs 2000, p152). Jord (2007, p102) alludes to Turners (2003) suggestion that visual feedback may be what is necessary to restore the lines of communication between performer and audience in electronic music.

The introduction of some gestural input can make the relationship between cause and effect visually clearer; however, it is inevitable that some external, unpredictable forces...[will] affect the system, and the output [will be] the result of this permanent struggle (Jord 2007, p95). Arthurs (2000, p93) explains how Greg Schiemers Spectral Dance (1991) exploits the unpredictable nature of an algorithm to limit the performers control over a musical outcome. This represents a departure from traditional musical performance for both

the performer and the audience, as the instrument is no longer just a passive channel for human expression (Jord 2007, p105).

Most commercially available controllers are based on traditional instruments like the keyboard, where their widespread use is restricted to the imitation of other traditional instruments through synthesisers (Arthurs 2000, p120). Although it is possible to avoid the traditional instrumental sounds, the traditional performance techniques imposed by these controllers makes it difficult to confront new music-making paradigms

(Jord 2007, pp97-8).

Jord (2007, p99) states that any input device can become a good or bad choice depending on the context, the parameter to control, or the performer who will be using it...The challenge remains how to integrate and transform the apparatus into coherently designed, meaningful musical experiences with emotional depth. Arthurs (2000, p72) also affirms that the challenge is to create an instrument that offers musical control over its vast range of parameters.

Composing a work for a controller with a strong visual element can result in greater stimulation for performer and audience; however, it is important to recognise that the gestural component should not necessarily mimic that of traditional musical instruments. Also, while clear relationships between sound and gesture should exist, it is not essential that the performer has complete control over the output.

All sounds can be described as having three basic components: a controller, something that vibrates, and a signal (Vella 2000, p64). Vella (2000, p66) outlines the instrumental

classification system of Hornbostel-Sachs (1914), where instruments are classified according to the way they vibrate to produce a signal. An electronic instrument is a subcategory of the Electrophones, which are defined as producing sound from a loudspeaker driven by an amplifier. The subcategories are devised according to the way the input signal is amplified; however, the electronic instrument is a special case, because sound is produced by electronic oscillators and other circuits rather than mechanically vibrating parts (Vella 2000, p71).

Reason is a powerful software package that emulates many industry-standard analogue devices belonging to the aforementioned electronic instrument category of the Electrophones. The controller can also be considered an electronic instrument (Vella 2000, p65), containing its own circuits and actively affecting Reasons output signal, provided that MIDI messages can be transmitted from the controller to Reason.

Reason deals with MIDI inputs and outputs through a system called Remote, which enables a performer to interact with a project using multiple control surfaces. Any single controller can be assigned to one of the virtual rack devices employed within a project, to affect variables ranging from knobs and faders on the software synthesisers and effects units, to buttons and switches on the software sequencers. Remote has its own default set of assignments for each device; however, it is possible to change the assignments in a project using Remote Override Edit mode. It is also possible to reassign control to a different device during a performance, as well as assign multiple controllers to a single device (Jones 2008, pp86-8).

There are many commercially available MIDI controllers in the form of keyboards and mixers that are recognised by Reason as soon as they are plugged into a computers USB port

(Jones 2008, p86). These can be expensive and not suitable for anything more than simple hacking if the performance requires more than piano keys, buttons and faders. The use of an analogue-to-MIDI interface allows a performers interaction with an analogue circuit to be converted into any type of MIDI message recognised by Reason (Jord 2007, p98). Possible devices include the Theremin, ultrasound, sensors (for any kind of physical gesture or external parameter), joysticks and graphics tablets.

Reason and its controller(s) can be considered active musical components of the instrument. As there are many possibilities with regards to the actual controller used, it is important to consider the parameters in Reason that can be affected. It is useful to refer to works composed for traditional musical instruments, as similar audience responses can be applied to parameter changes in new sounds for computer music.

The extension of traditional instrumental practice by either a composer or performer can be received as a departure by an audience. Brian Wilsons use of the harpsichord in The Beach Boys When I Grow Up (To Be A Man) (1965) successfully brought new life to the instrument, inspiring future use in the pop music context by himself and The Beatles (Keely 2009; Vella 2000, pp79-80). Igor Stravinsky extended the bassoon to its top register for the opening solo in Le sacre du printemps (1913), which caused its first audience to riot; however, since then the orchestral masterpiece has remained part of the standard repertoire (Vella 2000, p81). John Cage changed the traditional role of the piano in Sonatas and Interludes for Prepared Piano (1948) by inserting rubbers, screws and paper clips among the strings to alter its sound (Vella 2000, pp64, 82). To give some idea of the similarities with Reason, the composer may refer to Arthurs (2000, p91) quoting John Cage (1973): electrical instruments will make available any and all sounds that can be heard.

It is possible to create a sense of aural perspective through changes in an instruments dynamics (Arthurs 2000, p91). Gustav Mahler (1992) opens the second movement of Symphony No. 7 (1904-5) with a call and response between a French horn playing forte and another playing piano, giving the impression that the softer horn is situated in the distance. Varying dynamics also creates a sense of movement, as in Bydlo (The Ox Cart) from Mussorgskys Pictures at an Exhibition (1874). The image of an ox cart approaching and receding is represented by the respective crescendos and decrescendos of a repetitive piano rhythm (Vella 2000, p127). To achieve similar effects with Reason, a controller could be assigned to adjust the velocity of a sound, enabling simultaneous changes in attack, harmonic content and intensity.

The combination of multiple instrumental timbres playing in unison results in new instrumental timbres, as evidenced in works such as Igor Stravinskys (2000) Le sacre du printemps (1914) and Gustav Holsts (1996) The Planets (1914-16). Reasons Combinator not only allows several devices to be connected to form layered multi-instruments, but also can be set up so that more than one variable from any number of devices is affected by a single control (Jones 2008, p17). Adjusting unique parameters on several devices simultaneously can lead to dramatic shifts in the texture and timbre of the layered sound (Vella 2000, pp. 121, 144).

It is more logical to work with pre-defined patterns and sequences in Reason when manipulating parameters other than note events in real time. Pattern-capable devices (such as the Matrix, Redrum and Thor) allow controllers to be mapped to the pattern selection buttons through Remote Override Edit mode, so that different sequences can be triggered in real time (Jones 2008, pp88). The DDL-1 Digital Delay Line also works well with sequences and can

be used to create the kind of real time counterpoint that early electronic music composers achieved with tape delay systems (Collins 2007, pp43-4).

Considering innovative use of traditional musical instruments and their parameter modulations in previous works gives some perspective as to how similar effects can be created with computer music. The audience response is a result of something tangible in performance and is easily applied in the powerful environment created by Reason and a suitable controller.

A composers use of Reason and a visceral control surface in real time live performance has the capability to produce coherent, musical output. It is necessary to limit the performers duties with respect to the parameters to be controlled, though it is not essential that the performer be in control of all of them. There are many possibilities for the actual device employed as a controller, and the most appropriate one should be governed by what parameters it will control, as well as its prospective performer and audience.

References:
Arthurs, A 2000, Special Topics, in R Vella, Musical Environments, Currency Press Pty Limited, Strawberry Hills, Australia. Collins, N 2007, Live electronic music, in N Collins & J dEscrivn (eds), The Cambridge Companion to Electronic Music, University Press, Cambridge, United Kingdom. Holst, G 1996, The Planets, musical score, Dover, New York. Jones, H 2008, Playing Reason Live, MusicTech Focus: Reason, January. Jones, H 2008, The power of Reasons Combinator, MusicTech Focus: Reason, January. Jord, S 2007, Interactivity and live computer music, in N Collins & J dEscrivn (eds), The Cambridge Companion to Electronic Music, University Press, Cambridge, United Kingdom. Keely, K 2009, The Beach Boys Today! Album Review, accessed 18/8/2009, http://50s-60spop-music.suite101.com/article.cfm/the_beach_boys_today_album_review Mahler, G 1992, Symphony No. 7, musical score, Dover, New York. Stravinsky, I 2000, Le sacre du printemps, musical score, Dover, New York. Vella, R 2000, Musical Environments, Currency Press Pty Limited, Strawberry Hills, Australia.

Вам также может понравиться