Вы находитесь на странице: 1из 21

1

Intro to Music Technology


MIDI Basics
Part V
Intro

MIDI is everywhere around us. It's in musical instruments, computers,


cell phones, and many other products from well known companies like
Microsoft, Apple, Nokia, Sony, Yamaha, & hundreds more consumer
products companies.
Most popular recorded music is written and performed using MIDIequipped electronic keyboards (aka "synthesizers"). Much music is also
written on computers using "Sequencers" and/or "Digital Audio
workstations". Other MIDI-equipped musical instruments may also be used,
including digital drums, digital guitars, wind instruments, and more.
Your computer probably has the ability to play MIDI files using either
built-in hardware or a software synthesizer that responds to MIDI messages,
and with an appropriate adapter your computer can be connected to other
MIDI-equipped products so you can use MIDI to help you learn, play, create
and enjoy music.
Even film and TV scores are usually created on MIDI instruments, and
with advances in digital sampling and synthesis technologies making digital
instruments sound ever more realistic, the orchestra playing behind that bigscreen block buster is more likely to be the product of a few MIDI devices
than dozens of acoustic instruments.
Besides music creation and playback described above, MIDI has some
other interesting and popular uses. MIDI Show Control is a different set of
MIDI messages used for controlling rides at theme parks as well as for
operating themed events such as are found outside many Las Vegas casinos.
And many people have developed unique products that use MIDI.
And if you own a cell phone that has "polyphonic ring-tones" (billions do)
it's probably got a MIDI synthesizer built-in (or something very much like
MIDI, depending on the manufacturer). Ring-tones are a very popular add-on
business for cellular providers, and many people use MIDI to make their own
ring-tones and put them on their phones.

The Basics

MIDI, pronounced Mid-ee is an acronym for Musical Instrument Digital


Interface. At its most basic, it is a standardized communications protocol that
allows musical instruments and computers to talk to each other using a common
language.
MIDI is a standard, a protocol, a language, and a list of specifications. It
identifies not only how information is transmitted but also what hardware
transmits this information. It specifies how cables are connected and what type of

2
cables should be used to transmit MIDI data. MIDI has as many software
implementations as it has hardware ones.
At its most basic level, MIDI was conceived as a way to link 2 or more
synthesizers together to layer sounds. It now is used to control studio
equipment, light shows, and even control factory automation.
Brief History of MIDI

Throughout the history of electronic music, musicians have wanted to connect


multiple instruments together to create a whole greater than the sum of its parts.
The greatest stumbling block was that instruments from one company were
incompatible with instruments from another.
Moog is generally, and appropriately, credited for taking the synthesizer
out of the university laboratory and putting it in the hands of musicians.
Certainly from the time of
Walter Carlos' ground-breaking
Switched On Bach recording
(1968) to the release of the
MiniMoog (1970) both musicians
and the music-buying public
became enamored if not
frankly dazzled by the sonic
possibilities now seemingly on
the musical horizon.

As it turned out it was a false dawn. The synthesizers of the 1970s might
have been unrestricted sonically but in terms of playability, stability, polyphony,
and compatibility they were still very limited indeed.
Early integrated circuits-based synthesizers from Moog, ARP, and EMS
opened the door but it was the arrival of Japanese companies like Korg, Roland,
and Yamaha in the mid 1970s that converted potential into popularity.
Digitally Controlled Synthesizers

The popularity of synthesizers


got a major boost in 1978 when
micoprocessor-based instruments
began to appear, spearheaded by a
new California company called
Sequential Circuits.
The Prophet-5, though still hugely limited by today's standards, offered
reasonable levels of playability, stability, and polyphony, albeit at a hefty price at

3
the time (around $4000). Soon Korg, Roland, and Yamaha's microprocessorbased offerings would slash prices in half, and by the turn of the decade the
polyphonic synthesizer was firmly on the map for every self-respecting keyboard
player from hobbyists to touring professionals. The days of the Hammond organ,
the Fender Rhodes piano, and the Hohner Clavinet were coming to an end or
so we thought
Stability, playability, and polyphony continued to evolve in the early
1980s but compatibility remained a thorn in the side of manufacturers.
MIDI concept is born

Visionaries like Dave Smith from Sequential Circuits, and Ikutaru


Kakehashi from Roland began to worry that this lack of compatibility between
manufacturers would restrict people's use of synthesizers, which would ultimately
inhibit sales growth. Talk of a universal' digital communication system thus
began circulating in 1981. Dave Smith and Chet Wood presented a paper that
year at AES (Audio Engineering Society) proposing a concept for a Universal
Synthesizer Interface.
At this time there were 7 major players in the synthesizer manufacturing
business; Arp, Moog, Oberheim, Sequential Circuits, (US) and Roland, Korg and
Yamaha, (Japan). There were others as well but these were the big players
capturing most of the market share. They were all talking about how great and
profitable it would be to get instruments to be able to connect to each other.

At the following NAMM (National Association of Music Merchandisers)


show in January 1982 a meeting took place between the leading American and
Japanese synthesizer manufacturers where certain improvements were made
to the USI specification and the acronym MIDI was adopted.

The following year, the first public presentation of a working MIDI


connection took place in 1983 at the winter NAMM show. A Sequential
Circuits Prophet-600 was connected to a Roland Jupiter 6 by each synth's
MIDI interface. Connecting synthesizers together certainly was not a new
idea, but this means of doing so was new and far more effective than earlier
solutions.

In the summer of the same year, Yamaha introduced the DX7 FM


synthesizer, with MIDI hardware as a standard component. MIDI rapidly
found favor with manufacturers that recognized the advantages of

4
standardizing a basic hardware/software interface for data exchange among
different machines. It is no exaggeration to say that MIDI fueled an incredibly
active period of hardware synthesis development during the late 1980s and
early 1990s.
MIDI and the Personal Computer

Simultaneously, the personal computer (PC) was emerging as a potential


tool for musicians because of its programmability. Roland seized that
opportunity and began work on a musical interface device for the IBM PC.
Roland saw the PC as being a digital alternative to its analog sequencers,
and since this hardware device would allow musical instruments to
communicate with IBM PC computers, it was the perfect interface tool to
penetrate new markets.

Roland MPU-401 1984


(Musical Processing Unit)
By 1985, the Commodore 64, Apple II, and IBM PC could all be adapted
for MIDI. The Atari 512 even had a MIDI interface built in.
In 1986, Apple came out with the very popular Macintosh Plus, which
quickly became a favorite with musicians because of its GUI.
Music Software

Now that computers could speak MIDI, many new software companies started
creating programs for sequencing. Some of the first.

Steinberg Research GmbH 1984 - now known as Steinberg (owned by


Yamaha) created and developed the first MIDI Multitrack
Sequencer.
First product: Pro-16 for the Commodore 64
Current products: DAW products include Cubase and Nuendo.

MOTU 1985 (Mark of the Unicorn)


First product: Performer (Apple Macintosh only)
Current product: Digital Performer 6 (Mac only)

Twelve Tone Systems 1987- now known as Cakewalk.


First product: Cakewalk version 1 for DOS
Current product: DAW products include Sonar

C-Lab 1988 - later Emagic now owned by Apple


First product: Creator
Current product: DAW Logic Pro 8

Software houses created and marketed music composition programs and


other MIDI software. Equipped with the right hardware and software, a
musician could use the computer to control synthesizers, drum machines,
mixers, effects units--anything equipped with MIDI connectors. The
extent of control varied, but the efficiency of the MIDI studio made a
revolutionary impact on music production.

One of the advantages of MIDIs modular concept is that you could now
pick and choose system components that best suit your needs. Your
favorite keyboard could be linked to any MIDI instrument you please. To
add additional synthesizers, you dont necessarily need more keyboards.
The newly developed concept of the synthesizer module would save space
and money.

What is MIDI?

The Musical Instrument Digital Interface (MIDI) allow musicians, sound and
lighting engineers, computer enthusiasts, or anybody else for that matter to use
multimedia computers and electronic musical instruments to create, listen to, and
learn about music by offering a common language that is shared between
compatible devices and software.
MIDI can be divided into 3 categories.

1.

The Protocol the language MIDI uses.

2.

The Hardware Interface the connections and cables MIDI uses


to transmit and receive its information.

3.

The File Formats how MIDI manages and manipulates the data.
Standard MIDI Files (SMF) Music Editing - Sequencing

1.

Protocol
The Protocol of MIDI is a music description language in binary form, in
which binary words describe an event of a musical performance.

MIDI was originally intended for keyboard instruments, so many of its


events are keyboard oriented, where the action of pressing a note is like
activating an On switch, and the release of that note is like turning a switch
Off.

Status and Data Bytes


There are two types of bytes, Status or Data.

All Status bytes begin with a 1 as the MSB and are the first byte
transmitted when sending any MIDI command. They serve to identify the
kind of information being sent over MIDI. It tells the receiving device which
MIDI channel the event belongs to and what the event is. For example, an
event can be a Note On, Note Off, Pitch bend, program change, etc

All Data bytes begin with a 0 as the MSB and usually 1 or 2 data bytes
follow a status byte. They represent some value associated with the status
byte. For example, when you strike a middle C on the transmitting keyboard
with fairly heavy force, the Status message would hold a note on and the
Data message would be a note value number of 60d and a velocity level of
maybe about 114d.

Note On command

Transmits 3 message bytes

1st byte Status byte - Note On command and MIDI Channel


2nd byte Data byte MIDI note number (0-127)
3rd byte Data byte Velocity (0-127)
90h or 10010000 = Note On command on MIDI channel 1
3Ch or 00111100 = MIDI note number 60d or middle C
72h or 01110010 = Velocity value of 114d, fairly high value

Status messages use numbers ranging from 128d (80h, 10000000b) to


255d (FFh, 11111111). The Data messages use numbers ranging from 0d
(00h, 00000000b) to 127d (7F, 0111111b). Again, the first bit (MSB)

7
determines if it is a Status or Data message.

MIDI actually uses a 10-bit word when it transmits. 8-bits are for
information (Status or Data) and the 2 extra bits are used for error
correction.

Since the MSB is only used to designate whether the message is Status or
Data, this only leaves 7-bits of values for the rest of the word. The values of
7-bits range from 0-127d. This is why you will notice that values in MIDI are
often numbered from 0 to 127. When you adjust the controls on some MIDI
instruments you will see values range from 0 to 127.

MIDI messages are often represented in one of three formats: decimal,


binary, and hexadecimal.

An entire MIDI message can contain up to 3 bytes of information


depending on the MIDI function. The status byte always begins with a 1 and
is like the engine pulling the cars (data). The next 3-bits of the status byte
identify the MIDI function and the last 4-bits identify the MIDI channels 1-16.
(Except System Common messages, which are sent out to all MIDI channels).
16 MIDI Channels 1-16 (0000 1111)
A MIDI channel is like a television or radio channel. It is a way for MIDI
to isolate information so that a receiving instrument set to a certain
channel will filter out all the other information in the transmission, and
reproduce or process only the information to which it is tuned.
Most synths built today are multi-timbral instruments. They can play
more that one sound at a time. Using MIDI channels we can access up to
16 sounds per synth.
If your instrument is multi-timbral, it will retain only the MIDI messages
that apply to its active channels. Lets say you have a multi-timbral
instrument set to play channels 1 through 5; this instrument will ignore all
messages sent out MIDI channels 6 through16.

MIDI Message Types (Status bytes)


All aspects of your musical performance can be represented in a MIDI
message. The following section will identify these aspects and explain
how they work and what values are attached to them. To better
understand these aspects of MIDI messages, the Status bytes have been
divided into five categories.

Channel Voice Messages are the basic MIDI events representing a

8
musical performance. The most common Channel Voice
Messages are Note On and Note Off, Pitch Bend, Program Change
and After Touch.

Channel Mode Messages tell devices to send or receive


information in a certain way, which is defined by the mode being
sent. An example would be making a device respond in omni/off
mode instead of omni/on mode. In omni/off mode the device
would only receive information on its basic channel. In omni/on
mode it can receive data on all 16 channels.

System Common Messages are, as the name would suggest,


common to all instruments, devices, or software in your MIDI
setup. Sequencers use System Common Messages for MIDI time
references (using MIDI Time Code), song position, song selection
and tuning.

System Real Time Messages are synchronization commands used


by MIDI to control sequences, such as Start and Stop commands
imbedded as a MIDI command, along with the MIDI Timing
Clock.

System Exclusive Messages are used to send or receive


instrument settings such as patch or performance memories. You
can also use System Exclusive (SysEx) messages to transfer
sample waveforms via MIDI along with other non-music related
functions.

Channel Voice Messages


This is the backbone of MIDI. Most of what MIDI sends in a live
performance are Channel Voice messages. The following section will
describe each of those message types so you can understand what is
transmitted through MIDI and where to look when editing a MIDI sequence
recorded in a MIDI sequencer.
Note Off (80-8F 1000cccc) This message is sent when you release a note
after striking it. (Now days, most devices will use a Note On message with a
velocity value of 0 to indicate that the note is off.) If your keyboard supports
Release Velocity sensing, which detects how fast you release the key, this can
be used to change some aspect of the sound. Ex., like a slow release could
change the release envelope of a patch making the sound slowly decay instead
of a quick decay.
Note On (90-9F 1001cccc) Every time you play a note, a Note On message is
sent. This Note On message contains two pieces of information (besides the

9
MIDI channel): the key or note number and the velocity at which the note was
played. Both of these are data bytes and their values can be between 0 and
127. Below are charts for MIDI note numbers and velocity range.
MIDI NOTE Numbers
Octave
-1
0
1
2
3
4
5
6
7
8
9

C
0
12
24
36
48
60
72
84
96
108
120

C#/Db
1
13
25
37
49
61
73
85
97
109
121

D
2
14
26
38
50
62
74
86
98
110
122

D#/Eb
3
15
27
39
51
63
75
87
99
111
123

E
4
16
28
40
52
64
76
88
100
112
124

F
5
17
29
41
53
65
77
89
101
113
125

F#/Gb
6
18
30
42
54
66
78
90
102
114
126

G
7
19
31
43
55
67
79
91
103
115
127

G#/Ab
8
20
32
44
56
68
80
92
104
116
-

A
9
21
33
45
57
69
81
93
104
117
-

A#/Bb
10
22
34
46
58
70
82
94
106
118
-

B
11
23
35
47
59
71
83
95
107
119
-

Velocity range reference.


Musical Expression
Extremely soft
Pianissimo (very soft)
Piano (soft)
Mezzo Piano (moderately soft)
Mezzo Forte (moderately hard)
Forte (loud or hard)
Fortissimo (very loud or hard)
Extremely loud or aggressive

Musical Notation
ppp
pp
p
mp
mf
f
ff
fff

MIDI velocity range


1 - 15
16 - 31
32 - 47
48 - 63
64 - 79
80 - 95
96 - 111
112 - 127

Polyphonic Aftertouch (A0-AF 1010cccc) Aftertouch is a modulation


source and be made to do whatever in the receiving instrument. Vibrato and
filter sweeping are common uses of aftertouch. This function determines the
amount of aftertouch pressure you put on each key that is being held down
(note on). It sends out continuous bytes of data for each note as you press
harder or softer on the keys. This can add up to a lot of MIDI data being
transmitted.
Program Change (C0-CF 1100cccc) Sending this message out will cause
the receiving unit to change its program. Some manufacturers might call
them presets, patches, instruments or whatever.

10
Channel Aftertouch (D0-DF 1101cccc) Like Polyphonic aftertouch except
it takes the value of the key that is being pressed the hardest and sends that
data out. All keys that are being played will respond to the data that is being
sent out by the key with the most pressure. The result is less data being
transmitted and all keys that are being played will have the same amount of
aftertouch modulation applied to them.
Pitch Wheel (E0-EF 1110cccc) Pitch bending can make the sound more
expressive, somewhat like a wind instrument. Because the human ear is very
sensitive to pitch changes, the pitch bend message contains two data bytes to
determine the bend value (214). This gives a resolution of 16,384 steps, which
is usually split in 2, with +8,192 steps above and -8,192 steps below with 0
being the original pitch. This smoothes out the stair-stepping effect that
would result with only 128 steps. Pitch bend control is a Continuous
Controller type since it continuously sends MIDI messages to update the
receiving MIDI device on the position of its controller.
Control Change and Channel Mode (B0-BF 1011cccc) The Keys are not
the only way to control the sound. There are 128 control changes that are
defined in the 2nd Data byte. Devices such as a modulation wheel, sustain
pedal, volume control, expression pedal, breath controller and many more are
used to give you more control over the expressive elements of a sound. Some
are Continuous Controller type and some are Switch Controller type.
Included under this status byte are the Channel Mode messages.
Channel Mode Messages
The Local control on/off is under this status byte. It enables/disables the
keyboard from transmitting MIDI data to its own internal sounds. When a
keyboard is connected to a DAWs sequencer, the MIDI data is received and
transmitted back out to the sending device. This can double the MIDI data in
the keyboard since it is receiving from both the DAW and its own keyboard.
By turning off local control, the device only receives from the DAW.
Channel Mode: This Status byte also directs us to the Channel Mode
functions. This determines how a device will respond to MIDI messages in
and out. There are 4 modes of operation that are combined in four ways.
Omni on mode implies that a device can respond to all or any
incoming MIDI channel data, regardless of its channel.
Omni off mode implies that a device can only respond to its base
MIDI channel. For instance, if you set your keyboard to channel 1, it
will only receive data that is on channel 1, and transmit data on
channel 1.
Poly mode implies that a device is capable of polyphony and will
enable polyphonic playing of any MIDI channel.

11
Mono mode implies that a device will not play more than one note
at a time on any given channel.
Mode 1 Omni On/Poly rarely used since data on any MIDI channel
will be played back on the devices base channel.
Mode 2 Omni On/Mono rarely used same as Mode 1 except it
will only play back monophonically (1 note at a time).
Mode 3 Omni Off/Poly Most used in todays MIDI world. The
device will respond to MIDI channel data and play back with
polyphony.
Mode 4 Omni Off/Mono rarely used same as above but will
playback in mono.
System Common, System Real Time and System Exclusive Messages
System Common (F0-FF 1111nnnn) System Common messages are
intended for all MIDI channels in a system so the last 4bits define message
types, not MIDI channels.
The System Real Time and System Exclusive messages are included in this
Status byte. Most system common messages relate to synchronization
features and are used with sequencers since they relate to time positioning,
song selection, and features on your MIDI device. Heres a look at these
messages from the last 4 bits.
F0 11110000 System Exclusive message status byte
F1 11110001 MIDI Time Code Qtr. Frame status byte
F2 11110010 Song Position Pointer status byte
F3 11110011 Song Select (song#)
F4 11110100 Undefined
F5 11110101 Undefined
F6 11110110 Tune Request
F7 11110111 End of SysEx (EOX)
F8 11111000 Timing Clock
F9 11111001 Undefined
FA 11111010 Start
FB 11111011 Continue
FC 11111100 Stop
FD 11111101 Undefined
FE 11111110 Active Sensing
FF 11111111 System Reset
System Exclusive messages address devices by manufacturer. This allows
you to send functions that are only related to that particular device and are
not common MIDI messages. For example, custom patches for a
particular synthesizer (brand and model) can be saved in SysEx at the
beginning of a sequence and loaded back into that same brand and model

12
of synth at another studio. Another example, you could send all your
parameter settings of a MIDI device into patch editing software
(editor/librarian) in order to use the computers GUI to make changes to
these parameters, rather than using the devices front panel LCD.
Each manufacturer has their own SysEx ID number that has been assigned
to them by the MMA (MIDI Manufacturers Association).
This ends the Protocol section of MIDI and believe it or not, this is just an overview,
barley scratching the surface of the MIDI protocol. Now well talk about the hardware.

2.

Hardware Interface

The MIDI Hardware interface is an opto-isolated, UART (Universal


Asynchronous Receive/Transmit) device that transmits data in a Serial
fashion (1 bit at a time) at a rate of 31.25 kBaud per second. The UART
converts serial-to-parallel when receiving data, or parallel-to-serial when
transmitting data.

The opto-isolator connected to the UART prevents any electrical


connection between units. Data is transferred using an LED (light
emitting diode) and a photo-optic cell built into the opto-isolator.
The MIDI connectors are 5-pin DIN plug type. You can find the same type of
male plug at both ends of the cable. All MIDI equipped devices use female 5pin DIN.

The cables are a twisted pair with a shield for noise rejection. The shield
is only grounded on one side so as not to create a noisy ground loop
between instruments.

Only pins 4 and 5 carry data. Pin 2 is the shield and is grounded only to
the MIDI out connection of a unit. Pins 1 and 3 are not used at this time
for MIDI 1.0 spec but may be used at a later date if there is a major
revision to the MIDI spec.

13

MIDI is transmitted in a serial fashion (1-bit at a time) at 31.25 kBaud


(31,250 bits per second). Since a MIDI byte is really 10-bits, it takes 320
microsecond to transmit a 10-bit word (31,250/10=3,125
1/3,125=.00032). So a typical 3byte message takes 960 microseconds, or
about 1millisecond to transmit.

Typical MIDI Setup Configurations

At its most basic level, MIDI lets the user tie in one synthesizer
with another so that both can be played from the same keyboard. One is
the transmitter, or master, generating information that is understood by the
second synth, the receiver, or slave.

Synth A - Master

Synth B Slave

For instance, when you play Synth As keyboard, the sound of Synth B can be
layered along with Synth A. But when you play Synth Bs keyboard, you will
only hear Synth B.

14

Daisy-Chain Network
The MIDI Thru connector receives a copy of any digital message coming
into the MIDI In connection and sends a duplicate of this information out of
the MIDI Thru port into the MIDI In of a third MIDI device. This allows the
user to have more than two MIDI devices connected at once. The MIDI Out
port from the second or third device in the diagram below would not work
because it is sending MIDI information from that particular synthesizer. The
MIDI Thru port is receiving the MIDI In information and passing it on to the
next device.

When MIDI devices are linked together by a series of MIDI In and MIDI Thru
connections, it is referred to as a Daisy-Chain Network.

15

In the next example we have added a computer and MIDI interface. The
first order of business is to connect the master keyboard to the computer so
they can communicate with each other.
Next connect the three tone generators (synthesizers without keyboards)
The MIDI Out on the MIDI interface may also act as a MIDI Thru that relays
a copy of the MIDI In information. This will allow the keyboard to
communicate with the computer and the three tone generators. Use the
concept of the daisy-chain network set-up from the MIDI Thru port of the
keyboard.

16
One of MIDIs limitations is that daisy-chaining becomes impractical with more
than four instruments. MIDI transmission is pretty fast, 31.25kBaud (31,250 bits
per second), but because it is transmitting in a serial fashion (1 bit at a time)
instead of a parallel fashion (1 byte at a time), it can get bogged down. After 4
connections, a perceptible time delay can occur. To remedy this effect, a star
network is used.

Multi-Port Star Network


A multi-port Star Interface receives MIDI data at the MIDI In ports and then
copies the information and sends it out to two or more Thru ports. Each MIDI In
port may be assigned to specific MIDI Thru ports. Now we connect the keyboard
controller so that it sends information to the MIDI interface.
Then connect the MIDI interface to the Keyboard. Finally, connect the three
remaining Tone Generators using a star set-up. Do not use a daisy-chain set-up for
these connections.

MIDI Timing Accuracy and Running Status


Since MIDI was designed for musical performance data, it must provide
sufficiently accurate timing to preserve the rhythmic integrity of the music. The
ear is quite sensitive to small variations in timing that can destroy a musical
phrase. This is particularly true for grace notes, strummed chords and clusters of
notes, and for rhythmically complex and syncopated music.
Latency (the delay between when an event is triggered and when the resulting
sound occurs) is also important: musical instruments feel more and more sluggish
to play as latency increases. Since sound travels at about 1 ms per foot, latency of

17
7 ms is roughly equal to the maximum separation between members of a string
quartet. In practice, latency of 10 ms is generally imperceptible, as long as the
variation (due to bottleneck of MIDI data) in the latency is kept small.
With a data transmission rate of 31.25 kBaud, and 10 bits transmitted per byte
of MIDI data, a 3-byte Note On or Note Off message takes about 1 ms
(960microsec) to be sent. Since MIDI data is transmitted serially, a pair of
musical events which originally occurred at the same time but must be sent one at
a time in the MIDI data stream and cannot be reproduced at exactly the same
time. Luckily, human performers almost never play two notes at exactly the same
time. Notes are generally spaced at least slightly apart. This allows MIDI to
reproduce a solo musical part with quite reasonable rhythmic accuracy.
However, MIDI data being sent from a sequencer can include a number of
different parts. On a given beat, there may be a large number of musical events
that should occur virtually simultaneously - especially if the events have been
quantized. In this situation, many events will have to wait their turn to be
transmitted over MIDI. Worse, different events will be delayed by different
amounts of time (depending on how many events are queued up ahead of a given
event). This can produce a kind of progressive rhythmic smearing that may be
quite noticeable. A technique called running status is provided to help reduce
this rhythmic smearing effect by reducing the amount of data actually
transmitted in the MIDI data stream.
Running status is based on the fact that it is very common for a string of
consecutive messages to be of the same message type. For instance, when a chord
is played on a keyboard, ten successive Note On messages may be generated,
followed by ten Note Off messages. When running status is used, a status byte is
sent for a message only when the message is not of the same type as the last
message sent on the same Channel. The status byte for subsequent messages of
the same type may be omitted (only the data bytes are sent for these subsequent
messages).
The effectiveness of running status can be enhanced by sending Note On
messages with a velocity of zero in place of Note Off messages. In this case, long
strings of Note On messages will often occur. Changes in some of the MIDI
controllers or movement of the pitch bend wheel on a musical instrument can
produce a staggering number of MIDI Channel voice messages, and running
status can also help a great deal in these instances.
So most modern MIDI hardware (e.g. synths) and software use 'running-status'.
It's assumed that, once the expected number of data-bytes has been sent/received,
IF the next byte is *not* a status-byte, THEN the last status-byte received should
be used to decipher the following data bytes. Typically this results in about a 1/3
reduction in the number of bytes sent.

18

3.

MIDI File Formats


For pure live real-time performance using MIDI systems there is no need for file
formats since nothing is being saved. However, if the MIDI data is to be stored
as a data file, and/or edited using a sequencer, then some form of "time-stamping"
for the MIDI messages is required.
When you record MIDI into a sequencer program, it saves the data as a
proprietary format that can only be read by the same or similar program. In other
words, if you do a MIDI project in Reason and try to open it on another computer
using Reason it should open without any problems and all of your settings and
sounds should be there as well. But if you try to open it in another application
like cubase or Pro Tools, it wont open since it wasnt created on that application.
There is a way to get around this obstacle making it possible to save MIDI
performance data and have it compatible with practically every sequencing
program out there.
The Standard MIDI Files specification provides a standardized method for
handling time-stamped MIDI data. This standardized file format for time-stamped
MIDI data allows different applications, such as sequencers, scoring packages,
and multimedia presentation software, to share MIDI data files.
The specification for Standard MIDI Files defines three formats for MIDI files.
MIDI sequencers can generally manage multiple MIDI data streams, or "tracks".
Standard MIDI files using Format 0 store all of the MIDI sequence data in a
single track. Format 1 files store MIDI data as a collection of tracks. Format 2
files can store several independent songs in a series of Format 0 files. Format 2
has never really caught on and is generally not used by anyone. Most
sophisticated MIDI sequencers can read either Format 0 or Format 1 Standard
MIDI Files. Format 0 files may be smaller, and thus conserve storage space. They
may also be transferred using slightly less system bandwidth than Format 1 files.
However, Format 1 files may be viewed and edited more directly, and are
therefore generally preferred.

Before we go further with file formats, this would be a good time to talk about an
addendum to the MIDI protocol that revolutionized the MIDI industry in the early 90s.
During this time, desktop musicians, multimedia producers, and game developers began
clamoring for some level of playback predictability during the exchange of Standard
MIDI Files (SMFs). Understandably, composers and arrangers wanted to ensure that
piano parts would be played with piano patches and drums wouldn't sound like violins.

General MIDI 1
In the mid 80s, Roland produced a sound module called the MT-32. While its sound
quality was less than stellar, the MT-32 filled a need for an inexpensive tone module that

19
could be MIDIed up to a computer to play back sequences for trade show presentations
and video games. Roland repackaged the MT on a PC card called the LAPC-1 and sold
quite a few. This helped to spawn the idea of GM.
In the late 80s, MIDI manufacturers saw that there was a huge, untapped market out
there at the consumer level. Computer games were becoming very popular and there was
a need to standardize music and sound effects. Business multi-media presentations and
amateur musicians using MIDI software to compose music all needed a way to
standardize MIDI music so that it would sound practically the same on any instrument
that it was played back on. This led to the idea of General MIDI.
The first GM module was the Roland SC-55 Sound Canvas. It did very well so other
manufacturers all started producing GM compatible equipment. It is now the standard for
all computer sound cards and most high-end synths have at least a bank of GM sound in
their factory presets.
General MIDI or GM is a specification for synthesizers that imposes several
requirements beyond the MIDI standard. While MIDI itself provides a protocol which
ensures that different instruments can interoperate at a fundamental level (e.g. that
pressing keys on a MIDI keyboard will cause an attached MIDI sound module to play
musical notes), General MIDI (or GM) goes further in two ways: it requires that all GMcompatible instruments meet a certain minimal set of features, such as being able to play
at least 24 notes simultaneously (polyphony), and it attaches certain interpretations to
many parameters and control messages which were left unspecified in MIDI, such as
defining instrument sounds for each of 128 program numbers.
General MIDI was first standardized in 1991 by the MIDI Manufacturers Association
(MMA) and the Japan MIDI Standards Committee (JMSC), and has since been adopted
as an addendum to the main MIDI standard.
To be GM1 compatible, a GM1 sound-generating device (keyboard, sound module,
sound card, software program or other product) must meet the General MIDI System
Level 1 performance requirements outlined below, instantaneously upon demand, and
without additional modification or adjustment/configuration by the user.
Voices: A minimum of either 24 fully dynamically allocated voices are available
simultaneously for both melodic and percussive sounds, or 16 dynamically
allocated voices are available for melody plus 8 for percussion. All voices
respond to velocity.
Channels: All 16 MIDI Channels are supported. Each Channel can play a
variable number of voices (polyphony). Each Channel can play a different
instrument (sound/patch/timbre). Key-based percussion is always on MIDI
Channel 10.
Instruments: A minimum of 16 simultaneous and different timbres playing
various instruments. A minimum of 128 preset instruments (MIDI program
numbers) conforming to the GM1 Instrument Patch Map and 47 percussion

20
sounds that conform to the GM1 Percussion Key Map.
Channel Messages: Support for continuous controllers 1, 7, 10, 11, 64, 121 and
123; RPN #s 0, 1, 2; Channel Pressure, Pitch Bend.
Other Messages: Respond to the data entry controller and the RPNs (Registered
Parameter Number) for fine and course tuning and pitch bend range, as well as all
General MIDI Level 1 System Messages.
General MIDI is limited to the quality of each sound source that it is played back
on. Cheaper sound cards may conform to the GM standard but the quality might
be drastically inferior to a higher end product.
The advent of GM spawned a whole new market of music programmers that
would record MIDI sequences of popular music in GM format and sell them to
the amateur, semi/pro and pro musicians for use in live performance.

General MIDI 2
General MIDI 2 was adopted in 1999 and added some significant improvements
to the GM1 standard.
32 note polyphony
MIDI ch10 and 11 can simultaneously play percussion sounds.
256 program sounds. Basically, its just more variations of the original 128.
GM2's introduction of Key-Based Instrument Controllers is a major step forward
in drum programming. Key-Based lets you change the sound, pan and volume of
individual drums instead of just the whole kit.

MIDI on the Web


When it comes to putting your music on the Web, there are two options: posting
either audio files or MIDI files. With MIDI files you dont control the final output of
the MIDI file on the users system, and since each system is different, the sound
quality (as well as the sounds themselves) might vary widely from system to system.
There are options that might help limit this uncertainty. Among those are
downloadable sounds (DLS) and extensible music format (XMF) files.
DLS DownLoadable Sounds
DLS provides a means for game developers and composers to add their own custom
sounds to the GM sound set stored in a sound card's ROM. DLS-compatible devices
will automatically download these custom sounds from the SMF disk or internet into

21
system RAM, allowing MIDI music to be freely augmented with new instrument
sounds, dialog or special effects - thus providing a universal interactive playback
experience, along with an unlimited palette of sounds. At the same time, it enables the
wavetable synthesizers in computer sound cards to deliver improved audio at no
additional cost.
DLS enables the author to completely define an instrument by combining a
recorded waveform with articulation information (Attack transients). An instrument
defined this way can be downloaded into any hardware device that supports the
standard and then played like any standard MIDI synthesizer. Together with MIDI, it
delivers a common playback experience, unlike GM, an unlimited sound palette for
both instruments and sound effects and true audio interactivity, unlike digital audio.
XMF Extensible Music Format
XMF is a family of music-related file formats created and administered by the
MIDI Manufacturer's Association in conjunction with Beatnik. XMF is based on the
idea of containing one or more existing files such as Standard MIDI Files, DLS
instrument files, WAV or other digital audio files, etc. to create a collection of all
the resources needed to present a musical piece, an interactive web page soundtrack,
or any other piece of media using pre-produced sound elements. This file format is
actually a meta format a container file that points to other types of files. It loads up
the GM SMF and opens the XMF player (Quicktime, Window Media Player) while
downloading any digital audio. As soon as it has enough information to start
playback, it begins to play the content. In the case of XMF files, the content is
usually a MIDI file with its associated files: DLS and other digital audio.

Вам также может понравиться