Home -> Magazines -> Issues -> Articles in this issue -> View
MIDI: Past, Present & Future | |
Article from Sound On Sound, June 1992 |
The Musical Instrument Digital Interface has revolutionised the way we write and record music. Martin Russ delves deep into MIDI's 10-year history, explains its many recent changes and additions, and looks ahead to see what the future will hold.
Depending on when you define the point at which the Musical Instrument Digital Interface first appeared, MIDI will be 10 years old sometime in the next year or so. Electronic musical instruments and equipment have changed enormously in that time — in fact, I find it increasingly hard to remember what things were like 'before MIDI'. So as the first MIDI decade draws to a close, this is a good time to recap on the origins, current situation, and explore some of the future possibilities, of MIDI.
Voltage controlled synthesizers first appeared in 1964: Bob Moog took some of the ideas used in audio test equipment and analogue computers, but used them to produce a general purpose sound making tool which used voltages as a way to both control and carry the audio signal. The important parts of the sophisticated electronics which made this possible were the transistor and the integrated circuit, both spin-offs from the American Apollo space program. The possibilities offered by Moog's synthesizers were explored by electronic musicians, and Walter Carlos's Switched On Bach album popularised the concept of using electronics to produce virtuoso, one-man performances of classical music.
The disadvantage of using analogue voltages to both control and carry a sound is that there is more than one way to define the relationship between a control signal and the result it produces. Musical pitch, for example, involves a doubling of frequency for an interval of one octave, and one voltage control system uses a change of one volt to produce a pitch change of one octave. Alternatively, you could use a linear voltage system where the frequency is in direct proportion to the voltage — double the voltage and you double the frequency. Ways of triggering envelopes merely involves defining 'on' and 'off' voltage levels. Similarly, envelope levels can be positive, negative or bipolar.
Since no manufacturer expected you to use anything other than their brand of equipment, no real standard for control voltages or trigger levels emerged. This made interfacing synthesizers difficult, if not virtually impossible. Worse, the arrival of cheap microprocessor chips meant that polyphonic performance synthesizers with patch memories became possible, and a new set of (incompatible) digital interfaces began to appear: Roland's Digital Communication Bus; Sequential Circuit's SCI Digital Interface; and Oberheim's Computer/Synthesizer Interface.
However, the presidents of these companies had already been discussing a common interface, and the first outline specification document was presented at the Autumn 1981 Audio Engineering Society (AES) convention by Sequential Circuits (SCI). The specification, called the Universal Synthesizer Interface, described an interface based around musical events. The USI used a serial data format based around the RS422 standard of TTL levels at a rate of 19.2kBaud (19,200 bits per second), and used 1/4" jack connectors. The use of a high speed serial transmission system was intended to avoid the problems of unwieldy cables associated with using parallel interfaces (as in the Oberheim system). The president of SCI, Dave Smith, then sent a questionnaire to manufacturers and consultants, and this resulted in a meeting held at the January 1982 NAMM convention in Anaheim, California (the US National Association of Music Merchants show is to the US what the Frankfurt Music Messe is to Europe).
The representatives at this meeting were a good cross-section of the more active synthesizer manufacturers at the time, including: SCI; Roland; Oberheim; CBS/Rhodes; Yamaha; Emu; Korg; Kawai; Octave Plateau; Passport Designs; and Alpha Syntauri. Opto-isolation was added to the USI specification to help eliminate ground loops, and the transmission speed was increased to 31.25kBaud to try and reduce the delays inherent in a serial system.
Some of the Japanese companies then proposed an alternative specification which went far beyond the simple note on and off codes in the USI proposal. The Japanese proposal offered a data structure with separate status and data bytes indicated by the most significant bit in the byte, thus simplifying the protocol requirements. More complex operations were suggested to extend the interface well beyond just note information.
After the meeting, SCI collated the two proposals and produced a composite proposal which consolidated the best ideas into a single specification, and Bob Moog publicly announced MIDI in his monthly column in the October 1982 issue of America's Keyboard magazine.
The first commercially available instrument to feature MIDI sockets was the SCI Prophet 600 polysynth, released in December 1982. The first major MIDI document (The Complete SCI MIDI, 1st Edition — and I gotta copy!) was produced after further exchanges between SCI and Roland (who liaised with Yamaha, Korg and Kawai) and was published by SCI in January 1983. In February 1983, Yamaha announced the DX7, although it wasn't until the NAMM show in June 1983 that a DX7 and Prophet 600 could be connected together — and there were some problems arising from interpreting the proposed specification.
This early MIDI proposal had some differences from today's version. There were three MIDI Clock bytes: $F8 was 'MIDI Clock in Play', whilst $FC was 'MIDI Clock in Stop' and $F9 was a Measure End marker sent instead of a MIDI Clock. These were designed to enable PLL-based clock generators to sync up by providing a clock signal at all times, which reveals a bias towards the use of analogue circuitry. Although SCI did not define the $En byte, they used it for the Pitch Bend Wheel in the Prophet 600, but also designated Controller Number 0 as Pitch Bend. The Prophet 600's Mod Wheel was assigned to Controller Number 1, the next one available. Yamaha added an 'active sensing' signal of $F0,$43 which did not have an EOX (End of Exclusive) byte and was subsequently changed to the $FE byte. Yamaha's early DX7s also used Controller Number 3 for aftertouch, and implemented its monophonic mode with 1-note/voice polyphony and one part multi-timbrality, thus wasting the other 15 voices!
With further modifications, the proposal became the MIDI Specification Version 1.0 in Japan on the 5th of August 1983, with SCI, Roland, Yamaha, Korg and Kawai agreeing on the definition of the Musical Instrument Digital Interface. Like many 'de facto' standards, the MIDI specification is the result of voluntary agreements between manufacturers, but it is not a formal 'standard' in the legal sense of the word. Manufacturers use the MIDI specification and the non-proprietary MIDI protocol because it adds value to a product by making it compatible with other MIDI devices.
The Prophet 600 may have been the first commercial synthesizer to have MIDI sockets on its rear panel, but the major influence on the evolution of MIDI over the next couple of years was the Yamaha DX7. Massive sales provided a huge user base of DX7 owners ready to exploit the new possibilities offered by MIDI, but they also helped to make the DX7's MIDI implementation a strong influence on the evolution of MIDI itself. The original MIDI Specification may have defined the important aspects of the MIDI interface like the connectors, baud rate and byte format, but many areas were deliberately left undefined.
One of the major problems with defining something too closely is that there is no room for expansion. If you currently own nine CDs, do you allow storage space for 10, assuming that you will only ever buy one more, or do you assume that you will eventually need space for perhaps 30 or more? The designers of MIDI were careful to fix only the parts of the MIDI specification that were essential, which left things like Controller Numbers and System Exclusive free for manufacturers to use as they wanted. In much the same way that MIDI itself was the product of contributions from many people, so the undefined parts left room for the process of continuous evolution to happen. MIDI is thus not designed to be an interface that is fixed and rigid for all time, but one that develops as the need arises.
Take, for example, the MIDI Controller Number assignments of the DX7:
Controller Number | Controller |
---|---|
2 | Breath Controller |
4 | Foot Controller |
5 | Portamento Time |
6 | Data Entry Slider |
64 | Sustain Switch |
65 | Portamento Switch |
96 | Data Entry Increment (+1) |
97 | Data Entry Decrement (-1) |
Controller Number 3 is missing because it was originally (mis)used on the early DX7 for aftertouch. All the other assignments were so well established because of their use by the many DX7 users that they were incorporated in the MIDI 1.0 Detailed Specification document that was released in October 1985. This document took account of the way that MIDI had developed since 1983, and included many additions and enhancements — whilst still retaining the original format unchanged.
One problem with this evolutionary process, essentially waiting to see what becomes dominant and then adopting it, is that you can get temporary inconsistencies. For example, Casio CZ101 synthesizers used MIDI Controller 6 as their master tuning control, which meant that anyone with a DX7 connected to a CZ101 could accidentally detune the CZ by moving the DX's data slider. To avoid this, two organisations were founded to co-ordinate possible additions to the MIDI specification and then release them in a controlled way. The MIDI Manufacturer's Association (MMA: mostly American and European manufacturers) and Japanese MIDI Standards Committee (JMSC: Japanese manufacturers) were formed to deal with such extensions and further development of MIDI, and they have been especially busy in the last couple of years with many new additions.
Before we look at the latest enhancements it is probably a good idea to look at some of the other ways that the MIDI Specification has developed. Although the fixing of some of the MIDI Controller Numbers was defined in the Detailed Specification, some much more important changes were made to the Controllers. The On/Off switches (Controller Numbers 64 to 95) were converted to 7-bit controllers (continuous switches!) to maximise the potential use of the Controller Numbers. Although at first you might question having something like the Sustain Pedal (Controller Number 64) as anything other than an on/off switch, consider the additional control that you would gain if you could control how long the sustained notes lasted, or at what level they sustained — you would then use a footswitch with several levels instead of a simple on/off action. Some electronic pianos have exactly this sort of extra control, although it is not common in professional hi-tech equipment.
Four 14-bit and four 7-bit Controller Numbers were assigned to General Purpose use, but the main innovation was the definition of Parameter Numbers. These enable the number of available MIDI Controllers to be extended beyond the 128 which are available by using the basic Controller message ('$Bn,ControllerNumber,Value') to over 32,000. This is achieved by using two 14-bit Controller Numbers to 'point' to one of these extra parameters, whose value is then changed by using the normal Data Increment and Data Decrement messages (Controller Numbers 96 and 97). So, the loss of four Controllers opens up two sets of 16,384 Parameter Controller Numbers.
The Registered Parameters are defined for parameters which apply to a wide range of devices, such as Pitch Bend Sensitivity, whilst the Non-Registered Parameters are intended to be used for manufacturer or instrument-specific parameters. In practice, few manufacturers have taken advantage of this recommended way of mapping the editing parameters of their equipment, and most continue to use System Exclusive commands instead. The Reset All Controllers message (Controller Number 121) is used to reset MIDI Controllers and Pressures to their initial states: Pitch Bend to the centre position, Modulation Wheel to zero etc.
MIDI Controller Numbers were not the only parts of MIDI to be modified in the Detailed Specification. The original 1983 MIDI Specification says that instruments should initially power up in Omni mode (Mode 1, in which a synth will respond to data on any MIDI channel) until they receive a Mode message — which makes sense for testing the setup of a small MIDI network but requires lots of external reprogramming before you can use all your gear with a multi-track sequencer. This became a recommendation in 1985, with the comment that a changed mode should persist even after power-down, which reflects the proliferation of battery-backed RAM storage in digital instruments. (The January 1983 proposal document makes it clear that the original intention was for instruments to regularly transmit a Mode message via MIDI, so that a MIDI network could be configured by the controlling 'master' keyboard setting the modes of all the others. This seems to have been lost in the flurry of changes that followed.)
There was still room to get things wrong. The DX7 (with modified ROMs to correct the aftertouch mis-assignment) only operates in Poly mode (Mode 3) — even when it is operating in its own monophonic key assignment mode. When a DX7 is playing a monophonic voice it becomes a polyphonic synthesizer with only one voice! To confuse matters further, the DX7 accepts a Mono all notes off Mode Message (Controller 126), intended only for instruments in Mono mode, although it does not actually change to that MIDI mode. Although the DX7 can receive on any channel, there is no way to set Omni reception from the front panel, and it will only transmit on channel 1. Local Control is permanently on, so using the DX7 as a master keyboard with a sequencer can be tricky. These all reflect the fact that the DX7 was an early MIDI instrument — the DX7 II, released in 1987, had a much more complete and comprehensive MIDI implementation.
The Detailed MIDI spec also took note of some of the terminology that had evolved around the whole subject of MIDI. The original intention of Mode 4 (Mono On, Omni Off) was to assign the polyphony across channels, with each channel only receiving monophonic note messages. This is how a guitar works — although you can play up to 6-note chords at once, each string can only produce one note at once. In practice, this mode has only really been used for guitar synthesizers, where the individual pitch, velocity and pressure controllers can be fully exploited. Instead, Mode 3 (Omni Off, Poly On) has been extended into what is often called 'Multi' mode, where a single synthesizer can be configured so that it behaves as if it was several different instruments operating in Mode 3, but on different channels.
One of the most recent new messages blurs the boundary between Modes 3 and 4: the Legato Switch (Controller Number 68) temporarily shifts the receiving synthesizer from Mode 1 or 3 into something similar to Mode 4, where new notes only change the pitch and do not re-trigger the envelopes. Release the Legato controller and everything returns to the previous state. In other words, it allows you to turn a 32-note polyphonic into an emulation of a monosynth in mid-performance.
Although multi-timbrality and the use of the 'Multi' mode has been very successful as a marketing tool, it removes the one major advantage of the Mono mode. The Mode Message for the MIDI Mono mode indicates how many voices are available — so an 8-note polyphonic instrument will occupy eight MIDI channels, and this can be indicated or set by using Mode Messages. This would avoid some of the current problems of losing notes because of inadequate polyphony — you could determine the polyphony of a MIDI network automatically. Perhaps the emerging multi-port MIDI interfaces with their capacity for dealing with more than the standard 16 MIDI channels may see the re-emergence of Mode 4 as a powerful and expressive tool for instruments other than guitar synthesizers.
The Detailed MIDI Specification also has a few quirks of its own. Mode 2 (Omni On, Mono On) is unusual in that it defines a MIDI instrument which responds monophonically to note messages on any channel. Using this mode effectively reduces a MIDI system to one note at once!
"Both MIDI Time Code and the Sample Dump Standard show how MIDI's 'open architecture' allows major changes to he made whilst still retaining backward capability"
There is also considerable confusion about the All Notes Off (ANO) command (Controller 123 = $7B) and its interaction with a Sustain/Hold pedal (Controller Number 64). Some instruments transmit an ANO whenever you lift your hands from the keyboard or whenever you release the Sustain/Hold Pedal, which then poses the problem of what to do with notes which are still being held down with your fingers. Some instruments put all the notes into the release stage of the envelope, including those you are still holding down. You might reasonably ask questions like: should an ANO message really turn all notes off, or does an ANO message only turn notes off which are not being held by a player? In fact, what should actually happen is that only the notes which are playing when an ANO is received will continue to sound until the sustain pedal is released — although some older instruments do not actually behave in this way. It is quite possible to fill pages and pages with detailed examination of the pros and cons of this argument, but the best way to deal with the whole thing is to always filter out ANO messages at the input of your sequencer.
The recently introduced All Sounds Off (ASO) message (Controller Number 12Q) aims to reach the places that ANO can't, and is presumably intended as a panic button — the 'traditional' way of attempting to turn off sustaining notes involves either sending ANO messages on all channels, or sending Note Off messages for every note number on each channel. Whereas the ANO message puts notes into the release stage of their envelopes, an ASO message implies that notes should be stopped as quickly as possible.
Perhaps the most far reaching additions to MIDI were MIDI Time Code and the Sample Dump Standard. Both of these arise from developments in technology, and show the way that MIDI's 'open architecture' organisation allows major changes to be made whilst still retaining backwards compatibility.
MIDI Time Code (MTC) was introduced as a way of linking the professional film and video industry to the MIDI world, by allowing SMPTE times to be used as the basis for an absolute timing reference within MIDI systems. This arises because of the way that MIDI quickly evolved from just a way of connecting a couple of synthesizers together into a sophisticated music creating environment, and the way that video technology has become closely involved with the audio industry. The musical roots of MIDI are revealed by its use of relative timing in the form of the Song Position Pointers — the time at which a musical event happens will change if you alter the tempo, whereas for film and video use you need musical events to happen at specific times.
MTC uses the System Common $F1 Status byte to produce Quarter Frame messages which are used when playing or recording, and the System Exclusive Real Time Universal format ($F0 7F...) to give Full messages which provide complete time in a single message and are used after operations like fast-forward or rewind.
An oft-overlooked part of MTC is MIDI Cueing, which provides special control facilities for audio/visual and video applications. The MIDI Cueing messages were originally Non-Real Time Universal System Exclusive messages, which suffer from a slightly lower processing priority than Real Time messages, and this has now been solved by duplicating most of the MTC Cueing messages in the Universal Real Time System Exclusive area, but with a different 'sub ID 1' code: 05 instead of 04, and a new name: Real Time MTC Cueing. The ability to control tape decks, video tape recorders, hard disk recorders and many other devices with event lists ought to be immensely useful to anyone who uses cue sheets to match up sound with pictures.
The MIDI Sample Dump Standard (SDS) was also technology-driven. Sampling instruments have shown perhaps the most rapid development since the introduction of MIDI, and the MIDI SDS allows sample data to be exchanged between sampling instruments by providing both a common format and a protocol which allows the large amounts of data involved to be reliably conveyed. All of the commands use Universal Non-Real Time System Exclusive format messages. The sample data is sent in 120 data byte packets with checksums, with optional handshaking to ensure re-transmission of any packets received with errors. The Dump Header message contains additional information about the packetised sample: a 14-bit Sample Number; Resolution; Sample Time; Sample Size; and Loop point information.
The header can be used to alert the receiving instrument that a dump is about to happen, and it is followed by a time delay so that the receiver can decide if it can accept the sample. Further information about extra loop points is available by using Additional Loop Point messages, whilst the identity of the transmitting device can be obtained by using the Inquiry message. A Dump Request message can be used to remotely initiate a sample dump from a sampling instrument. There are four Handshaking messages: ACK (acknowledge packet received OK); NAK (not acknowledged, last packet received incorrectly); CANCEL (terminate the dump); and WAIT (pause and do not transmit the next packet until the next message is received).
If handshaking is not used then the transmitting instrument will wait for 20 milliseconds between each packet and then send the next packet. The main problem with SDS dumps are the length of time it takes to transmit samples between two sampling instruments. A rough rule of thumb is to allow a factor of about 30 to 1: a 1-second 16-bit 44.1kHz sample can take about 30 seconds to transfer.
With an interchange format for samples defined, the only important area missing from the MIDI Specification was a standard way of storing sequencer data files in a way that could be used to exchange MIDI sequencer information between different sequencers running on different computers. The MIDI File is intended to fill the gap. The way that the sequencer information is stored in a MIDI File is similar to the way in which it would appear at the MIDI Out of the sequencer, but the file format is optimised for small size and easy interpretation, instead of being designed as a replacement for the normal data storage for a sequencer program. The V1.0 MIDI File definition document was initially distributed by the MMA in July 1988, but it was not until 1991 that JMSC members began to use it in products, so the use of MIDI Files has until very recently been predominantly from non-Japanese manufacturers.
Standard MIDI Files contain time-stamped MIDI data, organised into blocks of data called chunks. There are two types of chunk: the Header chunk contains information which applies to the whole MIDI File, whilst the Track chunks which follow contain streams of MIDI data, and allow multiple outputs, patterns, sequences and songs to be stored. There are three types of MIDI File, and the type is indicated by the 'format' field of the header chunk. Type 0 files contain one multi-channel track — a sort of snapshot of the output of a simple sequencer. Type 1 files hold one or more simultaneous tracks — a separate track chunk for each of the outputs or tracks of a linear-oriented sequencer. Type 2 files hold patterns which are not time aligned — as used in pattern-based sequencers.
Each musical or MIDI event in a Standard MIDI File is time-stamped with a delta-time (the time since the last event), measured either in fractions of a beat or second. Events can be either MIDI Channel messages (using running status) or meta events. Meta events cover additional information like copyright notices in ASCII text, Sequence,Track, and Instrument Names, Lyrics, Markers, Tempo and Time Signature, as well as the obligatory End Of Track event. One interesting and useful aspect of MIDI Files is that they define the first track chunk to be a Tempo Map, which means that it can sometimes be easier to check on tempo changes by examining the MIDI File rather than the sequencer display. In common with many other elements of the MIDI Specification, there were several types of 'MIDI File' being used before the final V1.0 document was released, and you should treat any files created before mid-1988 with suspicion.
After the release of the Standard MIDI File document, things were relatively quiet for a couple of years. But in mid-1991, two new additions were made to the MIDI Specification. These were Bank Select and Effects Control.
Bank Select was introduced to allow the selection of more patches than were permitted by the original MIDI Program Change message, which only allows for sending values up to 128. Whilst this probably seemed generous in 1983, by the end of the '80s, it was beginning to look rather inadequate. Some very unusual methods of selecting banks of sounds were introduced by some manufacturers, but what was really needed was an extension to the basic message. Yamaha's bank changing method as used in the SY77 is a good example of one of the alternative ways to select banks — a Program Change message is used to select a patch from A1 to D16 (from 0 up to 63), and this is immediately followed by a second Program Change message (from 117 to 127) which is used to select either a bank or a Multi. This use of two messages as a pair is actually similar to the real Bank Select message.
The Bank Select message uses two previously undefined MIDI Controller Numbers: 0 and 20. These are used as a 14-bit selector which can access 16,384 different banks. The MSB (Controller Number 0) and LSB (Controller Number 20) Control Messages must both be sent, and then followed by a Program Change message ($Cn) to select the actual voice within the bank. A typical Bank Select message would thus be of the form:
$Bn,$00,$00,$Bn,$20,$0b,$Cn,0a
This message selects patch number 'a' in bank 'b' on Channel 'n'.
Effects Control was the second of the 1991 additions. MIDI Controller Numbers 91 to 95 were originally defined as Depth controls for various named effects like Chorus and Phasing. These have now been redefined as Effects Depth 1 to 5, and they are intended to act as the controls for the depth parameter of an effect used in an instrument. This fits in very nicely with the way that many music workstations allow you to control the on-board effects sections with MIDI controllers, and should make effects control using external sequencers much easier. Additionally, Controller Numbers 12 and 13 have been defined as Effect Controls 1 and 2, and should be used in conjunction with the Effects Depth Controllers (91 to 95) to control a related parameter. For example, Reverb Depth and Reverb Time could be assigned to Controller Numbers 91 and 12 respectively.
The last quarter of 1991 and the start of 1992 have seen the introduction of more significant new features within MIDI. One of these takes the idea of the universal Effects controls (Controller Numbers 91 to 95, 12 and 13) and applies it to produce defined real-time controllers for the sort of 'Quick Edit' parameters that have started to appear on synthesizers which are too complex for most users to learn how to program. To save valuable Controller Numbers, some of the parameters are defined differently for instruments and multi-effects units. Controller Numbers 70 to 79 are used for Sound Controllers 1 to 10.
For Instruments:
Controller Number | Parameter |
---|---|
70 | Sound Variation |
71 | Harmonic Content |
72 | Release Time |
73 | Attack Time |
74 | Brightness |
75 to 79 | Undefined |
For Multi-effects units:
Controller Number | Parameter (All On/Off) |
---|---|
70 | Exciter |
71 | Compressor |
72 | Distortion |
73 | Equaliser |
74 | Expander/Noise Gate |
75 | Reverb |
76 | Delay |
77 | Pitch Transpose |
78 | Flange/Chorus |
79 | Special FX |
One of the strangest aspects of MIDI Files has been that there has been no 'authorised' way to send MIDI Files over MIDI cables — the interchange medium has usually been a 3.5" floppy disk. This has been solved with the addition of a new Universal Non-Real Time System Exclusive format: the MIDI File Dump protocol. This is similar to the Sample Dump Standard, but can be used to transfer any type of data file — even a Standard MIDI File — from one computer/device to another. Using a 'sub ID 1' of 07, the File Dump protocol has the same header and packet structure, handshaking and dump request format as the SDS, but with slightly different organisation, and a few extras like file type and name fields. The 8-bit bytes normally encountered in computer data are converted into blocks of eight MIDI-compatible '0xxx xxxx' bytes where eight 'compatible' bytes are used to transmit seven real bytes.
Transferring MIDI files via MIDI itself, rather than via disks, avoids problems like trying to persuade Apple File Exchange to read an Atari TOS 1.09 disk, and then using ResEdit or DeskZap to change the MIDI files' creators to 'Midi' (sic). It also means that MIDI Files can now be sent via modem, which opens up much more practical musical E-mail possibilities.
Another set of new System Exclusive Real-Time Universal messages are concerned with Notation Information, and use a 'sub ID 1' of 03. Musical information like time signatures and bar markers are covered, and the data format of the Time Signature messages is the same as the MIDI File Time Signature meta event, but with extra bytes to cope with compound time signatures. The Bar Marker message is particularly interesting when you consider that the original proposed use for the currently undefined $F9 System Real-Time byte was as an End of Measure marker, but that it never made it into the MIDI 1.0 Specification.
Yet another System Exclusive Real-Time Universal message is the Single Note Retuning Change message, intended as a 'power-user' performance control — it allows you to retune individual notes in real time. Some people might have preferred a normal MIDI Controller Number for this use, but I suspect that if retuning becomes popular, then we may see a duplication to one in much the same way as MTC Cueing made the transition from Non-Real to Real Time. A corresponding pair of Non-Real Time Universal System Exclusive messages allow micro-tuning data to be exchanged between different instruments by using the MIDI Tuning Standard.
"One of the strangest aspects of MIDI Files has been that there has been no 'authorised' way to send MIDI Files over MIDI cables — the interchange medium has usually been a 3.5" floppy disk."
A Bulk Tuning Dump Request will be responded to by a Bulk Tuning Dump which contains tuning data for all 128 MIDI notes, allocated to one of 128 Tuning Program Numbers. A new Registered Parameter Number (03) has been defined to allow easy switching between these Tuning Programs. The frequency parameters are set using three 'MIDI-compatible' byte fields with 8.1758 Hertz as the minimum frequency (represented by $00 $00 $00) and a maximum frequency of 13,289.73 Hertz (represented by $7F $7F $7E), with the frequency increment being .0061 cents.
January 1992 saw the release of the MIDI Machine Control (MMC) and MIDI Show Control (MSC) protocols. MMC allows control over tape recorder type devices via MIDI, and is based around the Audio Tape Recorder part of the ESbus standard. MMC is also quite similar to the proprietary system used by Fostex.
MIDI Show Control is the long-awaited MIDI lighting control system, and is again based on the commands used in currently existing computer-controlled lighting, sound and show control systems. MSC should have wide application in the rapidly expanding area of multi-media, and will also find uses in the live performance, audio-visual and theatrical environments.
General MIDI (GM) was ratified by the MMA and JMSC early in 1992. It is intended to sort out the problems of assignment of sounds to program numbers and drum sounds to note numbers — exactly the sort of things which make using someone else's MIDI Files a nightmare. General MIDI should provide a basic compatibility for consumer-level MIDI applications, which will mean that a General MIDI Score should play back in almost exactly the same way on any GM playback device: a synthesizer; sound module; computer sound card; or even a computer MIDI File player.
GM Level 1 defines a MIDI instrument with requirements such as a minimum of 24 dynamically allocated voices, response over all 16 MIDI channels, percussion on Channel 10, 16-part multi-timbrality, and a minimum of 128 presets/programs/patches with specific defined sounds.
A new Universal Real Time System Exclusive message has been defined for use with General MIDI systems. Two Device Control 'sub ID 2's have been set: 01 for Master Volume, and 02 for Master Balance. The names suggest that these messages will act like the volume and balance controls on a hi-fi amplifier, but actually this will only apply when the message is a 'broadcast' version with a target device of $7F — for all 'target device' specific messages (<> $7F) these Device Control messages will behave like the individual channel fader and pan controls on a mixing desk.
CD+MIDI is another of the multi-media extras which are all suddenly arriving after a long and delay-prone gestation. CDTV, CDI and CD-Graphics are just some of the new acronyms which will appear on CD players, offering TV pictures. Interactive TV, and Teletext-style information in addition to the audio you might expect from the ubiquitous disc of plastic. CD+MIDI adds the faintly curious advantage of providing MIDI output simultaneous with the audio output. This seems to offer anyone with a MIDI scoring/notation program virtually instant access to copyright problems, as well as the opportunity to play via your MIDI instruments what you can already hear from your hi-fi system (which, just to really confuse matters, might be described by its manufacturer as a 'Midi System'). Despite this, I am sure that CD+MIDI will sound much more desirable once the advertising people get their hands on it.
I thought I knew what the letters SMDI stood for when I reviewed the Peavey SP Sample Player, but it appears that SCSI MIDI may actually turn out to be SCSI MIDI Device Interface, or even something else like that but slightly different. Regardless of what it is called, it is still a way of using SCSI to carry MIDI-type messages — so you can achieve wonders like moving MIDI SDS sample dumps around very quickly instead of very slowly. With cheap SCSI hard disk drives readily available for the Macintosh, we may see SMDI becoming an essential part of a serious sample user's acronyms.
The provision of 16 MIDI channels must have seemed generous 10 years ago, but hi-tech music has moved on and even the 32 channels offered by using two serial ports on a computer can become limiting given the 32-note (or more) polyphony and 16-part multi-timbrality of many devices. One way forward has begun to appear on Apple Macintosh Interfaces: multi-port MIDI interfaces. Instead of providing several Thru sockets, they provide eight or 16 independent MIDI outputs, which opens the way to utilising 128 (8 x 16) or 256 channels. The processing power needed to service such devices fully is expensive at the moment, but the trend in computers has always been for more power at less cost, and this should continue in the future.
Multi-port interfaces, combined with high polyphony and multi-timbrality, mean that the Mono Mode (Mode 4) may replace Mode 3 and its 'Multi' extensions as the MIDI mode for the future. Mono Mode offers an escape from the performance limitations of most current polyphonic synthesizers, and opens the way to controlling individual notes rather than blocks of notes. Just as the violin section in an orchestra is made up of many individual violinists, Mode 4 would enable MIDI 'Players' to be used, with all the inherent advantages of playing techniques like precise pitch bend control, player specific dynamics control, true glissandos, real portamento and proper legato playing.
MIDI Local Area Networks (LANs) offer another approach to distributing MIDI over more than 16 channels, in a way which echoes developments in computing. LANs are actually similar to multi-port MIDI interfaces, except that the serial interface between the computer and the interface has become distributed around the LAN, and in software instead of being in one hardware box. The advantages of electrical isolation mean that optical fibre should become the preferred carrier rather than the copper cable of current MIDI networks, although developments in wireless LANs using radio communication may offer an intriguing alternative.
The All Sound Off and MIDI Cueing messages are just two examples that illustrate an interesting trend in the MIDI specification — they effectively correct things that were inadequately specified (and subsequently misinterpreted) by adding a new command rather than correcting the original problem. Unfortunately, I can see that human nature will always conspire to interpret things wrongly. I predict that a manufacturer will confuse or mix up the All Sounds Off message with the ANO message, and I await the first instrument that transmits an ASO message whenever you lift your hands from the keyboard. Presumably a new message will be introduced to correct this too!
MIDI's entire history is one of continuous change. The only way to keep up with a moving target like that is to get on board and hang on tight — and the best way to do this is to support the European MIDI Association (EMA), or more specifically, the UK branch of it: the UKMA. Perhaps then we can start to balance the marked lack of European influence over what happens to MIDI. Reading Sound On Sound regularly is essential as well, but then you already know that if you are reading these words.
Further information
UKMA and EMA, (Contact Details))
E-Mail: MIDIHELP @ CIX, UKMA @ PAN
MIDI Basics - First Steps In Multi-timbrality |
Virtual or Reality |
MIDI By Example - Cabling a MIDI system (Part 1) |
Get Organised! - Keeping Track Of MIDI Connections |
General MIDI - A True MIDI Standard? |
MIDI Muting - Sound Workshop |
Technically Speaking |
Where MIDI meets Video... |
Technically Speaking |
MIDI - The Absolute Basics (Part 1) |
The Strange Case of the Singular Digit |
Software Support - Hints, Tips & News From The World Of Music Software |
Browse by Topic:
Feature by Martin Russ
mu:zines is the result of thousands of hours of effort, and will require many thousands more going forward to reach our goals of getting all this content online.
If you value this resource, you can support this project - it really helps!
New issues that have been donated or scanned for us this month.
All donations and support are gratefully appreciated - thank you.
Do you have any of these magazine issues?
If so, and you can donate, lend or scan them to help complete our archive, please get in touch via the Contribute page - thanks!