Home -> Magazines -> Issues -> Articles in this issue -> View
Real Time MIDI | |
Article from Music Technology, May 1989 |
MIDI is more than a means of sending note data to a couple of synths and changing settings on a reverb unit. Ernie Tello looks into the real-time applications of MIDI.
Although often accused of removing the expression from music, MIDI can actually be used to make your music more expressive.
CERTAIN PEOPLE ARE still having trouble coming to terms with just how far MIDI has progressed - others can't wait for it to progress much further. Not long ago, I suggested a MIDI application to a developer and received the reply "MIDI wasn't designed for that!". No doubt MIDI will always mean different things to different people...
Those potentially affected the most by MIDI control are probably composers and drummers/percussionists. As Bill Bruford is busy proving, drummers today can participate in the melodic and harmonic aspects of music as other players always have. And the composer's job need not end when all the musical parts are written and assigned to instruments.
Real-time MIDI control actually covers a number of different, but related, areas. The basic concept is that of making changes to various aspects of the musical performance while the music is being played live or while a sequence is running. Ideally, manual real-time control would be an extension of how you play your chosen instrument, but this takes us into the area of personal preferences, as performance parameters such as after-touch are not attractive to all players. There is, however, already an impressive number of things you can do just with the fingers you're using to play the music. And the fact that both the notes and their articulation are translated into data that can control a studio full of devices has awesome connotations. On the other hand, there are limitations to MIDI, some well known, others not so well known.
One limitation you may have encountered is that volume is determined by note velocity on many or most instruments. We've come to take this for granted, but it can present a problem in performance where you want to play fast without getting louder as you do so. Realistically the only way to achieve this is to use a sequencer. In the future, all keyboard controllers will probably replace or complement velocity with real pressure sensitivity to determine loudness.
DEVELOPMENTS IN THE area of digital sound processing units have led to increased sophistication in the MIDI implementation of many digital reverb and effects units. Devices such as the ART MultiVerb II, Korg DRV2000, Lexicon LXP1, Yamaha SPX1000 and Alesis QuadraVerb offer real-time control to pro and budget MIDI studios alike.
The MultiVerb II has the usual complement of signal processing functions as well as a selection of dynamic MIDI functions (similar to those introduced by ART on their DR1 digital reverb). The DRV2000 allows up to two parameters of any sound processing program to be modulated in real time. When the LXP1 is used with Lexicon's MRC (MIDI Remote Controller) unit, up to eight parameters can be remotely controlled in real time. The Yamaha SPX1000 offers similar features to those of the DRV2000, but can also make complete data dumps via MIDI SysEx, so that program setups can be saved to disk on other MIDI devices. The Alesis QuadraVerb allows up to eight effects parameters to be controlled by practically any MIDI messages you like. Hopefully you begin to get the picture..
The real-time MIDI control of sound processing units like these are of two basic types: those that allow outside controllers to manually change parameters at will, and those that allow external sequencers or the music itself to control them.
Making the pedals or sliders on a DX7II control reverberation parameters on a unit (Korg's DRV2000, for example) is pretty straightforward. But, far more importantly, we're beginning to see effects devices that are sensitive to the notes that they see in the MIDI data stream as they're being played. Now that's something to wave a flag about, because it means that an important new type of MIDI device with sensitivity to musical context has quietly been making its appearance. It's worth taking a look at the MIDI implementation of a device like the Korg DRV2000 that operates in this way.
MULTI-MODULATION IS what Korg call it. Other manufacturers have different names, but what we're talking about is placing certain parameters of the sound processing programs under the control of a variety of MIDI sources. This can take place in real time as you or your sequencer play. On the DRV2000, external footpedals, keyboard sliders or aftertouch, MIDI note information, and even audio input level can be used to make these changes. There are basically two different types of MIDI control - on/off and continuous - and these correspond to the two basic types of MIDI controllers.
"The concept of Real-time MIDI is that of making changes to various aspects of the musical performance while the music is being played."
On the whole, the way in which the DRV2000 can be made sensitive to MIDI note and velocity information is rather rudimentary. You set the "Sense" scaling number in the same way as you would any other controller. However, the control of effects parameters by MIDI note data is such an important operation that a far more sophisticated implementation is highly desirable. The fact that a MIDI keyboard is programmable, and allows many different sounds to be assigned to different keys, is reason enough to provide a fine degree of control over how such note information influences effects parameters. Even when only one instrument is assigned to the whole keyboard, there are occasions when you might want to simulate special reverberation environments such as a piano sound board, the resonator of a guitar or violin, or even a human voice. Such acoustic instruments are sensitive to musical contexts in ways that synthesisers and effects boxes are not (so far).
Also, it ought to be possible to apply more than one controller to the same effects parameter. In that way you could use a footswitch to emulate a piano's damper pedal and, at the same time, have the reverb or echo be sensitive to what notes are played.
ONE TYPE OF control that is seldom used involves specifying when certain things will not occur. This has some fairly important uses. There are some effects that just don't sound good with a particular piece of music because of the way they react to a few notes in certain parts. At other times, the whole piece may be too up-tempo for the effects to react properly. In such cases it is useful to turn the effect off or diminish it for certain sections of the piece. Footpedal control is one standard way of coping with this, but there are times when you really want the control to be automatic. There are even times when you want an effect parameter to increase considerably, but only after the music has actually stopped. This is very easy to do with a device like the DRV2000, since it can use the negative sense of the input level from its audio jack as a controller. The result can be a very convenient way of making specific effects fade at the end of a song.
PROGRAM CHANGE IS a form of automatic control that is more than just a convenient time-saving feature. It tends to make the effects program a more or less permanent part of your sound. Of course, synths like the Korg M1 and Roland D50 provide a built-in solution to this problem. However, when outside effects are used, program changes give the result of combining various programs on different devices into one sound. The act of setting up an effects program for each patch or sample you use forces you to decide which of the existing programs is the best, and sometimes may even lead you to create a new program. In any case, the result is the same: the sound improves dramatically.
For this type of automatic program change control to work, your main MIDI instrument must transmit program change data. It is certainly desirable for program changes to be "sendable" while the music is playing. However, on many instruments this capability is designed for use with only one synthesiser at a time. The Yamaha SPX90 has a provision for use with more instruments. It has four different banks for program change assignments, each of which can be set to a different MIDI channel. If you're alternating lead voices between different synthesisers and sharing the same effects device, the appropriate effect for each lead can be easily selected.
Sending program change messages to effects processors is typically used to force programs in the processor to mirror patch changes on a synth or drum machine. If you're careful, you can use it in powerful ways during an arrangement as well. This generally works well in sparse arrangements where there is enough open space to cover the changes. With some units, there is a brief interruption of the output before the new effect takes over when a program is changed. In arrangements where there's a lot going on, a lot of instrument and effects switching in the background is possible. But if you intend to make changes in the treatment of upfront sounds, it is important to select equipment whose program changes are instantaneous and free from any extraneous noises or interruptions of the audio program.
Sending a program change by embedding program change numbers in a sequence sometimes only allows you to change voices, not performances. Other instruments only change their performance setup in response to a program change message. However, if you make the change you want manually on such an instrument while the sequencer is recording, you can usually get the sequencer to perform the change when the sequence is played back. The disadvantage of this method is that your manual change cannot be timed as precisely as an embedded program change number.
"Real-time MIDI control can make the process of projecting music to an audience subject to minute control specified by the composer."
Some synths, like the Yamaha DX7II, allow you to specify what the instrument will transmit when a program change is invoked. This is useful if you are stacking voices and the slaved synth does not have a program change table, like the Roland D50. In that case, you program the DX7II to transmit the data that the slave needs to receive for each program change. The DX7II also has the ability to transmit a program change number selected manually even when no changes are called for in the instrument itself. This is handy when the instrument is being used as a master controller and you want it to keep its current program, but change the program of a slave synth.
With a device like the MIDI Mitigator RFC1 (from American company Lake Butler Sound), stored MIDI messages can be sent by pressing various footpedal combinations. The stored messages can be up to 255 bytes in length and include anything in the MIDI specification. One obvious use for the RFC1 is to call up all the setups for a live show just by pressing a footswitch. However, there are many other uses for such a versatile MIDI foot controller.
IF YOU HAVE a MIDI processor such as the Forte Mentor, Axxess Mapper or Yamaha MEN, really advanced (and often esoteric) types of real-time MIDI processing are available to you. One important development in MIDI processing is the appearance of "send an example" programming on devices like the Mapper. This means that you don't have to be familiar with the intricacies of hexadecimal MIDI SysEx codes to get the device to send them. By having the synth send its SysEx messages while you are programming it, the MIDI processor can be taught the messages that are involved by example, even if you have no idea what the message is. A general technique is at work here that has wide areas of application for simplifying MIDI programming.
The MRC controller (from Lexicon) is an interesting device that taps some of the potential for real-time MIDI control. In addition to being a controller for the LXP1 and PCM7O effects processors, the MRC has the ability to act as a patch editor and controller for six-operator FM synths. This brings the old analogue hands-on style of programming to the world of FM synthesis. In addition, the MRC's four soft buttons and four soft sliders can be assigned to send any MIDI controller message.
Since there are two sets of MIDI Ins and Outs on the MRC, one set can be connected to a Lexicon effects processor and the other set to a synthesiser or drum machine. This setup works particularly well when you want to take advantage of the FM control or MIDI controller features.
With a computer, the opportunities for real-time MIDI processing are theoretically enormous, but in practice are limited by the specific software and hardware you use. The Yamaha C1 computer, with its eight MIDI Outs, offers great opportunities here, providing that software makes full use of its built-in features.
Another computer that has special facilities for real-time control is the Apple Macintosh. With the new MIDI Manager from Apple, more than one MIDI program can be running simultaneously under MultiFinder. The output from each one can be merged into a single stream that is sent to the MIDI Out port. The
MIDI Manager also supports other mapping functions such as splitting and routing MIDI data among different applications. A graphic interface called the Patchbay will allow users to take advantage of these capabilities, but these tools must be specifically supported by the programs you use. Most of the Mac developers are enthusiastic about this new capability and intend to provide support for it in upcoming releases of their programs.
Real-time MIDI control can make the process of projecting music to an audience subject to minute control specified by the composer. Some compositions in recent years have taken specific advantage of this. The most publicised example of this technique is probably the composition Repons by Pierre Boulez of IRCAM. However, groups like Pink Floyd have been working in this area for many years. What has changed since their initial attempts is the connection of complex sound projection systems to sequencers and other, more sophisticated controllers.
Traditionally, musicians have used two hands, two feet and their breath to perform music (with occasional recourse to the knees and elbows in the cases of pedal steel guitars and bagpipes). So far, MIDI has only changed the way you can express your music with your body, but eventually it will do much more. It also will extend the distances over which music can be made. I haven't heard of anybody having a MIDI jam by telephone yet, but that's probably on the cards. The full potential of transmitting and manipulating music as digital data streams is still evolving.
MIDI Timing Delays - Software & Hardware Thrus |
![]() Virtual or Reality |
Managing MIDI |
![]() Introduction |
The Strange Case of the Singular Digit |
Inside MIDI |
Effective Automation - Creative mixing with MIDI controlled effects (Part 1) |
![]() General MIDI - Who? What? Why? When? |
MIDI Matters - Song Position Pointers (Part 1) |
Orchestrating with MIDI (Part 1) |
Where MIDI meets Video... |
Adrift On An MTC - MIDI Time Code |
Browse by Topic:
Feature by Ernie Tello
mu:zines is the result of thousands of hours of effort, and will require many thousands more going forward to reach our goals of getting all this content online.
If you value this resource, you can support this project - it really helps!
New issues that have been donated or scanned for us this month.
All donations and support are gratefully appreciated - thank you.
Do you have any of these magazine issues?
If so, and you can donate, lend or scan them to help complete our archive, please get in touch via the Contribute page - thanks!