Protocol (Part 4)
Paul Overaa continues his series on the MIDI standard with a look at channel messages
Part four of Paul Overaa's series deals with individual voice and note data or MIDI channel messages as it's more commonly known
Channel messages come in two varieties, Mode messages, which are used for selecting one of the four basic MIDI modes, and voice messages. As you might expect, voice messages are concerned with sounds... turning notes on and off, voice selection, and in general controlling the sound producing circuitry.
For a start, let's see exactly what happens when you press a key on a synthesizer. An audio signal appears at the audio output but it's what happens at the MIDI terminal that we're interested in. You probably know already that a MIDI Note-On message gets transmitted, but let's have a look in detail at what this contains.
Four pieces of information are delivered, and the first two pieces that come down the line are held in the Note-On status byte. This identifies both the message type and the "channel number". MIDI recognizes the existence of 16 separate channels and it's the lower four bits of the status byte which stores the necessary channel identification details. If, as an example, we have a look at the Note-On status byte you'll appreciate the overall ideas for all channel messages.
In binary form a "Note-On" status byte looks like this...
MIDI Channel numbers range from 1 to 16 but the numbers which get stored in the status byte are one less than this, i.e. they range from 0 to 15 with channel 1 being represented by the number 0 etc. Because the lower four bits of a channel status byte can vary it's common to use "n" to represent a variable part when discussing status bytes, i.e. "nnnn" is taken to mean four bits which could vary from 0000 to 1111 binary, i.e. from 0 to 15 decimal.
Following the status byte comes a key-number. Middle C is assigned the value of 60 and this changes by plus or minus 1 for every semitone above or below Middle C. On the face of it there's no provision in the MIDI spec for non-semitone based scale arrangements but, because there's no direct link between note numbers and note frequency, there's some experimental leaway here for scale redefinition and no practical reason why the conventional interpretation of the MIDI pitch/note number relationship shouldn't be completely re-mapped if you felt it would serve a useful purpose.
The third byte of the message is a number which represents how hard the note was struck. This last item is called the "velocity byte" in MIDI-speak and will, for touch sensitive keyboards, range from 1-127. Keyboards without touch sensitivity normally transmit a default value of 64, although synths like Roland's alpha Juno can be set up so that they use the EV-5 footpedal position to create velocity data. It's not the ideal way of doing things but it is better than no velocity control at all !
So, when you hit a note on a MIDI keyboard the end result, as far as the MIDI OUT terminal is concerned, is a packet of information which looks like this...
The MIDI communications protocol is based on serial transmission, so these MIDI messages aren't sent in one go, they're not even sent as individual bytes. Each byte has to be broken down into the electronic equivalent of the binary form of the number it represents and then this has to be transmitted as a stream of individual bits. For the benefit of everyone who wants to know roughly what happens, here's a bottom line description. The software inside the synthesizer senses the keypress, and builds a three byte MIDI message. This message is transmitted internally one byte at a time to a device inside the synthesizer called a UART (universal asynchronous receiver transmitter), and this takes each byte of the message and converts it to a stream of pulses adding the necessary start and stop bits that the MIDI specification requires for serial transfer. At the end of the day then the sort of data appearing at the MIDI out terminal looks like this....
Equipment reading such a MIDI data stream uses similar UART devices to translate the stream of pulses back into individual 8 bit "bytes" which the internal software then re-groups back into into the appropriate MIDI message.
Anyway, getting back to the messages themselves, sending a Note-On message with a velocity of 0 (zero) is one of the ways in which notes can be turned off. Another way is to send a real "Note-Off" message - the message format is similar to the Note-On arrangement but uses a different status byte (1000nnnn binary). There are advantages and disadvantages to both approaches. Real Note-Off messages allow you to specify a "release velocity" (which allows more expressive playing if your keyboards support it). Turning notes on and off by using streams of Note-On messages allow running status to be used and this lets the internal software eliminate all duplicate status bytes in a stream of identical messages - thus helping to reduce congestion by cutting the amount of traffic going down the MIDI line. Pressing harder on the keys after you've started to play the notes can be sensed by some keyboards and translated into pressure or "aftertouch" messages. There are two versions... an overall "channel pressure" and a polyphonic version where individual keys will have their own individual aftertouch data. Good as it may appear polyphonic aftertouch does have some disadvantages... it is expensive to implement, and, because of the large amount of data it can generate, it can put MIDI communications under strain.
Program change messages allow particular voice or synth settings to be selected by remote control. Each message consists of two bytes - a status byte (1100nnnn binary) followed by a data byte representing the program change number. Program change commands are certainly simple to understand but they have caused problems, simply because there's absolutely NO standardization as far as their meaning goes! So a PG 1 sent to one synth may select a bank 1/voice 1 which, for the sake of argument, might be some violin or string sound. The same command sent to another synth could, and almost definitely would, select a sound which is totally unrelated, e.g. the mating call of a Japanese Yak.
This caused a lot of problems in the early MIDI days when program change numbers on particular synths were "fixed". If you wanted to link two synths together you needed to find some way of getting pairs of compatible sounds for each patch. It was a pain... you either ended up using two channels to carry data which was to all extents and purposes identical, or you had to play around copying the sounds you liked on each synthesizer into memory bank positions which were known to correspond to a particular program change command.
It obviously isn't possible to develop a generalized scheme to relate particular sounds with particular program change numbers but the other problem, the fixed relationships between a program change number and a particular voice (and the related difficulties caused by the inconsistencies in the way manufacturers lay out their memory banks), has been looked at. It didn't take long for this problem to be solved and the best approach is that adopted with units like Yamaha's TX81Z expander module.
There is a user definable program change table available which identifies which voice any given program change message will select. When you're buying MIDI gear nowadays look for flexibility in this area because it's fairly important for two reasons. Firstly, if you need to use more than one unit on any single channel you need this flexibility to avoid the problems we mentioned earlier. Secondly, bear in mind that you may likely add to or change your expanders/synths as time goes on. Once you've built up a library of songs on your sequencer you'll will not want to to go through every song and sequence to edit program change commands to suit new units - it's better (quicker) to alter the program-change/voice correspondence on the new units to fit in with the data you've already got stored.
MIDI also supports a range of "Controller" messages. They come in three forms and correspond to switch (on/off type) controllers, data controllers, and continuous controllers. Each group has been assigned a particular range of "controller numbers", 0-63 are the continuous controllers, 64-95 are the switches, and 96-101 are the data controllers. Most MIDI equipment neither transmits nor recognizes the whole range of defined messages, and the purposes of many of the controllers have not been clearly defined although some defacto standards are emerging... the mod wheel is usually controller number 1, portamento is 5, volume is 7. If in doubt have a look at the MIDI Implementation Charts for your equipment... you'll see which controllers are being used. Perhaps the best general way to solve this problem would be for all manufacturers to allow users to define their own controller/effect correspondences.
The last type of channel message you'll come across are the Pitch Bend messages. These are transmitted when the pitch-wheel/lever changes position and consist of a status byte (111Onnnn binary) followed by two data bytes which identify the position of the wheel.
Excessive use of both pitch bend and controllers can result in a lot of messages going down the MIDI line. Some equipment allows the user to choose whether or not such messages are transmitted (often the default is NOT to send them). Sequencers also may well filter out these types of messages from the data stream unless you specifically ask for them to be recorded. If you hit problems in this area it's usually easy to sort out because there's only two possibilities... either your MIDI gear isn't transmitting the messages in the first place, or your sequencer isn't set up to receive them!