The Performing Art (Part 2)
Part two of this series on combining live instrument performances and MIDI recording looks at a typical session in the studio of Ollie Crooke and Simon Thomas.
IN THE SECOND PART OF THIS SERIES ON USING MIDI TO RECORD MUSICIANS' LIVE PERFORMANCES, THE SPOTLIGHT FALLS ON THE HEART OF THE SYSTEM: THE SEQUENCER.
ON A RECENT session at our studio, Roland Kerridge, a great drummer noted for his disgusting intimacy with the Simmons SDX, and our own Ollie Crooke, 'Master of the MIDI Bass' played a track live into Notator. A short discussion of the practical problems involved in that session should prove illuminating for anyone contemplating any complex live sequencing.
"A LOT OF GLITCHES THAT LOOK APPALLING WHEN YOU SEE THEM SPRAWLING ACROSS YOUR
SEQUENCER EDIT SCREEN DON'T ACTUALLY PLAY, SO THERE'S NO NEED TO GET RID OF THEM"
Firstly the instruments: the Wal MB4 MIDI Bass transmits only on MIDI channels 1-4 (one for each string to drive four synth or sample voices in Mono Mode 4). Using it with any sequencer that doesn't have the facility of a MIDI channel de-mix is going to involve you in an awful lot of painstaking editing to separate the drums from the bass. Luckily, Notator has just such a facility. In order to separate the drums in the computer we get the SDX to send them grouped on different MIDI channels; with the bass on channels 1-4 we put the bass drum on channel 5, the snare and rim on 6, the hi-hat on 7, the tom toms on 8, the crash cymbal on 9 and the ride on 10. Keeping the snare and rim on the same channel and grouping the toms is done to make any visual editing a bit easier - another way would be to use your sequencer's drum edit facility if it has one. Pro24 has a very useful drum edit screen for just this kind of thing. Visually it makes sense for us to merge the bass tracks back into a single track after the MIDI Channel demix, as they'll usually be edited as a single instrument.
There are two extra MIDI Ins on Notator's Unitor SMPTE unit, and so merging is not a problem for us. We used to use the Philip Rees 2M MIDI Merge unit which was fine 90% of the time but had a tendency to misbehave at the end of the day when it had heated up. This could be a mite frustrating when someone played a great five-minute rhythm track into the computer only to find that the merge unit had omitted all the Note Offs or given all the notes negative lengths. Mentioning this brings another golden rule to mind: save everything every time you do anything.
So here we are, with our record, and playback patches set up on the Sycologic M16 MIDI patchbay - the MIDI bass patched to the Atari and to a Yamaha TX802 and the SDX straight into the Atari.
The soft MIDI Thru is off on the Atari (because the SDX doesn't like it) and all the relevant MIDI channels are set up on the bass and drums. What next?
There are some keyboard lines programmed in the computer for the chaps to play along to, and we decide to add some percussion. We opt for a repeated percussion pattern rather than a strict (and boring) metronomic cow bell because it's more conducive to the creation of that elusive element, feel, so we use a cabasa and cow bell pattern that fit in well with the rest of the track and quantise it to be rigidly on the beat. Playing to a click track is a skill that needs to be worked on, and this is especially true for drummers who, in most cases, are used to setting the tempo. Luckily for us Roland is well versed in the art of being a "Slave to the Rhythm" and copes well with the percussion line.
Although the session is taking place entirely in the studio control room and we've got a great (and very loud) mix coming back over the monitors, we put Roland in a pair of cans. This is because drummers are so used to feeling the sound coming up at them when they hit a drum that listening to their playing coming from a pair of monitors behind them is very disconcerting. Some drummers think that they can hear a delay on the SDX (it's something like two milliseconds so this is fairly unlikely) when they listen over speakers. The other good thing about cans is that they block out the clatter of the stick and pedal noise of the pads so the mix doesn't have to be quite as loud for the rest of us mere mortals. So off we go: men and machines in perfect harmony.
We save three takes to disk and decide (typically) that the first one is the best. The next step is to de-mix all MIDI channels (so we now have ten separate tracks) and then stick channels 1-4 back together again to give us the bass part on one track. Next we must decide on a strategy for editing. Unfortunately most of the parts are a bit too rhythmically varied for any comprehensive quantisation - 16th triplets mixed in with straight 16ths on the hi-hat, fast double stroke rolls on the snare and toms. The song breaks down quite neatly into verse, bridge, chorus and middle sections, and the most sensible strategy seems to be to get a good single basic pattern for each section, arrange them into a song and then graft a selection of fills, rolls and variations that we've extracted from the live recording. It sounds long and laborious but a lot of the fills are usable as they were played - no, they were brilliant as they were played.
"GETTING TO KNOW A SEQUENCER TAKES TIME - UNLESS YOU'RE GOING TO SPEND A WEEK WITH EVERY SEQUENCER ON THE MARKET; YOU'RE NOT GOING TO BE ABLE TO MAKE A FULLY INFORMED CHOICE."
The MIDI bass tracks beautifully but it also glitches a fair amount - some of the glitching is the instrument's failing but some just reflects the difference between electronic and acoustic instruments; a lot of passing notes and bumped strings that don't come out very audibly on an acoustic instrument are treated a lot less sympathetically by MIDI - although it does depend on how well the patch has been programmed as to how "naturally" it responds (a matter we ll be considering in the next part of this series). A lot of glitches that look appalling when you see them sprawling across your sequencer edit page don't actually play, so there's no need to get rid of them (unless you can't bear to see them ruining the visual effect of your classic performance). Notator has a handy function called Delete Short Notes which can help a lot with double triggers caused by imperfect left and right hand co-ordination. Unfortunately slides on the MB4 (and other MIDI guitar controllers) come out as lots of short notes and so there are no cut and dried solutions for editing.
Back to the job in hand. Roland selects a 16th-note pattern on the hi-hat that he likes, and we use that as the Groove quantise template. The next problem is how to use it. The hi-hat track is liberally splattered with triplet fills which seem to preclude any easy quantisation. Simon comes to the rescue with a bit of lateral thinking. He copies the hi-hat track and then uses the Forced Legato function (which extends the length of each note to the beginning of the next note) on both tracks. Then from one track he deletes all the notes over 38 clicks long (16th triplets are 32 clicks long so this allowed some playing feel) and on the other track he deletes all the notes under 38 clicks long. This way he has all the notes a 16th or longer on one track and all the 16th triplets on the other. With two different groove quantisations the hi-hat is happening (as they say in the biz) - well nearly. For some strange reason the open and closed hi-hat pedal control on the SDX is MIDI Controller number 17 and, quite sensibly, Notator doesn't quantise Controller information along with Note Ons. Our solutions to this depend on how the drummer plays - if he or she is consistently ahead of or behind the beat, then separating out the controller information and using a positive or negative delay usually sorts the problem out. If the drummer is all over the place, then repairs have to be done by hand (or the track played again or the pedal overdubbed using a Mod Wheel or some other 'trick" solution). Luckily Roland is very accurate and so the delay trick works most of the time. We use the same quantisation tricks on everything else, and the drum track is rocking like it's in line for the Poll Tax. Bass next.
Unfortunately, changing the lengths of the bass notes doesn't really sound very good, so we have to experiment. We use a few different tricks in different places and generally do things more sectionally than with the drums. Using a Capture Window just to pull notes on the beat into place, and leaving things in between unquantised works well, but we have to seek out a few isolated places where it throws the relative timing out of place. Generally, when quantising melodic and harmonic instruments, it's much more difficult to keep a natural feel and cope with things like controller and pitchbend information than it is with pitchless percussion tracks.
Overall there isn't a huge amount to do to either the bass or drums considering how much information there is in the track, and most of what we have to do is just making certain beats particularly tight. One difficulty for the person doing the programming in sessions like this is that it does take a relatively long time to do the editing once the track has been played. This is no reflection on anyone's musicianship, it's just a fact that the impossible does take a while and getting something impossibly tight is no exception. Good judgement comes in when choosing the take that's easiest to edit. It can also be a good idea to do the playing in one session and the editing in another - this gives you a little time to listen to all the different takes and decide what to use.
If you're going to mix sections from different takes together then you should watch out for variations in velocity as well as any differences in the feel. This can be fixed with MIDI compression or by simply adding or subtracting a fixed amount using a logical editor on the imported pattern. Doing the two jobs at different times also means that the programmer can get on with the editing without having a load of bored musicians playing with all the toys in the studio and pouring coffee into the mixing desk.
A couple of tips on MIDI controller data: when you're recording a MIDI controller that also makes acoustic noises and putting the signal to tape whilst the MIDI goes to the sequencer, you have two basic options. The first is to get a near-perfect take on tape, edit the glitches in the MIDI data, but keep the overall timing relatively unquantised - this is fine for lead work but if it's a bassline or a rhythm guitar then you may have to use the rhythm unquantised as your groove template to quantise everything else. When this works it can sound absolutely brilliant, but remember that you can do drop-ins with MIDI as well as onto tape and so it's quite possible to repair "untight" sections. The alternative comes into its own when dealing with basslines: use loops. Look through your sequence for a really tight section, work out where it is on tape, sample it and then create a pattern in the sequence that triggers it off your sampler. You can then get an impossibly tight rhythm section with an enormous synth bass plus a fantastic slap sound.
Another problem is "machine gunning" on fast drum rolls. This is caused by the second trigger of a sound cutting off the first one before it's completed its envelope. With the SDX we can cure this by assigning another voice to the sample and the brain will deal with all the voicing logic. With drum machines or samplers where this isn't possible there is another way around it: make a copy of the sound in question and assign it an adjacent MIDI note number and a different output. The hard part is to then go through the roll in your sequence and change every other note to the second note number. As long as there aren't a huge number of very fast rolls then it shouldn't be too laborious and it does make them sound much more realistic.
A lot of editing of live MIDI performances can be tedious and time consuming, but the logical editors on sequencers are getting more and more sophisticated and offer some help. The best way to minimise the amount of time you spend editing is to maximise the performance of the musicians playing into the computer. The musical possibilities being opened up by this kind of technology are only just beginning to be realised, and at the moment the sound-producing modules seem to be lagging behind the controllers in their adaptability to real-time control. But soon, they'll catch up and we musicians will be doing some more manual reading. All you have to remember is that the machines will never catch up with our ability to conceive music. Will they?
Previous article in this issue:
Next article in this issue:
mu:zines is the result of thousands of hours of effort, and will require many thousands more going forward to reach our goals of getting all this content online.
If you value this resource, you can support this project - it really helps!