Kendall Wrightson argues the case for a universal editing interface for MIDI keyboards.
Kendall Wrightson calls for a standardised electronic musical instrument data entry interface.
Pressing a key on an average synthesizer in 1989, one can confidently expect to hear more or less any sound it is possible to hear. This is all totally cool and happening if the sound you want just happens to be Pygmy Death Piano or Mutant Cowboy & Bicycle. Sometimes, of course, these sounds fit the bill exactly - but perhaps nine times out of nine, they don't. So powerful have synthesizers become, that the sonic possibilities are virtually limitless (which, curiously enough, is what the synth manufacturers have been claiming since the early 70s!).
Why is it then, that after more than 20 years of technological breakthrough and design innovation, the massive creative potential of the contemporary synthesizer is under the control of one solitary, innocuous, data entry slider?
Clearly manufacturers have a problem with musicians, since they seem perfectly able to design a decent interface for everyone else. For example, sound engineers get standard tape transport controls, computer operators get an alphanumeric keyboard and a mouse. So why make the musician settle for one slider and a couple of data increment buttons?
Imagine if a mass market product like the television had such an interface. A lot of people would never realise that there was anything more to be had from a television than a somewhat dark ITV in black and white with no sound.
This would inevitably produce a breed of people called 'TV programmers'. For a modest arm or leg they would arrive, resplendent in Ferguson T-shirts, armed with their computers and editing software, and would speedily edit the tone, brightness and channel of your telly, pausing only to grin and talk about hi-fi.
Musicians also like to edit tone, brightness and channel, but are forced to learn a new method for every synth they encounter - and so most just give up. And does editing software or the add-on programmer actually provide a quick solution in the real and spontaneous world of the recording studio?
Imagine you are working on a commercial for the Vegetable Marketing Board with a producer who gives you two minutes to come up with "a sort of crinkly black sound with a touch of humour and a slight suggestion of potato". You find a preset that nearly fits the bill (well, the potato bit anyway), and all you want to do is decrease the brightness, shorten the release, increase the reverb time and you're there. It's not a synth you've used before, but you've read the review. No problem. "Get into edit mode; umm, turn memory protect off. Right. Find the, umm, volume envelope edit screen - Oh, amazing! Two envelopes per operator/oscillator/partial/doobry - and there's four of them! Wow! Could you just hang on while I set the computer up?" (Ten minutes later) "Umm, nearly there. Fantastic synth isn't it? Got some great presets. Do you mind if I just try preset 3? It's called 'ITV'..."
To be fair, designers have responded well to the need for better music keyboards, but in the rush to produce the biggest, lushest sound, the control interface has been put on the back burner. Presumably they are waiting for technology to come to the rescue? But as the membrane switch demonstrated, new doesn't necessarily mean better. Even a built-in touch screen is not the answer, as virtual mixing desks have proved. Computer screens may be helpful in displaying a lot of information graphically, but they'll never replace controls you can physically touch.
The problem is one of ergonomics. Every year, new synthesizers appear at half the price and twice the ability of their predecessors. The trouble is they are invariably 16 times more difficult to actually programme. One wonders what might have been achieved if even half of the R&D investment made into sonic ability had gone into ergonomic design.
So, enough moaning, what's the solution? Sliders! Sliders are the solution. In fact, a panel of eight or more sliders (preferably motorised like the 'Dumpy 7' mixer). The usual sort of alphanumeric display would list the parameters currently assigned to each slider and give numerical confirmation of the sliders' values. Also, two switches would be positioned near the sliders for on/off functions, but these would be in addition to the standard data increment yes/no buttons. (The data increment buttons would apply to the last slider used.) There would also be a further button located near the modulation wheels, called the Restore button. The latter is my hardware specification for a Standardised Electronic Musical Instrument Editor Interface - SEMIE, for want of a better acronym (and there must be one).
The software specification would involve having a standardised group of Control Modes. For example, Tone edit, Patch edit, MIDI edit, Utility, Edit, Load/Save, Sequencer edit, Effects edit, Sample edit, Rhythm edit-depending on the nature of the instrument. Each Control Mode would contain pages of up to eight parameters. (It should be possible to edit eight parameters in real time). In the case of synth and effects editing, the ability to edit several parameters at once provides immediate feedback as to the parameters' interaction, therefore making editing more intuitive.
The controls should be active at all times - giving more creativity in performance — as well as making it easier to change, say, the auto-correct quantisation level while a sequence is still running. No doubt everyone can think of ways to organise and standardise the parameters for each mode, but an optimum way surely exists. The Patch edit and Sample edit modes would, of course, depend on the method of synthesis or sampling employed. The Tone edit mode I'll come to in a moment.
Most importantly, all these parameters would be given standard MIDI Controller definitions. This would mean: (1) any one device could edit any other; (2) all Controller changes could be recorded into a sequencer to aid debugging. Thus, it would be possible to replay the sequence to see if the fault is duplicated.
The method of synthesis of the early analogue machines - ie. a voltage-controlled amplifier (VCA) and voltage-controlled filter (VCF) with an ADSR envelope for each - is well known and easily understood. These functions should be available on all synths, in addition to any envelope/filter type functions that form part of the machine's own method of synthesis. In other words, the VCA and VCF functions should control the entire pre-effects sound. If a synthesizer can have a complete multi-effects section built in, then surely an extra VCF and VCA would be a mere bagatelle. Unlike an instrument's synthesis (Patch editing) parameters, this analogue-style Tone edit section could be standardised. For synths, the Tone edit mode should be the default function of the eight sliders.
As mentioned previously, the controls should be active at all times. This is where the Restore switch comes in. Say, for example, that during a performance the volume release was lengthened and the filter opened. Normally, to recover the original patch, it would be necessary to 'dial up' the stored preset again. On the majority of synths this produces an audible glitch and cuts off any notes still sounding. The function of the Restore button then, is to restore the original patch without glitching or altering the tone of any notes still sustaining.
As synthesizers have so many sound generators these days, a 'parameter data copy' function is usually employed. With eight sliders available, it should be possible to subgroup sliders (like on a mixing desk) to alter, say, the attack of four oscillators at once. Furthermore, as it would be possible to modify eight parameters at once, it should also be possible to 'stack' eight different parameters on one slider. Thus, one slider might be used to simultaneously open the filter, decrease oscillator A pitch, change the mix between layer 1 and 2... and so on.
The cost of implementing something like the motorised SEMEI I'm proposing would be high for one electronic musical instrument manufacturer's range, but if all instruments were to be fitted with identical SEMEI devices, the cost would inevitably fall. Synths currently all use data entry sliders, so why not?
Clearly there are many possibilities for improvement, but in addressing the many grievances I've nurtured over the years, I don't want to detract from the main advantage that some kind of data entry standard would offer - easier access to an electronic musical instrument's creative potential.