The Appliance Of Science
Vancouver, Canada was the location for two hi-tech music events last August. Ron Briefel reports on the first of them, Digicon '85, and finds he has plenty to talk about.
Vancouver, Canada played host to two major celebrations of new music technology in August of this year. Next month we'll report on the International Computer Music Conference, the academics' showcase. This month it's the turn of the end-users' extravaganza, Digicon '85.
If you're interested in the fusion of art and technology, Vancouver is the place to be. The biggest city in British Columbia, Canada's western-most state, has modern theatres, art galleries, laboratories and lecture halls by the dozen, a TV and radio network with an admirable attitude towards hi-tech music, and a magnificent new building called the Expo Centre, architectural centrepiece of an artistic and scientific community that's as forward-looking as any other on the face of the globe.
The Centre will be at the heart of the action when Vancouver plays host to the World Exposition (the largest of its kind for well over a decade) in 1986. But in August of this year, it played host to a remarkable festival by the name of Digicon '85.
This enthusiastic celebration of things hi-tech coincided with the staging of the International Computer Music Conference in the same city. Which meant that for just over a week, Vancouver was full of people from all over the world, who'd all come to the same place out of a common interest in the use of digital technology in music and the visual arts. While Digicon's itinerary put the emphasis firmly on end users, ICMC had a more academic, more theoretical bias that meant much of its programme was aimed at designers and engineers — though there were still plenty of avant garde composers in evidence at the later event.
Various pre-conference announcements for Digicon promised the participation of such luminaries as John Chowning, Patrick Moraz, Bill Bruford, Allan Holdsworth, Todd Rundgren, and Wendy Carlos. Needless to say, none of these folk could actually make it to the event as such, but unless you were a fully paid-up member of the Autograph Hunters' Club, their absence didn't detract from Digicon's attractiveness.
One of those announcements also made great play of a planned transcontinental recording session called Overture 2000.
The performance was to employ a MIDI, PCM and video link by satellite between Los Angeles and Vancouver, with a rhythm section in LA laying down MIDI backing tracks for a group of Vancouver-based musicians to play over just half-a-second later. But owing to some technical problems (the exact nature of which I'll go into later), the Overture was never actually performed. And that did annoy some people...
The bulk of the conference programme was dedicated to the visual arts, with a vast range of talks and presentations by artists in the computer graphics and video fields. The music fans among us had to make do with Bob Moog and Roger Linn as 'star' presenters.
As developments in the computer graphics arena occur with increasing speed as a result of pressure from commercial forces, so those working on the hi-tech side of the visual arts are polarising into two groups: those that accept the process of commercialisation, and those that do not.
The 'real' artists feel that their colleagues in the commercial orbit are too concerned with technique and form for their own sake, and used Digicon as a platform from which to launch their attack. The commercial side responded admirably, but the whole debate (which isn't a million miles removed from the one in the academic music world, where composers who accept commercial instruments do battle with those that do not) soon became rather facile. After all, it's now fairly clear that without the commercial world's continuous demand for new visual imagery, most of the latest imagemaking technology would never have come into existence at all.
For proof of that, you need look no further than the graphics capabilities of home computers, which are becoming more sophisticated as each week goes by. One 'real' artist who's taken advantage of some of these techniques while retaining academic credibility is David Em, whose graphics images are what grace the page now in front of you.
Em's talk was one of Digicon's highlights, not least because, even though his work currently depends on some of the most sophisticated hardware and software currently available, the sort of technology he's using will soon be accessible to the mass market, thanks to the new generation of 16-bit home micros.
Em has succeeded in becoming official Artist in Residence at the California Institute of Technology, where he has access to software written for NASA jet propulsion research projects, amongst other things. Apparently, the Institute's scientists and engineers find it useful to have an artist's feedback on their graphics system's capabilities. They also find Em's imaginative images stimulating and intriguing, as indeed did all those who witnessed his work at Digicon.
Similarly striking were the graphics images produced by one Alvy Ray Smith, project leader of George 'Star Wars' Lucas' computer graphics team. The team have access to just about the most powerful computer graphics systems currently in existence, and as an example of their work, Smith presented an image of a billiard table top, complete with several balls in motion. I for one was certain I was looking at a photograph, but the image turned out to be entirely computer-generated, right down to the finest details like the reflections of the surroundings in the balls, and the motional blur of those that were moving. Very impressive indeed.
And so to Overture 2000, or rather the lack of it. The prime technical reason for its non-appearance (and hence the non-appearance of the big names I mentioned at the start) was the late availability of the multiplex/demultiplex system needed to handle eight separate asynchronous MIDI signals. Had things gone according to plan, the multiplexer would have sliced the MIDI input into a 1.544MHz datastream, which would then have been transmitted via satellite to a demultiplexer at the other end of the 'DIN cable in the sky' for separation back into separate MIDI signals and routing to individual instruments.
As things turned out, none of the invited artists wanted to take the risk of participating without exhaustive tests and trial runs, but we were at least treated to a scaled-down demonstration of the process by the man responsible for its technical development, Ralph Dyck. Within the confines of the conference room, the satellite was replaced by nothing more elaborate than a piece of wire, but the conversion from the standard asynchronous MIDI transmission rate of 31.25kBaud to the synchronous 1.544MHz datastream did take place, so we know it works.
With luck, the system will have been perfected by the time we get to Digicon '87, and Allan Holdsworth really will play a jam session with Todd Rundgren via a satellite link-up...
Of the remaining presentations at this year's extravaganza, two stand out as being exceptional.
The first was a performance by computer wizard Ed Tannenbaum and dancer/choreographer Karen Koyanagi. Their display involved Tannenbaum's computer hardware and software actually producing modified images of Koyanagi's movement, and projecting them onto a screen to one side of the stage. Stimulating, inventive, and still with great scope for future experimentation.
The second significant event was the showing of an Omnimax-format computer graphics film titled The Magic Egg. Now, Omnimax is a system of projecting a film onto a hemispherical dome not entirely unlike a planetarium, the difference being that the audience sits right up inside the dome. If you want to experience what I can only describe as a sensational 'surround-vision' effect, it's the only place to be.
The visuals for The Magic Egg were a tantalising mixture of terrifyingly realistic 'dangerous flying' effects (swooping over mountains and cities, even crashing on the odd occasion) and some weird and wonderful abstract imagery (huge sunbursts, giant Slinkies turning into butterflies, and other strange spatial transformations). Subtle it was not, but there was no escaping its power, or forgetting any detail of it afterwards.
A surround-sound system took care of the music, a specially-composed set of pieces by US synth magician Michael Boddicker. The morning after the performance, Boddicker brought everything back down to earth again with a detailed talk on the techniques he used in composing and recording the music.
He has a well-equipped keyboard studio that numbers a PPG, a DX7, an Emulator, a Jupiter 6, and several sequencers (including a Linn 9000) among its weaponry of hardware.
Being involved in writing film soundtracks has made Boddicker acutely aware of the importance of timing, and this in turn has forced him to deal with MIDI timing difficulties in a forceful, no-nonsense fashion. He's succeeded in overcoming the delay between MIDI channels so many musicians come up against during sequencing, by the simple expediency of never using more than one MIDI channel in the first place.
More cunning is the method he's devised of getting round the problem of different MIDI instruments taking different lengths of time to process incoming MIDI data from sequencers and the like. According to Boddicker, a DX7 takes around 7-8 milliseconds, an Emulator around 12, and a PPG a massive 24 milliseconds. Naturally, these delays can drastically weaken the impact of what should be a synchronised attack. But Boddicker has devised a method of delaying the click-track of each instrument before their output is laid down on tape.
This works fine so long as he can find out the MIDI implementation delays for each instrument. If he can't, Boddicker resorts to measuring delays with a pair of callipers on a multitrack tape containing a MIDI-implemented, multi-keyboard attack, arranged so that it's perfectly in sync according to the manufacturers' official specifications...
What Boddicker really needs is some sort of programmable MIDI delay compensator, a subject that brings us nicely on to the next musical speaker of note, Roger Linn.
The inventor of the industry standard LinnDrum spends most of his time these days trying to convince people of the worth of his company's Linn 9000 MIDI drum machine/recorder. His talk concentrated on the machine's history, its technical development, its capabilities, and why everybody should have one.
Linn has an easier time selling the 9000 across the pond than he does in the UK, where its dollar-inflated price tag makes it a wildly uncompetitive device desperately in search of some software updates to bring it into line with machines from other manufacturers. As things turned out, Linn used Digicon as the launch pad for some of that new software, which will provide for user-sampling and full step-time writing facilities.
In answer to Michael Boddicker's prayers, Roger Linn went on to confirm that he's working on a 32-track sequencer that will have the ability to shift individual tracks backwards or forwards in time, programmable in milliseconds. As well as being musically useful as an editing system in its own right, such a facility would largely overcome the problems of MIDI processing delays experienced by Boddicker and so many other musicians.
And Linn has extended his MIDI crystal ball-gazing to the more distant future, in which he sees the development of such marvels as an intelligent MIDI data processing computer system. The system would include what Linn terms 'full MIDI database librarian capabilities', which means it would be capable of supporting patch dumps for specified MIDI keyboard setups, configuration programming, MIDI delay analysis and automatic delay compensation.
In the discussions that followed Linn's talk, it became clear that the possibilities for MIDI code destructuring and reprogramming are as endless as the variety of people using the interface. Assorted members of the audience ventured ideas such as manipulating MIDI data to obtain 'reverse pitching', so that going up in pitch on one keyboard results in corresponding descending pitches on another connected via Linn's all-embracing computer system. And how about a pitch-to-patch converter program...?
Next came Bob Moog, grandaddy of them all. Like Linn's, his talk centred around the piece of technology he's currently spending most of his time working with: in this case the Kurzweil 250. Moog describes the Kurzweil as 'the most complex digital instrument that's ever been manufactured for consumers', which probably isn't too far from the truth.
Drawing comparisons between hi-tech instrument design in the mid-80s and the same job 20 years ago, Moog contrasted the vast team of hardware and software engineers which Ray Kurzweil assembled around him to design the 250 with his own situation, two decades earlier, assembling the first Moog analogue synthesiser, without any help from outside, on his kitchen table. We all know Moog is a thorough worker, but you can't get much more detailed than the approach adopted by Kurzweil's engineers, who spend weeks minutely shaping the spectral content of each of their samples, only to subject them to merciless scrutiny by accomplished musicians whose reputation has been made on the acoustic instrument being sampled.
Less well-known than Linn or Moog is the name of Roger Nichols — though if you do ever look at the back of LP covers as well as the front, it could ring a bell. Nichols is a one-time nuclear engineer who has since become a three-time Grammy award winner as producer of albums by the likes of Steely Dan, Donald Fagen, John Denver and Stevie Wonder.
He's also a champion of digital recording technology, perhaps because, in the days before MIDI, he spent months working on ways of getting digitally-recorded instruments perfectly in sync with each other. He managed to devise a means of transferring the codes derived from his recordings directly into a computer, from which he made minute time and frequency adjustment through specially-written software. He then returned the shifted and/or processed sounds to digital tape and played them back in the studio. After doing this for a while, he discovered that shifts of as little as 40 milliseconds could have a marked effect on the overall sound of the interaction between two instruments. The synchronised, repeating piano and bass attacks on Donald Fagen's 'Ruby' were achieved by this method.
By way of a diversion, Nichols has carried out investigations into the way musicians introduce delays of their own when playing 'in time' with music. Thus, few people know more than Nicols about the way musicians respond to things like the tempo of the music they're playing to, the position in the bar at which they begin playing, where that bar is situated in the phrase, and the feel of the piece as a whole.
So Nichols has considerable skill and expertise at manipulating temporal musical events — but it does take him an awful long time to produce albums. Personally, I suspect the advent of new technology will do little to speed his work. More likely, it'll delay the process by introducing yet more avenues for Nichols' tireless research...
Still, the producer had some relatively uninvolved things to say about MIDI. His main field of interest here is the development of a low-cost MIDI mixing desk. This would enable tasks such as fading, panning and EQ to be included in the MIDI datastream, and could therefore lead the way towards affordable studio automation. Don't laugh. It could be nearer than you think.
Politics reared its ugly head again as Digicon went into its later stages, but this time, the visitors were entreated to settle their theological differences and settle down to the business of getting the best out of new technology. The man doing the talking was one Bill Buxton, a sort of hi-tech Gandhi who spends his time preaching the vices of confrontation and the virtues of positive dialogue.
Thus, Buxton's Digicon sermon called for more artists in the lab, more scientists in the concert halls, more academics in the marketplace, more studio musicians and engineers in live performance, and so on until the world is a wonderful place to live in and everybody is blissfully happy.
All very praiseworthy, of course, and Buxton is more than just an idealist with nothing better to do than advocate peace, love and understanding. He also happens to be closely involved with a couple of recent hi-tech music developments, one of which is a general-purpose signal processing system accessed through a device called the Touch Tablet.
Basically, the Tablet is a highly touch-sensitive surface that works by sensing the capacitance of the player's fingers as they come into contact with it. It can process both pressure and area information very accurately, and sends the resulting data down an RS232 interface. MIDI data transfer is also available to and from the system.
The Touch Tablet's surface can be configured in a variety of different ways through software, and can be programmed to emulate or execute user-defined devices and operations. When you've decided what you want the Tablet to do and got hold of the necessary software, you place templates on the surface of the device to show each program and the area of the tablet that relates to it.
To give you some idea of what can be done, one software package and template provides five-voice FM synthesis capabilities, with specific areas of the Tablet performing control over volume, patch selection and timbre for each voice. To alter one of those parameters, all you do is brush your fingers across the relevant area on the Tablet; the harder you press, the quicker the change in parameter value takes place.
Drum software and a computer graphics 'paintbox' package have also been developed for the Touch Tablet, while an extremely elaborate digital interconnection console for MIDI instruments has also been considered by Buxton. He envisages template areas representing actual musical instruments and outboard effects, so that the user could make or break connections between machines simply by a brush of the hand, or increase the output level of instruments by the same method.
Buxton has also been a great supporter of the IVL Pitchrider (itself a Canadian design), a pitch-to-MIDI control instrument that's been briefly described in these pages before and which was being demonstrated by its manufacturers at Digicon.
Buxton has been using the Pitchrider 2000 to access MIDI from his saxophone, and claims to find the results unbelievable — even though the model has yet to arrive in the UK pending some software-rewriting by IVL's design team. When it does come, the 2000 will be able to recognise the pitch of any note from a microphone source and convert it into a corresponding MIDI data value. It will also be able to output MIDI data, and provide some simple controls to allow you to set and adjust MIDI parameters such as MIDI channel number, pitchbend range, volume dynamics, operating mode, and so on. You'll also be able to adjust the machine's response time and instruct the Pitchrider to transpose its output.
It shouldn't cost too much, either, but it's only a part of what IVL are capable of achieving. The Pitchrider 7000, a more complex machine based on similar principles, is a fully polyphonic guitar-to-MIDI interface that should enable you to play any MIDI synth or drum machine from a guitar. The system employs a pickup connected close to the bridge of the guitar, and can track lead lines very quickly, reproduce chords faithfully, and even follow string-bending. You can assign each string a different MIDI channel so that each can be used to manipulate a different sound source, and in the same way that the 2000 has already forced several brass and wind players (Buxton included) and even singers to rethink their playing technique, so the Pitchrider 7000 should do the same for guitarists, a breed of musicians that is notoriously set in its ways.
Show Report by Ron Briefel
mu:zines is the result of thousands of hours of effort, and will require many thousands more going forward to reach our goals of getting all this content online.
If you value this resource, you can support this project - it really helps!