Where MIDI meets Video...
Two ways to control video images using the existing musical interface.
Although presently consigned to the realms of individual experimentation, MIDI control of visuals is a reality. Simon Trask investigates two systems which are putting MIDI in the picture...
In a hi-tech world where 'standards' all too often turn out to be anything but, the Musical Instrument Digital Interface - MIDI - has been a major success story. Over the years it has steadily consolidated its presence, expanding into just about every area of studio recording technology, from effects processors to mixing desks to patchbays to tape machines. Today, MIDI is used daily in thousands of studios around the world. Not only has it proved to be robust, reliable and straightforward, but also it has no competing 'standard' to weaken its authority.
Now MIDI is showing signs of developing into a de facto multimedia control and interfacing standard, as developers working in video and lighting begin to adopt it. In all likelihood, the artist of the future will be someone who has both the technical and the aesthetic ability to work simultaneously in several different media, combining music, video, stills, graphics and lighting in an integrated artistic experience. With MIDI's growing significance as a unifying standard, musicians already familiar with MIDI technology are well placed to start exploring the creative possibilities of a mixed-media setup.
We'll be looking at MIDI-controlled lighting in a future issue. This month, however, we're going to focus on MIDI and video - specifically, on two very different systems currently under development, both of which can be controlled via MIDI.
Working from his flat in north London, David James has written custom software on his Atari 520STE which puts not only live or pre-recorded video but also computer-generated graphics and scanned images under MIDI control. With a keyboard, drum machine or other MIDI source connected to the STE's MIDI In socket, MIDI notes can be used to freeze incoming real-time video frames or to call images and graphics up on screen.
It's also possible to change the colours for an image or progressively inject a particular colour into it, to move sprites (graphic objects) around the screen, and to trigger visual FX such as scrolling and pulsating images - again using MIDI note information.
David's software has already been used live on tour by The Grid; in this instance a video projector and a bank of 20 TV screens were used to display the visuals, which were controlled from a MIDI keyboard by Richard Norris.
One of the programs David provided put a collection of individual words under MIDI note control, allowing messages to be 'played' onscreen live from the keyboard. In fact, he is more interested in pursuing this sort of collaborative effort than in marketing his software commercially...
"The ideas behind the software date back to the early 1980s, when I began experimenting with ways of linking music to colour," David explains. "I'd connect the audio outputs from an amplifier to the beam deflector coils on a colour TV and get these beautiful patterns occurring on the screen in time to the music. And I also got the screen to change colour as the frequency balance of the music changed, by routing the amplifier through a 3-way frequency splitter which I connected to the red, green and blue colour guns on the TV."
These experiments led to an art exhibition in Greenwich Village, New York, in 1982. David's attempts to interest clubs in his hybrid amplifier/TV systems met with poor response when they discovered that the TVs would have to be modified. He subsequently developed a system which put all the modifying electronics in an external box which sat between amplifier audio output and TV aerial input; this system, too, was also put on show in a New York art gallery for a few months.
David began experimenting with MIDI and computers in 1990, when he developed software which allowed him to use his Roland GR-50 guitar synth to trigger onscreen colours, sprites and shapes. A subsequent program converted notes from the guitar (transmitting in MIDI-mono mode) into finger positions on an onscreen fretboard, displaying each pitch in a different colour.
From there he gradually developed the library of routines which he uses today...
"The software's written in a mixture of STOS BASIC and machine code, with special MIDI and video extensions," he explains. "For live video input to the computer from a camcorder or VCR I use a Rombo VIDI-ST monochrome digitiser, which comes with colourising software; you can get that for about £80.
"I had to develop my own connection to the STE's cartridge port, though, because if you plug the digitiser in directly it blocks the MIDI ports. The pre-scanned colour video images were produced using an RGB colour splitter box, also from Rombo, which costs about £60. My STE is up-graded to 4Mb RAM so as to hold up to 120 colour pictures in memory. For creating artwork I use the Neochrome and Degas Elite art packages; the MIDI-linked words for The Grid were created with Calamus DTP software."
David is continuing to add new visual effects to his software library. One such effect randomly intercuts live video with short video loops 'sampled' from the digitiser input; another, which he is currently working on, puts a computer-generated landscape under MIDI control, with pitchbend manipulating picture orientation. He also envisages creating a more sophisticated setup with a genlock unit to mix video and graphics and a MIDI-controlled video switcher box to allow rapid selection of different video sources.
Meanwhile on on industrial estate in Edmonton, north London, a more ambitious - and significantly more expensive - experiment in MIDI-controlled video is taking place. Virtual Vision is a multitrack digital video edit/playback system which utilises five laserdisc players under computer control, playing back on a row of five video monitors. Virtual Zone, the company responsible for developing Virtual Vision, are based in the seemingly unlikely setting of a roller-skating rink; however, the rink also doubles as a rave venue, providing the company with the perfect 'test bed' for their system.
"Hi-tech musicians are well placed to take advantage of the sort of MIDI-connected mixed-media systems which are starting to emerge."
The guiding force behind Virtual Vision is Patrick D Martin, who, long-time readers may recall, appeared previously in MT (December '88 issue) in an article on the Psychomobile - an early incarnation of the Virtual Vision technology and concepts.
Virtual Zone are, says Patrick, "interested in the complete integration of the musical structure into the video editing structure. Virtual Vision is really more of a compositional medium, where the pictures can influence the composition of the sound and the composition of the sound can influence the sequencing and content of the pictures. You really have to do both at the same time in order to come up with good stuff.
"The time has come where it's no longer good enough just to be a band in the musical sense. What is really needed in a band these days is technically competent and visually literate people who are able to compose visually and aurally at the same time."
Virtual Vision requires video material to be originated externally. Video rushes can then be compiled on a laserdisc recorder, which is a write-once multisession system; then a database of takes can be put together within Virtual Zone's custom software, which runs on an Atari TT.
Takes are defined in terms of SMPTE time locations, and can be given descriptions which you can subsequently search on - so, for instance, you could easily call up any takes containing shots of waterfalls or clouds if they were identified as such. Once you've identified a take that you want to use, you can slot it into one of five video tracks, which are presented on-screen in a 'grid edit type of format which many MIDI sequencer users will be familiar with. Virtual Zone's software also allows you to decide which of the five available screens you want a take to appear on - any videodisc player can be routed to any screen by means of a video switching matrix which is under software command.
The Virtual Vision setup, which includes Steinberg's Cubase sequencing software running on an Atari ST computer, is synchronised from a master SMPTE timecode generator; in this way, music can be developed in tandem with multitrack video. Patrick would like to take this audio/video integration a step further by allowing cut-and-paste editing on Cubase to be 'mirrored' in their video editing software - a step which would require the co-operation of Steinberg.
Virtual Zone have also built MIDI control of video playback into their system. Notes played into Cubase from a MIDI source, such as a keyboard or a percussion controller, can be routed via MIDI to the company's custom software on the TT. This software converts selected MIDI notes to commands recognised by the laserdisc players and the switching matrix, and then transmits these commands to the video equipment via an RS422 link.
As well as putting the routing of laserdisc players to screens under MIDI control, this setup allows laserdisc playback functions such as freeze frame and play forward/reverse to be triggered from MIDI notes. The players respond instantly to MIDI commands, allowing very precise rhythmic video effects to be created live from a 'front end' which will be familiar to all hi-tech musicians. These effects can, of course, be recorded as an integral part of a Cubase sequence.
"Cubase users will take to this system like a duck to water," comments Patrick, highlighting the fact that hi-tech musicians are well placed to deal with MIDI-orientated audio/video systems like Virtual Vision. "One of the marvellous things about MIDI is that everybody agrees it is a standard, which of course is not the case with so-called computer communications 'standards' in IBM-land."
Patrick envisages a broader Virtual Vision system which would integrate automated lighting control into their existing audio and video setup, again using MIDI as the binding force. Other plans for Virtual Vision include using 3D sound and integrating real-time DVEs (Digital Video Effects) into the setup by placing five effects processors inbetween the switching matrix and the five monitors. Virtual Zone have already been experimenting with a 3D sound system which can position sound anywhere in a horizontal plane using a 4-speaker setup. At present the only way they can position individual sounds in different locations is to build up a multitrack recording, processing one sound at a time; on the basis of a tape I heard played back on the company's 4-speaker system, the results are certainly impressive.
Virtual Zone's financial backers, a company called PSI, are involved in maintaining international satellite networks, and in transferring highly sensitive information such as bank data via satellite. PSI have already experimented with relaying artistic events to venues around the world via satellite uplinks. Patrick envisages Virtual Zone plugging into this network, and talks enthusiastically of completely by-passing the established chains of distribution. As he explains it, multichannel artistic works created using Virtual Vision could be uplinked from any location - even a roller-skating rink in Edmonton - to subscribing venues around the world, who of course would need the necessary decoder and playback equipment.
At present the available technology allows only three synchronised video channels to be transmitted live via satellite link, but Patrick confidently predicts that it will soon be possible to deliver all five channels. At the same time, he acknowledges that what technology makes possible and what people actually want are not necessarily one and the same thing.
At present, Virtual Vision is a system in search of artistically convincing applications. Virtual Zone don't plan to sell it commercially; instead, their aim is to bring artists in and let them experiment with it, with a view to stimulating the development of a new artform based around multichannel video. In the longer term, they envisage generating money from the global distribution of artworks in this medium via the satellite network. More prosaically, Virtual Vision could become a new presentation medium for big-business product launches.
The challenge for forward-thinking artists today is to take the new media technologies and develop new forms of art which go beyond the tired promo music video format and the flashy but often superficial rave-type videos. Working with different media interactively is surely the way forward, and in their own ways David James and Virtual Zone are both opening up new possibilities for musicians and other artists who want to work in this way.
What is becoming clear is that MIDI has a crucial unifying role to play here, and that hi-tech musicians are well placed to take advantage of the sort of MIDI-connected mixed-media systems which are starting to emerge. If sequencer manufacturers start to think beyond the purely musical market and develop 'multimedia MIDI sequencers' with graphic front-ends optimised to sequence not only music but video and lighting, perhaps then the era of the true multimedia artist will start to dawn...
Feature by Simon Trask
mu:zines is the result of thousands of hours of effort, and will require many thousands more going forward to reach our goals of getting all this content online.
If you value this resource, you can support this project - it really helps!