Ever since the invention of the wax cylinder, listening to recorded music has been a passive experience; now CDs are capable of holding MIDI and other performance data. Bob O'Donnell looks at the changing face of music.
New forms of media, such as CD-I and CD+MIDI, offer revolutionary methods of presenting music that may have significant effects on the future of music.
NO PRIZES FOR recognising that synths, samplers and sequencers have had tremendous impact on how music is composed and produced. But how much have they affected the musical output of composers and players who use them? Yes, the sounds are different, the notes are more precise and the speed at which musicians are able to work has been greatly increased, but has any really new music resulted from these developments? Is there anything being created now that wasn't being created prior to the development and commercialisation of synthesisers, sequencers, drum machines and other MIDI equipment?
And what effects are new technical developments revolving around the CD, such as Compact Disc-Interactive and Compact Discs encoded with MIDI, likely to have? Here, it seems, opportunities for creating entirely new genres of music are opening up - as is the potential to completely redefine the listening process. By forcing composers to rethink their conceptual notions of music, and by giving listeners the opportunity of involving themselves (for example, deciding now a piece should proceed over time) these new "media" could add a new dimension to music - that of interactivity.
INTERACTIVITY HAS BECOME another buzzword in the computer world. Interactive devices like Apple's Hypercard and other hypertext programs are causing people to reconsider how large amounts of information should be presented, stored and retrieved. Hypertext is defined as inter-related text information which you can read or access in any order - it need not be from beginning to end. Systems which use hypertext allow you to quickly find stored information in a way similar to the way in which your mind works.
A hypertext program like Hypercard doesn't create anything new, but it allows you to browse through the information that you're interested in and ignore that which you're not. Depending or the specific application, it also allows you to delve as deeply into a topic as you want. Finally, and perhaps most significantly, it gives you control over when and in what order you want to access that information - which is why it's referred to as interactive.
Now here's where the new versions of CDs come into play. In theory, both CD+MIDI and CD-I discs will offer you the option, as a listener, to involve yourself with how the "musical information" on the CDs is presented. In the case of CD+MIDI, it may only be choosing which timbres are being played by the synth you've got plugged into the MIDI ports on your new multi purpose CD player. But the possibilities for CD+MIDI and CD-I go way beyond this.
Imagine, for example, having control over whether or not a particular musical line is played throughout a piece of music as well as determining which of your connected MIDI synths will act as the sound source for it. The possibilities are staggering.
I suspect that within a few years, CD-I and other hypertext developments will change the way we as a society think about information (musical or otherwise) and its presentation. The Age of Information is about to mature. If you're in any doubt, consider that the latest IMA Bulletin (the official newsletter of the International MIDI Association) devotes its front page to CD+MIDI.
Before we can start predicting where music may be going, we need to try and put the march of music technology into perspective. First, the tools that you and I now have to work with to create our music have had a profound impact on how every style of music is composed and recorded. The quality of electronic instruments has improved immensely over the last few years, as has the degree of control offered by sequencers and other processing devices over them. Musicians can now only afford to ignore the technology at their own risk.
"The only way new genres of music are developed is when musicians start thinking about music in a new way."
On the other hand, it seems that, despite the many creative possibilities offered by features like real-time System Exclusive control, most people are using their instruments for the simplest of applications. In the case of synths and samplers, many musicians are still buying new instruments just to gain access to the latest sounds. Developments in MIDI software seem to be a bit more encouraging; some fascinating new programs have been produced which are giving people a chance to view the music-making process from a new perspective - algorithmic composers (like Intelligent Music's M), for example. There's still a great deal of work to be done, however, and I suspect only a dedicated few have investigated this kind of development.
The bottom line seems to be that most people are satisfied with using new equipment to make the composition and recording of existing styles of music easier. Now, that's important even revolutionary in its own way - but it's time to move beyond these basic applications.
The instruments and software packages themselves don't have inherent capabilities to produce new types of music, they're only tools, but what the gear can do is to inspire us to think in different ways. That's the real beauty of sophisticated new instruments and new software.
THE ONLY way new musical avenues can be developed is when musicians start thinking about music in a new way. To begin with we need to throw aside old "linear" notions of how music should proceed and think about music as malleable, evolving material. A piece of music need not be a fixed entity, it can change and be transformed on different listenings. Over the last few decades, composers such as John Cage, Karlheinz Stockhausen and Brian Eno have recognised this and have incorporated elements of chance and personal decision into the compositions, so that each time the pieces are performed, they are different - in these cases, the performers are the ones who have had the control over how a piece of music an evolve over time. With the CD-I and CD+MIDI formats, that control could be transferred to listener.
The concept may sound a little far fetched at first, but as with other developments in the "arts", what begins as an avant-garde development often later becomes part of the mainstream.
To offer this kind of control over recorded music, however, there needs to be a way to store, retrieve and process the various musical options. Because CDs store all the music as digital audio (in other words, as numbers), the answer in simple - consider the musical composition as a program that contains numerical data and remember that data can be manipulated in a variety of ways given the proper tools. If you work with MIDI, this analogy shouldn't be too far fetched, because it's exactly what MIDI does - it converts musical performance into a language of computer data that can be transmitted to and understood by a variety of different types of machine. With sequencers you can record that information for future use, and with notation programs you can translate that data into a form that is intuitive and readable by most musicians.
"It's time to put the march of technology into perspective and see exactly where it's taking us."
Until recently, most of the manipulations of this musical data - such as digital signal processing of audio material and sequencing of MIDI material - have been based on predetermined instructions that you tell the instrument or computer in question to perform. Some of the randomising features found in patch editors, and many of the functions offered by algorithmic composition programs have brought in a sense of interactive creation - you select what parameters you want to randomise or what kind of randomisation you want to perform on a given group of notes - but the new CDs should allow interaction on a more global level.
CD CAN BE considered to be like a high-density floppy disk - it holds up to 660 Megabytes of data. The specific type of data it can store is not restricted to digital audio however, digital video information, computer software, MIDI data and "ordinary" computer data can all be written onto a CD alone, and in various combinations. As a result, we have CD-ROM (which stores large chunks of computer data), CD-V (where "V" stands for full-motion video, like Laserdiscs), and CD+G (where "G' represents video graphics, but this is more like a slide show than full-motion video), which are all variations an the technology. Of course, you need the appropriate hardware to take advantage of these different types of data - an ordinary CD audio player won't be able to make use of anything but digitally-encoded audio but the potential for numerous applications is there.
CD+G and CD+MIDI, both of which were developed by Warner New Media in America, are actually very closely related to CD audio. When the specification for CD audio discs was developed, 5% of the storage capacity was set aside for what was termed "subcode". (In fact, some CD players have subcode outputs.) Up until now this subcode storage hasn't really been used for anything, no this is where Warner New Media chose to locate the graphics or the MIDI data. Up to 16 channels of MIDI information can be stored on a CD+MIDI disc, and they can either be played in conjunction with or independently of the digital audio tracks stored on the disc.
CD-ROM, CD-V and CD-I, on the other hand, have entirely different specifications for storing the information on the disc (including several levels of audio quality standards) and consequently, require new types of CD players. CD-I combines digital audio, video and computer code onto a single disc and permits you to interactively choose which portions of the information you want to access at a particular time. As a result, a CD-I player must incorporate a CPU (Central Processing Unit or computer "brain"), as well as the appropriate audio and video hardware.
Unlike the CD+MIDI standard, which has not yet been finalised, the CD-I specification has been written. A solid base of CD-I products may not appear, however, until late this year, while CD+MIDI hardware and software is expected to surface more quickly.
IMAGINE A FORM of music where you have control over certain of a piece's most important characteristics: the instrumentation, arrangement, structure...
"CD-I and CD+MIDI discs offer control over instrumentation and structure of a piece of music."
Both CD-I discs and CD+MIDI discs will (theoretically at least) give you this kind of power. In the case of CD-I, the material you could control would be the recorded digital audio tracks, while the MIDI data would be the only thing coming from a CD+MIDI disc that you could alter. In either case, you couldn't create new musical options - neither type of disc will create something from nothing - but you could create a new version of the piece. depending on how you combine or change the existing materials. In the case of CD+MIDI discs the processing would occur outside the player because you cannot change the MIDI data that's encoded onto the CD. It would essentially work as a playback-only sequencer. CD-I players, on the other hand, will incorporate a CPU for processing functions, but again, the options would have to be recorded somewhere on the disc.
After giving the subject just a little bit of thought, I came up with several different levels of interactivity which these discs might provide. First, on a very basic level, imagine a jazz CD that would give you the option of listening to any one of ten different improvised solos for each cut. Each time you listened to that piece of music it would be slightly different, or if there was a particular solo you liked, you could program the CD player to always play your choice. On a slightly more involved level, a CD+MIDI disc would allow you to adjust the mix of a particular cut, or perhaps its instrumentation (by externally changing MIDI volume levels and patches). Finally, for the adventurous, it would be possible to go even further. You could have various sections of music, or a number of musical lines which could be juxtaposed in a variety of different ways by intelligent algorithms under your own control. (You could think of it as Algorithmic Playback software.) Of course, there are plenty more potential applications, but these should give you food for thought.
As composer, interactive music would present a number of new challenges. First, its acceptance would mean giving up complete control over the end result - the listener could have almost as much say in the final output of music as the composer. In fact, the line between composer and listener would become blurred as the listener started to take part in the creative process. Second, it would require the conception and development of new forms with which to organise your musical ideas. As composers we would have to start thinking on a more abstract level. Finally, the new music would require new skills for us to master. In addition to creating the musical content, we would need to understand and control the various permutations that the media offers.
In return for our efforts, however, we would gain the satisfaction of knowing that we had created an evolving work of art. The new dimension of interactive music could be akin to sculptures, where your viewing perspective alters the appearance of the work.
Interestingly enough, an example of interactive music which already exists - though it's not available on a compact disc. Nevertheless, Laurie Spiegel's Music Mouse program for the Macintosh and Amiga, does provide some interactive capabilities. When you move the mouse, the program will play musical notes (either on the computer or over MIDI) that are treated by a harmony-generating algorithm. Control parameters of the algorithm, which provide different types of harmony, counterpoint and orchestration, are accessible in real time from the computer keyboard. According to its creator, it can be thought of as new type of musical creation that lies somewhere between a composition, compositional method and a musical instrument.
I HAVE INTENTIONALLY skimmed over a number of different topics that eventually need to be addressed when talking about these developments. It may be quite a while before these new formats become viable, they may demand too much of the general consumer to ever become popular and the possibility of tying this all together with video images adds yet another dimension. The point, however, is that we can and should start thinking about how technology affects music. These new types of media could affect what we play and listen to in a very profound and positive way.
It's time to move beyond thinking about developments in technology as the latest and greatest sounds, or the trickiest new sequencer editing feature. New sounds and features are necessary and they will continue to inspire us, but I think we're reaching (or have reached) saturation point. What ought to happen now is that developments should encourage new ideas about music itself. This is much harder to do, of course, and may take quite a while to reach fruition, but the time for interactive music, it seems, is at hand.
Feature by Bob O'Donnell
mu:zines is the result of thousands of hours of effort, and will require many thousands more going forward to reach our goals of getting all this content online.
If you value this resource, you can support this project - it really helps!