Magazine Archive

Home -> Magazines -> Issues -> Articles in this issue -> View

The Prophet And The Rising Sun (Part 2)

Dave Smith

In the second part of this exclusive MT interview, the man behind the Prophet 5 and MIDI discusses MIDI, user interfaces and programming. Simon Trask listens in on the thoughts of Korg's R&D.


IN THE CONCLUSION OF THIS TWO-PART INTERVIEW SEQUENTIAL CIRCUITS FOUNDER AND CURRENT VICE-PRESIDENT OF KORG R&D DAVE SMITH TALKS ABOUT MIDI, POLYPHONY, USER INTERFACES AND PROGRAMMING.


DAVE SMITH GRADUATED from the University of California, Berkelee, in 1971 with a degree in electronic engineering and computer science, and started working in the electronics industry in Silicon Valley. Shortly afterwards the Minimoog was released, and seeing in it an opportunity to combine his interests in electronics and music, he bought one and began designing add-ons for it - purely for his own use, in his spare time.

Smith founded Sequential Circuits in 1974 as a part-time, one-man operation working out of his home. His first development was the Model 600 analogue sequencer, a CV-based unit for use with the Minimoog, which offered three banks of 16 notes. The Model 800 sequencer followed around two years later, offering 256 notes organised as 16 x 16. Where Smith only sold four or five Model 600 units, the Model 800 sold around 200. He also produced the Model 70 Programmer for the Minimoog and ARP 2600, again selling a couple of hundred units.

However, it wasn't until April 77 that he was able to quit working as an engineer and treat Sequential as a full-time concern, moving into modest business premises in San Jose and taking on a couple of part-time staff. It was at this point that he started thinking about the synth that was to become the Prophet 5.

"It actually worked out nice", Smith recalls, "I was able to take my time and learn how to do things in the beginning, because I wasn't counting on it to make money since I had a regular day job."

Smith struck it big with the Prophet 5, but could such a story happen today? With the ever-greater R&D investment needed to generate technological breakthroughs these days, isn't it inevitable that the burden of significant technological development will increasingly fall on the shoulders of the big companies, and that consequently such development will become more centralised?

"There's still always the niche markets", Smith contends. "Probably one of MIDI's biggest contributions has been that it opened up that whole cottage industry where one or two people could get together and do this little piece of software or this little hardware box. But to do a real full-blown instrument is pretty much out of the question these days."

Talking of boxes, isn't there a certain irony in the fact that MIDI was supposed to do away with them, yet in practice it's spawned a whole new generation of the things: MIDI Thru boxes, MIDI patchbays, MIDI mergers, MIDI filters, MIDI-to-CV converters...

"We kind of envisioned that", Smith replies. "MIDI was supposed to be real simple to do on straightforward topologies where you only had two or three things going on, but obviously if you're going to do a lot of things... In fact, at Sequential we thought MIDI connections should always be in a star network. It was the Japanese who were big on daisy-chaining - that's where the MIDI Thru came from. We always wanted it to be one instrument on one cable. If people did that and the processors in the synthesisers were fast enough to do even the simplest filtering on continuous controllers, there would really essentially be no MIDI timing problems. Some people still think there are, but really that isn't a limitation of MIDI, at least not the way we were envisioning it."

But could the MIDI pioneers possibly have foreseen the explosion of applications for the Musical Instrument Digital Interface? The very name suggests that the original concept has been somewhat superseded.

"We had a lot of it in mind", Smith replies. "Part of the reason for MIDI was the home computers coming in, so we knew there would be a lot of things you could do with it. We may not have actually envisioned something like MIDI Time Code being used, but we certainly knew that MIDI would be used as a clocking device as well as for notes. And, of course, we knew it would be used for sequencing and for patch saving. I wouldn't have guessed that MIDI would take over the whole studio quite as much as it has, that it would get built into tape decks and mixers, nor that it would be included in some of the new CDI stuff or on lighting controllers."

The omnipresence of MIDI as a simultaneously unifying and diversifying force in today's hi-tech musical world is a validation of the perception which Smith and his colleagues had in the early '80s.

"We all knew that if we really wanted the market to go somewhere we could help it along quite a bit by having something where everything could talk together. At that time we all had our own interfaces, but we were smart enough to say it was silly, that there was no need for it to be that way.

"It was mostly the big companies that were thinking this at the time, whereas a lot of the small companies were saying 'no, we're not going to do that, it's not good enough, it's not fast enough'. The big companies were all able to see that if they compromised they'd be able to make something reasonable, and if we wanted it to take off then it really was necessary to do that.

"The industry was small enough at the time that we were able to do it. Even now, let's face it, this isn't a huge conglomerate marketplace, it's a really small, specialised marketplace. On that level it's a lot easier to do things than if you're building three million compact disc players a week or whatever, which is a whole different ballgame. See, we were lucky. There were only five companies involved in the development of MIDI, and out of those companies it was Roland and Sequential who did much of the work."

Such co-operation wasn't without its difficulties, however. Smith found that he had to push for the inclusion of what became MIDI mode 4, or multitimbral mono mode, without letting on why he wanted it included.

"At Sequential we were envisioning the day when all synthesisers would be multitimbral and you could do a bunch of stuff on one instrument, but we didn't want to lecture the Japanese too much on it because then we'd be saying 'what you really should be doing is designing a box like this'. So it was a little tricky! In fact, there was some confusion about what mono mode was supposed to be."

Over the years, mode four has become the MIDI guitar mode by default, but Smith, who started his musical life playing guitar and bass, claims that that wasn't its original purpose. He also confirms the old story about how MIDI mode two came into being:

"Yamaha completely misinterpreted mono mode. They had it as a mono mode that was really monophonic, on one MIDI channel."

If such confusion could exist among a relatively small number of manufacturers, what chance would a MIDI II stand today?

"I wouldn't even want to start a MIDI II now", Smith says. "I suppose someday somebody's going to have to do it, and good luck to them. It's going to take a lot of work, and because there's so many more people involved now it's going to be real hard. You have all these small companies, and I'm not saying this is bad, but all of them are going to want to have input, which is going to make it hard for everybody to agree on anything. Personally I don't think the current version is all that bad. It works and it will continue to work for a long time, and of course it's cost-effective. Looking back, there were some rough spots and there continue to be some rough spots, but I think what people have to do is stand back and compare it to any other industry, then they'll realise how remarkable it is. Ultimately MIDI can never be universal, because everybody designs things differently, but I think it does pretty well."



"PART OF THE REASON FOR MIDI WAS THE HOME COMPUTERS COMING IN, SO WE KNEW THERE WOULD BE A LOT OF THINGS YOU COULD DO WITH IT."


And who did actually come up with the acronym MIDI?

"I did", Smith replies. "I remember the meeting. It was at our factory, and Kakehashi from Roland was there. The Japanese had presented the name UMI, for Universal Musical Interface, and they thought it was cute because of the double meaning with 'you-me'. We kind of cringed at that, so we sat there and bounced a couple of things around, and all of a sudden Musical Instrument Digital Interface came to me. 'MIDI' had a nice ring to it, and Musical Instrument Digital Interface was specific and yet general-purpose at the same time."

Turning to the subject of polyphony, the textural sophistication available on synths and samplers has seemingly always led the number of voices available. A good current example of this would be Korg's Wavestation. Does Smith see the number of voices increasing still further in line with the multitimbral and layering possibilities of today's digital synths?

"It's all a matter of time", comes the reply, "but for most keyboard playing I'd argue that you don't really need that much more than what you have now. It's only when you're trying to do everything in one box, if you're trying to drive it from a sequencer and you're trying to do 14-part multitimbrality then yes, you need more voices. But if you're using it as a sit-down-and-play type keyboard, it's plenty, 'cos even when you stack things, the more you stack 'em the less notes you have to play, unless you're really into thickening up the mix. Then you end up with the type of sounds that sound great when you play them in a music store or by yourself, but when you go in and try to lay down tracks they don't fit because there's too much there.

"In real use on a real record it's too much. But if you just have a home studio and you want one box where you can do everything by yourself and drive it from a sequencer, then yeah, 64 voices or 128 voices... The limit there is going to be what it is now, and that's with the microprocessors keeping up with it all. It'll probably make more sense to go out and buy two M3Rs and a Wavestation, buy your individual boxes rather than have it all in one box.

"Besides, most people would rather have the variety in sound, because there's always going to be some sort of a signature sound to any unit out there, and rather than have all 100 voices coming from that one box it might make more sense to separate it. A lot of us get pretty jaded with the voice-count, how many megabytes, how many megahertz, how many patches, how many sounds... A lot of people tend to use those things for gauges, but we feel it doesn't really matter how many ROM waves you have, what really, matters is what the instrument sounds like. But we have to play that game to a certain degree, so we have to list all the numbers just to keep up with everybody else. It gets a little crazy."

A lot of people still pine for the old analogue front panels with their sliders. Does Smith feel that a reversion to this kind of approach will happen?

"It's the parameter problem", he contends. "These same people probably want all the control of all the MIDI stuff, and how are you going to do that with knobs? It really isn't conducive to single front-panels. You could do something like Roland have done on the D70 where they have four assignable sliders - that sort of thing makes sense to make it easier to get to things. Obviously there could be bigger screens, more knobs which could be software-programmed, that sort of thing.

"I tend to think a lot of that's a minor part of the equation. I'm looking more for what generates the sound, 'cos if you look back historically speaking at the instruments that have done the best, it's because of the sound. It has nothing to do with the user interface, it has nothing to do with what it looks like, it has nothing to do with any of those things, what the manual's like, how much it costs. None of that counts, the only thing that counts is what it sounds like.

"Obviously we could use that as a way out more than we do, I'm not saying that we don't try to have a real good user-interface, because we do. But the only thing that really matters and that really makes an instrument sell is the sound. I get into trouble for this a lot at work, but I tend to put less emphasis on the user interface, because a lot of it to me is just details.

"What it comes down to is that one way might be a little faster than another. But either way you're not going to be able to program the instrument and make significant changes unless you really understand it. You can have a control to make a sound more or less bright, but what's that really going to buy you except real simple changes? If you came up with a simpler user interface it may help beginners more, and it may get more people deeper because they'd be less scared of it. But if you really want to do something serious, you have to understand the instrument to do it, you have to get down deeper.

"Again, I'm personally not real big on user interfaces. We have some people at work that are real good at it, fortunately. User interfaces always improve, because at each level of technology you get a bigger screen and you get more 'bang for your buck' out of the processor. Personally I really like the user interface we had on the Prophet 3000, where everything's softkey driven - similar to the approach we have on the Wavestation, where it's real concise and it leads you because you can see what your choices are, it's kind of self-teaching. Once you get the basic tree structure down, it's pretty simple."

All the same, one thing that can baffle people with today's instruments is the sheer number of parameters. Perhaps what we need is a way of synthesising sound which doesn't have a great number of parameters but which allows you to create sounds in a more directly musical way.

"There's a lot of talk about doing that", Smith comments, "and I guess we already do a little bit of that on the macro level of things, where you can say 'I want this type of an envelope' and not have to go in and set breakpoints, and the same for filter settings. So we do that to a certain degree.

"The biggest problem is that at some point if you have fewer parameters you have fewer possibilities, that's the bottom line no matter how you look at it. So, the instrument that has fewer parameters is going to be more limited in what you can do with it. I think the idea is to continue what we've been doing - which is to have different levels of programming. On the one hand there's a higher level where it's easier to change things but because there aren't as many possibilities you're not going to get the full range of things, but there's also a deeper level where if you really want to go in and twist things up you can, but you have to know what you're doing and you have to keep track of a lot of things. Something for nothing is a tricky one to do."

The difficulties of programming digital synths from the time of the DX7 on have led to a preset culture, where most people want ready-made sounds on cards and disks. But when everyone has access to the same sounds, you can get a situation where you keep hearing certain sounds coming up on records and in commercials.

"I think it's an unfortunate side-product of the market", admits Smith. "People, understandably, don't have the time to sit there and learn an instrument, because it does take time. I find that myself, now that I've built a little studio in my house and I've been spending more time in it, I'm finding that I do the same thing. The first thing I do is go through factory presets and find something that's close enough, then maybe edit it a little, but it's rare that I say 'Oh, I've got to have a sound like this' and sit down and create it from scratch, because I'm lazy and it's not worth the time, usually. Which I think is another reason why the factory presets are so important now, because as a practical matter that's how it's going to be used most of the time. Coming out with a bad set of sounds can almost kill a new synthesiser these days."

Finally, something a lot of people like is the noise inherent in older technology. Yet digital instruments seem to be getting more and more perfect, with more emphasis being placed on quality. What about an option to 'dirty the sound up a bit'? Smith feels that this is an ongoing trend:

"Each level of digital technology gets a little bit better at what the manufacturers like to call 'analogue' sounds. We're guilty of that, too: here's an analogue brass sound. Everybody does that, but it's true, digital synths are getting more analogue, each generation is a little bit more. Some day there'll probably be a grunge parameter that you can turn to 11 if you want to!"


More with this artist



Previous Article in this issue

Miditemp MP44 MIDI Player

Next article in this issue

Jazz Baby


Music Technology - Copyright: Music Maker Publications (UK), Future Publishing.

 

Music Technology - Dec 1990

Artist:

Dave Smith


Role:

Electrical Engineer
Company Founder
Designer

Series:

Dave Smith

Part 1 | Part 2 (Viewing)


Interview by Simon Trask

Previous article in this issue:

> Miditemp MP44 MIDI Player

Next article in this issue:

> Jazz Baby


Help Support The Things You Love

mu:zines is the result of thousands of hours of effort, and will require many thousands more going forward to reach our goals of getting all this content online.

If you value this resource, you can support this project - it really helps!

Donations for January 2020
Issues donated this month: 20

Issues that have been donated this month.

Funds donated this month: £16.00

All donations and support are gratefully appreciated - thank you.

Please Contribute to mu:zines by supplying magazines, scanning or donating funds. Thanks!

Monetary donations go towards site running costs, and the occasional coffee for me if there's anything left over!
muzines_logo_02

Small Print

Terms of usePrivacy