Home -> Magazines -> Issues -> Articles in this issue -> View
Mind Over Music | |
Article from Electronics & Music Maker, April 1984 |
The psychology of piano-playing.
Andrew Morris on the similarities between piano-playing and typing, plus a brief insight into an experiment aimed at quantifying pianists' sight-reading techniques.
Ever since Seashore (1938), one of our most distinguished musicologists, developed a method of converting the time and force of playing keys into a mechanical record, psychologists have endeavoured to quantify musical performance. Technically, it shouldn't be too difficult to study piano playing as a discrete time-series of responses. But, as Seashore discovered, there are inherent problems, not least of all the vast quantities of data produced.
In the early days, the records obtained were entirely mechanical, punched or printed on copious volumes of paper. More recently, methods have been developed of transferring performance details directly on to computer, facilitating complex analysis. Nevertheless, the impact of computer science and electronic technology has been largely in the field of typewriting. So, for a decade or more since the advent of the electric typewriter, psychologists have chosen to study typists rather than pianists using the rationale that typing and piano-playing are members of the same family of skills sharing the same phenomena. As the technological tools necessary to study piano-playing have only recently been developed we have had to be content using typing skills as models of piano skills. One of the most notable investigators to capitalise on recent advances is J.A. Sloboda at Exeter University. The present study is based largely on a series of experiments performed by Sloboda in 1977 (cf. Shaffer, Deutsch).
The equipment set-up can be seen in the photograph (note that the tuner was not part of the experiment but it did help to while away the protracted hours spent programming!). The instrument on which the subjects played was a Yamaha CP30 electric piano with touch-sensitive action. The monitor in the centre displayed a single stave of music which was controlled by the Apple II micro.
Inside the piano casing a set of microswitches were activated each time a key was pressed. The leading edge of the resultant voltage-change operated a purpose-built solid-state monostable circuit, which in turn activated one of the one-bit pushbutton inputs of the Apple's games control. The software was such that on each key press a millisecond timer was started which counted the time up to the next key press. These inter-note times (which are known as Inter-Reaction-Times or IRTs) were stored continuously in textfiles for later analysis. The aims of the software were many: to display a single treble stave across the monitor screen; to present a single line of notes across the stave; and to shift this one note to the left on each key press creating new notes on the right, causing notes on the left to move off the screen. The whole process of renewing the music text had to be extremely fast, otherwise each reaction time would have been an artifact of the software. Consequently, the Applesoft programs were compiled into machine code.
One cannot help but wonder how much simpler and more elegant it would have been to use a Chroma/Apple setup but, alas, a Chroma was not forthcoming. However, the system worked remarkably well and the performers found it surprisingly easy to adjust to sight-reading from a TV screen rather than manuscript. They were required to play at sight two types of melody, one consisting of random notes and the other of notes in a Baroque style. The hypothesis was that the Baroque type melody would be more 'meaningful' than the random melody and so would be played faster and with greater fluency. The explanation of this is simple. If a person is uncertain about an event, he will find it difficult to predict that event. In other words, the more random or meaningless the melody, the more choices there are associated with the next note, thus making prediction difficult. Secondly, the two melody types were displayed so that the sight-readers' look-ahead was limited to 2, 4, or eight notes, or just a single note depending on the experimenter. This is shown in Figure 1. By doing this it should be possible to tell how far ahead a sight-reader looks in order to play fluently. The number of notes in this look-ahead is called the Eye-Hand-Span (EHS). This phenomenon has been demonstrated in typing studies but not so much in those concerned with piano playing.
The results of this experiment confirm that piano-playing is a member of the family of skills that includes typing, but that it is incredibly more complex to study. The elucidation of these processes is invaluable to the complete theory of music, but they represent only one level from which it may be tackled, as with many other human information processing mechanisms. It can be studied from the fundamental perceptual processes involved in pitch recognition, for instance, or from the real-world aspect on which the present study is based.
Feature by Andrew Morris
mu:zines is the result of thousands of hours of effort, and will require many thousands more going forward to reach our goals of getting all this content online.
If you value this resource, you can support this project - it really helps!
New issues that have been donated or scanned for us this month.
All donations and support are gratefully appreciated - thank you.
Do you have any of these magazine issues?
If so, and you can donate, lend or scan them to help complete our archive, please get in touch via the Contribute page - thanks!