COMPARING HUMAN AND COMPUTATIONAL MODELS OF MUSIC PREDICTION
Date
1992-05-01
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
The information content of each successive note in a
piece of music is not an intrinsic musical property but depends
on the listener's own model of a genre of music. Human listeners'
models can be elicited by having them guess successive notes and
assign probabilities to their guesses by gambling. Computational
models can be constructed by developing a structural framework for
prediction, and "training" the system by having it assimilate a
corpus of sample compositions and adjust its internal probability
estimates accordingly. These two modeling techniques turn out to
yield remarkably similar values for the information content, or
"entropy," of the Bach chorale melodies.
While previous research has concentrated on the overall
information content of whole pieces of music, the present study
evaluates and compares the two kinds of model in fine detail. Their
predictions for two particular chorale melodies are analyzed on a
note-by-note basis, and the smoothed information profiles of the
chorales are examined and compared. Apart from the intrinsic interest
of comparing human with computational models of music, several
conclusions are drawn for the improvement of computational models.
Description
Keywords
Computer Science