Musical expression in automated composition of melodies
dc.contributor.author | McLintock, Brian Thomas | en |
dc.contributor.committeechair | Roach, John W. | en |
dc.contributor.committeemember | Miller, David P. | en |
dc.contributor.committeemember | Sochinski, James R. | en |
dc.contributor.department | Computer Science | en |
dc.date.accessioned | 2014-03-14T21:49:58Z | en |
dc.date.adate | 2012-11-17 | en |
dc.date.available | 2014-03-14T21:49:58Z | en |
dc.date.issued | 1987-11-20 | en |
dc.date.rdate | 2012-11-17 | en |
dc.date.sdate | 2012-11-17 | en |
dc.description.abstract | Music composed by computers has always been lacking in "musical" qualities: mood, emotional expression and a sense of purposefulness or goal. A musical expert system, called EMOTER, is the first attempt to address these important musical aspects. EMOTER receives as input a list of moods (e.g., happy, lively) and generates melodic passages intended to evoke those moods in an organized, coherent fashion. EMOTER composes the basic units of music called phrases. The program uses the mood-specification from a theory due to Deryck Cooke to derive a few motifs (very primitive melodic material) exemplifying the moods and computes a number of musical attributes to guide its compositional choices. A theory of emotion due to Leonard Meyer further helps plan the phrase. The theory states that an emotional response is stimulated in a listener when expectations about the progression of the music are first established and then inhibited (with the understanding that the expectations will eventually be fulfilled). A melodic passage is composed using the selected motifs, attributes and emotional theory to create a "skeletal" phrase. This is embellished and developed (also using the attributes and theory) to flesh-out the bare melodic material into a passage that embodies the musical characteristics of the mood-specification. Results with EMOTER are excellent. Many musical phrases comparable to music of normal composers are generated from a single mood-specification. More theory is needed, however, before the full complexities of human-composed music are sufficiently captured in code for EMOTER to pass a Turing test in music composition. | en |
dc.description.degree | Master of Science | en |
dc.format.extent | ix, 141 leaves | en |
dc.format.medium | BTD | en |
dc.format.mimetype | application/pdf | en |
dc.identifier.other | etd-11172012-040259 | en |
dc.identifier.sourceurl | http://scholar.lib.vt.edu/theses/available/etd-11172012-040259/ | en |
dc.identifier.uri | http://hdl.handle.net/10919/45809 | en |
dc.publisher | Virginia Tech | en |
dc.relation.haspart | LD5655.V855_1987.M4223.pdf | en |
dc.relation.isformatof | OCLC# 17600347 | en |
dc.rights | In Copyright | en |
dc.rights.uri | http://rightsstatements.org/vocab/InC/1.0/ | en |
dc.subject.lcc | LD5655.V855 1987.M4223 | en |
dc.subject.lcsh | Computer programming | en |
dc.subject.lcsh | Music | en |
dc.title | Musical expression in automated composition of melodies | en |
dc.type | Thesis | en |
dc.type.dcmitype | Text | en |
thesis.degree.discipline | Computer Science | en |
thesis.degree.grantor | Virginia Polytechnic Institute and State University | en |
thesis.degree.level | masters | en |
thesis.degree.name | Master of Science | en |
Files
Original bundle
1 - 1 of 1
Loading...
- Name:
- LD5655.V855_1987.M4223.pdf
- Size:
- 7.17 MB
- Format:
- Adobe Portable Document Format
- Description: