Goto M (2007) Active music listening interfaces based on signal processing. Goto M (2007) Active music listening interfaces based on sound source separation and F0 estimation. In: Current directions in computer music research. Mathews MV (1989) The conductor program and mechanical baton.
In: Proceedings of 3rd international conference on affective computing and intelligent interaction (ACII2009), Amsterdam, the Netherlands Hashida M, Tanaka S, Katayose H (2009) Mixtract: a directable musical expression system. In: Proceedings of 1st international conference on computer supported education (CSEDU), Lisboa, Portugal, pp 406–409 Peng L, Gerhard D (2009) A gestural interface for orchestral conducting education. In: Proceedings of international computer music conference (ICMC1998), Ann Arbor, MI, USA, pp 25–32 Usa S, Mochida Y (1998) A multi-modal conducting simulator. In: 11th international conference on cochlear implants and other implantable technologies, Stockholm, Sweden, p 356 IEEE Multimed 11(3):50–58ĭravins C, van Besouw R, Hansen KF, Kuske S (2010) Exploring and enjoying non-speech sounds through a cochlear implant: the therapy of music. Hunt A, Kirk R, Neighbour M (2004) Multiple media interfaces for music therapy. This process is experimental and the keywords may be updated as the learning algorithm improves. These keywords were added by machine and not by the authors. Possible methods for the evaluation of such systems are finally discussed. For many of the surveyed systems, a formal evaluation is missing. Several systems are then described, focusing on different technical and expressive aspects. Then, a generic approach to interactive control is presented, comprising four steps: control data collection and analysis, mapping from control data to performance parameters, modification of the music material, and audiovisual feedback synthesis. Their pros and cons are briefly discussed. A classification is proposed based on two initial design choices: the music material to interact with (i.e., MIDI or audio recordings) and the type of control (i.e., direct control of the low-level parameters such as tempo, intensity, and instrument balance or mapping from high-level parameters, such as emotions, to low-level parameters). This chapter is a literature survey of systems for real-time interactive control of automatic expressive music performance.