AIYA is a sound analysis and improvisation companion, originally designed for the Chaosflöte and MIDI instruments, and recently adapted for installation settings. Using Max as a starting point for the logic operations of the machine, AIYA also connects to any other program that is capable of receiving OSC messages, for the intention of creating flexible performance setups (e.g. with Touchdesigner, Ableton, Supercollider, etc.). It spins textures and melodies that are derived from the sound and motion behavior of the human performer. Through an analysis of performance attributes such as frequency, amplitude, density, spectral centroid, spectral flatness, body position, and join rotation, interaction rules for the machine are derived. By developing it, I hope to dissect the often intuitive and somehow opaque nature of decision-making in improvised music. Does the improvising musician also operate on sets of rules, validations, and discouragements, even if consciously, her behavior appears to be natural, intuitive, and human? What kinds of conscious and unconscious decisions does the musician make while improvising? Perhaps, if we attempt to replicate this complex system into electronics, we can come closer to understanding and translating the aesthetic of human improvisation.
other notes: AIYA is derived from the Chinese expression "aiya..!" which expresses a mixture of frustration and surprise. Such is my experience with creating this machine, where unexpected behaviors of the machine continually challenge - and therefore develop - my improvisation practice.