AIYA is an algorithmic and AI-driven improvisation machine, originally designed for the Chaosflöte and MIDI instruments, and recently adapted for installation settings.
Using MaxMSP as a starting point for the logic operations of the machine, AIYA also connects to any other program that is capable of receiving OSC messages, for the intention of creating flexible performance setups (e.g. with Touchdesigner, Ableton, Supercollider, etc.). It spins textures and melodies that are comprised of both purely synthesized sounds as well as from live-recordings of the sound and motion behavior of the human performer during the performance. Through an analysis of performance attributes such as frequency, amplitude, density, spectral centroid, spectral flatness, body position, and join rotation, interaction and behavioral rules for the machine are derived. These explicit rules are combined with "black box" AI processes that generate behaviors that may be less predictable, but still intuitive. Important in AIYA's design is to have a balance between familiarity vs. surprise, experimentation, and the expanding of the improvisational pallete.
AIYA is a fundamentally flexible system. Many different adaptations have been made, including versions for MIDI keyboards, PoseNet performer, sensor-augmented flute, human-machine quartet (two humans, two improvisation machines), vocalist, and percussionist. The spirit of the project is such that each different version of AIYA constitutes a completely new work, where the design of the non-linear performance system is considered itself a composition.
AIYA is a central component of my doctoral research project at the Künstuniversität Graz and Zürich University of the Arts.
On this page are projects and samples featuring AIYA.
Research Title:
Developing a Visual Framework for Musical Improvisation Machines: Investigating Impacts on Human-Machine Interaction, Posthuman Identity, and Performance Narrative
The practice of creating and performing with improvisation machines spans several decades, most notably seen through the works of George Lewis, Benjamin Carey, Guy Hoffman, Gil Weinberg, and IRCAM, among others. Arguably, the aforementioned developments have focused largely on the sonic sophistication of the improvisation machine and presents a research gap in the development of more sophisticated visual representations of the machine. My dissertation research aims to address this gap by developing a visual framework for improvisation machines that can be used as a defined, yet modular starting point for other creators of improvisation machines to use for their own visual representations.
The development of such a framework instigates reflections on the visual representations’ effects on the performance setting, specifically addressing the following questions:
- How might a visual representation for a musical improvisation machine be developed that aids in facilitating bidirectional human-machine communication and the blurring of perceived borders between human and machine?
- How does the visual representation contribute to the shifts in perception between the two identities of the improvisation machine, when it appears at times as an extension of the mindbody and at other times a distinct improvisation agent?
- What effects does this have on the perceived narrative of the improvisation?
The research will culminate into three distinct components: a suite of performances with the developed visual framework applied to the improvisation machine, a written dissertation reflecting upon the aforementioned issues raised by the research topic, and the software of the improvisation machine with the applied visual framework available as an open-source project for other creators to use and modify.
other notes: AIYA is derived from the Chinese expression "aiya..!" which expresses a mixture of frustration and surprise. Such is my experience with creating this machine, where unexpected behaviors of the machine continually challenge - and therefore develop - my improvisation practice.