System Design:
Balance of control vs. automation:
"how much should I be able to steer/influence the machine?" >> the improvisation machine as an autonomous entity vs. an instrument
“how does the improvisation machine know when the performance is ‘over’?” >> how do we translate "typical" endings in improvisation into code?
Sonic identity:
distinct identity vs. sonic extension of human performer
>>greater similarity in sonic identity contributes to the perception of it being an extension to oneself
>>faster responsiveness & more transparent interaction rules also contributes to this
how distinct/similar should the sounds produced by human vs. machine be?
Reinforcing stereotypes?
Many aspects of machine improvisation reinforce Western performance practice (George Lewis' Voyager is perhaps an exception).
How can machine improvisation yield different ways of approaching improvisation, how can it be used to emancipate ourselves from traditional performance stereotypes?
Analysis:
Musical elements in improvisation practice
Translated into computer-actionable sonic transformations (real-time)
…of sonic entrance/exit of instrument (is someone playing or not?)
Set an audio gate; when the threshold has been reached, something is considered to be “playing.”
…of current playing dynamic level / of average playing dynamic level
Amplitude measurement of the audio signal
…of current pitch content / of pitch content over time (e.g., “what key are we playing in, is this applicable?”)
…of current “energy” of improvisation / of average “energy” of improvisation
Frequency measurement of the audio signal; played pitches can be accumulated into a buffer which can be analyzed to determine the “key” [1] in which one is playing.
Combined calculation of acceleration/velocity values of motion sensor, velocity of changes from note to note, velocity of changes in timbre
Amplitude envelope tracking
…of sound envelopes (soft entrances? sharp entrances? is one playing staccato?)
…of physical bodily gestures
Measurement of acceleration/velocity values of motion sensor
Behavior:
Musical elements in improvisation practice
Translated into computer-actionable sonic transformations (real-time)
Call and response
Buffer recall, buffer recall with transformation, complex delay lines
Harmonizing lines
Pitch shifting, or pitch shifting and time transformations
Creating foreground and background roles
Motivic transformation
Amplitude modulation, filtering, playing with repetition or holding of musical gestures/notes (possibly to indicate background textures, as an example)
Buffer recall combined with pitch shifting, time shifting, and/or timbral alteration of original source sound
Analyze the data parameters generated in real-time by the human (pitch, amplitude, motion, etc.), and generate material with qualities opposing these. Additionally, a timer and/or trigger could be set to activate/deactivate the "counter" behavior.
“Counter” behavior
“Follow” behavior
Anticipation/foreshadowing of next musical action
Analyze the data parameters generated real-time by the human (pitch, amplitude, motion, etc.), and generate material with qualities resembling these. Additionally, a timer and/or trigger could be set to activate/deactivate the "follow" behavior.
Create a consistent [visual] gesture that precedes the main action (e.g., flashing the screen once to indicate a change in sound material)