-
Carey, B. (2019). Musical Interpretation In Improvised Human- Machine Performance Musical Interpretation In Improvised Human-Machine Performance. Sound Scripts, 6(1).
Clark, A., & Chalmers, D. (1998). The Extended Mind. The Extended Mind, 58(1), 1–392. https://doi.org/10.1111/1467-8284.00096
Collins, N. (2006). Towards autonomous agents for live computer music : Realtime machine listening and interactive music systems. Journal of New Music Research, October 2003, 245. http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.65.2661
Colton, S. (2008). Creativity versus the perception of creativity in computational systems. AAAI Spring Symposium - Technical Report.
François, A. R. J., Chew, E., & Thurmond, D. (2007). Visual feedback in performer-machine interaction for musical improvisation. Proceedings of the 7th International Conference on New Interfaces for Musical Expression, NIME ’07, 277–280. https://doi.org/10.1145/1279740.1279798
Franklin, S., & Graesser, A. (2015). Is it an agent, or just a program?: A taxonomy for autonomous agents. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 1193(January), 21–35. https://doi.org/10.1007/bfb0013570
Hayles, N. K. (2002). Flesh and Metal: Reconfiguring the Mindbody in Virtual Environments. Configurations. https://doi.org/10.1353/con.2003.0015
Haraway, D. J., Wolfe, C., & Haraway, D. J. (2017). A Cyborg Manifesto. In Manifestly Haraway.
Hoffman, G., & Weinberg, G. (2011). Interactive improvisation with a robotic marimba player. Autonomous Robots, 31(2–3), 133–153. https://doi.org/10.1007/s10514-011-9237-0
Hong, J. W. (2018). Bias in perception of art produced by artificial intelligence. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). https://doi.org/10.1007/978-3-319-91244-8_24
Lewis, G. E. (2003). Too many notes: Computers, complexity, and culture in Voyager. New Media: Theories and Practices of Digitextuality, 10, 93–106. https://doi.org/10.4324/9780203953853-16
Lewis, G. E. (2018). Why do we want our computers to improvise? In The Oxford Handbook of Algorithmic Music. https://doi.org/10.1093/oxfordhb/9780190226992.013.29
Lindborg, P. (2013). Skalldans , an Audiovisual Improvisation Framework. Proceedings of the Sound and Music Computing Conference 2013, SMC 2013, August 2013, 415–418. http://www.logos-verlag.de/cgi-bin/buch/isbn/3472
Russell, S. J., & Norvig, P. (2020). Artificial intelligence. In M. J. Horton, T. D. Holm, V. O’Brien, C. Trentacoste, & I. Zucker (Eds.), IEEE Instrumentation and Measurement Magazine (Vol. 23, Issue 3). Alan R. Apt. https://doi.org/10.1109/MIM.2020.9082795
Lorway,N., Jarvis, M., Wilson, A., Powley, E. J., & Speakman, J. A. (2019).Autopia: An AI collaborator for gamified live coding music performances. 2019AISB Convention.
Lösel, G. (2018) Can Robots Improvise? Liminalities, 14(1), 185. https://doi.org/10.5281/zenodo.3469861
Sturm, B. L., Ben-Tal, O., Monaghan, Ú., Collins, N., Herremans, D., Chew, E., Hadjeres, G., Deruty, E., & Pachet, F. (2019). Machine learning research that matters for music creation: A case study. Journal of New Music Research. https://doi.org/10.1080/09298215.2018.1515233
Woolrdige, M., & Jennings, N. (1995). Intelligent agents: Theory and Practice. The Knowledge Engineering Review, 10(2), 115–152. https://doi.org/doi:10.1017/S0269888900008122