Towards Assisted Interactive Machine Learning

In a sentence: Assisted Interactive Machine Learning (AIML) is an interaction design method based on deep reinforcement learning that I started developing for the purpose of exploring the vast space of possible mappings between gesture and sound synthesis.

I am presenting a research paper and a live multimedia performance on AIML at ICLI 2020 – the fifth International Conference on Live Interfaces taking place at the Norwegian University of Science and Technology in Trondheim, Norway.

The paper (PDF)

We present a sonic interaction design approach that makes use of deep reinforcement learning to explore many mapping possibilities between input sensor data streams and sound synthesis parameters. The user can give feedback to an artificial agent about the mappings proposed by the latter while playing the synthesiser and trying the new mappings on the fly. The design approach we adopted is inspired by the ideas established by the interactive machine learning paradigm, as well as by the use of artificial agents in computer music for exploring complex parameter spaces.

About the performance (PDF)

“My phone beeps. A notification on the home screen says “You have a new memory”. It happens at times, unsupervised learning algorithms scan your photos and videos, look at their features and metadata, and then you get a nice slideshow of that trip to South America, or those shows you went to while you were in Hamburg or London. There is something ridiculous about this (the music they put on the slideshows, for example) as well as something eerie, something even slightly distressing perhaps.”

“You Have a New Memory” (2020) makes use of the AIML interaction paradigm to navigate a vast corpus of audio material harvested from the messaging applications, videos, and audio journals recorded on the author’s mobile phone. This corpus of sonic memories is then organised using audio descriptors and navigated with the aid of an artificial agent and reinforcement learning.
The title of the piece – “You Have a New Memory” – refers to the notifications that a popular photo library application occasionally send to mobile devices to prompt their users to check an algorithmically generated photo gallery that collects images and videos related to a particular event or series of events in their lives.

I started developing these concepts in Summer 2019 in Berlin after a few informal meetings with Atau Tanaka, then Edgard-Varèse guest professor at TU Berlin. Development took place during a 1-month postdoc at Goldsmiths, University of London, in September 2019, and continued with Stefan Östersjö and the GEMM))) Gesture Embodiment and Machines in Music research cluster at the School of Music in Piteå, Luleå University of Technology, Sweden.

Paper presentation at ICLI2020, Trondheim, Norway:

Published
Categorized as Research