This piece was performed at NIME 2018 (both at the Virginia Tech’s Moss Arts Center and at the NIME performance night organised by University of Virginia in Charlottesville) and at MOCO 2018, held at InfoMus – Casa Paganini. I composed…
New modosc objects for EMG & MoCap processing in Max
During November and December 2018, I had the opportunity to spend 5 weeks as a visiting researcher at RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion, an amazing centre of excellence recently inaugurated at the University of Oslo.…
modosc: Mocap & Max video tutorials
These are some introductory video tutorials about processing motion capture data in real time in Max using the modosc library. Modosc is a set of Max abstractions designed for computing motion descriptors from raw motion capture data in real time.…
Building a swarm poly synth using Max 8 new MC objects
I just downloaded the new Max 8 and here is a simple synth I built using the new MC (multichannel) objects. Each voice has 32 sawtooth oscillators, so with 6-voice polyphony you can get up to 192 oscillators playing at…
Workshop and Performance at Harvestworks, New York City
I recently ran a workshop and performed at Harvestworks in New York City. The workshop was done in collaboration with Andrew Telichan Phillips form the Music and Audio Research Laboratory at NYU Steinhardt. The amazing Ana García Caraballos performed with me…
Testing the XTH Sense with Physical Models and Machine Learning
I recently had the chance to play with a prototype version of the new XTH Sense. I met up with Marco Donnarumma and Balandino Di Donato at Integra Lab in Birmingham and we spent a couple of days experimenting with this interesting and…
Performances at Peninsula Arts Contemporary Music Festival 2016
Very excited to be performing two pieces at this year’s Peninsula Arts Contemporary Music Festival. The super talented Esther Coorevits will once again join me to perform an updated version of Kineslimina, which will be performed at the Gala Concert on…
At New York University to work on sensors for music performance – pt. 5: tests with musicians
Some experiments I did together with Andrew Telichan Phillips and some very nice and talented musicians at NYU Steinhardt and at The Sweatshop. We used Myo sensor armbands and Machine Learning to adapt control parameters to the movements of musicians playing different musical instruments.…
At New York University to work on sensors for music performance – pt. 4: Talk at NYU Steinhardt
Tomorrow I am going to deliver a talk at the NYU Music and Audio research laboratory about my research at the Interdisciplinary Centre for Computer Music Research (ICCMR) in Plymouth. Click on the poster below to learn more. Related posts: pt.…
At New York University to work on sensors for music performance – pt. 3: Machine Learning
In the past couple of weeks we used two Myos at the same time to evaluate more high-level features of the movement such as Symmetry, Contraction and even full-body weight shifting, which worked surprisingly well when combining and comparing the orientation data…