At New York University to work on sensors for music performance – pt. 2: Making sense of IMU Motion Data

Posted by On Aug 24, 2015 In NYC

I’m currently in New York and in the past weeks I have designed a set of Max objects that make use of the motion data obtained from 9DoF IMUs for musical purposes. Antonio Camurri and his colleagues at InfoMus – Casa Paganini have made extensive use of various motion descriptors throughout the years. I tried to adapt their concepts to the data obtained from the IMUs. In this paper you can find an interesting overview of some of the techniques they employ for analysing movement expressivity.

Since at the moment I’m mostly using Thalmic Labs’ Myo, I also further developed part of the MuMyo Max patch that Kristian Nymoen, Mari Romarheim Haugen, and Alexander Refsum Jensenius from fourMs (University of Oslo) presented at NIME this year. For example, I added a way to centre the yaw orientation value in Max, as shown in the video below. Easily centring the yaw value is useful also because the orientation data of the Myo is affected by yaw drifting. I haven’t experienced a massive amount of drift when using the device so periodically centring seems like an acceptable solution in my case, however it might be worth trying to implement algorithms to dynamically compensate yaw drift such as Madgwick’s filter.

Andrew (my collaborator here at NYU) is working on a real-time DSP/synthesis engine that we will control through musicians’movements sensed by the Myo. I look forward tot try it myself and with other musicians I’ve met here at NYU. I also involved Rodrigo Schramm, with whom I had the pleasure of working several times before. He has recently completed his brilliant PhD thesis on computational analysis of music-related movements and I’m very happy to collaborate with him again.

I used some simple maths to convert the orientation data to XY position in a 2D space, which comes in handy when using some sort of XY pad like Max’s [nodes] object to control musical parameters.  In addition to the orientation, I also mapped two subsets of the EMG data to the size of the nodes, which creates some interesting global effects when increasing the effort in a movement.

I also built a patch dedicated to recording the sensor data synced with audio sources, which will be very useful for research and analysis. The recorder will also come in handy when using various machine learning techniques to recognise certain movements. I’m particularly interested in recording whole performances and and compare a recording with the real-time data stream during a live performance. To do so I’ll use Baptiste Caramiaoux’s Gesture Variation Follower, which is available both for Max and C++.

Enough with the technicalities for this post, let’s talk about music. I’ve been going to The Stone every week since I arrived here, which is unique venue for amazing, mind-blowing, genre-defying music and a constant source of inspiration for what I’m doing. Every Sunday at 3pm different musicians perform a selection of new compositions by John Zorn called “The Bagatelles”. In small venues such as The Stone it is possible to hear (or should I say “feel”) every single detail of the performance, appreciate the texture of the sound, the presence of the performers, their movements and their interplay. Extremely recommended.

I will soon post some more “musical” tests, also involving other musicians!

This project is supported by Santander Universities and it’s a collaboration between Federico Visi, who is currently carrying out his doctoral research at the Interdisciplinary Centre for Computer Music Research (ICCMR), Plymouth University (UK) under the supervision of Prof Eduardo Reck Miranda and Andrew Telichan Phillips, who is currently carrying out his doctoral research at NYU under the supervision of Dr. Tae Hong Park.

Video: Kineslimina performed at CMMR 2015

Posted by On Aug 16, 2015 In Photo/Video, Works

While I’m in New York working on motion sensors for music performance, here is a video of my piece Kineslimina performed last June at 11th International Symposium on Computer Music Multidisciplinary Research (CMMR) in Plymouth, UK. I’m not 100% happy with the sound quality and I couldn’t access the raw footage to edit it myself but the guys filming the CMMR performances did a great job nevertheless.

The piece was performed by Esther Coorevits and me and I can’t stress enough how important Esther’s contribution to this piece was. Her feedback during rehearsals was vital for the development of the piece and her performance superlative. The piece will be performed again at the 2016 Peninsula Arts Contemporary Music Festival in February next year.

At New York University to work on sensors for music performance – pt. 1

Posted by On Aug 07, 2015 In NYC

For the next few weeks I will be in New York  working on a collaborative project with the Music and Audio Research Laboratory (MARL) at NYU Steinhardt School of Music and Performing Arts Practice.

The goal of this project is to develop software tools that allow one to harness wearable sensor technologies in order to provide tools for body movement research and interactive music performance. In particular, the project will focus on the use of 9 Degrees of Freedom Inertial Measurement Units (9DoF IMUs) coupled with a form of muscle sensing, such as electromyography (EMG) or mechanomyography (MMG).

This project has a twofold purpose. One is to develop dedicated applications to process the motion data from the sensors in real time. This will allow performers to interact with music and with each other through their movements, extending the possibilities of their musical instruments. In addition to that, the software will be a useful tool for researchers to study body motion and collect sensor data for analysis.

During the first week we focused on obtaining a stable stream of data from the Myo armband. The Myo features a 9DoF IMU which provides tridimensional acceleration and angular velocity in addition to orientation data obtained by sensor fusion, both in Euler angles and quaternion format. Along with the IMU data, the Myo provides 8-channel EMG data, which is a unique feature of the device. I will work with other sensors in the near future since I wouldn’t like to limit the software I’m working on to the Myo. I’ve already tried other IMUs and I look forward to work with Marco Donnarumma’s new version of the Xth Sense, which is currently being tested and will be soon available through xth.io. However, at the moment the Myo provides a good hardware platform for prototyping algorithms and trying ideas out since it is a fairly well-engineered and compact device.

I started working on real-time implementations of movement descriptors traditionally used with optical motion capture systems, such as Quantity of Motion, and Smoothness. This descriptors allow to extract expressive features from the movement data, which are then useful for interactive music applications and movement analysis. The main challenge here is to adapt the ideas behind this descriptors to the data provided by the wearable sensors, which is completely different from the data obtained by optical devices such as the Kinect and marker-based MoCap systems.

In addition to this rather technical work, I will test the software in actual music performances, collaborating with other musicians. I believe this is a vital and essential part of the research, without which the project might steer too far away from what it is actually all about: music. While in New York, I will also try to take advantage of the fervent and inexhaustible offer of live music this city has always had, which is a great source of inspiration for what I’m doing. I’ve already been to a few excellent concerts and performances of various kinds and observing the behaviours of the musicians performing has already lead to some ideas I will want to try in the coming days.

I will try to write more posts about our progress if time allows it, possibly including videos and pictures. Alright, now back to work.

Read pt.2 here.

This project is supported by Santander Universities and it’s a collaboration between Federico Visi, who is currently carrying out his doctoral research at the Interdisciplinary Centre for Computer Music Research (ICCMR), Plymouth University (UK) under the supervision of Prof Eduardo Reck Miranda and Andrew Telichan Phillips, who is currently carrying out his doctoral research at NYU under the supervision of Dr. Tae Hong Park.

Kineslimina performed at CMMR and MuSA

Posted by On Jul 21, 2015 In Photo/Video

Kineslimina was performed at the 11th International Symposium on Computer Music Multidisciplinary Research (CMMR) in Plymouth (UK) and at Sixth International Symposium on Music/Sonic Art in Karlsruhe (Germany).

The performers are Esther Coorevits on viola and motion sensors and yours truly on electric guitar and motion sensors.

Here some pictures from the June 16th performance at the CMMR gala concert.

Paper presentation at CMMR 2015

Posted by On Jun 24, 2015 In Events

Here you can download the paper I presented at CMMR 2015 in collaboration with Esther Coorevits from IPEM, Ghent University, and Rodrigo Schramm from Federal University of Rio Grande do Sul.

It’s titled Instrumental Movements of Neophytes: Analysis of Movement Periodicities, Commonalities and Individualities in Mimed Violin Performance here is the abstract:

Body movement and embodied knowledge play an impor- tant part in how we express and understand music. The gestures of a musician playing an instrument are part of a shared knowledge that contributes to musical expressivity by building expectations and influencing perception. In this study, we investigate the extent in which the movement vocabulary of violin performance is part of the embodied knowledge of individuals with no experience in playing the instrument. We asked people who cannot play the violin to mime a performance along an audio excerpt recorded by an expert. They do so by using a silent violin, specifically modified to be more accessible to neophytes. Preliminary motion data analyses suggest that, despite the individuality of each performance, there is a certain consistency among participants in terms of overall rhythmic resonance with the music and movement in response to melodic phrasing. Individualities and commonalities are then analysed using Functional Principal Component Analysis.

PQoM

 

Kineslimina: a study for guitar, viola and motion sensors

Posted by On Jun 08, 2015 In Events

I’m giving the final touches to a piece for viola, guitar, motion sensors and live electronics that I have been working on as part of my PhD research project. It will be premiered during the Gala Concert of the 11th International Symposium on Computer Music Multidisciplinary Research (CMMR) on Tuesday, 16th June 2015. It will be performed by Esther Coorevits and me.

Here’s an excerpt from the programme notes:

Kineslimina is a piece for viola, electric guitar and live electronics that explores the use of the musicians’ instrumental gestures and movements as an expressive medium. Such gestures merge with the other musical features and become an integral part of the score. While playing their instruments, the musicians wear an armband fitted with motion sensors, which tracks their movements and sends the motion data to a computer. The computer then processes the movement data and sound, responding with a wide range of dynamics: from subtle timbral alterations that follow the movements of the bow during string changes to deeper resonances when more overt gestures are performed by the musicians.

Inspired by the studies of musical gestures and embodied music cognition, the piece requires the performers to exceed the usual boundaries of their instrumental gestures, thus creating new challenges as well as new possibilities of expression and interplay.

Motion and Music Workshop at CMMR15

Posted by On Jun 02, 2015 In Events

I’m co-organinsing the Motion and Music Workshop that will take place at Plymouth University, Plymouth, UK, on 15 June 2015. It will be a satellite event of the 11th International Symposium on Computer Music Multidisciplinary Research – CMMR 2015: Music, Mind, and Embodiment, which will be held at Plymouth University on 16-19 June 2015.

More info on the workshop webpage.

 

Unfolding | Clusters presented at the Peninsula Contemporary Music Festival 2015

Posted by On Feb 26, 2015 In Events

Unfolding | Clusters will be presented at the Peninsula Arts Contemporary Music Festival this weekend at the Immersive Vision Theatre, Plymouth University.

The Motor Neurone Disease Association (MNDA) will be present to present their work and collect donations (be generous!) and will introduce the work with me, Duncan Williams and Giovanni Dothel during the presentation on Friday at 7pm.

Here is the full schedule, check also the full festival programme.

Friday 27 February

19:00 (introduction)

19:30 (performance)

Saturday 28 February

17:00 (performance)

Sunday 1 March

14:00 (performance)

Presentation at CIM 14 Conference of Interdisciplinary Musicology, Berlin, DE

Posted by On Dec 04, 2014 In Events

This week I’ll be in Berlin attending CIM 14, the 9th Conference of Interdisciplinary Musicology. 
I’ll be presenting a new scientific paper about new developments of Unfolding | Clusters, a music and visual media installation about ALS (Amyotrophic Lateral Sclerosis).

Unfolding | Clusters was made in collaboration with Duncan Williams and Giovanni Dothel. It was first presented at the UCLA Art|Sci Center in Los Angeles in June 2014  and will be presented at the Pensinsula Arts Contemporary Music Festival in Plymouth in February 2015.

My presentation is scheduled for Saturday 6 December at 14.30 in the Curt-Sachs-Saal at the Staatliches Institut für Musikforschung. The paper can be downloaded here.

https://www.dropbox.com/s/qkovmkclda341gh/cim14_submission_72.pdf?dl=0

Paper presentation at ICMC-SMC Athens, Greece

Posted by On Sep 23, 2014 In Events

Our paper “Effects of different bow stroke styles on body movements of a viola player: an exploratory study” was presented at The joint ICMC|SMC|2014 Conference Athens, Greece, on September 18th 2014. Download the full PDF file.

Abstract

This paper describes an exploratory study of different gestures and body movements of a viola player resulting from the variation of bow strokes length and quantity. Within the theoretical framework of embodied music cognition and the study of musical gestures, we aim to observe how the variation of a musical feature within the piece affects the body movements of the performer. Two brief pieces were performed in four different versions, each one with different directions regarding the bow strokes. The performances were recorded using a multimodal recording platform that included audio, video and motion capture data obtained from high-speed tracking of reflective markers placed on the body of the performer and on the instrument. We extracted measurements of quantity of motion and velocity of different parts of the body, the bow and the viola. Results indicate that an increased activity in sound-producing and instrumental gestures does not always resonate proportionally in the rest of the body and the outcome in terms of ancillary gestures may vary across upper body and lower body.