Gesture controlled Musical Conducting

Kieran Flay


Supervised by Dave Marshall; Moderated by Alia I Abdelmoty

The idea is to use a Kinect and/or Leap Motion gestural input device to device methods to allow for musical conducting. This basically means recognising and mapping appropriate bodily gestures into MIDI messages that can control some digital audio workstation.

Machine Learning methods will be used to recognise the gestures.

Initial Plan (30/01/2016) [Zip Archive]

Final Report (04/05/2016) [Zip Archive]

Publication Form