My myo gesture control armband arrived last week, and I’ve been working hard to integrate it into mmi. I’ve made a Max object for it and a mmi controller patch, so now it can be used to control instruments in mmi.
The orientation data seems very good and responsive, although the myo appears to have no sensor for correcting yaw drift (such as a magnometer or optical system like the mmi remote), so this has to be corrected manually when needed. The gesture recognition works quite well for me, but still seems a bit erratic, even after creating a custom profile. No doubt this will be improved over time with software updates, but may also be better after more experiments with arm positioning. Since the EMG data can be streamed directly, it might be useful to try some custom gesture recognition of my own to see what can be done.
I’ve uploaded an early exploration with it on my youtube channel.
We took part in Science and Engineering Day at the University of Southampton on 14th March, with some new demos. Some relate to the Hands On Sound project examining motion capture and sound processing, and some revised mmi (multi-modal instrument) demos comparing different types of feedback.
mmi: visitors using Novint Falcons for force feedback in physical model synthesis at Science and Engineering Day
mmi: visitors using a Leap Motion controller to strike a circular plate at Science and Engineering Day
We had lots of visitors all day who certainly seemed to enjoy the demos. My thanks to Chris Lucas and Dan Halford for their help on the day, and also to Dan for preparing some of the demos as well. It was lots of fun!
Hands on Sound: Chris Lucas helping visitors to Science and Engineering Day with a granular synthesis demo using a Leap Motion controller.
Hands on Sound: Dan Halford discussing sound processing and motion capture with visitor to Science and Engineering Day.
Have uploaded a brief demo – plucking a string (again!) – combining Oculus, Leap and mmi. If you fullscreen on an Oculus the animation should appear in 3D. The objects are not strictly tied to the hand positions yet, but it demonstrates the idea.
My Oculus Rift arrived this week and I’ve been getting it working with mmi and Leap. I’ve updated Graham Wakefield’s oculus object for DK2 and SDK 0.4 and it is working pretty well on my MacBook Pro with GEForce GT650 1GB. Am getting around 63FPS just animating the Leap, using the V2 SDK. I’m looking to optimise further so scenes can remain at 60FPS (to avoid sea-sickness) when there’s additional scene content.
Hope to get some movies completed soon, but just screenshots for now. If you have an Oculus, fullscreen the image on the Oculus display and it should appear in 3D.
Oculus Rift DK2 with Leap Motion V2, mmi.
mmi: plucked rectangular membrane, as displayed on Oculus Rift DK2
Max binary for the updated Max Oculus object here
My Leap Motion controller (www.leapmotion.com) turned up this week and is a lot of fun. Masayuki Akamatsu has already created a Max object for the device – here – and so it was a pretty quick task to try this out in mmi. I’ve uploaded a quick vid showing some plucking with the leap motion – the frame rates suffer a bit when using the video capture software, but you get the idea.
This basic example only uses the palm positions and orientations to control the instrument – there is clearly a lot more to explore using finger data.
Here is an example using Control with a bowed string. The GUI on the iPad is constructed by sending messages from mmi over OSC, and then can be used to play the instrument in mmi.
I’ve tried out a number of OSC apps for iOS with mmi and will be posting some videos of them in action. A couple of interesting ones are Control and Fantastick, both of which allow the OSC host to construct the interface on the iOS device via messages. Indeed with Fantastick, there is no graphical interface other than what is built via messaging.
So here’s a video of using Fantastick with a plucked string, and animating the instrument on the iPad so that the performer can pluck the string by dragging the plectrum on the iPad screen. It is a bit like Garageband instruments on iOS, except here it is a physical model and the variation in the plucking is therefore potentially more dynamic and “realistic”.