My myo gesture control armband arrived last week, and I’ve been working hard to integrate it into mmi. I’ve made a Max object for it and a mmi controller patch, so now it can be used to control instruments in mmi.
The orientation data seems very good and responsive, although the myo appears to have no sensor for correcting yaw drift (such as a magnometer or optical system like the wii remote), so this has to be corrected manually when needed. The gesture recognition works quite well for me, but still seems a bit erratic, even after creating a custom profile. No doubt this will be improved over time with software updates, but may also be better after more experiments with arm positioning. Since the EMG data can be streamed directly, it might be useful to try some custom gesture recognition of my own to see what can be done.
I’ve uploaded an early exploration with it on my youtube channel.
We took part in Science and Engineering Day at the University of Southampton on 14th March, with some new demos. Some relate to the Hands On Sound project examining motion capture and sound processing, and some revised mmi (multi-modal instrument) demos comparing different types of feedback.
mmi: visitors using Novint Falcons for force feedback in physical model synthesis at Science and Engineering Day
mmi: visitors using a Leap Motion controller to strike a circular plate at Science and Engineering Day
We had lots of visitors all day who certainly seemed to enjoy the demos. My thanks to Chris Lucas and Dan Halford for their help on the day, and also to Dan for preparing some of the demos as well. It was lots of fun!
Hands on Sound: Chris Lucas helping visitors to Science and Engineering Day with a granular synthesis demo using a Leap Motion controller.
Hands on Sound: Dan Halford discussing sound processing and motion capture with visitor to Science and Engineering Day.
Have uploaded a brief demo – plucking a string (again!) – combining Oculus, Leap and mmi. If you fullscreen on an Oculus the animation should appear in 3D. The objects are not strictly tied to the hand positions yet, but it demonstrates the idea.
My Oculus Rift arrived this week and I’ve been getting it working with mmi and Leap. I’ve updated Graham Wakefield’s oculus object for DK2 and SDK 0.4 and it is working pretty well on my MacBook Pro with GEForce GT650 1GB. Am getting around 63FPS just animating the Leap, using the V2 SDK. I’m looking to optimise further so scenes can remain at 60FPS (to avoid sea-sickness) when there’s additional scene content.
Hope to get some movies completed soon, but just screenshots for now. If you have an Oculus, fullscreen the image on the Oculus display and it should appear in 3D.
Oculus Rift DK2 with Leap Motion V2, mmi.
mmi: plucked rectangular membrane, as displayed on Oculus Rift DK2
Max binary for the updated Max Oculus object here
My Leap Motion controller (www.leapmotion.com) turned up this week and is a lot of fun. Masayuki Akamatsu has already created a Max object for the device – here – and so it was a pretty quick task to try this out in mmi. I’ve uploaded a quick vid showing some plucking with the leap motion – the frame rates suffer a bit when using the video capture software, but you get the idea.
This basic example only uses the palm positions and orientations to control the instrument – there is clearly a lot more to explore using finger data.
Here is an example using Control with a bowed string. The GUI on the iPad is constructed by sending messages from mmi over OSC, and then can be used to play the instrument in mmi.
I’ve tried out a number of OSC apps for iOS with mmi and will be posting some videos of them in action. A couple of interesting ones are Control and Fantastick, both of which allow the OSC host to construct the interface on the iOS device via messages. Indeed with Fantastick, there is no graphical interface other than what is built via messaging.
So here’s a video of using Fantastick with a plucked string, and animating the instrument on the iPad so that the performer can pluck the string by dragging the plectrum on the iPad screen. It is a bit like Garageband instruments on iOS, except here it is a physical model and the variation in the plucking is therefore potentially more dynamic and “realistic”.
I visited the Microsoft Technology Centre in Reading on Friday with a group of staff and students from Southampton University to have a look at their current interactive technologies – PixelSense (renamed from Surface the day after our visit!), Kinect and Windows 8 running on a Samsung slate. Dave Brown gave us a very interesting talk and helpful demos – a couple of pics below. There’s more info about the visit on the CITE blog.
Dave Brown demoing the PixelSense
MTC Reading Envisioning Room
I was at the Cheltenham Science Festival yesterday in the Discover Zone with the Bringing Research to Life Roadshow. It is a great event with lots of interesting stands from all kinds of organisations and some great speakers – I would have liked to have spent more time looking around! We had loads of visitors and people trying out the demos – from 4 year olds to 80 year olds. I was impressed that the Kinect tracked well from the very smallest kids trying it out up to very tall adults. Iyad is running the demos again today, so if you’re attending the Festival today, do drop in to the Discover zone (which has lots of other fun things to try) and try out the music tech demos.
I recently joined the University of Southampton Bringing Research Life Roadshow – a roving set of demonstrations and activities bringing the University’s research to public audiences across the Southwest. My first event was at the INTECH Science Centre and Planetarium near Winchester, and demo’d mmi with a Kinect and with FaceOSC (which incidentally works very well with the FaceTime camera in newer MacBook Pro models), as well as some wiimote controlled audio processing demos (“Sound Explorer” – see screenshot below). The kids seemed to have fun playing – “sick” was a comment on the Kinect interface!
Iyad with the demos at INTECH
The next event is at the Cheltenham Science Festival this week. The Music Tech demo’s will be there Thursday 14th/Friday 15th June, but the Roadshow is there the whole week. Many thanks to Iyad Assaf for helping me out with the music demos at these events.
btw. INTECH has a Reactable which was great fun to play with in the breaks! (and yes I have mmi working with Reactivision, as shown on the mmi page).