My myo gesture control armband arrived last week, and I’ve been working hard to integrate it into mmi. I’ve made a Max object for it and a mmi controller patch, so now it can be used to control instruments in mmi.
The orientation data seems very good and responsive, although the myo appears to have no sensor for correcting yaw drift (such as a magnometer or optical system like the mmi remote), so this has to be corrected manually when needed. The gesture recognition works quite well for me, but still seems a bit erratic, even after creating a custom profile. No doubt this will be improved over time with software updates, but may also be better after more experiments with arm positioning. Since the EMG data can be streamed directly, it might be useful to try some custom gesture recognition of my own to see what can be done.
I’ve uploaded an early exploration with it on my youtube channel.