I visited the Microsoft Technology Centre in Reading on Friday with a group of staff and students from Southampton University to have a look at their current interactive technologies – PixelSense (renamed from Surface the day after our visit!), Kinect and Windows 8 running on a Samsung slate. Dave Brown gave us a very interesting talk and helpful demos – a couple of pics below. There’s more info about the visit on the CITE blog.
Dave Brown demoing the PixelSense
MTC Reading Envisioning Room
I was at the Cheltenham Science Festival yesterday in the Discover Zone with the Bringing Research to Life Roadshow. It is a great event with lots of interesting stands from all kinds of organisations and some great speakers – I would have liked to have spent more time looking around! We had loads of visitors and people trying out the demos – from 4 year olds to 80 year olds. I was impressed that the Kinect tracked well from the very smallest kids trying it out up to very tall adults. Iyad is running the demos again today, so if you’re attending the Festival today, do drop in to the Discover zone (which has lots of other fun things to try) and try out the music tech demos.
I recently joined the University of Southampton Bringing Research Life Roadshow – a roving set of demonstrations and activities bringing the University’s research to public audiences across the Southwest. My first event was at the INTECH Science Centre and Planetarium near Winchester, and demo’d mmi with a Kinect and with FaceOSC (which incidentally works very well with the FaceTime camera in newer MacBook Pro models), as well as some wiimote controlled audio processing demos (“Sound Explorer” – see screenshot below). The kids seemed to have fun playing – “sick” was a comment on the Kinect interface!
Iyad with the demos at INTECH
The next event is at the Cheltenham Science Festival this week. The Music Tech demo’s will be there Thursday 14th/Friday 15th June, but the Roadshow is there the whole week. Many thanks to Iyad Assaf for helping me out with the music demos at these events.
btw. INTECH has a Reactable which was great fun to play with in the breaks! (and yes I have mmi working with Reactivision, as shown on the mmi page).
I found some interesting software by Kyle McDonald called FaceOSC (https://github.com/kylemcdonald) which tracks head position and facial expression and outputs OSC messages. Here’s a YouTube clip of a simple attempt at using it to control a single reed model in mmi:
I’ve posted a demo on YouTube using Reactivision to control a plucked string. With a PS3 Eye it’s giving 40 FPS which is quite responsive, but it should be possible to get higher frame rates than that.
The wiimote video from my last post is also now on up on my YouTube channel.
My channel: http://www.youtube.com/rpolfreman
I’ve put a new video on YouTube showing use of Kinect to pluck a string with a number of parameters being controlled by hand positions.
Non-linear mapping has been added to mmi using linear interpolation and table objects in Max.
The mmi wii controller patch has been updated to include Motion Plus (i.e. gyros) when available. This gives speed and orientation of the wiimote, but subject to drift. A basic correction of roll and pitch using accelerometer data has been included, but needs refinement. I’ll post a video demo soon.
added a new page about some software I’m developing called MMI (Multi Modal Instrument), with some screenshots and a short YouTube example using a Kinect to control physical modelling synthesis.