Shimi (a.k.a Travis) is a robotic musical companion. The idea is to take the desktop experience of listening to music and interacting with your playlist (iTunes, Spotify, etc) and move it over to a human-robotic interaction while also leveraging the robotic platform for new music-based interactions. I have developed several applications for Shimi that have focused on musical query methods, physical gestures and expression, and natural language processing.
1) Query by Natural Language
2) Query by Tapping
3) Shimi Band
This is a piece I wrote for multiple Shimi robots. Sounds are mapped to different positions and gestures. Each Shimi communicates over a wireless network and their moves and are synchronized into a choreography that also allows for interesting music to occur.
4) Jamming With Shimi
Here Shimi listens (in real-time) to a live performance and generates different dance styles in response
4) Emotion recognition in language
|