Engineers at Design I/O, a creative studio specialising in immersive interactive installations, have brilliantly used Google’s Project Soli sensor kit and machine learning tool Wekinator to detect small finger movements to allow someone to play a tiny violin solo.
It works when a user rubs their fingers together over the sensor. Google’s Project Soli, a tiny radar on a chip then senses this and plays a violin sound.
If the joke is lost on you, mock sympathy can often be expressed by rubbing your thumb and index finger together, and saying “Can you hear that – it must be the world’s smallest violin”.
Subscribe
If you enjoyed this article subscribe to our mailing list to receive weekly updates!
Project Soli’s main goal is to make touchless gesture easily accessible to developers by using a miniature radar.
This could be useful if you wanted to interact with ‘imaginary’ sliders to adjust something like the air conditioning in a car. The system would would allow you to slide your thumb across the side of your outstretched index finger.
While Soli provides the input, Design I/O used Wekinator, open source machine learning software, to train its model. The great thing about this approach is that while the input will be whatever the radar detects, Wekinator is able to learn your chosen gesture and perform the violin output for you.
The output, in this case, is violin music. While the input is whatever the radar detects. Wekinator is then used to do the heavy lifting. It can learn the monitored finger movements the radar detects and then match those chosen gestures to perform the appropriately sad violin output.
Alongside Wekinator, Design I/O used an Open Sound Control (OSC) protocol to broadcast its inputs and outputs over a network.
Design I/O cited this XKCD cartoon as their inspiration