In the search for more intuitive ways for controlling sound synthesis, the hands can be the direct source of control.
Hardware + Software
is a free-hand gesture-based instrument for sound manipulation and performance using 2 networked Leap Motion controllers. This allows for direct control over sound parameters with our hand in 3d space.
Skeletal hand data is captured in real-time with 2 Leap Motion controllers. openFrameworks converts hand data into OSC messages, which are then sent to anything that receives OSC, in this case Wekinator (a machine learning middleman), which processes the hand data and sends the results to Max/MSP for sonification.
Intuitive Gesture Design
Taking cues from traditional musical interfaces, Signal separates synthesis functions between the hands, providing distinct platforms for each hand to execute precise gestures (dominant hand) and expressive gestures (supporting hand).
By training Neural Networks, a form of machine learning, non-linear relationships are established between the gestures and the desired outputs. Delegating technical details to the algorithm allows the designers' focus to shift towards finding appropriate gestural metaphors for specific synthesis processes.
The Left Hand controls various parameters simultaneously, providing expressive possibilities. Held in an elevated position, it can perform free-hand gestures, combining various movements. There are several gestures to choose from.
Once in the active space, the left hand can open and close, separating the tips of the fingers from the thumb. This controls the Grain Birth Rate, which is the time between when each grain starts playing. The wider the open hand, the greater the distance between the grains.
Rotating the forearm controls the Grain Size. This means the time it takes the grain to playback from beginning to end. Outward rotation decreases the grain size, while inward rotation increases the grain size.
Extending the arm forward decreases the Pitch. This is in relation to depth, and depth to decreasing pitch.
Using the right hand, the distance between the thumb and index finger determines the size of the subsample window. The position of the thumb also determines the position of the window within the larger sound file. I chose this gesture because the distance between the index and thumb is commonly used to represent size, and within the audio community it is common to scrub through recordings on a horizontal timeline using a window similar to this one.