You’ve probably seen ultrasonic sensors used as a simple non-contact method for detecting objects in the path of a robot, but interestingly, they are rarely seen as human-machine interface devices. Perhaps this could be because, as author Andy Grove puts it, these sensors “aren’t super reliable.” Still though, his project, which signals a Raspberry Pi with an array of these sensors, presents some interesting possibilities.
In his demonstration, he uses eight of these sensors hooked up to an Octasonic breakout board to generate individual bits of data depending on which sensor is covered. These bits together form one byte, signaling values between 0 and 255, and software of the Pi then samples this to determine what action to take, in this case what note or notes to play. Gestures can also be used to change instruments or even shut the Raspberry Pi down.
Though Groves did not implement this, and it might be difficult to do well given their inconsistencies, in theory something like this could produce a variety of values on each sensor. This would mean more control possibilities for electronic musical instruments, or whatever other device you needed to control. Taking things in a different direction, as commenter “goacego” noted, something similar could also be done with photo resistors, though that wouldn’t allow you to play music in the dark if you so desired.