I’m going to start documenting each Sonic Interactions experiment for the purpose of marking where I am in the process. Each one of these is merely a rough sketch to build upon and are in no means finished. My first experiment takes data from the accelerometer of a SenseHat and uses it to change parameters of a simple synth.
Goal: use an accelerometer to control the frequencies of a synth, experiment with gestural interfaces for music
Questions:
How do we tame the wild data coming out of the accelerometer to use in a musical way in synth?
How do we use the joystick and middle click to add to the interaction?
Process:
- Write a python script to retrieve data from sense-hat and send to Pd
- Use data from python in Pd to alter the frequencies of oscillators:
3. Determine the mapping of data to synth parameters, I started with this:
The Pitch (x plane) from the Accelerometer was mapped to OSC 1 (oscillator frequency)
The Roll (y plane) was mapped to OSC 2
The Yaw (z plane) was mapped to OSC 3
All the code from this experiment can be found at the Sonic Interactions Github project. Python script is here
and the Pd file is here.
Let me know what you’d want to see done with this experiment next?
To make it more musical or more expressive, would you add a finer scale to the sensitivity of the accelerometer data so that you could, for example, play scales more easier?
Leave a Reply