The first problem I needed to overcome was how to create sound using processing. I initially used the beads library which is a software library written in Java for realtime audio. I used a tutorial on how to generate sound using beads. I started with a basic synth sound and put it in a sketch where you can increase the gain by moving the mouse.
I then experimented with different effects you can use. These are examples of delay and a filter sweep over a synth sound.
I then developed this onto importing audio and started with a simple drum beat which is passed through a filter when the left button is clicked, creating a muffled quieter version of the original.
This then moved on to adding drum samples and assigning them to keys, this could be used as an interactive element for music composition as described in my Design Iterations – Concept Idea. This idea involved using three interactive screens to produce sounds based on hand movements picked up by the camera. If I wanted to take this example further I would need to map the kick/snare sound to an area of the screen which when the camera senses movement there, it would trigger the sound.
I also used the tutorial to create this example of a ball moving around the screen and changing the gain of a synth sound depending on where it moves to. This is similar to the idea I described in Design Iterations – Concept Idea 2, where I described dots representing people who walk past and following a path which interacts with other dots/ people using sounds to represent communication or events happening. For this I would need to solve the problem on how to map the sound to the dot and then create a random noise generator for other dots that appear. I would also need to figure out how to get a dot to appear based on camera movement.