Charlotte Strethill-Smith

Digital Media Design Student

Design Iterations – Finalising Imagery — January 19, 2015

Design Iterations – Finalising Imagery

My next stage after sound experiments, was to create what I wanted the sketch to look like. I based the process on a concept idea I noted previously. This idea was to create a ball that made its own path around the screen every time someone walked past. The effect would be multiple balls making their own path around the screen to represent our own paths we make, intersecting and interacting with each other.

My first problem to overcome was to create a ball that moved around and stayed inside the screen. I did this using an array to avoid typing the same code repeatedly  After creating this sketch I then came across another problem, making the balls starting point random as well as pairing this sketch with an earlier camera sketch.

Screen Shot 2015-01-20 at 15.23.53 Screen Shot 2015-01-20 at 15.23.45


I have now created this sketch which produces multiple balls every time a movement is made in front of the camera. I also randomised the starting location of each one. However every time a movement is made multiple balls are created rather than just one.

I am also unsure wether to incorporate the sound experiments as they were not entirely successful.

Screen Shot 2015-01-21 at 19.14.19Screen Shot 2015-01-21 at 19.14.36 Screen Shot 2015-01-21 at 19.14.45 Screen Shot 2015-01-21 at 19.14.54

Processing Sound Experiments — January 5, 2015

Processing Sound Experiments

The first problem I needed to overcome was how to create sound using processing. I initially used the beads library which is a software library written in Java for realtime audio. I used a tutorial on how to generate sound using beads. I started with a basic synth sound and put it in a sketch where you can increase the gain by moving the mouse.

Screen Shot 2015-01-12 at 15.15.39

I then experimented with different effects you can use. These are examples of delay and a filter sweep over a synth sound.

Screen Shot 2015-01-12 at 15.24.44Screen Shot 2015-01-12 at 15.24.49

Screen Shot 2015-01-12 at 15.32.55Screen Shot 2015-01-12 at 15.32.58

I then developed this onto importing audio and started with a simple drum beat which is passed through a filter when the left button is clicked, creating a muffled quieter version of the original.

Screen Shot 2015-01-12 at 15.39.47 Screen Shot 2015-01-12 at 15.39.59

This then moved on to adding drum samples and assigning them to keys, this could be used as an interactive element for music composition as described in my Design Iterations – Concept Idea. This idea involved using three interactive screens to produce sounds based on hand movements picked up by the camera. If I wanted to take this example further I would need to map the kick/snare sound to an area of the screen which when the camera senses movement there, it would trigger the sound.

Screen Shot 2015-01-12 at 15.45.47

I also used the tutorial to create this example of a ball moving around the screen and changing the gain of a synth sound depending on where it moves to. This is similar to the idea I described in Design Iterations – Concept Idea 2, where I described dots representing people who walk past and following a path which interacts with other dots/ people using sounds to represent communication or events happening. For this I would need to solve the problem on how to map the sound to the dot and then create a random noise generator for other dots that appear. I would also need to figure out how to get a dot to appear based on camera movement.

Screen Shot 2015-01-12 at 15.58.30 Screen Shot 2015-01-12 at 15.58.36

Concept Theory — January 2, 2015

Concept Theory

My concept ideas look into media theories of interaction. According to John Thompson, who draws on the work of Habermas developing ideas from the Frankfurt School. there are three distinctive types of interaction. Face to Face interaction, Mediated interaction for instance people talking on the phone and mediated Quasi-interaction which refers to the social relations created by modern day mass media. This interaction stretches across space and time but does not link individuals directly. Thompson makes the point that all three types intermingle in our lives today.

Functionalist theorists such as Charles Wright (1960) and Denis McQuail (2000) focus on how media integrates and binds societies together. Those who have a rather positive view on The Internet see it as a positive addition to human interaction and argue it enriches and expands individual peoples social networks. Separation and distance from friends and family becomes more tolerable due to internet communication. (Giddens  2009)

One theory I would like to represent in my work is from Howard Rheingold’s ‘The Virtual Community’ (2000). He looks into the ‘Virtual Communities’ which he defines as “social aggregations that emerge from the Net when enough people carry on…public discussions long enough, with sufficient human feeling, to form webs of personal relationships in cyberspace’. He then goes on to say that being in this world is a lot like being part of a physical real world but in a disembodied form.” I want to interpret this in this project by representing us in a disembodied form, but showing the interactions we make in real and virtual life. Some scholars could argue that we spend less time interacting in the physical world(Giddens, 2009). However it could be possible we are constantly surrounded and taking part in virtual and real like interactions.

In my concept ideas I have incorporated the use of sound. I want to reflect on the ideas of Theodore Adorno (1976) of the Frankfurt School of critical theory. He argues that “Musical forms tend to reflect the society within which they exist.”(Giddens, 2009) for instance in capitalist society, musical forms take on predictable structures to offer easy gratification These forms train people to expect repetition and uniformity.

Design Iterations – Concept Idea 2 — December 29, 2014

Design Iterations – Concept Idea 2

I wanted to experiment with creating sound via processing. I thought of an idea that also incorporates the concept of demonstrating human interaction. My idea involves the camera picking up when someone walks past by registering the brightness of the camera picture, taking one pixel from the middle of the screen and analysing that against the previous capture. If it is darker it means someone has walked past and a dot will appear, representing the person who walked past. This dot will move around the screen making it’s own path and noise. When another person walks past, another dot will appear and make its own path and a different pitch of noise. If the dots collide they bounce off and the impact makes another noise. This project represents how we are constantly surrounded by people everyday, the interactions we have can be minimal and yet our paths can cross at any time. It could also be a comment on how our paths can always cross but never interact.


Processing – Week 8 – Reading and Mapping Values — December 28, 2014

Processing – Week 8 – Reading and Mapping Values

For this Processing Workshop, we started by looking at converting a colour value to a height location on screen. We created the function testColor to map mouse coordinates to the image we have loaded. We did this by creating x and y integers and mapping the mouse points to the width and height. This then returns the images pixels depending on where the mouse is. For this example it creates rectangles which are the colour of whichever pixel the mouse is on.

Screen Shot 2014-12-11 at 13.38.47colourline


The next step is colour to height reading multi values. In this example we are using arrays to store our colour values. We then created three custom methods to  draw all the blocks to screen, build an array of colours and map mouse coordinates to image coordinates to read a colour value.

Screen Shot 2014-12-11 at 14.47.31Screen Shot 2014-12-11 at 14.47.35    colourblocks

Design Iterations – Concept Idea — December 18, 2014

Design Iterations – Concept Idea

I recently downloaded an app called propellerhead – figure. This is a simple app which has three main screens the user can input Drums, Bass and Synth into by dragging their finger to create different sounds and volume. These are all recorded together to produce looped music shorts which can be downloaded and/or exported to different programmes.


I really liked the main concept of this idea and thought about how to execute this in a larger space. By creating sketches I could use three different screens. These would use the camera input to track hand movements and create different sounds based on where the hand coordinates on the screen. One screen would be drums, one synth bass and one synth lead.


This interactive experiment would attempt to engage users in the area to interact with the screen and thus each other by influencing the sounds and communicating with each other using sounds/ another language. This would be a comment on how little we engage with surrounding people, unless through a screen with a sense of anonymity. The users never have to look at each other and yet are communicating with one another through sounds and body movement. This could also look at how we can meet and interact with people on a global level, even if we cannot understand their language , through forums, games and other social mediums. This may also be a good observation of body language. Some may be shy and timid with their body language and create only a few sounds, whereas others may express themselves more fluidly and confidently to play around with the experiment more.

Processing – Week 7 – Pixels Falling — November 22, 2014

Processing – Week 7 – Pixels Falling

This week we converted an images pixels into particles which when we press any key on the keyboard, the image explodes into particles and falls down.

Screen Shot 2014-11-26 at 12.18.02 Screen Shot 2014-11-26 at 12.18.18 Screen Shot 2014-11-26 at 12.18.23

We did this by creating an array list which is a store that will hold all the classes we make if we assign it to the list. By creating a separate class we can call that function multiple times without having to type it all out repeatedly. This is useful for a lot of things like particles. When the particles are added to the ArrayList they are stripped off their interface so we use the class name in parenthesis to reattach the interface to the objects data. When the mouse button is pressed we call update on the particle system to reset the image. To separate the image to pixels and draw each particle to screen by using p.draw.

Screen Shot 2014-11-26 at 12.20.15

Screen Shot 2014-11-26 at 12.23.47Screen Shot 2014-11-26 at 12.23.36Screen Shot 2014-11-26 at 12.24.11Screen Shot 2014-11-26 at 12.24.12

Science Museum London Visit — November 21, 2014

Science Museum London Visit


Our Course went to London on a visit to the Science Museum. There was a floor full of interactive exhibits called Who am I?. The exhibits explored every part of the human body and mind. There were a few exhibits which related to my current unit of creating an interactive feature at uni. One used a camera to map the movements of the people in front of it and project them on a screen built up as a large group of balls. These would then float down across a long strip underfoot whilst questions came up to make the user think.



The largest installation was a large table in the middle of the room which encouraged users to play short games about their Favorite food, what languages they could speak and other aspects. These would then be projected and compared on a large screen on the wall.


The pods surrounding all games which would test things such as if you were more creative thinking or logical thinking and then compare you to statistics.


In the main entrance there was a huge display with blocks of light appearing like they are being affected by gravity and the force of other light blocks. this directly relates to how we recently used processing to create different forces such as gravity and friction to affect a dot on the screen.


Another interesting exhibition, was the 3D printing space which showcased a variety of 3D printed objects as well as how the process works. I also looked at an exhibit which showed a variety of 3D shapes and patterns which gave me a good idea of the kind of result I want to produce visually for the interactive project.


pic4 10393_3

Processing – Week 6 – Velocity Ball — November 15, 2014

Processing – Week 6 – Velocity Ball

This Week we started off by learning about vectors. We did this by seeing how to manually insert vectors and seeing the individual aspects of it rather than just calling the PVector function.

Screen Shot 2014-11-22 at 18.26.26

We then moved on to producing a visual representation using what we learnt about vectors from the first half of the workshop. We created three forces of physics: gravity, friction and a force pushing the ball depending on the mouse position to the centre.

Screen Shot 2014-11-22 at 18.35.54 Screen Shot 2014-11-22 at 18.35.58

This resulted in the user being able to do this.


Processing Experiment 2 — November 10, 2014