Charlotte Strethill-Smith

Digital Media Design Student

Graduate Project Exhibition — June 28, 2016

Graduate Project Exhibition

Interactive installation using projection mapping and visual coding to stimulate discussion about urban renewal/design in Southampton. This involves the audience to move their hands above a city model of various areas of Southampton to trigger animated projections onto the city model. The projected images would allow for reflection and discussion about areas of the city, that could benefit from investment and renewal through showcasing ideas and scenarios put forward by the community as a form of visual interactive community appraisal. This project has stemmed from personal experiences of communal spaces and frustration with some aspects of urban design.exhibition


Graduate Project Trailer for the Exhibition — June 7, 2016

Graduate Project Trailer for the Exhibition

As part of our graduate project we are required to create a 10 second trailer to convey our projects concept. I found this challenging to fit the concept into 10 seconds; and so formed three short sentences to describe the concept with an informative animation behind. I chose to animate the trailer in after effects as it fit with the design theme and it would be more difficult to get the footage and edit in time when setting up the exhibition next week.

Finishing the Graduate Project — May 31, 2016

Finishing the Graduate Project

The MaxPatch for the graduate project is now finished. I added a counter to the patch which is triggered by the movement bang. The counter counts the video frames from the  movie file input and then counts to a certain number before stopping the video. This is timed to stop on a blank frame of the video after a certain animation has played. I had previously tried to use loop points and a counter with a clocker. The loop points did not work when I tried to duplicate it and the counter with a clocker needed to be triggered twice before resetting. I also changed the time message to frame as this then worked on all videos inputed. Finally, I adjusted the animation video to fully cover the middle model tile.

I have been in contact with Culture Southampton and intend to set up a meeting with them in the next week or so. This is intended to gain advice on current community views on developing city areas. As well as, invite Culture Southampton to my graduate project exhibition.

Graduate Project Test — May 28, 2016

Graduate Project Test

I have now tested all parts of the graduate project and fixed any issues with code that have arisen. This included changing the MaxPatch message from Time 720 to Frame 720 as the time message only worked on some videos to switch the time frame. I also need to extend the dark animation area of the train station model to cover the entire space. However, this test ran smoothly without my  computer lagging and without the animations triggering the movement detection. I now feel ready to present the project at the exhibition.

Background Animations — May 20, 2016

Background Animations

I added simple animations on top of the model backgrounds. This will attract the audience towards the models when it is not in use. I used the previous train image I created and changed the style to match the backgrounds. I also increased the time the animation resets to avoid a constantly moving train animation. I also added the train passing through another model tile, tying the models together. This portrays the model tiles are located in different areas of the same city.


Background Design Progress — May 19, 2016

Background Design Progress

I was unhappy with the previous background design and chose to change the style similar to an architectural site plan. By doing so I am more satisfied with overall look as it is presented professionally and the project concept is portrayed accurately. However the projections are still slightly pixelated so I will look wether saving as a different file format will resolve this issue.

Design Progress — May 17, 2016

Design Progress

I created three background images for the models in illustrator. I created separate png files for the buildings with 3D models in order to map the images to the model better. However, I was still unsure on the design so I researched architecture site plan design and incorporated this into one of the backgrounds. I prefer this style to the previous as I think outlining the areas and building shading creates a more professional architectural look. This aesthetic suits the theme of the project merging design ideas with community appraisal. I also intend to add small background animations to create a less static look whilst the model is not interacted with.


MaxPatch Progress —

MaxPatch Progress

I have now created a basic working patch of the interactive element. This will be used in my graduate project as an interactive function between the audience and city models. Here, I am taking a live camera feed to split into three hotspots to detect movement in. These hotspots now have their own individual threshold slider to calibrate the sensitivity individually. When movement is detected a bang is sent out. This bang sets the time on a video to various points depending on which hotspot is triggered. The video is then outputted via syphon to MadMapper to be projected onto the models.

This patch is still a little glitchy and I need to put in a function to stop the video after a clip has been played. I intend to look into the loop points function to resolve this.

Screen Shot 2016-05-17 at 16.40.17.png

Using Camera zones to detect movement in Max7 — April 23, 2016

Using Camera zones to detect movement in Max7

Previously the patch I was working with used live video feed to detect movement, when detected it would output a video to MadMapper. I used an example from a forum post on motion detection. I then used the add on Syphon and it’s example patcher to output the video to MadMapper. Screen Shot 2016-04-23 at 15.24.09.png

After creating this, I needed to separate three zones on the live camera feed to detect movement individually. I used a Camera and Zones example, which filters the video feed with threshold making it black and white and easier to detect movement. This was then divided into three camera zones or hotspots. The result is the ability to detect and isolate interaction which each of the three models.

This Weeks Progress — April 22, 2016

This Weeks Progress

Interactive Function

I asked my lecturers for help with the interactive function of my project. I had originally found a MaxMSP patch, video triggers video by Zach Poff . This patch detects motion via live video feed, which triggers and outputs a video. This patch also allowed users to draw specific areas to detect movement on the live video feed instead of the entire area. My lecturers pointed out that as it had been made with a previous version of Max, it no longer functioned correctly and it would be easier to make it from scratch.

They advised the easiest way to create the patch would be to first use an example of motion detection. Use this example to output a video to Madmapper via Syphon. Then to put the animations I want to project into one video at different points. Then to track movement in specific coordinates; which will move the timehead of the video to various points depending on which model has been interacted with. So far I have used the motion tracking example and outputted the triggered video to MadMapper via Syphon.


I have also continued testing the design aspect of my project. I isolated the buildings from the aerial view image and mapped them to the model buildings I made. I then put the rest of the image underneath. This was intended to have precisely mapped images. However, you can still see the buildings underneath.


I still wanted to see if contrasting styles of realistic and infographic projections worked together. I created simple vector images on Illustrator and animated them with After Effects. After projecting the images onto the model, I am not entirely happy with the outcome. Personally, I think the styles clash and the real map image overlaps the models in some areas. To develop the design I will research into further ways of achieving the desired look. I also intend to draw one of the models entirely in vector graphics to see what the overall effect is.