Charlotte Strethill-Smith

Digital Media Design Student

Model Making — March 29, 2016

Model Making

To start model making I looked at into areas around southampton with large public spaces that were not in use or could be further developed. These spaces included:

  • A small green space opposite the train station used by pedestrians as a walk path. This could be further developed with lights or a brightly lit artwork for safer passage at nights.
  • Two unused lots next to the northern bridge. One is originally where the old ITV building used to be.
  • An unused lot next to the Paris Smith Building.
  • An unused lot behind Debenhams where an old shopping centre used to be.
  • Also further use of the large concrete space outside the Guildhall. This could be used for pop up exhibitions.

 

After picking the areas to build I printed the aerial views onto grid paper and cut out the building shapes. These were then scanned into the computer and enlarged 3 times to create the actual scale size of the models. The grid gave the measurements needed to cut the shapes out of styrofoam. This was done by heating a metal knife over a flame which easily cut through and created a straight edge. The models were then sanded down to get rid of rough surfaces and to carve more detail such as roof edges into the buildings.

The next step with the models will be to see how well they are projected onto and eventually to complete at least 4 models that will sit on individual podiums.

 

Advertisements
Visual Coding Progress! — March 22, 2016

Visual Coding Progress!

This week I looked into various Max 7 tutorials and patcher help files, this included looking at the CV.JIT object guide. This object guide showed various uses for the CV patcher from pattern recognition, pixel tracking and uses with blobs. Looking through tutorials on interactive Max projects, I found a patcher created by a user (YANIKI), who commented on a post asking how to trigger zones using a camera. This patcher triggers when movement is detected in a certain camera area.

I linked this to a previous patcher which uses Syphon to send code to MadMapper and project an image. This creates the basic interactive element to my project where hand movement picked up by a camera triggers a video to project. However, two issues have presented themselves through this patch:

  • Unless there is constant movement in the camera zone the video pauses and resets itself.
  • MapMapper connects to the patch and finds the video, but will not play the video.
  • Max 7 cannot find the external PS2 EyeToy and this patch works using the internal camera.

In order to resolve these problems I will look into jitter tutorials on video and other relevant help articles. I will also look into buying an external webcam compatible with Max 7.

 

Screen Shot 2016-03-20 at 22.13.14Screen Shot 2016-03-20 at 22.13.23Screen Shot 2016-03-22 at 14.30.35

Interactivity Progress! — March 15, 2016

Interactivity Progress!

MaxMSP is a visual programming language that helps you build complex, interactive programs. This is especially useful for building audio, MIDI, video, and graphics applications where user interaction is needed. I followed a few of the tutorials provided so I could grasp the basics of this program. I created this patch which takes video input  (either live or pre-recorded) and feeds the data through RGB colours and then converts it into MIDI notes. The sound can then be adjusted accordingly.

Screen Shot 2016-03-15 at 19.16.59

I furthered this by connecting Max 7 to MadMapper (a software used for projection mapping) using a Syphon package. This means I can edit live or pre-recorded video to then be outputted and updated live to MadMapper.

 

I also used a TUIO patch, an open framework that allows the transmission of an abstract description of interactive surfaces, including touch events and tangible object states. With this patch I connected Max 7 to CCV (an open source/cross-platform solution for blob tracking with computer vision). In theory, this means I can use CCV to track movement via camera which turns the movement into blobs, the blobs can then be sent to Max 7 where the response to project video on top of the blobs can be made. This can then be outputted through MadMapper and projected onto the city model I am building.

Screen Shot 2016-03-15 at 15.41.22

 

Group Meeting — March 10, 2016

Group Meeting

This week I had a group meeting with my community of practice supervisor and group. Each group member presented their project concept and progress so far and then feedback was given from the group and supervisor. I find this method helpful for gaining experience in presenting and pitching my ideas as well as gaining useful feedback to progress my project further.

The main points bought up by my group and supervisor were questions on:

  • If I am keeping the city model location to Southampton or a general city
  • If I am going to call on collaboration with any architecture students or residents of Southampton
  • If I am going to contact Southampton Council or the art gallery with my project

The suggestion of using a map instead of a city model was bought up and could be used to colour the various sections of southampton. This could also bring up a separate projector screen with a photo gallery local residents had drawn of design ideas for the local area. From this meeting I have decided to contact Southampton Council and local art galleries to pitch my project and ask if they would contribute to the project.

Isadora, CCV and Quartz Composer Workshop — March 9, 2016

Isadora, CCV and Quartz Composer Workshop

On Tuesday I had my second projection mapping workshop looking at programs that interact with projection mapping software. We looked at Community Core Vision (CCV) a simple open-source interface for advanced visual tracking. CCV inputs video stream and outputs tracking data (e.g. coordinates and blob size) and events (e.g. finger down, moved and released) that are used in building multi-touch applications. This may be the answer to the interactivity problem of my project .

This is showing basic functionality of CCV and the projector and camera calibrating so CCV can detect the right blob tracking. In this example the attributes are sensing too high so it is detecting movement that is not there.

 

We were then shown Quartz composer, similar to Isadora but is free and less powerful. Both of these programs can feed into other programs and implement many features from simply playing a video to creating a unique interactive system that responds to a live performer.

TUIO was also recommended as an open framework that defines a common protocol and API for tangible multitouch surfaces. The TUIO protocol allows the transmission of an abstract description of interactive surfaces, including touch events and tangible object states. This protocol encodes control data from a tracker application (for example CCV and its blob tracker) and sends it to any client application that is capable of decoding the protocol.

After the workshop, the workshop leader explained she had looked into using the Kinect as an interactive element of projecting. She had also experienced the same problems and showed me the processes she went through. I will go through the documentation she gave me and plugins to see if I can get the Kinect to work again. She also suggested I could use a webcam with CCV to achieve the same result. I will follow her advice and use CCV with Max/MSP Jitter with a webcam to see if this will solve the interactivity function for my project.

K6 Gallery Exhibition – Brutalist — March 4, 2016

K6 Gallery Exhibition – Brutalist

This week I visited the K6 Gallery (an exhibition space in two phone boxes) to view their Brutalist exhibition  celebrating Southampton’s post-war architecture. This is relevant to my graduate project which disagrees with this architecture aesthetic. Whereas this exhibition shows an alternative viewpoint which celebrates the style. By accepting an opposite viewpoint the subjective nature of my project could lessen.

Photographers Greg Moss, James Newell and Daniel Cane capture Southampton’s best post-war buildings as chosen by the writer Owen Hatherley. These pieces compare the contrasting nature of medieval style ruins and modernist concrete buildings populating Southampton, whilst praising the ambitious in-house architecture programme as a success.