On Tuesday I had my second projection mapping workshop looking at programs that interact with projection mapping software. We looked at Community Core Vision (CCV) a simple open-source interface for advanced visual tracking. CCV inputs video stream and outputs tracking data (e.g. coordinates and blob size) and events (e.g. finger down, moved and released) that are used in building multi-touch applications. This may be the answer to the interactivity problem of my project .

This is showing basic functionality of CCV and the projector and camera calibrating so CCV can detect the right blob tracking. In this example the attributes are sensing too high so it is detecting movement that is not there.


We were then shown Quartz composer, similar to Isadora but is free and less powerful. Both of these programs can feed into other programs and implement many features from simply playing a video to creating a unique interactive system that responds to a live performer.

TUIO was also recommended as an open framework that defines a common protocol and API for tangible multitouch surfaces. The TUIO protocol allows the transmission of an abstract description of interactive surfaces, including touch events and tangible object states. This protocol encodes control data from a tracker application (for example CCV and its blob tracker) and sends it to any client application that is capable of decoding the protocol.

After the workshop, the workshop leader explained she had looked into using the Kinect as an interactive element of projecting. She had also experienced the same problems and showed me the processes she went through. I will go through the documentation she gave me and plugins to see if I can get the Kinect to work again. She also suggested I could use a webcam with CCV to achieve the same result. I will follow her advice and use CCV with Max/MSP Jitter with a webcam to see if this will solve the interactivity function for my project.