To start testing if the Kinect sensor could be used as a way of interacting with the projection, I firstly borrowed a Kinect model 1414. Kinect sensors have an infrared projector, infrared camera that picks up the infrared projector and an RGB camera. This helps detect depth perception. I also bought an adapter with a usb and power supply in order to use the Kinect with my Mac.

I looked for tutorials on how to use the Kinect and projector together to create an interactive element and found Open Kinect with processing by Daniel Shiffman (also available as a youtube playlist). Whilst I followed the tutorial, processing came up with the error There are no kinects, returning null and after trying a multitude of ways to amend this, I could not fix the problem. I also looked into OpenNi; an open source structure sensor, however it was recently bought by Apple so no longer available.

As the Kinect sensor was suggested due to its depth perception use, it’s possible I may not need to use this method as I only want to detect a person in an area and project in that place. For this reason I will look into other uses of projection interaction that may be easier and more effective.