Jamie Thomas – The Karman Cube

Experimentation

This chapter explains the developmental stages of implementing a suitable tracking method for the project, as well as having downloadable applications and Max/MSP patches to further illustrate the video documentation in the near future. Any comments and suggestions would be greatly appreciated on the comments page to the right hand side.

Early experimentation was carried out using physical objects in a space with a camera recognizing and tracking the colour of an object in the Max/MSP programming environment. This object could then be assigned an x and a y coordinate and tracked in ‘real-time’. When moved from left to right the object would produce an incrementing keyboard scale with the y axis controlling the velocity of each note. The problem with this form of tracking is that the lighting conditions must be very specific in order for the tracking to perform well.

In the next development a focus on removing the physical object from the interaction was necessary. This was made possible by using the same process as the first patch, by using colour recognition. A decision to focus on the hands for interaction was made, as a way to simplify initial development for quick prototyping. In order to adapt the earlier patch the camera input needed to be converted into a binary image as to separate the light from dark parts off the input. The luminescence of the computer screen was used to reflect light from the hand to the camera, creating a white silhouette of the movement. The same musical parameters were applied.

Whilst both patches were successful in terms of initial tracking development, the strict lighting conditions limited their use to darkly lit areas and would therefore not be suitable to continue developing. The sensitivity and the amount of controllable parameters would also hinder the development. A different approach, with less limitations and more control was required.

The cv.jit library for computer vision, which is a set of jitter objects which work inside max/msp offered a new platform for image analysis with many different tracking methods.  A number of patches were made exploring the use of the cv.jit.track object which enables any object once clicked on to be tracked on a two plane matrix, outputting x an y values for its position. These were mostly unsuccessful due to its unreliability due to environmental limitations e.g. a movement in the background would interfere with the tracking point.

Some research was then conducted into how the cv.jit.track object was implemented, and it was found that it uses the horn-schunck method of optical flow. In which pixel displacement between frames of the video are calculated in order to detect the direction and velocity of movement within the video. The object cv.jit.hsflow which would provide the basis of the following examples and allowed for the analysis of the input using the horn-schunck method resulting in the output of a higher level of information for further analysis.

Although achieving some interesting results, the tracking method was still limited to a x and y output. The next patch demonstrates a method of achieving a higher number of controllable parameters using only the x axis. This was made possible by creating a virtual cube with which to interact. Each side of the cube contains controller messages of 0 to 127 with the outputting result dependent of the orientation of the cube.

The implementation of this visual representation caused complications within the interaction due to the amount of attention drawn towards it. This created a sense that the cube was in control of the audio, and the user in control of the cube, this created a barrier between user and outcome similar to that of a physical object. As stated in the previous chapter the process was valuable and resulted in the realization that further research into the creation of virtual entities is needed in order to make some crucial decisions regarding the interface implementation. This ongoing process of creating tools and experimenting with their interfaces has proved to be an extremely worthwhile process. This form of exploratory development will continue throughout the project.

%d bloggers like this: