Jamie Thomas – The Karman Cube


This section introduces the research taken out on the available technology that is applicable and appropriate to the project, while also looking at an established work and performance that has become of interest in this field. The main purpose of this chapter however is to discus the theoretical implications that have arisen from research in relation to this work.

In order to create an intangible system for interaction with computer software, a mode of recognizing movement within an given environment must be established. Many input devices are available including ultrasonic sensors, infra red transmitters and receivers, sonar and radar. Some preliminary research led to the decision to use video camera input as the mode for tracking movement, as this proved the most robust and responsive way of analyzing a large amount of input data. Research was then undertaken into new and in-development systems that utilize infra red technologies and stereoscopic lenses. These were interesting because many of them were able to perceive depth, and would therefore be able to track movement on a three dimensional plane. This is appealing because unlike traditional interfacing, the body can play a larger role in the space it inhabits.

Due to the widespread use of inexpensive home web camera’s, a decision was made to develop a system that could utilize this basic technology, so as not to limit the final outcome to specific hardware. This way anybody owning a simple computer camera would be able to use the software with good results. The programming environment Max/MSP and Jitter would provide all of the necessary tools needed to prototype the system, although further research at a later date may open up new possibilities and direction for the project.

Max/Msp and Jitter Interface

Max/Msp and Jitter Interface

In order to discuss the works of others, an introduction to the theoretical field of Phenomenology proves helpful when applying philosophy to HCI. Computers have been assimilated into society in a wide range of products and applications, we interact with them on a day to day basis. Michael Polanyi writes, on ‘tools’  ‘We pour ourselves into them and assimilate them as parts of our own existence.’ (Polanyi, 1958, p.59) Computers become an extension of self, we operate through them, focusing on the job at hand and not the tool itself. Martin Heidegger’s ready-to-hand theory presented in his seminal work ‘Being and Time’ theorizes this concept.

Click on the picture to view an extract of he book on Amazon

Click on the picture to view an extract of he book on Amazon

“The technology disappears in one’s hand as the user focuses on the immediate performance of the tool. Heidegger terms this condition of the tool as ‘ready-to-hand’ because the tool, through the experience of the user, is fused with the body.” (Zics, 2008, p.4)

Tools become ‘invisible’ upon interaction with them, not in an intangible sense, but in a distinction between what the user does with the tool and the way the user thinks about the tool.

Don Ihde states that ‘the better the machine, the more transparency there is.’ (Ihde, 1986, p. 141) and this is central to Heideggers philosophy on Dasein (being-in-the-world) as this is the way that the world makes itself available to us as ‘an unconscious but accessible background to our activity..’ (Dourish, 2001, p.110)
There are a number of multimedia artists working in the field of computer vision for artificial perception, and many have used these theoretical standings in relation to technology they use. The pieces of most interest in this research engage the body within the interaction, forming complex relationships between user and system. David Rokeby’s 1986-90 project ‘The Very Nervous System’ encapsulates these ideas.

If link does not work please go here.

“Because the computer is purely logical, the language of interaction should strive to be intuitive. Because the computer removes you from your body, the body should be strongly engaged. Because the computer’s activity takes place on the tiny playing fields of integrated circuits, the encounter with the computer should take place in a human-scaled physical space.” (Rokeby, Link,)

This project becomes interesting because upon interaction the interface becomes ready-to-hand. It is possible not to think of the system and concentrate on the output and how that is effecting the movement and vice versa. It is also evident that Rokeby has created an intuitive system by tapping into existing metaphors requiring the recollection of experimental knowledge within musical interaction. For example, quick subtle movements with the fingers create gentle and rapid string sounds whilst more pronounced hitting movements create drum patterns. This embodiment draws focus to the interaction itself as the content and therefore the engaged action of the body. Rokeby has also developed a set of objects that operate inside Max/MSP to facilitate artists working with camera tracking, these can be found here, unfortunately due to the cost of his software the implementation within this project is not possible.

A visual aspect to work of this nature becomes difficult, especially when dealing with a visual representation of the interaction within the interface. This could be distracting and limiting ending up with the dominant eye leading the interaction. For everything to occur simultaneously a more creation based visual representation, painting with sound approach must be taken, and more research into this aspect of the design is required. This visual distraction is demonstrated in the next chapter.

%d bloggers like this: