Sunday, 17 June 2012

Centre for Intelligent Systems Research.























On day five we journeyed to The Centre for Intelligent System Research labs, located at the Deakin University Waurn Ponds campus in Geelong; where robots and avatars are a chief subject of investigation.

We were able to consider what would be a realistic expectation of a robot or avatar actor. The engineers and scientists we met strongly advised us to consider using semi-autonomous robots as they described their machinic friends (none of which are humanoid, all of which are developed for use in industry) as unpredictable and dangerous. They re-articulated our fundamental question: ‘Does it matter if a robot is actually autonomous or can we just give the impression of autonomy?’ A puppeteered robot or avatar would be much more reliable than an autonomous one, they told us, and an audience would really not be able to tell the difference. 

For us though, this would not satisfy our criteria for a media actor. The question challenged our reason for conducting this investigation. We were left with a significant question:

Are we using technology and media to add something to a piece of dramatic theatre that actors cannot do or are we trying to get the technology and media to do what actors already do? The answer would be that we are simply trying to understand when a technology or media can be considered an actor, if ever it can in fact.

After meeting with Professor Saeid Nahavandi and and his team of Engineers and Scientists, each of the following activities have become mini-projects. The programming and creating of the necessary tools will take place before the ARTLAB theatre team arrive in December. 


1. REAL VERSUS ROBOT

Creating a humanoid robot using a mannequin’s upper torso mounted on a mobile robot. The robot could be programmed to move through space, and speak with pre-recorded dialogue. A series of responses could be programmed to be triggered by interaction with an actor. We could also test lighting various limbs of the robot to give a sense of life. These mobile robots could also be attached with non-human figures, such as a giant eye.














 2. REAL VERSUS VIRTUAL

Using the Microsoft Kinect and 3D screen, create an avatar that can be programmed to express gestures triggered by an actor so that it appears as if the avatar becomes aware and interacts. This could also have pre-recorded voice that is triggered by the actor’s motion.

3. Tele-operated Ozbots 

These can be operated remotely. They can project sound, throw light and ‘see’.














4. Active shutter goggles

Using the dome 3D screen and active shutter goggles, we can test the audience seeing 3D projections with actors in front of the screen. 

5. Human face as IPAD

An ipad could be attached to a person’s face so that it becomes like a mirror to the onlooker. 

 












6. Simulation of an out-of-body experience. 

An avatar that looks like the ‘participant’ can move and feel independently of the person.

7. Robotic arms

Testing the use of robotic arms, as scenes to be filmed. 





No comments:

Post a Comment