The main user interaction approach of living-room2 is the free exploration. The user can move around and experience the ambient space intuitively. For this purpse, the project includes examinations of several graphical user interfaces (GUI). Using a handheld device (miniTrax Wand from Intersense) the user can interact with information panels placed and animated in 3D space. The investigated applications are described below.
In the movie ‘Terminator’ Arnold Schwarzenegger can see additional semitransparent information referring to the the 'real life' stream of his robotic view. Using this type of GUI means fixing a transparent 2D screen in front of the eyes. Displaying mainly text information and outline shapes with a major amount of transparency refers to real space and helps the user to not loose orientation.
The graphical interface of the scenario ‘Design Vocabulary’ is constructed in this way. The interface offers a menu that can be accessed, hidden and browsed with the joystick. Additional virtual content referring to the living-room2 furniture seen in the video life stream is implemented to select the student works.
Instead of using only one semitransparent 2D-plane in front of the eyes several independent planes can be implemented and placed in space offering more lay-out and options for links. It provides interaction with more visual interface instead of only text.
The scenario ‘AR-Décors’ uses mobile planes that visually come closer to the viewer after selection. They are displayed in front of a semitransparent coloured background, which puts the image of reality on a layer of less importance during the moment of selection.
The exterior surface of a virtual body placed in space can serve as a display. For example, the user can for example turn the body with a tracked handheld device and select from images placed on its surface planes.
This GUI will be explored, but not implemented in living-room2.
HUMAN CENTRED OBJECT
The living-room2 main menu for scenario selection explores the possibilities to place an interactive virtual object with information panels surrounding the users head. The user can turn and select from a ring (stripe, globe, loose composition) of image and text panels positioned around her/his head. Several aesthetics and functionalities have been designed and tested.
An important approach to blur the borders between the virtual and the real and to 'syncornize' both spaces is to use interfaces in both worlds. The furniture of the living-room2 box has been prepared with sensors and can therefore be used as an interface. Distance, light, pressure and other sensors were evaluated. Taking a piece of fruit form the bowl or sitting down on the sofa are options that are currently being researched. At this moment the user can trigger virtually contextualized elements by opening the drawers of the cupboard.
Apart from triggering actions, the furniture can receive inputs. All lamps and the ventilator can be actuated and dimmed according to virtual and real world circumstances.
LOCATION SPECIFIC POPUPS
For the integration of student works location specific references have been implemented. When the user looks at a piece of furniture or a specific position in space that plays a major role in the content of the students scenario a virtual indicator will display the entrance to this scenario proposal. The 3D-icon can be activated to show the selected scenario proposal on a popping up virtual video screen.