In seiner Funktionalität auf die Lehre in gestalterischen Studiengängen zugeschnitten... Schnittstelle für die moderne Lehre
In seiner Funktionalität auf die Lehre in gestalterischen Studiengängen zugeschnitten... Schnittstelle für die moderne Lehre
We are supposed to live in a „digitized world“. Why are we then still interacting with digital data on screens, drowning in menus, cryptic graphic user elements or unlearnable gesture sets? This requires much of our attention, keeping us separated from the ordinary physical environment within which we live and interact, instead of enhancing it.
This master thesis approaches this problem and presents a set of interaction paradigms based on the augmentation of sticky notes, paper maps and sketches towards a closer integration of the digital world and our daily life.
„The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it.“ – Mark Weiser, 1991
With this statement Mark Weiser started his 1991 article „The Computer for the 21st Century“, which has influenced the research in human-computer interaction during the last decades.
Since then, the way in which people interact with computers and interfaces has been changing continuously and it has changed our lives in many ways. We now have more devices, more powerful and better connected to access digital data anywhere and anytime. However, this requires much of our attention, we often find ourselves drown into complex interfaces, and this keeps us more separated from our environment instead of helping us to be more aware of it.
We need to find ways towards more invisible interfaces, towards a more seamless interaction with them, and definitely, towards interfaces that fit in our world instead of forcing us to enter theirs.
For this, the search for a closer integration of the real and the digital world and for more natural ways of interaction is needed.
With diverse prototypes applied to two use cases, this work presents the design of a set of novel interaction paradigms to augment some of the most basic but also versatile every-day practices: the use of paper, sticky notes or sketches. We argue that this will allow us to implicitly interact with digital information while interacting as we are used to do in our real life, empowering tangible interaction and enhancing situation awareness and collaboration.
Our approach is as follows: if we consider the physical and digital worlds as two different and independent layers, this integration can be approached in two different directions, depending on the layer that is focused on.
From digital to real: with focus on the digital world, by bringing actual screen-based interfaces out of the screen, pushing their elements into a physical layer, or making digital data graspable.
From real to digital: with focus on the real world, by augmenting every-day objects using them directly as input/output (I/O) devices, or by augmenting every-day practices or processes, by tracking and complementing them with additional digital content.
These directions are actually feeding back each other, like the poles of a magnet, forming an iterative process.
In this thesis, these two perspectives have been explored with two use cases from different contexts:
Working on different cases with different needs and goals will produce, not only single solutions for each one of them, but also general insights and interaction paradigms that we could translate to other scenarios and cases.
This thesis is based on the following research questions:
This project was developed in cooperation with Patrick Oswald
Traditionally, video games are screen-bound. The ways in which people interact with them have evolved over the years: devices like the Wii-Remote or the Kinect have added more embodiment, and Augmented Reality (AR) allows users to immerse in complete new virtual worlds. But the world we explore in a game is still a virtual one, separated from the physical environment within which we live and interact, and is still confined to a screen.
We propose to bring the game out of the screen, integrating the virtual world users are playing in into their daily environment. This will introduce new concepts in the field of tangible interaction, collaboration and game design.
Enhance Creativity
By pushing the video game stage onto a physical layer, we will be using our environment as a game level. As a consequence users will be enabled to create their own game level contents by using everyday physical objects. This approach aims to bridge the gap between playing and creating by making both possible at the same time, enhancing users' awareness of their own environment and users creativity.
Enhance Collaboration
Multi-player modes in traditional video games have not changed much. The players roles are normally similar, and vary between playing to achieve a common goal and playing as opponents. This case aims to enhance collaboration between players by exploring new roles and collaboration settings between them.
In disaster situations, the process of managing and making decisions is still mostly analog. Disaster managers still use paper maps and other analog and paper-based tools.
While paper offers flexibility and efficiency and supports fluid collaboration, uncertain and constantly changing disaster situations require up-to-date reliable data in order to maintain situation awareness and enable rapid decision-making. Current tools for itself does not provide this flexibility.
While recent research studies propose novel systems based on multi-touch and tangible interaction on tabletops, these prototypes suppose a complete replacement of traditional workflow, which can imply a loss of confidence and situation awareness.
The approach presented here proposes a change of assumption: the augmentation of the actual work process. For it, a user centered design approach is needed.
Scenario: German Technical Emergency Service (THW)
This case builds on the information extracted from a previous user centered study [Paelke et al. 2012]. The study worked closer with the „Management and Communication“ unit of the Technical Emergency Service in Germany (THW – Technisches Hilfswerk), whose main task is to keep track of all emerging information during an operation.
Damage Accounts
THW workers use, in combination with paper maps, the so called Damage Accounts, which contain information about an operation and are used to handle damages and tasks. These are used to inform the responsible units about this tasks and keep track of all information and updates.
Augmenting the Current Process
Starting from the understanding of the benefits of paper maps and other analog tools, our system studies how traditional tools and digital augmented data can support the disaster managers' planning and decision-making process.
It mixes augmented paper maps, damage accounts and sketch interaction with computer vision and interactive projection, and is a study of the role of tangibility and sketching in disaster management.
Our approach uses post-its to represent the units involved in a given operation, and are placed on a paper on the assigned location. Once attached, post-its are detected and assigned to the corresponding damage account using their color, and are augmented with projected up-to-date information. Thus, disaster managers working on the control point can see the operation status in real time and get direct feedback from the units in situ.
The augmentation of paper and sketches offered us a common framework to approach two cases with different goals and requirements –one aims to push screen-based content into the physical environment, the other tries to augment a traditionally analog process–.
We implemented this by combining interactive projection with computer vision algorithms. With this non-intrusive solution the technology will remain discreetly in the background and the user may interact with every-day objects in casual and highly natural way while getting immediate feedback on his actions.
A list of the explored interaction paradigms, applied to both use cases, follows.
For detailed information, please check out the Master Thesis' Book (see „Material“ folder).
Interacting with Sticky Notes and Paper
Interacting with Sketches
Sticky Notes + Sketches = New Tools
Paper Computing: Paper as I/O Interface
Augmenting Multi-touch Devices
The current set-up includes:
For our prototypes we have used the first version of the Microsoft Kinect. Nevertheless, since in the actual version most of the tracking is performed over the RGB channel, any camera can be used. This is transparent for the library.
Tests with a low-cost camera (3€) have been done with good results. The better the camera resolution is, the better the tracking will work.
We have implemented Posting Bits in form of an open-source library for Processing. It deals with the tracking of sticky notes, sketches and other elements on a surface.
Since the library has a modular structure, it can be used in many contexts involving the same principles, as our prototypes (i.Ge and the augmented Damage Accounts) demonstrate.
The software side implements computer vision algorithms using the OpenCV library.
The elements on stage (sticky notes, etc.) are detected by using contour detection. Color detection is also performed in order to categorize each element depending on their color.
Currently, the tracking is done directly over the source image. If needed, it is possible to perform background subtraction right before the tracking when working on non-uniform backgrounds. This is the case when using post-its on paper maps.