research > Touch Projector

Touch Projector

Mobile Interaction through Video

Touch Projector

Touch Projector allows users to manipulate content on distant displays that are unreachable, such as (a) displays outside a window, or (b) a tabletop system crowded with people. It allows users to manipulate devices that are incapable of touch interaction, such as (c) a wall projection or (d) a laptop. Users point the device at the respective display and manipulate its content by touching and dragging objects in live video. The device "projects" the touch input onto the target display, which acts as if it had occurred on itself.

In 1992, Tani et al. envisioned how users could interact with a real-world device located at a distance through live video. Cameras observed industrial machinery and allowed users to manipulate mechanical switches and sliders over a distance by clicking and dragging within the live video image with a mouse. This was made possible by mapping portions of the video frame to the respective parts of the remote hardware. The system was revolutionary in that it established a particularly direct type of affordance – in many ways similar to the affordance of direct touch.

While the metaphor is still interesting, the environments and usage scenarios have changed since that time. (1) The proliferation of displays on machines and computer systems has turned many spaces into multi-display environments. (2) With the presence of portable computers such as laptops or tablet PCs, the displays within these environments may be rearranged. (3) In these flexible display setups, Tani's fixed camera setup is not necessarily appropriate anymore.

We investigate how to apply "interaction through video" to these new scenarios and to what extent mobile devices can offer the required flexibility. We build on recent advances in mobile augmented reality (such as markerless tracking and camera-based pose estimation) combined with techniques for manipulating objects at a distance (such as distant pointing, input redirection and local portholes from remote displays).

Touch Projector allows users to manipulate content on displays at a distance, including those that would otherwise be unreachable. It further allows users to manipulate devices that are be incapable of touch interaction, such as a wall projection, or a laptop computer. Users aim the device with one hand and then manipulate objects by touching and dragging it in the live video using the other hand. Touch input is "projected" onto the remote display, as if it had occurred on it.

With Touch Projector, users manipulate targets using both hands in concert. The non-dominant hand holds the device and coarsely orients it, while the dominant hand interacts within the reference frame established by the non-dominant hand (cf. toolglass interaction). This combination allows interaction with large displays by moving the entire device (cf. peephole displays) as well as interaction with small displays using touch input. Touch Projector preserves immediate feedback: when content on the target display is changed, users immediately perceive these changes through the live video. This allows for a close connection of action and reaction as both occur on the mobile device.

Publications

Can You See Where I Point at?

Can You See Where I Point at?

Boring, S., and Baur, D.

In International Workshop on Security and Privacy in Spontaneous Interaction and Mobile Phone Use (in Conjunction with Pervasive 2010), Helsinki, Finland, 2 pages, May 17.

Touch Projector: Mobile Interaction Through Video

Touch Projector: Mobile Interaction Through Video

Boring, S., Baur, D., Butz, A., Gustafson, S., and Baudisch, P.

In ACM International Conference on Human Factors in Computing Systems - CHI 2010. Atlanta, GA, USA, ACM Press, 10 pages, Apr 10-15.

Videos