Overall, we had 7 contributions to our workshop on visuo-haptic interaction at the AVI 2022. On this site, you can find these contributions.

Tele-Haptics: Remote Collaboration on Physical Objects

Maximilian Letter, Katrin Wolf

The need to work together at any time from any place in the world is constantly increasing. While the possibility to remotely collaborate on digital content has matured to the point where it manifested into daily used services, remote collaboration on physical objects has not left its early stages of prototypes and studies. The task is challenging because of multiple reasons that partly cause each other: (1) physical content is static and cannot be duplicated or transported like digital content; (2) the possibilities of capturing and digitizing physical content are yet limited and usually come with a loss of information; (3) interacting with representations of physical objects is unnatural and does not allow an experience similar to the interaction with real-world objects. In my work, I aim to research the latter of these challenges. The goal is to explore how working with physical objects that are remote can be digitally supported to enable collaborations independent of time or space. As this field of academia is comparably novel, numerous aspects can be looked at. I am most interested in asynchronous collaboration and mutual collaboration, as well as how to transport and emulate the properties of physical objects that go beyond the visual.

For the complete contribution, click here:

HaptiX: Extending Cobot’s Motion Intention Visualization by Haptic Feedback

Max Pascher, Kirill Kronhardt, Til Franzen, Jens Gerken

Nowadays, robots are found in a growing number of areas where they collaborate closely with humans. Enabled by lightweight materials and safety sensors, these cobots are gaining increasing popularity in domestic care, supporting people with physical impairments in their everyday lives. However, when cobots perform actions autonomously, it remains challenging for human collaborators to understand and predict their behavior, which is crucial for achieving trust and user acceptance. One significant aspect of predicting cobot behavior is understanding their motion intention and comprehending how they „think“ about their actions. Moreover, other information sources often occupy human visual and audio modalities, rendering them frequently unsuitable for transmitting such information. We work on a solution that communicates cobot intention via haptic feedback to tackle this challenge. In our concept, we map planned motions of the cobot to different haptic patterns to extend the visual intention feedback.

For the complete contribution, click here:

Haptic Tools: Enhancing Tool Capabilities by Tactile-kinesthetic Feedback

Juan F. Olaya-Figueroa, Katrin Wolf

Adding tactile-kinesthetic feedback to tools is a novel concept that aims at creating physically computable tools, either augmented ones or virtual ones that act as controllers, which makes tool usage feeling realistic and enables rich user experience (UX). We extend existing approaches of haptics in digital object manipulation, such as vibrotactile and pseudo-haptic feedback, through integrating tactile-kinethics. When using traditional tools, such as saw, hammer, and drilling machine, we feel rich haptic feedback, such as weight change, force feedback, and tactile-kinethetic feedback. Such feedback not only provides information about the position of our tools and the progress of the work we do with them, it also constrains or even stops the movements of the tools we hold in hand. In this position paper, we point at limitations in realism and UX of haptic feedback during object manipulation and tool usage. We discuss how to possibly extend capabilities of future tool with a focus of adding haptic feedback that makes digital tool use feel „real“. We finally highlight challenges we expect to face during our project.

For the complete contribution, click here:

Towards a Design Space of Proprioceptive Interaction

Takashi Goto, Katrin Wolf

Proprioceptive interfaces allow users both, to acquire information by actuating human bodies and to receive input through captured body movements. While humans as well as most user interfaces mainly rely on vision to receive information, proprioceptive interfaces extend established interface modalities and thus, not only quantitatively increase the information bandwidth but also enrich the information quality. This paper discusses the general concept of providing information through proprioception and gives an overview of previous research. It moreover describes our exemplary implementation of the concept and proposes how possible future work could reduce research gaps in that area, aiming to build a design space of proprioceptive interaction.

For the complete contribution, click here:

Generating a Haptic Feedback Tool for Craftspeople

Ece Dinçer

With the effects of both increasing digitalisation and measures taken during the pandemic period, people are faced with new remote communication and collaboration technologies and techniques in last 2 years. Although, some employees can work remote with solutions like video conferencing, virtual workspaces and/or collaborative file editing, it is true that those technologies currently limits users of their visual and auditory perceptions, affecting the nature and complexity of ways of working. Especially, when the subjects of this working environment are workers who use traditionally physical tools like handcrafts, learning new skills or introducing the usage of tools to newcomers is restricted. With this starting point, it is aimed to complete this master thesis on using haptic feedback and creating a tool to bridge over craftspersonship and remote work. For this purpose, some related works have been reviewed under the headings of „Haptic Handheld Controllers“, „Wearables“, „Texture“ and „Touchless Feedback“. Moreover, it was decided that research through Design approach will be implemented to reach expected outcome, a high-fidelity prototype of a haptically tangible manipulative tool which helps craftspeople to apply their handcraft into remote working spaces in a natural way.

For the complete contribution, click here:

Tangible Objects in Virtual Reality for Visuo-Haptic Feedback: A Marker-Based Approach

Ana Rita Rebelo, Rui Nóbrega

Including tangible objects in Virtual Reality (VR) experiences leverages the interaction and immersion in virtual experiences. Users’ hands are freed to interact directly with physical objects and thus receive haptic feedback that complements the primarily visual information offered by the Head-Mounted Display (HMD). The challenge in this area relies on tracking and mapping physical objects in the Virtual Environment (VE). Approaches have been proposed to integrate tangible objects into the virtual world. Most methods require attaching sensors to the physical objects, usually resulting in object-oriented solutions, making the system inflexible to track different objects. This position paper discusses different methods to include tangible objects in VEs. As a flexible solution suitable for different objects and requiring few additional hardware resources, we present our marker-based approach, which uses computer vision technology to track the physical objects.

For the complete contribution, click here:

Towards Inducing Weight Perception in Virtual Reality Through a Liquid-based
Haptic Controller

Alexander Kalus, Martin Kocur, Niels Henze

Rendering haptic sensations while interacting with virtual objects, such as experiencing weight is essential for creating natural virtual reality (VR) experiences. However, accurately providing the forces required to sense an object’s weight poses a demanding challenge. Hence, standard VR setups do not allow users to experience different weight sensations. In this paper, we propose a haptic VR controller design that renders the weight of virtual objects by regulating the mass of the controller through liquid transfer. Our planned system consists of two tracked controllers that contain a water bag. They are connected to a liquid reservoir in the back, to or from which water is transferred to change the weight of the controller. A Microcontroller determines the weight of each reservoir via a bi-directional water pump and a set of solenoid valves. To evaluate the prototype, we are planning to investigate in a study whether our system can enhance the VR experience and sense of presence while lifting and swinging virtual objects. Furthermore, we plan to examine, whether the device can be used to amplify avatar embodiment.

For the complete contribution, click here: