ERCIM news 135
ERCIM news 135
ERCIM news 134
ERCIM news 134
ERCIM news 133
ERCIM news 133
ERCIM news 132
ERCIM news 132
ERCIM news 131
ERCIM news 131
Back Issues Online
Back Issues Online

by Gianpaolo Palma and Paolo Cignoni (CNR-ISTI)

We introduce a system designed to enhance engagement in VR experiences using sensorised replicas of real objects created through 3D printing. This system lets users interact with physical replicas within the virtual environment while visualising the original object's appearance. Additionally, it facilitates the creation of augmented experiences to manipulate the virtual appearance of the physical replica using personalisation actions, such as painting over the object’s surface or attaching additional virtual objects, taking advantage of its tactile feedback.

Virtual reality (VR) technologies have become increasingly affordable and popular in recent years, thanks to advancements in hardware and software. A critical challenge for these technologies is establishing paradigms that enable user interactions as similar as possible to the real world, thereby incorporating physicality into the experience.

To address this challenge, we explore integrating VR consumer technologies with directly manipulating 3D-printed objects to create an interactive and tangible user interface within the virtual environment. The primary objective is to develop a VR system that provides users with a blended virtual/real experience perceived as more accurate and engaging. When interacting with a low-cost physical 3D-printed replica, the head-mounted display (HMD) enhances the user's visual experience despite its appearance differing from the original object. The tactile feedback experienced when touching the replica, combined with its interactivity and the high-quality visuals provided by the HMD, significantly enhance immersion and the user's emotional impact. Following the reality-virtuality continuum taxonomy [1], we propose an augmented virtuality experience centred on interactive and touch-sensitive 3D-printed objects.

Figure 1: (Left) Photos of the system hardware. (Centre) Example of an interactive session in the virtual environment. (Right) Photos of the user during the interactive session showed in the centre
Figure 1: (Left) Photos of the system hardware. (Centre) Example of an interactive session in the virtual environment. (Right) Photos of the user during the interactive session showed in the centre.

The proposed system meets three requirements. The first is to enhance the visual appearance of a low-cost physical replica of an artefact by using a VR device, specifically a HMD, to overlay the faithful appearance of the original object virtually onto it. The second requirement aims to enhance the user's emotional impact by enabling them to physically manipulate the replica in the virtual environment, leveraging touch feedback. The final requirement focuses on enhancing user immersion and engagement by allowing the personalisation of the replica through changes in its virtual appearance when touched, facilitated by a physical personalisation palette. To fulfil these requirements, we have designed a system comprising custom hardware components and a software library [2].

The proposed hardware setup integrates various devices. The primary device is the HMD, which provides the visual VR experience and tracks the 3D-printed replica in the VR environment. We utilised the HTC Vive and its extension, ViveTracker, for real object tracking. Additionally, we integrated the HMD with LeapMotion to enable robust active hand tracking. Subsequently, we developed a reusable 3D-printed support to mount physical replicas of different objects onto the ViveTracker, allowing their tracking within the HMD's working area. A physical palette equipped with customisable buttons is attached to the 3D-printed support, enabling users to select the desired type of personalisation to apply to the virtual object's surface. Finally, an electronic controller with capacitive touch sensing detects when the user touches the replica and the personalisation palettes.

The software library, distributed on both the replica hardware and the PC running the experience, collects all data the hardware devices generate. It calculates the surface position when the user touches the replica and visually presents this information to the users. Developed within the game engine Unity, the software employs a custom script to manage Wi-Fi communication with the capacitive touch-sensing controller.

Our system utilises a client-server architecture, with the server operating on the controller and the client embedded within the Unity application. Once the connection is established, the server sends messages whenever there is a change in the touch status over the replica or the palette buttons. The actions associated with each palette button are configurable within the Unity application. For each touch event on the replica, the script determines the surface position for changing the appearance. This process involves a simple ray-casting procedure against the 3D model of the replica. Each ray originates from the position of the index fingertip detected by the Leap Motion sensor.

We have identified three primary directions for casting these rays based on common finger-movement patterns. The index distal phalanx defines the first direction, which enhances the detection of contact points when the user touches the surface with the index finger close to the surface normal vector. The second direction is determined by the line of view connecting the index fingertip to the user's head, resulting in more robust detection when the user touches the surface with the hand palm in an orthogonal position with respect to the view direction. Lastly, the hand palm defines the third direction, which enhances robustness when the user touches the replicas on the silhouette. Through the designed personalisation palette, users can virtually paint over the replica's surface using their index finger as a brush or attach additional virtual objects onto the surface. The palette buttons enable users to select paint colours, choose objects to attach, or undo previous actions. After selecting an action, when the user touches the replica, the system applies the chosen colour to the virtual object's surface or positions the selected object at the touched point using the normal vector for coherent orientation.

Feedback from end-users indicates that the virtual experience was exciting and engaging, thanks to the integration of tactile feedback from the physical replica and visual feedback from the virtual replica through personalisation actions. Users found their interactions natural and fascinating, especially as they became more accustomed to navigating the system. Expressly, the tracking and visualisation of the user's hands in VR were noted for enabling a level of interaction and accuracy during personalisation actions that would have been otherwise difficult to achieve.

References: 
[1] P. Milgram and F. Kishino, “A taxonomy of mixed reality visual displays,” IEICE Transactions on Information and Systems, vol. 77, no. 12, pp. 1321–1329, 1994.
[2] G. Palma, S. Perry, and P. Cignoni, “Augmented virtuality using touch-sensitive 3D-printed objects,” Remote Sensing, vol. 13, no. 11, 2021.

Please contact: 
Gianpaolo Palma, CNR-ISTI, Italy
This email address is being protected from spambots. You need JavaScript enabled to view it.

 

 

Next issue: July 2024
Special theme:
Sustainable Cities
Call for the next issue
Image ERCIM News 137
This issue in pdf

 

Image ERCIM News 137 epub
This issue in ePub format

Get the latest issue to your desktop
RSS Feed