Ir al contenido

Documat


Resumen de A Vibrotactile prototyping toolkit for virtual reality and videogames

Jonatan Martínez Muñoz

  • Haptics, which in general refers to the sense of touch, plays an essential role not only in our perceptual construction of spatial environmental layout, but in the human ability to manipulate objects with one's hands [1]. As Lederman and Klatzky stated [2], vision and audition are recognized for providing highly precise spatial and temporal information, respectively, whereas the haptic system is especially effective at processing the material characteristics of surfaces and objects. Moreover, this sense reinforces other channels and, as Sallnäs et al. [3] indicated, increases perceived virtual presence. The change from traditional user interfaces, such as the mouse and keyboard, to more modern ones, like touch screens present in mobile phones and tablets, have resulted in the loss of the perceptual keys that allowed their efficient use without visual guidance. In a keyboard, not only the shape and disposition of the keys are easily perceived by the user¿s sense of touch, but also the acknowledgement of their activation. However, in a touch screen, the visual channel needs to actively support the user interaction. The industry has partially supplied this absence with vibrotactile feedback, which is present in almost every mobile device. Game controllers also benefit from this feedback, although with the different objective of enriching the user experience. Recently, the interaction has jumped beyond the screen, with depth sensors like Kinect, Xtion and Leap Motion. In this case, the lack of haptic feedback is even more noticeable, since the gestures occur in the air. This problem is not new. In fact, it has been present in VR for many years, where the challenge is to provide realistic sensations to the users. Many VR environments have stunning visual displays and high-fidelity sound equipment, whereas haptic technology is clearly behind. However, being able to touch, feel, and manipulate objects, in addition to seeing and hearing them, is essential to fulfil the objective of VR. Developing haptic schemes of interaction inherently requires the use of a physical device to transmit this kind of information to their senses. Many researchers have studied different ways to provide realistic sensations, and different companies have created complex devices including Phantom or CyberGrasp. However, the problem with these systems is twofold. On one hand these systems have serious limitations, like their reduced workspace or high cost, being suitable for a reduced field of application. On the other hand, commercial vibrotactile devices like CyberTouch have a fixed distribution of actuators (in this case, on the top of the fingers of a glove) and cannot be adapted to suit other applications or interaction metaphors. Moreover, solutions from the game industry like the Nintendo Wiimote or Logitech Rumblepad provide very limited haptic sensations, and they are not adequate for general haptic feedback. This lack of general purpose tactile solutions has led to many haptic researchers to build their own devices [4], [5], which is time consuming and far from optimal. As the sense of touch is distributed all over the skin, these devices are not focused on a single part of the body. It is possible to find haptic devices for the hands [6], shoulders [7], torso [8], waist [9], and even integrated in objects like seats [10]. Therefore, there is a need for a system that allows an easy connection and placement of tactile actuators to form an adaptable haptic display, so that it can be used in different scenarios. This system would be useful not only for any general purpose VE, but also for prototyping specialized haptic devices. Research contents This thesis aims to improve the haptic feedback on VE, including videogames and interactions beyond the screen. To this end, several objectives are proposed. First, it is important to study the psychophysical aspects of the sense of touch, as well as the different technologies available to provide haptic feedback. Once a suitable technology has been chosen, a prototyping platform needs to be built, which allows an easy development of different vibrotactile-enabled devices. This platform should be composed of an electronic controller and multiple actuators. In the software side, it would be important to design and implement a tactile authoring tool. This tool should allow the creation of tactile patterns associated to one or more actuators. Finally, different experiments should be conducted in order to assess the capabilities of the developed haptic platform to simulate different aspects of the sense of touch. Conclusion This thesis has investigated the use of vibrotactile technology, and more specifically, ERM actuators, to provide tactile feedback in virtual environments. As a result, the main contributions of the research are: ¿ An overview of the perception of the human sense of touch, and a deep review of tactile technologies available to provide tactile feedback. ¿ A hardware prototyping platform for vibrotactile feedback has been developed. It includes novel features like an scalable architecture, the support for a wide range of ERM actuators, the inclusion of two advanced driving techniques (overdrive and active braking), and a volume adjustment knob to suit the user preferences. ¿ A software authoring tool that, together with the hardware platform, can be used to easily design and test complex tactile patterns distributed along several actuators. It can be used to define overdrive and active braking pulses to exploit the hardware capabilities, and it provides an API library to reuse the designed stimuli from an external application. Other interesting features are the inclusion of a graphic representation of the actuators distribution, and the possibility to compensate the different sensibilities of the skin by adjusting the gain of each channel. ¿ A vibrotactile rendering method for rendering 2D shapes and textures through an ERM actuator, as well as an experiment to evaluate its performance when comparing it against a force feedback device and the bare finger. ¿ Two experiments have been conducted to identify regular 3D shapes without visual guidance. The first one was carried out with a multi-point force feedback device, while for the second one a vibrotactile glove built with the prototyping platform was built. To this end, an adapted haptic rendering algorithm has been developed, using a virtual hand model, and multiple collision points per actuator. In addition, an optimized collision algorithm for regular 3D shapes has been implemented to work in constant time. The results of these two experiments were compared with previous results found in the literature with single point force feedback devices. ¿ An experiment to assess the feasibility of transmitting an object's weight and size information to the user in a VE through vibrations. Three different methods for rendering the weight properties of the object were tested with a user evaluation.


Fundación Dialnet

Mi Documat