Proceedings Volume 7525

The Engineering Reality of Virtual Reality 2010

cover
Proceedings Volume 7525

The Engineering Reality of Virtual Reality 2010

View the digital version of this volume at SPIE Digital Libarary.

Volume Details

Date Published: 27 January 2010
Contents: 5 Sessions, 16 Papers, 0 Presentations
Conference: IS&T/SPIE Electronic Imaging 2010
Volume Number: 7525

Table of Contents

icon_mobile_dropdown

Table of Contents

All links to SPIE Proceedings will open in the SPIE Digital Library. external link icon
View Session icon_mobile_dropdown
  • Front Matter: Volume 7525
  • Color Me Beautiful: Putting Your Best Image Forward
  • Work It Out: Using Interfaces for Facilitating Ableness
  • Artful Realities and Engineering Experience
  • Virtual Environments in Action
Front Matter: Volume 7525
icon_mobile_dropdown
Front Matter: Volume 7525
This PDF file contains the front matter associated with SPIE Proceedings Volume 7525, including the Title Page, Copyright information, Table of Contents, Introduction, and Conference Committee listing.
Color Me Beautiful: Putting Your Best Image Forward
icon_mobile_dropdown
Hotspot mitigation in the StarCAVE
Jordan Rhee, Jurgen Schulze, Thomas A. DeFanti
Rear-projected screens such as those in Digital Light Projection (DLP) televisions suffer from an image quality problem called hot spotting, where the image is brightest at a point dependent on the viewing angle. In rear-projected mulit-screen configurations such as the StarCAVE at Calit2, this causes discontinuities in brightness at the edges where screens meet, and thus in the 3D image perceived by the user. In the StarCAVE we know the viewer's position in 3D space and we have programmable graphics hardware, so we can mitigate this effect by performing post-processing in the inverse of the pattern, yielding a homogenous image at the output. Our implementation improves brightness homogeneity by a factor of 4 while decreasing frame rate by only 1-3 fps.
A turning cabin simulator to reduce simulator sickness
Ronald R. Mourant, Zhishuai Yin
A long time problem associated with driving simulators is simulator sickness. A possible cause of simulator sickness is that the optical flow experienced in driving simulators is much different from that experienced in real world driving. With the potential to reduce simulator sickness, a turning cabin driving simulator, whose cabin rotates around the yaw axis was built. In the multi-projector display system, algorithms were implemented to calibrate both geometric distortions and photometric distortions via software to produce a seamless high-resolution display on a cylindrical screen. An automotive seat was mounted on an AC servo actuator at the center of the cylindrical screen. The force feedback steering wheel, and gas and brake pedals, were connected to the simulator's computer. Experiments were conducted to study the effect of optical flow patterns on simulator sickness. Results suggested that the optical flow perceived by drivers in the fixed base simulator was greater than that in the turning cabin simulator. Also, drivers reported a higher degree of simulator sickness in the fixed base simulator. The lower amount of optical flow perceived in the turning cabin simulator is believed to be a positive factor in reducing simulator sickness.
Work It Out: Using Interfaces for Facilitating Ableness
icon_mobile_dropdown
Virtual reality welder training
Steven A. White, Dirk Reiners, Mores Prachyabrued, et al.
This document describes the Virtual Reality Simulated MIG Lab (sMIG), a system for Virtual Reality welder training. It is designed to reproduce the experience of metal inert gas (MIG) welding faithfully enough to be used as a teaching tool for beginning welding students. To make the experience as realistic as possible it employs physically accurate and tracked input devices, a real-time welding simulation, real-time sound generation and a 3D display for output. Thanks to being a fully digital system it can go beyond providing just a realistic welding experience by giving interactive and immediate feedback to the student to avoid learning wrong movements from day 1.
Exploring the simulation requirements for virtual regional anesthesia training
V. Charissis, C. R. Zimmer, S. Sakellariou, et al.
This paper presents an investigation towards the simulation requirements for virtual regional anaesthesia training. To this end we have developed a prototype human-computer interface designed to facilitate Virtual Reality (VR) augmenting educational tactics for regional anaesthesia training. The proposed interface system, aims to compliment nerve blocking techniques methods. The system is designed to operate in real-time 3D environment presenting anatomical information and enabling the user to explore the spatial relation of different human parts without any physical constrains. Furthermore the proposed system aims to assist the trainee anaesthetists so as to build a mental, three-dimensional map of the anatomical elements and their depictive relationship to the Ultra-Sound imaging which is used for navigation of the anaesthetic needle. Opting for a sophisticated approach of interaction, the interface elements are based on simplified visual representation of real objects, and can be operated through haptic devices and surround auditory cues. This paper discusses the challenges involved in the HCI design, introduces the visual components of the interface and presents a tentative plan of future work which involves the development of realistic haptic feedback and various regional anaesthesia training scenarios.
Using commodity accelerometers and gyroscopes to improve speed and accuracy of JanusVF
Malcolm Hutson, Dirk Reiners
Several critical limitations exist in the currently available commercial tracking technologies for fully-enclosed virtual reality (VR) systems. While several 6DOF solutions can be adapted to work in fully-enclosed spaces, they still include elements of hardware that can interfere with the user's visual experience. JanusVF introduced a tracking solution for fully-enclosed VR displays that achieves comparable performance to available commercial solutions but without artifacts that can obscure the user's view. JanusVF employs a small, high-resolution camera that is worn on the user's head, but faces backwards. The VR rendering software draws specific fiducial markers with known size and absolute position inside the VR scene behind the user but in view of the camera. These fiducials are tracked by ARToolkitPlus and integrated by a single-constraint-at-a-time (SCAAT) filter to update the head pose. In this paper we investigate the addition of low-cost accelerometers and gyroscopes such as those in Nintendo Wii remotes, the Wii Motion Plus, and the Sony Sixaxis controller to improve the precision and accuracy of JanusVF. Several enthusiast projects have implemented these units as basic trackers or for gesture recognition, but none so far have created true 6DOF trackers using only the accelerometers and gyroscopes. Our original experiments were repeated after adding the low-cost inertial sensors, showing considerable improvements and noise reduction.
A case study of collaborative facilities use in engineering design
Laura Monroe, David Pugmire
In this paper we describe the use of visualization tools and facilities in the collaborative design of a replacement weapons system, the Reliable Replacement Warhead (RRW). We used not only standard collaboration methods but also a range of visualization software and facilities to bring together domain specialists from laboratories across the country to collaborate on the design and integrate this disparate input early in the design. This was the first time in U.S. weapons history that a weapon had been designed in this collaborative manner. Benefits included projected cost savings, design improvements and increased understanding across the project.
Artful Realities and Engineering Experience
icon_mobile_dropdown
Dissociation in virtual reality: depersonalization and derealization
This paper looks at virtual worlds such as Second Life7 (SL) as possible incubators of dissociation disorders as classified by the Diagnostic and Statistical Manual of Mental Disorders, 4th Edition3 (also known as the DSM-IV). Depersonalization is where "a person feels that he or she has changed in some way or is somehow unreal." Derealization when "the same beliefs are held about one's surroundings." Dissociative Identity Disorder (DID), previously known as multiple personality disorder fits users of Second Life who adopt "in-world" avatars and in effect, enact multiple distinct identities or personalities (known as alter egos or alters). Select questions from the Structured Clinical Interview for Depersonalization (SCI-DER)8 will be discussed as they might apply to the user's experience in Second Life. Finally I would like to consider the hypothesis that rather than a pathological disorder, dissociation is a normal response to the "artificial reality" of Second Life.
The interplays among technology and content, immersant and VE
Meehae Song, Diane Gromala, Chris Shaw, et al.
The research program aims to explore and examine the fine balance necessary for maintaining the interplays between technology and the immersant, including identifying qualities that contribute to creating and maintaining a sense of "presence" and "immersion" in an immersive virtual reality (IVR) experience. Building upon and extending previous work, we compare sitting meditation with walking meditation in a virtual environment (VE). The Virtual Meditative Walk, a new work-in-progress, integrates VR and biofeedback technologies with a self-directed, uni-directional treadmill. As immersants learn how to meditate while walking, robust, real-time biofeedback technology continuously measures breathing, skin conductance and heart rate. The physiological states of the immersant will in turn affect the audio and stereoscopic visual media through shutter glasses. We plan to test the potential benefits and limitations of this physically active form of meditation with data from a sitting form of meditation. A mixed-methods approach to testing user outcomes parallels the knowledge bases of the collaborative team: a physician, computer scientists and artists.
Ambient clumsiness in virtual environments
Silvia Ruzanka, Katherine Behar
A fundamental pursuit of Virtual Reality is the experience of a seamless connection between the user's body and actions within the simulation. Virtual worlds often mediate the relationship between the physical and virtual body through creating an idealized representation of the self in an idealized space. This paper argues that the very ubiquity of the medium of virtual environments, such as the massively popular Second Life, has now made them mundane, and that idealized representations are no longer appropriate. In our artwork we introduce the attribute of clumsiness to Second Life by creating and distributing scripts that cause users' avatars to exhibit unpredictable stumbling, tripping, and momentary poor coordination, thus subtly and unexpectedly intervening with, rather than amplifying, a user's intent. These behaviors are publicly distributed, and manifest only occasionally - rather than intentional, conscious actions, they are involuntary and ambient. We suggest that the physical human body is itself an imperfect interface, and that the continued blurring of distinctions between the physical body and virtual representations calls for the introduction of these mundane, clumsy elements.
Body parts
In this project, the artist wishes to examine corporeality in the virtual realm, through the usage of the (non)-physical body of the avatar. An art installation created in the virtual world of Second Life, which is meant to be accessed with site specific avatars, will provide the creative platform whereby this investigation is undertaken. Thus, "body parts" seeks to challenge the residents of virtual environments into connecting with the virtual manifestations, i.e., avatars of others in an emotionally expressive/intimate manner.
Virtual Environments in Action
icon_mobile_dropdown
The social computing room: a multi-purpose collaborative visualization environment
David Borland, Michael Conway, Jason Coposky, et al.
The Social Computing Room (SCR) is a novel collaborative visualization environment for viewing and interacting with large amounts of visual data. The SCR consists of a square room with 12 projectors (3 per wall) used to display a single 360-degree desktop environment that provides a large physical real estate for arranging visual information. The SCR was designed to be cost-effective, collaborative, configurable, widely applicable, and approachable for naive users. Because the SCR displays a single desktop, a wide range of applications is easily supported, making it possible for a variety of disciplines to take advantage of the room. We provide a technical overview of the room and highlight its application to scientific visualization, arts and humanities projects, research group meetings, and virtual worlds, among other uses.
Experiments in mixed reality
David M. Krum, Ramy Sadek, Luv Kohli, et al.
As part of the Institute for Creative Technologies and the School of Cinematic Arts at the University of Southern California, the Mixed Reality lab develops technologies and techniques for presenting realistic immersive training experiences. Such experiences typically place users within a complex ecology of social actors, physical objects, and collections of intents, motivations, relationships, and other psychological constructs. Currently, it remains infeasible to completely synthesize the interactivity and sensory signatures of such ecologies. For this reason, the lab advocates mixed reality methods for training and conducts experiments exploring such methods. Currently, the lab focuses on understanding and exploiting the elasticity of human perception with respect to representational differences between real and virtual environments. This paper presents an overview of three projects: techniques for redirected walking, displays for the representation of virtual humans, and audio processing to increase stress.
Interactive augmented reality system for product design review
Giandomenico Caruso, Guido Maria Re
The product development process, of industrial products, includes a phase dedicated to the design review that is a crucial phase where various experts cooperate in selecting the optimal product shape. Although computer graphics allows us to create very realistic virtual representations of the products, it is not uncommon that designers decide to build physical mock-ups of their newly conceived products because they need to physically interact with the prototype and also to evaluate the product within a plurality of real contexts. This paper describes the hardware and software development of our Augmented Reality design review system that allows to overcome some issues related to the 3D visualization and to the interaction with the virtual objects. Our system is composed by a Video See Through Head Mounted Display, which allows to improve the 3D visualization by controlling the convergence of the video cameras automatically, and a wireless control system, which allows us to create some metaphors to interact with the virtual objects. During the development of the system, in order to define and tune the algorithms, we have performed some testing sessions. Then, we have performed further tests in order to verify the effectiveness of the system and to collect additional data and comments about usability and ergonomic aspects.
Art in virtual reality 2010
For decades, virtual reality artwork has existed in a small but highly influential niche in the world of electronic and new media art. Since the early 1990's, virtual reality installations have come to define an extreme boundary point of both aesthetic experience and technological sophistication. Classic virtual reality artworks have an almost mythological stature - powerful, exotic, and often rarely exhibited. Today, art in virtual environments continues to evolve and mature, encompassing everything from fully immersive CAVE experiences to performance art in Second Life to the use of augmented and mixed reality in public space. Art in Virtual Reality 2010 is a public exhibition of new artwork that showcases the diverse ways that contemporary artists use virtual environments to explore new aesthetic ground and investigate the continually evolving relationship between our selves and our virtual worlds.
Dream Home: a multiview stereoscopic interior design system
Fu-Jen Hsiao, Chih-Jen Teng, Chung-Wei Lin, et al.
In this paper, a novel multi-view stereoscopic interior design system, "Dream Home", has been developed to bring users new interior design experience. Different than other interior design system before, we put emphasis on its intuitive manipulation and multi-view stereoscopic visualization in real time. Users can do their own interior design just using their hands and eyes without any difficulty. They manipulate furniture cards directly as they wish to setup their living room in the model house task space, get the multi-view 3D visual feedback instantly, and re-adjust cards until they are satisfied. No special skills are required, and you can explore your design talent arbitrarily. We hope that "Dream Home" will make interior design more user-friendly, more intuitive, and more vivid.