Editorial

2
SI: MANUFACTURING AND CONSTRUCTION Editorial James Ritchie Judy Vance Satyandra Gupta Received: 15 February 2011 / Accepted: 17 February 2011 / Published online: 1 March 2011 Ó Springer-Verlag London Limited 2011 This is the second volume of the special issue of virtual manufacturing and construction. To date, most virtual assembly simulations concentrate on interactions of rigid bodies; however, many objects to be assembled undergo some amount of deformation in the assembly process. Mishra and Suresh propose a method to simulate virtual assembly of CAD models of thin deformable beam and plate-like objects. These specific geometries are often found in snap fit assemblies of two or more parts. They propose a dual representation technique that relies on mapping a physics-based model to the 3D geometric model. For the physics model, a simplified stiffness matrix based on a projected surface is computed. A finite element analysis is performed on the simplified model that results in deformation of the 2D projected surface. The full deflection of the 3D surface is recovered through a unique mapping of the 2D deflections to the triangular representation of the 3D surface. The final deformed surface can then be displayed using OpenGL. Several sample problems are presented to test the algorithm for accuracy, generality, and speed. The paper concludes with a case study of an electrical connector assembly. Dorozhkin et al. present research that couples simulation of product movement and operator utilization within a factory simulation with immersive virtual reality. Discrete event simulation is a well-developed simulation tool used to simulate part flow, operator utilization, and factory layout of complex assembly processes. The authors describe an immersive virtual reality application that integrates data from discrete event simulation, 3D factory layouts, and 3D CAD geometry to support an immersive virtual reality application for assembly planning. The application supports navigation throughout the simulation and the ability to query parts and operators as to their status within the simulation. Bottlenecks and underutilized operators are easily identified within the full-scale immersive environment. The application requires little preprocessing and facilitates easy modification to support evaluation of multiple factory configurations and their effect on part production. Cai and Lin present an interesting approach to extending the tracking range of single camera non-wearable eye trackers for use in wide field-of-view virtual scenarios. Because eye and head movement are loosely coupled, they propose to augment eye tracking with head pose estima- tion. The focus of the paper is the development and testing of head pose estimation using image recognition, principle components analysis, and neural networks. To generate the head pose estimation, a training session is completed where image capture is used to collect data, which is analyzed using principal components analysis (PCA). A neural net- work operates on the PCA data to output head pitch and yaw position based on associative mapping. Based on the results of three experiments, the authors concluded that head pose tracking is more suitable for identifying areas of interest instead of tracking points of interest. Several improvements and directions for future work are discussed. The research by Yoganandan et al. results in a novel method for simulating both behavioral and functional J. Ritchie (&) Heriot-Watt IMRC, Heriot-Watt University, Edinburgh, UK e-mail: [email protected] J. Vance Virtual Reality Applications Centre, Iowa State University, Ames, IA, USA S. Gupta Institute for Systems Research, University of Maryland, College Park, MD, USA 123 Virtual Reality (2012) 16:1–2 DOI 10.1007/s10055-011-0188-8

Transcript of Editorial

Page 1: Editorial

SI: MANUFACTURING AND CONSTRUCTION

Editorial

James Ritchie • Judy Vance • Satyandra Gupta

Received: 15 February 2011 / Accepted: 17 February 2011 / Published online: 1 March 2011

� Springer-Verlag London Limited 2011

This is the second volume of the special issue of virtual

manufacturing and construction.

To date, most virtual assembly simulations concentrate

on interactions of rigid bodies; however, many objects to

be assembled undergo some amount of deformation in the

assembly process. Mishra and Suresh propose a method to

simulate virtual assembly of CAD models of thin

deformable beam and plate-like objects. These specific

geometries are often found in snap fit assemblies of two or

more parts. They propose a dual representation technique

that relies on mapping a physics-based model to the 3D

geometric model. For the physics model, a simplified

stiffness matrix based on a projected surface is computed.

A finite element analysis is performed on the simplified

model that results in deformation of the 2D projected

surface. The full deflection of the 3D surface is recovered

through a unique mapping of the 2D deflections to the

triangular representation of the 3D surface. The final

deformed surface can then be displayed using OpenGL.

Several sample problems are presented to test the algorithm

for accuracy, generality, and speed. The paper concludes

with a case study of an electrical connector assembly.

Dorozhkin et al. present research that couples simulation

of product movement and operator utilization within a

factory simulation with immersive virtual reality. Discrete

event simulation is a well-developed simulation tool used

to simulate part flow, operator utilization, and factory

layout of complex assembly processes. The authors

describe an immersive virtual reality application that

integrates data from discrete event simulation, 3D factory

layouts, and 3D CAD geometry to support an immersive

virtual reality application for assembly planning. The

application supports navigation throughout the simulation

and the ability to query parts and operators as to their status

within the simulation. Bottlenecks and underutilized

operators are easily identified within the full-scale

immersive environment. The application requires little

preprocessing and facilitates easy modification to support

evaluation of multiple factory configurations and their

effect on part production.

Cai and Lin present an interesting approach to extending

the tracking range of single camera non-wearable eye

trackers for use in wide field-of-view virtual scenarios.

Because eye and head movement are loosely coupled, they

propose to augment eye tracking with head pose estima-

tion. The focus of the paper is the development and testing

of head pose estimation using image recognition, principle

components analysis, and neural networks. To generate the

head pose estimation, a training session is completed where

image capture is used to collect data, which is analyzed

using principal components analysis (PCA). A neural net-

work operates on the PCA data to output head pitch and

yaw position based on associative mapping. Based on the

results of three experiments, the authors concluded that

head pose tracking is more suitable for identifying areas of

interest instead of tracking points of interest. Several

improvements and directions for future work are discussed.

The research by Yoganandan et al. results in a novel

method for simulating both behavioral and functional

J. Ritchie (&)

Heriot-Watt IMRC, Heriot-Watt University, Edinburgh, UK

e-mail: [email protected]

J. Vance

Virtual Reality Applications Centre, Iowa State University,

Ames, IA, USA

S. Gupta

Institute for Systems Research, University of Maryland,

College Park, MD, USA

123

Virtual Reality (2012) 16:1–2

DOI 10.1007/s10055-011-0188-8

Page 2: Editorial

operation of touch-screen mobile devices using virtual

reality and haptics. There are two primary contributions in

this work: simulation of a deformable visual interface

activated through haptic simulation of a stylus and func-

tional prototyping of user logic that is directly transferable

to a real device. Three-dimensional CAD models define the

basic device geometry. The VR system uses embedded web

pages to capture display layouts and Windows CD-OS

emulator to simulate functionality. The paper describes in

detail the methods developed to perform visual deforma-

tion of the texture maps computed on the GPU and haptic

feedback calculated on the CPU. Using a 3DOF haptic

device to simulate a stylus, users can interact with the

virtual prototype and perform a wide variety of mouse-like

actions such as dragging, selection, motion, button selec-

tion, etc. while viewing the deformed UI in real time.

Because of the use of embedded web pages and Windows

CD-OS emulator, designers can quickly mock up multiple

design options and perform functional evaluations.

Chambers et al. describe methods to simulate weld bead

formation to support a real-time interactive virtual Metal

Inert Gas (MIG) welding training simulator. The methods

are implemented as part of a low-cost welding training

simulator, sMIG, previously developed by some of the

authors. The finite difference approach is used to determine

the temperature distribution in the plate and calculate the

weld bead shape and depth of penetration fast enough to

support user interaction during training. Partial conver-

gence of the differential equation representing the tem-

perature distribution is used to achieve fast computation.

Weld bead height, width, and depth were calculated at over

92% accuracy when compared to actual weld results.

Limitations of simulating faster weld gun speeds are dis-

cussed in future work. This approach advances welding

training simulation by allowing arbitrary-shaped weld

paths and non-constant weld gun speed.

The research by Alvarez and Su explores the use of

immersive virtual reality for conceptual design of planar

and spatial mechanisms. The Virtual Reality Mechanism

Design Studio (VRMDS) provides an intuitive user inter-

face for conceptual design of linkages with arbitrary

topology supported by robust dynamic simulation to pro-

totype functionality of the concept. The application is

designed to run in either a desktop configuration or an

immersive configuration with support for a wide variety of

input devices including a mouse, a graphic table, a haptic

device, or a 6DOF wand. A Python interface to MATLAB

supports interactive dynamic analysis and functional pro-

totyping. Because MATLAB simulation of mechanism

dynamics is quite complicated, this virtual reality interface

provides the user with an intuitive design tool that presents

the design process as a visual and interactive experience,

releasing novice designers from the burden of extensive

training in MATLAB programming. Details of the data

structure and user interface supporting conceptual design

of these mechanisms are included in the paper. A case

study designing a spatial four bar linkage is presented.

The papers provided in these two issues give an idea of

how virtual manufacturing has progressed since the first

special issue on the same topic published in this journal in

2007 (Volume 11, Number 4). Many of the same issues in

terms of technology developments and adoptions as well as

applications research still apply. However, the number of

groups participating in VM research is growing rapidly,

and it is only a matter of time before the natural, intuitive,

human centric interfaces associated with interactive VR

design, and manufacturing systems become common

throughout industry.

We feel that these contributions amply demonstrate the

potential benefits that VM will bring to bear on industry

when it is fully adopted.

Finally, we would like to thank the editors and pub-

lishers of the Virtual Reality Journal itself for giving us the

opportunity to create this Special Issue, highlighting

research work in this important area of VR applications.

2 Virtual Reality (2012) 16:1–2

123