Post on 26-Aug-2020
Natural and ecological human-agent interaction: case studies in virtual and augmented reality
environmentsManuela Chessa, PhD
University of Genoa - Department of Informatics, Bioengineering, Robotics and System Engineering (DIBRIS)
Tutorial: Active Vision and Human Robot Collaboration
Vision
Visual perception
The experimental evidence
guides the design of visual
technologies that diminish
the visual fatigue
The experimental evidence
guides the design of
artificial vision systems
that have real-world
performances
Human-computer interfaces
(based on VR/AR/MR)
Bio-inspired computer vision
(based on neural paradigms)
My research: Visual perception
Today’s tutorialPerceptual aspects of Human-Agent Interaction and techniques to
achieve a natural and ecological interaction
(case studies in Virtual and Augmented Reality)
Outline of this tutorial
• Brief overview of Human-Agent Interaction
• New devices and open (research) issues
• Case studies: natural and ecological interaction in virtual and augmented reality environments
• Visual feedback for proprioception in VR with HMD
• Immersivity and cybersickness in VR
• Natural perception of 3D virtual objects
• Natural perception of VR stimuli
• Natural interaction in AR
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
Several modalities of Human-Agent Interaction
Several Interaction Modalities
Virtual, Augmented and Mixed Reality
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
Virtual Reality: new devices
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
Virtual Reality: software
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
Videogames/entertainment/media
Virtual Reality: other applications
Leap Motion and VR for rehabilitation
LeapMotionRehabVRToolkit- University of Southern California
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
https://www.youtube.com/watch?v=KEeZdQaK0Pw
• HMDs typically consist of goggles with small monitors mounted infront of each eye and a motion tracking system that computes theposition and movements of the use ’s head.
• In the past, good quality meant high prices (e.g. military applications)
• Now, several low-cost solutions (e.g. Oculus Rift, HTC Vive, SamsungGear VR).
• Open questions: cybersickness, visual fatigue, vestibulo-ocular reflex,latence, level of immersion, proprioceptive feedback.
• All these drawbacks are still preventing a real NATURAL andECOLOGICAL interaction in immersive VR.
Virtual Reality: open issues
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
Virtual Reality: going beyond the wow effect
• Recently: Oculus Call for Research
• Oculus Research is offering competitive funding to advance basicresearch into a number of areas of perception science that impact thedevelopment of virtual reality platforms. The areas of research includeself-motion, binocular eye-movements, multisensory perception andbiological motion in social interaction
• We are particularly interested in research that utilizes a combinationof computational and psychophysical approaches. One of our goals isto stimulate engagement from the vision science, cognitive science,and related fields on these important topics for virtual reality.
Oculus Call for Research, 2016
Augmented Reality: devices and applications
APPLICATIONS
videogames
museums
education
rehabilitation
serious
games
training
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
Augmented Reality: new devices (besides smartphones)
14Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
Augmented Reality: open issues
Besides the standard applications of Augmented Reality, several facts shouldbe investigated:
1. How do people perceive the digital contents added to the real world? Isthis perception coherent and stable? Which are the consequences of anon coherent overlapping between virtual and real contents?
2. How do people interact with Augmented Reality? Is a Natural Interactionwith such systems possible?
3. What happens if we try to use Augmented Reality not in the peripersonalspace but in a wider area? Is it feasible to walk into Augmented Reality(maintaining a coherent perception)? Which Computer Vision aspectsbecome crucial to address it?
4. Does immersive Augmented Reality cause the fatigue and cybersicknessissues typical of VR headsets?
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
[Kooi and Toet, 2004]
[Bruder et al., 2013]
[Shibata et al, 2011]
Perceptual conflicts
Reaching task
Imperfections in binocular image pairs
Natural Human-Computer Interaction
Experimental evidences
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
• New off-the-shelf devices and virtual/augmented reality technologies areavailable
• They simplify the interface between the systems and the users, which caninteract with the virtual environment through different modalities (movementsof the body)
• The aim of current HCIs through mixed reality is:
• To create environments and situations reasonably similar to those of thereal world (ecological validity)
• To make both qualitative and quantitative improvements in daily activitiesof healthy and impaired people
Natural Human-Computer Interaction
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
Case studies5 case studies, to show different aspects to be considered to achieve a
Natural Human-Computer Interaction in actual VR and AR systems.
Case study 1Add a VISUAL feedback of our own body inside immersive VR to improve
proprioception
Case study 1: Proprioception in VR with HMD
An issue that negatively affects Natural Interaction in Immersive Virtual Reality is that
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
ou ody’s p op io eptio is issi g
REAL WORLD IMMERSIVE VR WITH HMD
Case study 1: Proprioception in VR with HMD
• Virtual environments may be successfully used in contexts like VRET(virtual reality exposure therapy) or edutainment.
• VR technology must be sufficiently immersive and must correctlystimulate the senses and the emotions of the users (Krijin et al.,2004).
• VR environments should induce in users the sensation of presence(i.e. the sensation of physically being in the virtual environment).
• In the literature, the topic of presence is often addressed together with the topic of immersion (Cummings & Bailenson, 2015).
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
Case study 1: Proprioception in VR with HMD
In the actual systems some attempts to address the issue:
Solution proposed by my group: to acquire the 3D position of the userby using a Kinect and a Leap motion and to insert in real-time the 3Davatar inside the VR.
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
Interaction with HTC vive, effective, but
not ecological/natural proprioception
Case study 1: Proprioception in VR with HMD
• The aim of the work is to create a virtual environment, in which theuser can visually perceive his/her own virtual body, and can interactwith the virtual objects by using a body that reproduces his/hermovements.
• In particular, the focus is on having a realistic, ecological and naturalinteraction through all the body and, in addition, a fine interactionwith the hands.
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
M. Chessa, L. Caroggio, H. Huang and F. Solari (2016) Insert your own body in the Oculus Rift to improve proprioception.
International Conference on Computer Vision Theory and Applications, VISAPP 2016, 27th-29th February2014, Rome.
Case study 1: Proprioception in VR with HMD
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
M. Chessa, L. Caroggio, H. Huang and F. Solari (2016) Insert your own body in the Oculus Rift to improve proprioception.
International Conference on Computer Vision Theory and Applications, VISAPP 2016, 27th-29th February2014, Rome.
Case study 1: Proprioception in VR with HMD
Body tracking
• The data stream for tracking the user is acquired by a Microsoft Kinect;
• Synchronization of depth and color images with a resolution of 640x480 pixelsat a frame rate of 30Hz;
• The images are analyzed by the Kinect MS-SDK Assets, available in the Unityasset Store, which uses the Kinect Runtime provided by Microsoft to make thetracking information suitable to move an avatar;
• The asset gives information on the tracking of 20 joints of the users body,which are then aligned with those acquired by the Leap Motion.
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
M. Chessa, L. Caroggio, H. Huang and F. Solari (2016) Insert your own body in the Oculus Rift to improve proprioception.
International Conference on Computer Vision Theory and Applications, VISAPP 2016, 27th-29th February2014, Rome.
Case study 1: Proprioception in VR with HMD
Hand tracking
• The accurate tracking of the users hands and fingers are performed by theLeap Motion;
• It is composed of two wide-angle infrared CCD cameras with an acquisitionfrequency of about 120 Hz and the detection field is approximately ahemisphere of 0.5m radius above the sensor;
• The accuracy for the position detection of the fingertips is approximately0.01mm.
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
M. Chessa, L. Caroggio, H. Huang and F. Solari (2016) Insert your own body in the Oculus Rift to improve proprioception.
International Conference on Computer Vision Theory and Applications, VISAPP 2016, 27th-29th February2014, Rome.
Case study 1: Proprioception in VR with HMD
Calibration and registration
• To align the data of all sensors, the calibration phase is performed intwo steps:
1. Rigid transformation between common points acquired by the Kinect andthe Leap Motion, computed just once.
2. Live corrections to overcome the residual offset present between the Kinectand the Leap Motion tracking, and management of the head position overtime, computed every frame.
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
M. Chessa, L. Caroggio, H. Huang and F. Solari (2016) Insert your own body in the Oculus Rift to improve proprioception.
International Conference on Computer Vision Theory and Applications, VISAPP 2016, 27th-29th February2014, Rome.
Case study 1: Proprioception in VR with HMD
Uncalibrated data from the sensors
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
M. Chessa, L. Caroggio, H. Huang and F. Solari (2016) Insert your own body in the Oculus Rift to improve proprioception.
International Conference on Computer Vision Theory and Applications, VISAPP 2016, 27th-29th February2014, Rome.
To compute the rigid
transformation, we use the
least-square rigid motion
usi g SVD te h i ue.
Own avatar inside the VR
environment, after the calibration
and the rigid transformation
Case study 1: Proprioception in VR with HMD
• The result of the rigid transformation, performed on a single set ofsamples, often leads to have an alignment visually incorrect; this isdue to multiple factors:
• possible coplanar structure of the points: hands, wrists, elbows almost on the same plane;
• noise that affects the Kinect joints: in particular hands and wrists;
• frequent mismatch between the centers of the hands, due to the noise and to the worse accuracy of the Kinect, with respect to the Leap Motion.
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
M. Chessa, L. Caroggio, H. Huang and F. Solari (2016) Insert your own body in the Oculus Rift to improve proprioception.
International Conference on Computer Vision Theory and Applications, VISAPP 2016, 27th-29th February2014, Rome.
Case study 1: Proprioception in VR with HMDOwn avatar inside the VR environment: final result after live corrections
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
M. Chessa, L. Caroggio, H. Huang and F. Solari (2016) Insert your own body in the Oculus Rift to improve proprioception.
International Conference on Computer Vision Theory and Applications, VISAPP 2016, 27th-29th February2014, Rome.
Case study 1: Proprioception in VR with HMD
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
https://www.youtube.com/watch?v=2UaxkyZbeLQ
Case study 1: Proprioception in VR with HMD
QUALITY OF THE VR SCENARIO
TRUTHFULNESS OF THE BODY
SENSE OF IMMERSION
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
M. Chessa, L. Caroggio, H. Huang and F. Solari (2016) Insert your own body in the Oculus Rift to improve proprioception.
International Conference on Computer Vision Theory and Applications, VISAPP 2016, 27th-29th February2014, Rome.
Case study 1: Proprioception in VR with HMD
• We have successfully added our own body inside the Oculus Rift.
• Main contribution: integration of heterogeneous sensors in a workingprototype, by following a simple but robust technique.
• Future works:
• to improve the prototype, especially from the graphic point of view.
• to test other devices
• to add other senses (e.g. forces)
• Actual work:
• Perceptive calibration
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
But see (Lugrin et al., 2015)
M. Chessa, L. Caroggio, H. Huang and F. Solari (2016) Insert your own body in the Oculus Rift to improve proprioception.
International Conference on Computer Vision Theory and Applications, VISAPP 2016, 27th-29th February2014, Rome.
Case study 2To act in a Natural way inside HMDs, VR should be sufficiently immersive
and should not cause cybersickness
Case study 2: Immersivity and cybersickness in VR
• Given the widespread diffusion of new and inexpensive devicesoriginally designed for games and entertainment, it is of generalinterest to test whether these devices can be effectively used incontexts different from the ones for which they were designed.
• The aim of this work is to investigate whether the Oculus Rift HMDcan generate a perceptual experience similar to that experienced inthe real world.
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
MOTIVATION OF THE RESEARCH
M. Chessa, G. Maiello, A. Borsari, PJ Bex (2016) The Perceptual Quality of the Oculus Rift for Immersive
Virtual Reality. Human Computer Interaction, pp. 1-32
Case study 2: Immersivity and cybersickness in VR
• Immersivity. Is the Oculus Rift able to make the user feel as if he orshe is in a real-world scenario? Is it possible to elicit in the user thesensation of presence via the virtual stimuli rendered by the device?
• Cybersickness. Does VR experienced through the Oculus Rift inducephysical discomfort to the user?
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
RESEARCH QUESTIONS
M. Chessa, G. Maiello, A. Borsari, PJ Bex (2016) The Perceptual Quality of the Oculus Rift for Immersive
Virtual Reality. Human Computer Interaction, pp. 1-32
Case study 2: Immersivity and cybersickness in VR
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
Case study 2: Immersivity and cybersickness in VR
Experiment 1 the subjects tried the 4
scenarios:
• Heart Rate Measurement
• Immersivity Questionnaire
Experiment 2 obstacle scenario, only:
• Head Movement Measurement
• The Simulator Sickness Questionnaire
[Kennedy et al. 1993]
Experiment 3 comparison among HMD, 3D
TV, Google Cardboard with rollercoaster:
• Heart Rate Measurement
• Head Movement Measurement
• The Simulator Sickness Questionnaire
[Kennedy et al. 1993]
• Immersivity Questionnaire
M. Chessa, G. Maiello, A. Borsari, PJ Bex (2016) The Perceptual Quality of the Oculus Rift for Immersive
Virtual Reality. Human Computer Interaction, pp. 1-32
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
Case study 2: Immersivity and cybersickness in VR
M. Chessa, G. Maiello, A. Borsari, PJ Bex (2016) The Perceptual Quality of the Oculus Rift for Immersive
Virtual Reality. Human Computer Interaction, pp. 1-32
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
Case study 2: Immersivity and cybersickness in VR
M. Chessa, G. Maiello, A. Borsari, PJ Bex (2016) The Perceptual Quality of the Oculus Rift for Immersive
Virtual Reality. Human Computer Interaction, pp. 1-32
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
Case study 2: Immersivity and cybersickness in VR
M. Chessa, G. Maiello, A. Borsari, PJ Bex (2016) The Perceptual Quality of the Oculus Rift for Immersive
Virtual Reality. Human Computer Interaction, pp. 1-32
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
Case study 2: Immersivity and cybersickness in VR
• The heart rate of human observers increased during the exposure to virtualscenarios experienced via the Oculus Rift. This result is consistent with previousliterature that links physiological activation with levels of immersion [Gorini et al.,2011]
• The self-reported answers to a specifically devised immersivity questionnaireshow that the majority of participants felt the experience was immersive andrealistic.
• We observed a significant correlation between self-reported fear of heights andthe sensation of vertigo experienced in one of two virtual scenarios involvingheights.
• Observers were reactive to virtual objects placed in their path.
M. Chessa, G. Maiello, A. Borsari, PJ Bex (2016) The Perceptual Quality of the Oculus Rift for Immersive
Virtual Reality. Human Computer Interaction, pp. 1-32
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
Case study 2: Immersivity and cybersickness in VR
• The Oculus Rift did not induce simulator sickness symptoms whenobservers were viewing the train scenario.
• Compared to other VR systems, namely, a wide-screen 3DTV and aGoogle Cardboard, the Oculus Rift elicited a greater sensation ofimmersion and similar levels of physiological activation.
• The preliminary investigation of the Oculus Rift HMD described in thisarticle suggests that the Oculus has great potential for employment inan array of basic research [Kim et al., 2015] and clinical applications[Hoffman et al., 2014]
M. Chessa, G. Maiello, A. Borsari, PJ Bex (2016) The Perceptual Quality of the Oculus Rift for Immersive
Virtual Reality. Human Computer Interaction, pp. 1-32
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
Case study 3Natural perception of 3D virtual objects: how to observe a virtual object
in a mixed environments like it was real
Visual comfort of display systems can be
seriously reduced by many factors including:
• Jitter
• Flickering
• Image motion
• Poor resolution
• Binocular asymmetry: a difference
between the left and right images of a
stereo pair.
Some specific geometrical parameters:
• the image planes have to be parallel;
• the optical points should be offset relative to the center of the image;
• the distance between the two optical centers equal to the interpupillary distance;
• the field of view of the cameras must be equal to the angle subtended by the
screen;
• the ratio focal length/viewing distance should be equal to the ratio width of the
screen/image plane.
Case study 3: Natural perception of 3D virtual objects
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
3DStereo – some (known) issues
Case study 3: Natural perception of 3D virtual objects
Vergence-accomodation conflict: the eyes of
the observer have to maintain
accommodation on the display screen, thus
lacking the natural relationship between
accommodation and vergence eye
movements (eyes make vergence on and the
distance of depth)
Misperception of 3D o je ts’ shape and
distance: it occurs when the observer moves
in front of the display (changes the head
pose)
NO EASY
SOLUTION,
OUT OF THE
SCOPE
OUR TD3D
SOLUTION,
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
3DStereo – some OPEN issues
But see [Padmanabana et al., Optimizing virtual
reality for all users through gaze-contingent and
adaptive focus displays, PNAS2017]
Case study 3: Natural perception of 3D virtual objects
misperception of a single 3D point natural perception of a single 3D point
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
F. Solari, M. Chessa, M. Garibotti, S.P. Sabatini. (2013) Natural perception in dynamic stereoscopic augmented reality environments.
Display 34(2), pp. 142-152.
Case study 3: Natural perception of 3D virtual objects
Generalized asymmetric
frustums:
not a roto-translation of
the standard off-axis
frustums
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
F. Solari, M. Chessa, M. Garibotti, S.P. Sabatini. (2013) Natural perception in dynamic stereoscopic augmented reality environments.
Display 34(2), pp. 142-152.
NATURAL PERCEPTION OF
3D OBJECT: the proposed
system (simulated perception
of the 3D stereoscopic object)
3D TV
user
MISPERCEPTION OF 3D
OBJECT: the standard
system (simulated perception
of the 3D stereoscopic object)
3D TV
user
International Patent Application WO2013088390. M Chessa, F Solari, M Garibotti, SP Sabatini. Improved three-
dimensional stereoscopic rendering of virtual objects for a moving observer. assignee: University of Genoa, 2012.
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
Case study 3: Natural perception of 3D virtual objects
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
Case study 3: Natural perception of 3D virtual objects
Our solution based on standard 3D
monitors
M. Chessa, M. Garibotti, V. Rossi, A. Novellino, F. Solari (2015) A virtual holographic display case for museum installations. 7th
International Conference on Intelligent Technologies for Interactive Entertainment (INTETAIN), pp. 69-73. Torino, 10-12 June 2015.
Case study 3: Natural perception of 3D virtual objects
2 pairs of
generalized
asymmetric
frustums (one for
each monitor) for a
give use eyes’ position
Different left/right
projections for
diffe e t use eyes’ position for the 3
monitors
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
M. Chessa, M. Garibotti, V. Rossi, A. Novellino, F. Solari (2015) A virtual holographic display case for museum installations. 7th
International Conference on Intelligent Technologies for Interactive Entertainment (INTETAIN), pp. 69-73. Torino, 10-12 June 2015.
Case study 3: Natural perception of 3D virtual objects
A point in the Kinect reference frame
A point in the world reference
frame (coincident with M2)
A point in the M2 reference frame
A point in the external reference frame
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
Calibration and registration
M. Chessa, M. Garibotti, V. Rossi, A. Novellino, F. Solari (2015) A virtual holographic display case for museum installations. 7th
International Conference on Intelligent Technologies for Interactive Entertainment (INTETAIN), pp. 69-73. Torino, 10-12 June 2015.
Case study 3: Natural perception of 3D virtual objects
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
Case study 4How to improve the natural perception of virtual images
Case study 4: Natural perception of VR stimuli
• In natural viewing, humans continuously vary accommodation andeye position to bring into focus on the fovea an image of what iscurrently being fixated.
• All objects in the visual field that are at the accommodative distancewill, in first approximation, form sharp images on the retinae.
• Images of objects that are closer or farther than the accommodativedistance will instead be out of focus.
G. Maiello, M. Chessa, F. Solari, P. Bex. Simulated disparity and peripheral blur interact during binocular fusion. Journal of Vision, 14(8)13, July 17, 2014.
G. Maiello, M. Chessa, F. Solari, P. Bex. Stereoscopic fusion with gaze-contingent blur. ECVP, Bremen, Germany, August 2013.
Case study 4: Natural perception of VR stimuli
• We have developed a low-cost, practical gaze-contingent display inwhich natural images are presented to the observer with dioptric blurand stereoscopic disparity that are dependent on the three-dimensional structure of natural scenes.
• Our system simulates a distribution of retinal blur and depth similarto that experienced in real-world viewing conditions by emmetropicobservers.
G. Maiello, M. Chessa, F. Solari, P. Bex. Simulated disparity and peripheral blur interact during binocular
fusion. Journal of Vision, 14(8)13, July 17, 2014.
• Natural scenes contain multiple sources of depth information: e.g. binoculardisparity, perspective, and blur (i.e. when eyes are focused at a given distance,objects at other distances will be blurred on the retina):
• We examine depth perception in real images (Light field camera,www.lytro.com) with natural variation in perspective, blur and binoculardisparity
• We examine how the time-course of binocular fusion depends on depthcues.
Case study 4: Natural perception of VR stimuli
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
G. Maiello, M. Chessa, F. Solari, P. Bex. Simulated disparity and peripheral blur interact during binocular
fusion. Journal of Vision, 14(8)13, July 17, 2014.
Case study 4: Natural perception of VR stimuli
G. Maiello, M. Chessa, F. Solari, P. Bex. Simulated disparity and peripheral blur interact during binocular
fusion. Journal of Vision, 14(8)13, July 17, 2014.
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
Case study 4: Natural perception of VR stimuli
Informative
Retinal Blur
Gaze
contingent
blur
G. Maiello, M. Chessa, F. Solari, P. Bex. Simulated disparity and peripheral blur interact during binocular
fusion. Journal of Vision, 14(8)13, July 17, 2014.
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
Case study 4: Natural perception of VR stimuli
G. Maiello, M. Chessa, F. Solari, P. Bex. Simulated disparity and peripheral blur interact during binocular
fusion. Journal of Vision, 14(8)13, July 17, 2014.
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
Case study 4: Natural perception of VR stimuli
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
FOVE (eye-tracking inside HMD) – foveated rendering
Case study 5How to achieve Natural Interaction in AR
Case study 5: Natural interaction in AR
• We would like to find an answer to the following questions:
• Is acting in virtual environments (though without wearing invasive devices) natural as acting in the real world?
• Are we able and do we like collaborating with other people in such environments?
• Which kind of perceptual feedbacks help us in achieving a natural interaction?
And then:
• What is missing? What are we doing well or wrong?
• How could Technology and Computer Vision help us?
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
M. Chessa, G. Matafu’, S. Susini, F. Solari (2016) An experimental setup for natural interaction in a collaborative virtual
environment. 13th European Conference on Visual Media Production (CVMP16), 12-13th December 2016, London.
Case study 5: Natural interaction in AR
NON INVASIVE TRACKING:
LEAP MOTION INTEGRATED
INTO UNITY 3D
THE OVERALL SYSTEM ARCHITECTURE
A LIBRARY OF HAND GESTURES FROM LEAP MOTION RAW
DATA
GRASP CLAMP PINCH
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
M. Chessa, G. Matafu’, S. Susini, F. Solari (2016) An experimental setup for natural interaction in a collaborative virtual
environment. 13th European Conference on Visual Media Production (CVMP16), 12-13th December 2016, London.
Case study 5: Natural interaction in AR
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
https://www.youtube.com/watch?v=mokxW9hQKjQ
Case study 5: Natural interaction in AR
• We have performed six experimental sessions, divided into preliminary tests and final tests.
• A total of twenty-seven untrained volunteers participated to the experimental sessions.
• They were six females and twenty-one males.
• All participants were between 20 and 28 years old and they had
never tried our system before,
• moreover they had never used any application controlled by Leap Motion controller.
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
M. Chessa, G. Matafu’, S. Susini, F. Solari (2016) An experimental setup for natural interaction in a collaborative virtual
environment. 13th European Conference on Visual Media Production (CVMP16), 12-13th December 2016, London.
Case study 5: Natural interaction in AR
• Preliminary tests:
1. Interaction between the user and the environment.
2. Interaction among users.
• Final tests:
3. Comparison between mouse interaction and Leap Motion interaction.
4. Users' performances with different feedbacks.
5. Comparison between stereoscopic and non-stereoscopic visualization in depth perception.
6. Acting in a shared workspace.
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
M. Chessa, G. Matafu’, S. Susini, F. Solari (2016) An experimental setup for natural interaction in a collaborative virtual
environment. 13th European Conference on Visual Media Production (CVMP16), 12-13th December 2016, London.
Case study 5: Natural interaction in AR
PINCH BY USING LEAP MOTION PINCH BY USING MOUSE
Comparison between mouse interaction and Leap Motion interaction (time to complete a
task)
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
M. Chessa, G. Matafu’, S. Susini, F. Solari (2016) An experimental setup for natural interaction in a collaborative virtual
environment. 13th European Conference on Visual Media Production (CVMP16), 12-13th December 2016, London.
Case study 5: Natural interaction in AR
Users' performances with different feedbacks (ANOVA)
ANOVA WITH DIFFERENT FEEDBACKS
TIM
E T
O C
OM
PLE
TE
TH
E T
AS
K
NONE LIGHT STEREO SOUND TOP VIEW HAPTIC
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
M. Chessa, G. Matafu’, S. Susini, F. Solari (2016) An experimental setup for natural interaction in a collaborative virtual
environment. 13th European Conference on Visual Media Production (CVMP16), 12-13th December 2016, London.
Case study 5: Natural interaction in AR
• Leap motion has (some) problems
• To improve its stability.
• To explore other non-invasive and low-cost solutions (based on Computer Vision) to track the hand of the user.
• Comparison with other (non natural?) solution: is it better natural interaction or stability?
• To improve analysis
• More complex tasks
• More users
• To analyse the effects of feedbacks not only by looking at completion times and reaching points but analysing the overall behaviour of users. (see tomorrow poster at NIVAR17 workshop)
Manuela Chessa, PhD, University of Genoa (manuela.chessa@unige.it)
HCI in VR/AR: the future in 1 slide
Fro people who lear s to i tera t with te h ology….
…to te h ology that lear s how to i tera t with us!