VENISE

Interaction models for V&AR

This topic covers several types of problems, from the design of interactive paradigms (control of virtual navigations, sensorimotor rendering...), up to the development of intelligent systems for the multimodal supervision of immersive interactions, for individualized or collaborative experiments, in Virtual or Mixed Reality systems.

In this respect, the question of the task is ubiquitous, because we need to justify the added value of V&AR with respect to already prevalent HCI techniques. Cognitive issues such as Cybersickness, Immersion, Realism, Presence, User implication, and scene Affordance, are also recurrent focuses of our studies. Last but not the least, the perception incoherencies in immersive situations are especiallystudied, and the EVE system, the multi-sensorimotor and multiuser CAVE-like system we designed and that we manage from 2010 remains a unique worldwide environment for that. This topics is structured as the following:

  • Interactive paradigms for Virtual Navigation
  • Sensorimotor channels for 3D interactions
  • Multimodal and Collaborative Immersion
  • Mixed Reality and Teleoperation

Interactive paradigms for Virtual Navigations

HCNav c

This subtopic addresses a basic problem for any V&AR application, i.e. how to travel within immersive environments, and the physiological and cognitive issues of such a task. We especially study a number of declinations of a very powerful interactive paradigm that can exploit any 6DoF tracker attached anywhere on the user body whose the acronym is HCNav (Hand/Head Controlled Navigation system). Some studies had demonstrated that applying HCNav to the motion of the head rather than the hand was more suitable: it provides a concrete vestibular stimulation to the users during their virtual navigation tasks. Moreover, a head centred HCNav approach has also added value in terms of presence and cybersickness, because of the bodily involvement of the user that this technique induces.

Sensorimotor channels for Immersion

CoRSAIRe MecaFlu BobThis subtopic aims at developing new metaphors and interactive paradigms based on combinaison of visual stereoscopy, 3D audio, and haptics, the three main sensorimotor channels we have gathered in our EVE system. Our main focus here is not the realistic rendering of scenes but rather the study of the contribution of these 3D feedback modalities towards facilitating user interactions and collaborations during immersive tasks. This research was initiated within the former ANR "CoRSAIRe" project, but we continue it thanks to different PhD works.

 

 

Supervision of Multimodal and Collaborative Immersion

collaboratif MalCoMIICs 2Our team is well known for its research on the management of multimodal interactions in immersive situations (e.g. to combine speech and gesture commands, with haptic and audio feedbacks to achieve real time and natural interactions on complex 3D data). Thanks to the multisteroscopic facilities of the EVE system, we have extended from 2010 our work on supervision to co-located immersive collaborations, and today, to immersive Collaborative Virtual Environment (CVE) in the context of our partnership in the Equipex DIGISCOPE project.

 

Mixed Reality

SACARI cOn one hand this subtopic is addressing research in Augmented Virtuality (AV) in the context of the SACARI project (Supervision of an Autonomous Car with Augmented viRtuality Interface) which aims to develop the concepts and techniques dedicated to the immersive teleoperation of a semi-autonomous vehicle. Presently, this work shifts to psycho-ergonomic aspects of telepresence, i.e. evaluating the conditions of the creation of an effective immersive telepresence control. We crafted a more complete definition of telepresence, which includes the user's implication into the task and the impact of the scene affordance, and are developping several experiments to evaluate this new model. These aspects have an increasing interest, for instance in drone control.

On the other hand, we are working on new Tangible interface combines with Augmented Reality (AR) immersions, espacially to address scientific and lurning applications for Biologist in molecular docking.

LIMSI
Campus universitaire bât 508
Rue John von Neumann
F - 91405 Orsay cedex
Tél +33 (0) 1 69 15 80 15
Email

SCIENTIFIC REPORTS

LIMSI in numbers

10 Research Teams
100 Researchers
40 Technicians and Engineers
60 Doctoral Students
70 Trainees

 Paris-Sud University new window

 

Paris-Saclay University new window