Real Virtuality

Real Virtuality

Real Virtuality

Immersive platform

Project Info

Start date:
April 2015

End date:

Funding:

Coordinator:
Artanim

Summary

Artanim is the inventor and continuously develops since 2015 the VR technology driving Dreamscape Immersive. This multi-user immersive platform combines a 3D environment – which can be seen and heard through a VR headset – with a real-life stage set incorporating haptic feedback elements.

The user’s movements are sparsely captured in real time and translated into a full-body animation thanks to a deep understanding of body mechanics. Contrary to other static position VR systems, the platform allows up to eight users to truly feel immersed in a VR scene. They are rendered as characters (avatars) inside a computer generated world where they can move physically, interact with objects and other players, and truly experience worlds previously accessible only in their imagination. The bodies of the users thus become the interface between the real and virtual worlds.

The platform combines the following features:

  • Wireless: complete freedom of movement across large spaces.
  • Social: interactive multi-user experiences within a shared environment or across connected pods.
  • Accurate: full-body and physical objects tracking with less than 1 millimeter deviation.
  • Real-time: zero discernible lag eliminating most concerns of motion sickness.
  • Flexible: SDK allowing content creators to easily create experiences for this particular platform.

This platform is leveraged by Dreamscape Immersive through their worldwide location-based VR entertainment centers, as well as through educational and training programs and other enterprise solutions.

The platform was also used to produce VR_I, the first ever choreographic work in immersive virtual reality, as well as Geneva 1850, a time traveling experience and historical reconstruction into the Geneva of 1850.

Related Publications

Chagué S, Charbonnier C. Real Virtuality: A Multi-User Immersive Platform Connecting Real and Virtual Worlds, VRIC 2016 Virtual Reality International Conference – Laval Virtual, Laval, France, ACM New York, NY, USA, March 2016.
PDF
 
Chagué S, Charbonnier C. Digital Cloning for an Increased Feeling of Presence in Collaborative Virtual Reality Environments, Proc. of 6th Int. Conf. on 3D Body Scanning Technologies, Lugano, Switzerland, October 2015.
PDF

Charbonnier C, Trouche V. Real Virtuality: Perspective offered by the Combination of Virtual Reality Headsets and Motion Capture, White Paper, August 2015.
PDF

Awards and Recognitions

Presence

Presence

Presence

A toolset for hyper-realistic and XR-based human-human and human-machine interactions

Project Info

Start date:
January 2024

End date:
December 2027

Funding:
EU Commission (GAP-101135025) and SERI (REF-1131-52104)

Coordinator:
Fundació i2CAT

Summary

The concept of presence can be understood as a synthesis of interrelated psychophysical ingredients where multiple perceptual dimensions intervene. A better understanding of how to impact specific aspects such as plausibility, the illusion that virtual events are really happening, co-presence, the illusion of being with others, or place illusion, the feeling of being there, is key to improve the quality of XR experiences. The availability and performance of advanced technologies will let us reach high levels of presence in XR, essential to get us closer than ever to the old VR dream: to be anywhere, doing anything, together with others from any place. The PRESENCE project will impact multiple dimensions of presence in physical-digital worlds, addressing three main challenges: i) how to create realistic visual interactions among remote humans, delivering high-end holoportation paradigms based on live volumetric video capturing, compression and optimization techniques under heterogeneous computation and network conditions; ii) how to provide realistic touch among remote users and synthetic objects, developing novel haptic systems and enabling spatial multi-device synchronization in multi-user scenarios; iii) how to produce realistic social interactions among avatars and agents, generating AI virtual humans, representing real users or AI agents.

The contribution of Artanim will focus on smart autonomous characters, which are able to generate physically-plausible behavior integrating the cues of avatars of human users or other autonomous characters in specific scenarios. Specific parameters will be studied, for the movement to respond plausibly to social cues such as interpersonal differences or to collaboration tasks. Interactive virtual agents may take into account the position of others to respect implicit proxemics rules by generating small step rotations, or if a user needs an object, the interactive virtual agents will grab and offer it.

Partners

Fundació i2Cat (Spain)

Actronika (France)

Universitaet Hamburg (Germany)

Ethniko Kentro Erevnas Kai Tehcnologikis Anapty (Greece)

Raytrix GMBH  (Germany)

SenseGlove B.V. (Netherlands)

Go Touch VR SAS (France)

Didimo, S.A.(Portugal)

Vection Italy Srl (Italy)

Universitat de Barcelona (Spain)

Unity Technologies (Denmark)

Sound Holding B.V. (Netherlands)

Interuniversitair Micro-Electronica Centrum (Belgium)

Joanneum Research Forschungsgesellschaft MBH (Austria)

SyncVR Medical B.V. (Netherlands)

Zaubar UG (Haftungsbeschraenkt) (Germany)

Artanim

Char4VR

Char4VR

Char4VR

Interactive characters for VR experiences

Project Info

Start date:
September 2020

End date:
December 2023

Funding:

Coordinator:
Artanim

Summary

Virtual Reality (VR) opens the possibility to develop a new kind of content format which, on one hand, is experienced as a rich narrative, just as movies or theater plays are, while on the other hand, it allows interaction with autonomous virtual characters. The experience would feel like having different autonomous characters unfold a consistent story plot, just as a movie or a theater play does, within an interactive VR simulation. Players could choose to participate in the events and affect some parts of the story, or just watch how the plot unfolds around them.

The main challenge to realize this vision is that current techniques for interactive character animation are designed for video games, and do not offer the subtle multi-modal interaction that VR users spontaneously expect.  The main goal of this project is to explore different techniques for interactive character animation that help creates interactive characters that can engage in a more compelling way with players. To achieve this goal, we use a combination of research techniques derived from computer graphics, machine learning and cognitive psychology.

Related Publications

Llobera J, Charbonnier C. Physics-based character animation and human motor control, Phys Life Rev, 46:190-219, 2023.

Llobera J, Jacquat V, Calabrese C, Charbonnier C. Playing the mirror game in virtual reality with an autonomous character, Sci Rep, 12:21329, 2022.
PDF

Llobera J, Charbonnier C. Physics-based character animation for Virtual Reality, Open Access Tools and Libraries for Virtual Reality, IEEE VR Workshop, 2022 Best Open Source tool Award, March 2022.
PDF

Llobera J, Booth J, Charbonnier C. Physics-based character animation controllers for videogame and virtual reality production, 14th ACM SIGGRAPH Conference on Motion, Interaction and Games, November 2021.
PDF

Llobera J, Booth J, Charbonnier C. New Techniques on Interactive Character Animation, SIGGRAPH ’21: short course, ACM, New York, NY, USA, August 2021.

Llobera J, Charbonnier C. Interactive Characters for Virtual Reality Stories, ACM International Conference on Interactive Media Experiences (IMX \’21), ACM, New York, NY, USA, June 2021.
PDF

VR-Together

VR-Together

VR-Together

Social VR experiences

Project Info

Start date:
October 2017

End date:
December 2020

Funding:
EU H2020

Coordinator:
Fundació i2CAT

Website:
http://vrtogether.eu/

Summary

VR-Together offers new ground-breaking virtual reality (VR) experiences based on social photorealistic immersive content. For this purpose, this project developed and assembled an end-to-end pipeline integrating state-of-the-art technologies and off-the-shelf components. The challenge of VR-Together was to create photorealistic truly social VR experiences in a cost effective manner. Immersive media production and delivery through innovative capture, encoding, delivery and rendering technologies. The project demonstrated the scalability of its approach for production and delivery of immersive content across 3 pilots. It introduced new methods for social VR evaluation and quantitative platform benchmarking for both live and interactive content production, thus providing production and delivery solutions with significant commercial value.

In this project, Artanim worked on content production for the 3 pilots combining VR and offline/real time motion capture and developed new tools for immersive media production. We also participated in the evaluation of the developed social VR experiences.

Partners

Fundació i2CAT (Spain)

Netherlands Organisation for Applied Scientific Research (The Netherlands)

Centrum Wiskunde & Informatica (The Netherlands)

Centre for Research & Technology Hellas (Greece)

Viaccess-Orca (France)

Entropy Studio (Spain)

Motion Spell (France)

Artanim

Related Publications

Galvan Debarba H, Montagud M, Chagué S, Lajara J, Lacosta I, Fernandez Langa S, Charbonnier C. Content Format and Quality of Experience in Virtual Reality, Multimed Tools Appl, 2022.
PDF

Revilla A, Zamarvide S, Lacosta I, Perez F, Lajara J, Kevelham B, Juillard V, Rochat B, Drocco M, Devaud N, Barbeau O, Charbonnier C, de Lange P, Li J, Mei Y, Lawicka K, Jansen J, Reimat N, Subramanyam S, Cesar P. A Collaborative VR Murder Mystery using Photorealistic User Representations, Proc. IEEE Conf. on Virtual Reality and 3D User Interfaces (VRW 2021), IEEE, pp. 766, March 2021.
PDF

Chatzitofis A, Saroglou L, Boutis P, Drakoulis P, Zioulis N, Subramanyam S, Kevelham B, Charbonnier C, Cesar P, Zarpalas D, Kollias S, Daras P. HUMAN4D: A Human-Centric Multimodal Dataset for Motions & Immersive Media, IEEE Access, 8:176241-176262, 2020.
PDF

Galvan Debarba H, Chagué S, Charbonnier C. On the Plausibility of Virtual Body Animation Features in Virtual Reality, IEEE Trans Vis Comput Graph, In Press, 2020.
PDF

De Simone F, Li J, Galvan Debarba H, El Ali A, Gunkel S, Cesar P. Watching videos together in social Virtual Reality: an experimental study on user’s QoE, Proc. 2019 IEEE Virtual Reality, IEEE Comput. Soc, March 2019.
PDF

HoloMed

HoloMed

HoloMed

AR tools for medicine

Project Info

Start date:
September 2016

End date:
February 2018

Funding:
La Tour Hospital / Hirslanden Clinique La Colline

Coordinator:
Artanim

Summary

In medicine, augmented reality (AR) technology can offer added value over traditional education, assessment and interventional approaches. By combining our expertise in motion capture, 3D anatomical modelling and AR technology, new solutions can be proposed to improve medical treatment.

In this project, we developed an anatomical see-through tool to visualize and analyze patient’s anatomy in real time and in motion for applications in sports medicine and rehabilitation. This tool allows healthcare professionals to visualize joint kinematics, where the bones are accurately rendered as a holographic overlay on the subject (like an X-ray vision) and in real-time as the subject performs the movement.

Partners

Artanim
Development of AR solutions

La Tour Hospital – Division of Orthopedic and Trauma Surgery
Clinical tests

Hirslanden Clinique La Colline – Division of Orthopedic and Trauma Surgery
Clinical tests

Related Publications

Elias de Oliveira M, Galvan Debarba H, Lädermann A, Chagué S, Charbonnier C. A Hand-Eye Calibration Method for Augmented Reality Applied to Computer-Assisted Orthopedic Surgery, Int J Med Robot Comput Assist Surg, 15(2):e1969, 2019.
PDF

Elias de Oliveira M, Galvan Debarba H, Lädermann A, Chagué S, Charbonnier C. Réalité augmentée en chirurgie orthopédique. Mythe ou réalité ?, Congrès de la Société Francophone d’Arthroscopie 2018, Paris, France, Revue de chirurgie orthopédique et traumatologique, 104(8):S110, November 2018.
PDF

Galvan Debarba H, Elias de Oliveira M, Lädermann A, Chagué S, Charbonnier C. Augmented Reality Visualization of Joint Movements for Rehabilitation and Sports Medicine, 20th Symposium on Virtual and Augmented Reality (SVR), Foz Do Iguaçu, Brazil, October 2018.
PDF

Galvan Debarba H, Elias de Oliveira M, Lädermann A, Chagué S, Charbonnier C. Augmented Reality Visualization of Joint Movements for Physical Examination and Rehabilitation, Proc. 2018 IEEE Virtual Reality, IEEE Comput. Soc, March 2018.
PDF

Galvan Debarba H, Elias de Oliveira M, Lädermann A, Chagué S, Charbonnier C. Tracking a Consumer HMD with a Third Party Motion Capture System, Proc. 2018 IEEE Virtual Reality, IEEE Comput. Soc, March 2018.
PDF

Elias de Oliveira M, Galvan Debarba H, Lädermann A, Chagué S, Charbonnier C. Development of a Hand-Eye Calibration Method for Augmented Reality Applied to Computer-Assisted Orthopedic Surgery, 32nd International Congress of Computer Assisted Radiology and Surgery, Heidelberg, Germany, Int J CARS, June 2018.
PDF