VR-Together

VR-Together

VR-Together

Social VR experiences

Project Info

Start date:
October 2017

End date:
September 2020

Funding:
EU H2020

Coordinator:
Fundació i2CAT

Website:
http://vrtogether.eu/

Summary

VR-Together offers new ground-breaking virtual reality (VR) experiences based on social photorealistic immersive content. For this purpose, this project develops and assembles an end-to-end pipeline integrating state-of-the-art technologies and off-the-shelf components. The challenge of VR-Together is to create photorealistic truly social VR experiences in a cost effective manner. Immersive media production and delivery through innovative capture, encoding, delivery and rendering technologies. The project will demonstrate the scalability of its approach for production and delivery of immersive content across 3 pilots. It will introduce new methods for social VR evaluation and quantitative platform benchmarking for both live and interactive content production, thus providing production and delivery solutions with significant commercial value.

In this project, Artanim works on content production for the 3 pilots combining VR and offline/real time motion capture and develops new tools for immersive media production. We also participate in the evaluation of the developed social VR experiences.

Partners

Fundació i2CAT (Spain)

Nederlandse Organisatie voor toegepast-natuurwetenschappelijk Onderzoek (The Netherlands)

Centrum Wiskunde & Informatica (The Netherlands)

Centre for Research & Technology Hellas (Greece)

Viaccess-Orca (France)

Entropy (Spain)

Motion Spell (France)

Artanim

Real Virtuality

Real Virtuality

Real Virtuality

Immersive platform

Project Info

Start date:
April 2015

End date:
August 2019

Funding:

Coordinator:
Artanim

Summary

The objective of this project is to develop a multi-user immersive platform combining a 3D environment – which can be seen and heard through a virtual reality (VR) headset – with a real life stage set. Users are tracked by a motion capture system allowing them to see their own bodies, move physically in the virtual environment and interact with physical objects.

The solution developed tackles the following technical challenges: 1) we generate a full body animation using inverse kinematics from a minimal number of markers to keep the user’s setup time as short as possible, while ensuring a good tracking accuracy; 2) the platform is multi-user and the VR headset interfaces wirelessly with the motion capture system; 3) the interaction with the objects is flawless and retargeted correctly; 4) the position and orientation of the user’s head is adequately handled to minimize latency (which would result in possible discomfort with the VR headset) and to maximize positioning accuracy.

This platform can be used in numerous applications, such as for virtual visits (e.g., long lost historical sites, architectural places, telepresence), entertainment (e.g., games, theme-park attractions, story-telling experiences) or medical projects (e.g., phobia treatment, rehabilitation). Our solution offers a “matrix-like” degree of immersion over a large area, up to hundreds of square meters.

Download the white paper here.

This platform is commercialized by Dreamscape Immersive. More information here.

Related Publications

Chagué S, Charbonnier C. Real Virtuality: A Multi-User Immersive Platform Connecting Real and Virtual Worlds, VRIC 2016 Virtual Reality International Conference – Laval Virtual, Laval, France, ACM New York, NY, USA, March 2016.
PDF

Chagué S, Charbonnier C. Digital Cloning for an Increased Feeling of Presence in Collaborative Virtual Reality Environments, Proc. of 6th Int. Conf. on 3D Body Scanning Technologies, Lugano, Switzerland, October 2015.
PDF