Char4VR

Char4VR

Char4VR

Interactive characters for VR experiences

Project Info

Start date:
September 2020

End date:
August 2023

Funding:

Coordinator:
Artanim

Summary

Virtual Reality (VR) opens the possibility to develop a new kind of content format which, on one hand, is experienced as a rich narrative, just as movies or theater plays are, while on the other hand, it allows interaction with autonomous virtual characters. The experience would feel like having different autonomous characters unfold a consistent story plot, just as a movie or a theater play does, within an interactive VR simulation. Players could choose to participate in the events and affect some parts of the story, or just watch how the plot unfolds around them.

The main challenge to realize this vision is that current techniques for interactive character animation are designed for video games, and do not offer the subtle multi-modal interaction that VR users spontaneously expect.  The main goal of this project is to explore different techniques for interactive character animation that help creates interactive characters that can engage in a more compelling way with players. To achieve this goal, we will use a combination of research techniques derived from computer graphics, machine learning and cognitive psychology.

Related Publications

Llobera J, Charbonnier C. Interactive Characters for Virtual Reality Stories, ACM International Conference on Interactive Media Experiences (IMX \’21), ACM, New York, NY, USA, June 2021.
PDF

MoRehab XR

MoRehab XR

MoRehab XR

Motion tracked rehabilitation through extended realities

Project Info

Start date:
August 2020

End date:
July 2023

Funding:

Coordinator:
Artanim

Summary

After a stabilization surgery or a traumatic accident, physical rehabilitation is often required to restore the joint’s normal function. This process requires very regular and repetitive training sessions. The regularity and involvement of the patient being critical for success of the procedure, maintaining them interested and motivated while repeating similar exercises over and over is one of the main challenges of the physical therapist. During these sessions, it is also difficult to objectively monitor small progresses or the lack of the latter. On the other end of the spectrum, some video games have proved to be very effective in motivating people to perform pretty repetitive tasks on their controllers in order to travel through a story or a challenge.

The goal of this project is to use the captivating capabilities of video games to entertain and motivate the patients through their rehabilitation, by designing a set of “exercise games” (called exergames) that will be won by performing the correct routines required for a proper physical rehabilitation. Through the use of modern motion tracking technology, the exergames will also be able to track the evolution of the physical recovery of the patients on a session per session basis, to adapt the challenges that the patients will face based on their needs and actual capacities, and if need be to alert the medical personnel early if some problems persist, or if no progresses are observed.

In this project, Artanim is in charge of building a platform suitable for these exergames, as well as designing and developing these exergames, and implementing the tools for a standard assessment of the patient’s performance. On the other hand, the medical usability and validity of the setup and exergames will be tested and assessed by the clinical team of La Tour Hospital over a large set of voluntary patients. Both teams will also work together to develop metrics to evaluate and adapt the patient’s performance during the sessions, thanks to the extended capabilities of the motion tracking system with respect to the data available in conventional physical therapy.

Partners

Artanim
Conception of the exergames and of a platform adapted to perform them, development of a set of medical scores adapted to the extended capabilities of the motion tracking system

La Tour Hospital – Division of Orthopedic and Trauma Surgery
Clinical tests, co-design of new physical evaluation metrics adapted to the motion tracking capabilities

Real Virtuality

Real Virtuality

Real Virtuality

Immersive platform

Project Info

Start date:
April 2015

End date:
December 2022

Funding:

Coordinator:
Artanim

Summary

Artanim invented and develops since 2015 the VR technology driving Dreamscape Immersive. This multi-user immersive platform combines a 3D environment – which can be seen and heard through a VR headset – with a real-life stage set incorporating haptic feedback elements.

The user’s movements are sparsely captured in real time and translated into a full-body animation thanks to a deep understanding of body mechanics. Contrary to other static position VR systems, the platform allows up to eight users to truly feel immersed in a VR scene. They are rendered as characters (avatars) inside a computer generated world where they can move physically, interact with objects and other players, and truly experience worlds previously accessible only in their imagination. The bodies of the users thus become the interface between the real and virtual worlds.

The platform combines the following features:

  • Wireless: complete freedom of movement across large spaces.
  • Social: interactive multi-user experiences within a shared environment or across connected pods.
  • Accurate: full-body and physical objects tracking with less than 1 millimeter deviation.
  • Real-time: zero discernible lag eliminating most concerns of motion sickness.
  • Flexible: SDK allowing content creators to easily create experiences for this particular platform.

This platform is leveraged by Dreamscape Immersive through their worldwide location-based VR entertainment centers, as well as through educational and training programs and other enterprise solutions.

Related Publications

Chagué S, Charbonnier C. Real Virtuality: A Multi-User Immersive Platform Connecting Real and Virtual Worlds, VRIC 2016 Virtual Reality International Conference – Laval Virtual, Laval, France, ACM New York, NY, USA, March 2016.
PDF
 
Chagué S, Charbonnier C. Digital Cloning for an Increased Feeling of Presence in Collaborative Virtual Reality Environments, Proc. of 6th Int. Conf. on 3D Body Scanning Technologies, Lugano, Switzerland, October 2015.
PDF

Charbonnier C, Trouche V. Real Virtuality: Perspective offered by the Combination of Virtual Reality Headsets and Motion Capture, White Paper, August 2015.
PDF

Awards and Recognitions

VR+4CAD

VR+4CAD

VR+4CAD

Closing the VR-loop around the human in CAD

Project Info

Start date:
June 2020

End date:
November 2021

Funding:
Innosuisse

Coordinator:
Artanim

Website:
https://vrplus4cad.artanim.ch/

Summary

VR+4CAD aims to tackle issues limiting the use of VR in manufacture and design as highlighted by the industries: the lack of full circle interoperability between VR and CAD, the friction experienced when entering a VR world and the limited feedback provided about the experience for future analysis.

In this project, we target CAD-authored design to be automatically converted and adapted for (virtual) human interaction within a virtual environment. Interaction is made more immediate by means of an experimental, markerless motion capture system that relieves the user from wearing specific devices. The data acquired by motion capture during each experience is analyzed via activity recognition techniques and transformed into implicit feedback. Both explicit and implicit feedback are merged and sent back to the CAD operator for the next design iteration.

Partners

Artanim
Markerless motion capture, VR activity annotation

Scuola universitaria professionale della Svizzera (SUPSI)
CAD/VR interoperability, human activity recognition

VR-Together

VR-Together

VR-Together

Social VR experiences

Project Info

Start date:
October 2017

End date:
December 2020

Funding:
EU H2020

Coordinator:
Fundació i2CAT

Website:
http://vrtogether.eu/

Summary

VR-Together offers new ground-breaking virtual reality (VR) experiences based on social photorealistic immersive content. For this purpose, this project developed and assembled an end-to-end pipeline integrating state-of-the-art technologies and off-the-shelf components. The challenge of VR-Together was to create photorealistic truly social VR experiences in a cost effective manner. Immersive media production and delivery through innovative capture, encoding, delivery and rendering technologies. The project demonstrated the scalability of its approach for production and delivery of immersive content across 3 pilots. It introduced new methods for social VR evaluation and quantitative platform benchmarking for both live and interactive content production, thus providing production and delivery solutions with significant commercial value.

In this project, Artanim worked on content production for the 3 pilots combining VR and offline/real time motion capture and developed new tools for immersive media production. We also participated in the evaluation of the developed social VR experiences.

Partners

Fundació i2CAT (Spain)

Netherlands Organisation for Applied Scientific Research (The Netherlands)

Centrum Wiskunde & Informatica (The Netherlands)

Centre for Research & Technology Hellas (Greece)

Viaccess-Orca (France)

Entropy Studio (Spain)

Motion Spell (France)

Artanim

Related Publications

Chatzitofis A, Saroglou L, Boutis P, Drakoulis P, Zioulis N, Subramanyam S, Kevelham B, Charbonnier C, Cesar P, Zarpalas D, Kollias S, Daras P. HUMAN4D: A Human-Centric Multimodal Dataset for Motions & Immersive Media, IEEE Access, 8:176241-176262, 2020.
PDF

Galvan Debarba H, Chagué S, Charbonnier C. On the Plausibility of Virtual Body AnimationFeatures in Virtual Reality, IEEE Trans Vis Comput Graph, In Press, 2020.
PDF

De Simone F, Li J, Galvan Debarba H, El Ali A, Gunkel S, Cesar P. Watching videos together in social Virtual Reality: an experimental study on user’s QoE, Proc. 2019 IEEE Virtual Reality, IEEE Comput. Soc, March 2019.
PDF