Curate XR

Curate XR

Curate XR

XR experience guidance using real-time LLM-driven conversational agents

Project Info

Start date:
January 2025

End date:
December 2026

Funding:

Coordinator:
Artanim

Summary

Imagine yourself inside a museum environment, whether real or entirely virtual. The elegant space is filled with fascinating works of art, some which you immediately recognize, but who was the artist again? And others grab your attention but are entirely unfamiliar. If only someone was around to tell you about the artists behind the works of art or provide you with details on the works themselves. Thankfully a virtual museum guide is in the room with you.

One of the drawbacks of more traditional approaches to such a museum guide scenario is the need to establish all the possible questions and responses up front, quickly leading to intractable production constraints, and an end-result that while informative may feel unnatural or robotic given that it’s entirely predetermined. The goal of our CurateXR project is to study how a modern-day generative AI chatbot backed by a Large Language Model (LLM) can be used to provide a real-time and entirely natural means of interacting with a virtual agent in such virtual (VR) or augmented reality (AR) scenarios. Powered by an animation system making use of Motion Matching and pathfinding to naturally navigate the space, the guide will happily stroll along with you.

All interactions happen through natural speech. Just ask any question you may have, and the guide will reply without delay, taking into consideration the context of the space it has been provided with up front, as well as real-time contextual clues provided to it such as your location and the artwork you’re looking at. Not only will the guide engage in a natural conversation with you, but if you feel more comfortable speaking in another language, just ask him to do so. The use of a generative AI chatbot resolves many issues around the barriers to entry users may feel when interacting with extended reality applications for the first time, and it provides a level of accessibility which is hard to replicate otherwise.

Generative AI chatbots aren’t just limited to conversations. They have the ability to turn your words into concrete actions. To demonstrate this, at one end of the museum space you will find a virtual sculpture experience, controlled by another agent. Through natural speech you are not just able to summon three-dimensional shapes in your desired color, but you can have them move as you desire. And if you’re not satisfied with the results you achieved, there is no need to start from scratch. Just inform your virtual helper about what changes you would like to make, and you’ll see the relevant updates happen before your eyes.

Integrating a generative AI chatbot into your VR scenarios can level up your experience by providing an entirely natural means of engagement. Just put your headset on and go. And while the museum scenario was chosen as the use-case for this project, the potential range of scenarios that can be supported – from training and education to rehabilitation, entertainment and many more – is virtually endless.

ABC-Space

ABC-Space

ABC-Space

Attention, Behavior and Curiosity in Space

Project Info

Start date:
September 2024

End date:
August 2028

Funding:
Swiss National Science Foundation (SNSF)

Coordinator:
Artanim

Summary

The aim of ABC-Space is to better understand how Attention and Curiosity deploy in Space, and their impact on Educational Virtual Reality (VR) experiences that use embodied social virtual characters. ABC-Space will also bring foundational insights into how social signals from interactive automated characters influence cognitive and motivational processes that guide attention and memory in VR. 

Our results will provide new insights in VR science, education psychology, and cognitive science. These results will be achieved through a combination of novel experimental paradigms probing attention, motivation, and memory in VR. By the end of the project, ABC-Space will have contributed to the understanding of the relation between attention and curiosity, memory and learning, 3D space representations, as well as their interplay in fully immersive VR. This knowledge will have important implications both for VR science and for cognitive science and psychology. ABC-Space will thus pave the way to design a new kind of educational VR agent, expand current theoretical models of attention and 3D space representation in humans, and demonstrate their educational impact for applied purposes.  

In this project, Artanim develops Physics-based controllers based on Deep Reinforcement to control virtual reality characters, produces the VR environments and contributes to the design of the behavioural experiments to validate their impact in terms of attention and learning. 

Partners

Artanim
Physics-based interactive Virtual Reality characters capable of signaling and social cueing

University of Geneva – Laboratory for Behavioral Neurology and Imaging of Cognition (LABNIC)
Cognitive and affective processes governing human attention in space

University of Geneva – Educational Technologies (TECFA)
Models of space representation used to form cognitive maps, and their impact in educational VR experiences

Presence

Presence

Presence

A toolset for hyper-realistic and XR-based human-human and human-machine interactions

Project Info

Start date:
January 2024

End date:
December 2027

Funding:
EU Commission (GAP-101135025) and SERI (REF-1131-52104)

Coordinator:
Fundació i2CAT

Website:
https://presence-xr.eu/

Summary

The concept of presence can be understood as a synthesis of interrelated psychophysical ingredients where multiple perceptual dimensions intervene. A better understanding of how to impact specific aspects such as plausibility, the illusion that virtual events are really happening, co-presence, the illusion of being with others, or place illusion, the feeling of being there, is key to improve the quality of XR experiences. The availability and performance of advanced technologies will let us reach high levels of presence in XR, essential to get us closer than ever to the old VR dream: to be anywhere, doing anything, together with others from any place. The PRESENCE project will impact multiple dimensions of presence in physical-digital worlds, addressing three main challenges: i) how to create realistic visual interactions among remote humans, delivering high-end holoportation paradigms based on live volumetric video capturing, compression and optimization techniques under heterogeneous computation and network conditions; ii) how to provide realistic touch among remote users and synthetic objects, developing novel haptic systems and enabling spatial multi-device synchronization in multi-user scenarios; iii) how to produce realistic social interactions among avatars and agents, generating AI virtual humans, representing real users or AI agents.

The contribution of Artanim will focus on smart autonomous characters, which are able to generate physically-plausible behavior integrating the cues of avatars of human users or other autonomous characters in specific scenarios. Specific parameters will be studied, for the movement to respond plausibly to social cues such as interpersonal differences or to collaboration tasks. Interactive virtual agents may take into account the position of others to respect implicit proxemics rules by generating small step rotations, or if a user needs an object, the interactive virtual agents will grab and offer it.

Partners

Fundació i2Cat (Spain)

Actronika (France)

Universitaet Hamburg (Germany)

Ethniko Kentro Erevnas Kai Tehcnologikis Anapty (Greece)

Raytrix GMBH  (Germany)

SenseGlove B.V. (Netherlands)

Go Touch VR SAS (France)

Didimo, S.A.(Portugal)

Vection Italy Srl (Italy)

Universitat de Barcelona (Spain)

Unity Technologies (Denmark)

Sound Holding B.V. (Netherlands)

Interuniversitair Micro-Electronica Centrum (Belgium)

Joanneum Research Forschungsgesellschaft MBH (Austria)

SyncVR Medical B.V. (Netherlands)

Zaubar UG (Haftungsbeschraenkt) (Germany)

Artanim

MoRehab XR

MoRehab XR

MoRehab XR

Motion tracked rehabilitation through extended realities

Project Info

Start date:
August 2020

End date:
December 2026

Funding:

Coordinator:
Artanim

Summary

After a stabilization surgery or a traumatic accident, physical rehabilitation is often required to restore the joint’s normal function. This process requires very regular and repetitive training sessions. The regularity and involvement of the patient being critical for success of the procedure, maintaining them interested and motivated while repeating similar exercises over and over is one of the main challenges of the physical therapist. During these sessions, it is also difficult to objectively monitor small progresses or the lack of the latter. On the other end of the spectrum, some video games have proved to be very effective in motivating people to perform pretty repetitive tasks on their controllers in order to travel through a story or a challenge.

The goal of this project is to use the captivating capabilities of video games to entertain and motivate the patients through their rehabilitation, by designing a set of “exercise games” (called exergames) that will be won by performing the correct routines required for a proper physical rehabilitation. Through the use of modern motion tracking technology, the exergames will also be able to track the evolution of the physical recovery of the patients on a session per session basis, to adapt the challenges that the patients will face based on their needs and actual capacities, and if need be to alert the medical personnel early if some problems persist, or if no progresses are observed.

In this project, Artanim is in charge of building a platform suitable for these exergames, as well as designing and developing these exergames, and implementing the tools for a standard assessment of the patient’s performance. On the other hand, the medical usability and validity of the setup and exergames will be tested and assessed by the clinical team of La Tour Hospital over a large set of voluntary patients. Both teams will also work together to develop metrics to evaluate and adapt the patient’s performance during the sessions, thanks to the extended capabilities of the motion tracking system with respect to the data available in conventional physical therapy.

Partners

Artanim
Conception of the exergames and of a platform adapted to perform them, development of a set of medical scores adapted to the extended capabilities of the motion tracking system

La Tour Hospital – Division of Orthopedic and Trauma Surgery
Clinical tests, co-design of new physical evaluation metrics adapted to the motion tracking capabilities

Related Publications

Mancuso M, Charbonnier C. Technical evaluation of the fidelity of the HTC Vive for upper limb tracking, 42nd International Conference on Biomechanics in Sports, Salzburg, Austria, July 2024.
PDF

 

Real Virtuality

Real Virtuality

Real Virtuality

Immersive platform

Project Info

Start date:
April 2015

End date:
December 2025

Funding:

Coordinator:
Artanim

Summary

Artanim is the inventor and continuously develops since 2015 the VR technology driving Dreamscape Immersive. This multi-user immersive platform combines a 3D environment – which can be seen and heard through a VR headset – with a real-life stage set incorporating haptic feedback elements.

The user’s movements are sparsely captured in real time and translated into a full-body animation thanks to a deep understanding of body mechanics. Contrary to other static position VR systems, the platform allows up to eight users to truly feel immersed in a VR scene. They are rendered as characters (avatars) inside a computer generated world where they can move physically, interact with objects and other players, and truly experience worlds previously accessible only in their imagination. The bodies of the users thus become the interface between the real and virtual worlds.

The platform combines the following features:

  • Wireless: complete freedom of movement across large spaces.
  • Social: interactive multi-user experiences within a shared environment or across connected pods.
  • Accurate: full-body and physical objects tracking with less than 1 millimeter deviation.
  • Real-time: zero discernible lag eliminating most concerns of motion sickness.
  • Flexible: SDK allowing content creators to easily create experiences for this particular platform.

This platform is leveraged by Dreamscape Immersive through their worldwide location-based VR entertainment centers, as well as through educational and training programs and other enterprise solutions.

The platform was also used to produce VR_I, the first ever choreographic work in immersive virtual reality, as well as Geneva 1850, a time traveling experience and historical reconstruction into the Geneva of 1850.

Related Publications

Chagué S, Charbonnier C. Real Virtuality: A Multi-User Immersive Platform Connecting Real and Virtual Worlds, VRIC 2016 Virtual Reality International Conference – Laval Virtual, Laval, France, ACM New York, NY, USA, March 2016.
PDF
 
Chagué S, Charbonnier C. Digital Cloning for an Increased Feeling of Presence in Collaborative Virtual Reality Environments, Proc. of 6th Int. Conf. on 3D Body Scanning Technologies, Lugano, Switzerland, October 2015.
PDF

Charbonnier C, Trouche V. Real Virtuality: Perspective offered by the Combination of Virtual Reality Headsets and Motion Capture, White Paper, August 2015.
PDF

Awards and Recognitions