Debug Mode!

Debug Mode!

Debug Mode

Error 404: Experience Not Found

Project Info

Producer:
Dreamscape Immersive / Artanim

Year:
2025

Summary

Your Dreamscape experience is riddled with bugs! These dreadful digital pests might very well eat through everything they see, and you’re the only one standing in their way. Hang on to your speedster as you launch through the rifts at full speed, and explore many corrupted worlds! Set a course for the asteroid field, join forces in a post-apocalyptic desert… and above all, stay alert: a well-deserved break could be the perfect grounds for an ambush. Keep your lasers hot, aim for the highest score, and DEBUG. THIS. MESS!

In this new opus, Artanim collaborated with the team of Dreamscape Immersive to create a new free-flying VR immersive experience combining a subtle balanced mix of storytelling and gaming with a splash of humor.

Permanently exhibited at the Dreamscape center in Geneva, level-1 of Confederation Center. Check https://dreamscapegeneva.com/ for more information and to get tickets.

More information on the VR technology here.

Credits

Production, project direction, scenario, 3D content creation, gameplay and VR platform
Dreamscape Immersive / Artanim

Music
Alain Renaud

Voice actor
Adrien Buensod

Being Laser – Restless Limbs

Being Laser – Restless Limbs

Being Laser - Restless Limbs

Experimental video

Project Info

Client:
Alan Bogona

Year:
2024

Summary

The experimental video project Being Laser – Restless Limbs explores the interactions between light, human body, and technology. Synthesizing speculative research that combines contemporary utopias and ancient archetypes related to artificial lighting, it highlights the cross-influences between Western and East Asian cultures. This project combines computer-generated animated sequences (CGI) with real footage in China, Switzerland, and Italy. The animated sequences feature ethereal, luminous, and ever-changing avatars, brought to life using motion capture techniques.

Credits

Indoor skydiving performers
Benjamin Guex, Olivier Longchamp (RealFly Sion)

 

Butoh dancer
Flavia Ghisalberti

 

Motion capture
Artanim

 

Support by
Pro Helvetia

ABC-Space

ABC-Space

ABC-Space

Attention, Behavior and Curiosity in Space

Project Info

Start date:
September 2024

End date:
August 2028

Funding:
Swiss National Science Foundation (SNSF)

Coordinator:
Artanim

Summary

The aim of ABC-Space is to better understand how Attention and Curiosity deploy in Space, and their impact on Educational Virtual Reality (VR) experiences that use embodied social virtual characters. ABC-Space will also bring foundational insights into how social signals from interactive automated characters influence cognitive and motivational processes that guide attention and memory in VR. 

Our results will provide new insights in VR science, education psychology, and cognitive science. These results will be achieved through a combination of novel experimental paradigms probing attention, motivation, and memory in VR. By the end of the project, ABC-Space will have contributed to the understanding of the relation between attention and curiosity, memory and learning, 3D space representations, as well as their interplay in fully immersive VR. This knowledge will have important implications both for VR science and for cognitive science and psychology. ABC-Space will thus pave the way to design a new kind of educational VR agent, expand current theoretical models of attention and 3D space representation in humans, and demonstrate their educational impact for applied purposes.  

In this project, Artanim develops Physics-based controllers based on Deep Reinforcement to control virtual reality characters, produces the VR environments and contributes to the design of the behavioural experiments to validate their impact in terms of attention and learning. 

Partners

Artanim
Physics-based interactive Virtual Reality characters capable of signaling and social cueing

University of Geneva – Laboratory for Behavioral Neurology and Imaging of Cognition (LABNIC)
Cognitive and affective processes governing human attention in space

University of Geneva – Educational Technologies (TECFA)
Models of space representation used to form cognitive maps, and their impact in educational VR experiences

Clinical Translation of MRgHIFU

Clinical Translation of MRgHIFU

Clinical Translation of MRgHIFU

Minimally invasive ablation of liver tumors

Project Info

Start date:
September 2024

End date:
August 2028

Funding:
Swiss National Science Foundation (SNSF)

Coordinator:
HUG – Division of Radiology

Summary

High intensity focused ultrasound (HIFU) is a precise method to thermally ablate deep-seated tumors in a non-invasive manner. A prerequisite for a safe and effective application of HIFU is image guidance, to plan and control the ablation process. The most suitable imaging modality is MRI, with its high soft tissue contrast and its ability to monitor tissue temperature changes (MR-guided HIFU, MRgHIFU). The therapy of abdominal organs, such as the liver, still poses several problems due to a moving target location caused by breathing, motion related MR-thermometry artefacts and near-field obstacles, i.e. thoracic cage or bowel.

The primary objective of this project is to perform clinical trials with MRgHIFU applied to liver neoplastic nodules. The project relies on a new concept of MRgHIFU ultrasound developed in a previous SNF project which will be tested for the first time on patients. The clinical studies will meet incremental requirements in liver from basic targeting to complete tumour ablation.

In this project, Artanim is the expert in computer science, involving the design, implementation, and quality assurance of real time software for integration of “self-scanning” sonication with closed loop temperature feedback control.

Partners

University Hospitals of Geneva – Division of Radiology
Imaging, data analysis, transducer development, clinical tests and project coordination

University Hospitals of Geneva – Division of Oncology
Clinical trials

University Hospitals of Geneva – Department of Surgery
Liver surgery

Artanim
Design, implementation and quality assurance of real time software for integration of “self-scanning” sonification

Presence

Presence

Presence

A toolset for hyper-realistic and XR-based human-human and human-machine interactions

Project Info

Start date:
January 2024

End date:
December 2027

Funding:
EU Commission (GAP-101135025) and SERI (REF-1131-52104)

Coordinator:
Fundació i2CAT

Website:
https://presence-xr.eu/

Summary

The concept of presence can be understood as a synthesis of interrelated psychophysical ingredients where multiple perceptual dimensions intervene. A better understanding of how to impact specific aspects such as plausibility, the illusion that virtual events are really happening, co-presence, the illusion of being with others, or place illusion, the feeling of being there, is key to improve the quality of XR experiences. The availability and performance of advanced technologies will let us reach high levels of presence in XR, essential to get us closer than ever to the old VR dream: to be anywhere, doing anything, together with others from any place. The PRESENCE project will impact multiple dimensions of presence in physical-digital worlds, addressing three main challenges: i) how to create realistic visual interactions among remote humans, delivering high-end holoportation paradigms based on live volumetric video capturing, compression and optimization techniques under heterogeneous computation and network conditions; ii) how to provide realistic touch among remote users and synthetic objects, developing novel haptic systems and enabling spatial multi-device synchronization in multi-user scenarios; iii) how to produce realistic social interactions among avatars and agents, generating AI virtual humans, representing real users or AI agents.

The contribution of Artanim will focus on smart autonomous characters, which are able to generate physically-plausible behavior integrating the cues of avatars of human users or other autonomous characters in specific scenarios. Specific parameters will be studied, for the movement to respond plausibly to social cues such as interpersonal differences or to collaboration tasks. Interactive virtual agents may take into account the position of others to respect implicit proxemics rules by generating small step rotations, or if a user needs an object, the interactive virtual agents will grab and offer it.

Partners

Fundació i2Cat (Spain)

Actronika (France)

Universitaet Hamburg (Germany)

Ethniko Kentro Erevnas Kai Tehcnologikis Anapty (Greece)

Raytrix GMBH  (Germany)

SenseGlove B.V. (Netherlands)

Go Touch VR SAS (France)

Didimo, S.A.(Portugal)

Vection Italy Srl (Italy)

Universitat de Barcelona (Spain)

Unity Technologies (Denmark)

Sound Holding B.V. (Netherlands)

Interuniversitair Micro-Electronica Centrum (Belgium)

Joanneum Research Forschungsgesellschaft MBH (Austria)

SyncVR Medical B.V. (Netherlands)

Zaubar UG (Haftungsbeschraenkt) (Germany)

Artanim