Real Virtuality was presented in Los Angeles at SIGGRAPH 2015, the leading conference on computer graphics and interactive techniques. Real Virtuality is a multi-user immersive platform combining a 3D environment – which can be seen through a VR headset – with a real life stage set. Users are tracked by a motion capture system allowing them to see their own bodies and move physically in the virtual environment.
The experience offered by Real Virtuality brings a once in a lifetime experience. Unlike other static position VR systems, Real Virtuality allows users to become immersed in a VR scene by walking, running, interacting with physical objects and meeting other people. Because user’s movement exactly match their avatars movements in the 3D environment and are streamed to the users with very low latency, there is no discomfort or interface required. The bodies of the users become the interface. The VR platform offers a “matrix-like” degree of immersion over a large area, up to hundreds of square meters:
Humanity has essentially recreated, on the basest level, The MatrixG. Clay Whittaker, Journalist, Popular Science
The demo “Walking through a pharaoh’s tomb” created in collaboration with Kenzan was selected as one of the three finalists for the Immersive Realities (AR/VR) contest. Artanim’s team was in Los Angeles from August 9th to 13th to present the project with the support of Pro Helvetia.
Are you dreaming about cloning yourself in 3D? This is now possible by visiting the center Coop Bassin in Conthey (VS) thanks to the osmo-lab initiative. The photogrammetric scanner developed by Artanim is installed until December 6 in order to scan your body and print your 3D figurine. The local press talks about it.
And you, what have you planned as a gift for Christmas?
Artanim just added a new tool to its motion capture equipment: the Optitrack Insight VCS, a professional virtual camera system. From now on, everyone doing motion capture at Artanim will be able to step into the virtual set, preview or record real camera movement and find the best angles to view the current scene.
The motion capture data captured by our Vicon system is processed in real time in MotionBuilder and displayed on the camera monitor. The position of MotionBuilder’s virtual camera is updated by the position of the reflective markers on the camera rig. In addition, the camera operator can control several parameters such as the camera zoom, horizontal panning, etc. The rig itself is very flexible and can be modified to accommodate different shooting styles (shoulder-mounted, hand-held, etc.).
We can’t wait to use it in future motion capture sessions and show you some results. Meanwhile, you can have a look at our first tests in the above video.
3D In Motion (3DIM) – our experimental setup of capture, visualization and sonification of movements in real time – was presented at the Montreux Jazz Festival on the 12th of July 2014 during an one hour workshop. For this occasion, a special sound design was developed by Alain Renaud from MINTLab. A mapping of different artists’ audio tracks recordedduring their visitto the festivalwas performed. The tracks were generouslyprovided by theMontreuxJazzLab of EPFL.
Moreover, several improvements in the 3DIM application were achieved. For instance, an iPad App was implemented to easily switch between the sound design scenarios, a prompter was added to provide instructions to the user and an OSC connection between the sonification and graphical applications was programmed to control visual feedback from sound events.
Again, the feedback from the public was positive. We look forward to developing a first live performance using this system.
We're proud to announce the launch of our new immersive #VR experience "Escalade - The Darkest Night", created in partnership with the Compagnie de 1602 and offered exclusively at the @Dreamscape Geneva center.