SIGGRAPH 2015 – Real-Time Live!

Real Virtuality was presented in Los Angeles at SIGGRAPH 2015, the leading conference on computer graphics and interactive techniques. Real Virtuality is a multi-user immersive platform combining a 3D environment – which can be seen through a VR headset – with a real life stage set. Users are tracked by a motion capture system allowing them to see their own bodies and move physically in the virtual environment.

The experience offered by Real Virtuality brings a once in a lifetime experience. Unlike other static position VR systems, Real Virtuality allows users to become immersed in a VR scene by walking, running, interacting with physical objects and meeting other people. Because user’s movement exactly match their avatars movements in the 3D environment and are streamed to the users with very low latency, there is no discomfort or interface required. The bodies of the users become the interface. The VR platform offers a “matrix-like” degree of immersion over a large area, up to hundreds of square meters:

Humanity has essentially recreated, on the basest level, The Matrix  G. Clay Whittaker, Journalist, Popular Science

The demo “Walking through a pharaoh’s tomb” created in collaboration with Kenzan was selected as one of the three finalists for the Immersive Realities (AR/VR) contest. Artanim’s team was in Los Angeles from August 9th to 13th to present the project with the support of Pro Helvetia.

Siggraph 2015 Walking through a pharaoh tomb

 

A step into virtual cinematography

Artanim just added a new tool to its motion capture equipment: the Optitrack Insight VCS, a professional virtual camera system. From now on, everyone doing motion capture at Artanim will be able to step into the virtual set, preview or record real camera movement and find the best angles to view the current scene.

The Optitrack Insight VCS shoulder mounted

The motion capture data captured by our Vicon system is processed in real time in MotionBuilder and displayed on the camera monitor. The position of MotionBuilder’s virtual camera is updated by the position of the reflective markers on the camera rig. In addition, the camera operator can control several parameters such as the camera zoom, horizontal panning, etc. The rig itself is very flexible and can be modified to accommodate different shooting styles (shoulder-mounted, hand-held, etc.).

We can’t wait to use it in future motion capture sessions and show you some results. Meanwhile, you can have a look at our first tests in the above video.

Virtual cinematography

3DIM @ Montreux Jazz Festival

3DIM @ Montreux Jazz Festival

3D In Motion (3DIM) – our experimental setup of capture, visualization and sonification of movements in real time – was presented at the Montreux Jazz Festival on the 12th of July 2014 during an one hour workshop. For this occasion, a special sound design was developed by Alain Renaud from MINTLab. A mapping of different artists’ audio tracks recorded during their visit to the festival was performed. The tracks were generously provided by the Montreux Jazz Lab of EPFL.

Moreover, several improvements in the 3DIM application were achieved. For instance, an iPad App was implemented to easily switch between the sound design scenarios, a prompter was added to provide instructions to the user and an OSC connection between the sonification and graphical applications was programmed to control visual feedback from sound events.

Again, the feedback from the public was positive. We look forward to developing a first live performance using this system.

More information about the 3DIM project here.

A demo video of the setup including the sonification part will arrive soon. Stay in touch!

Montreux Jazz Festival demo Montreux Jazz Festival demo