Your body feels good

Your body feels good

Do you feel in control of the body that you see? This is an important question in virtual reality (VR) as it highly impacts the user’s sensation of presence and embodiment of an avatar representation while immersed in a virtual environment. To better understand this aspect, we performed an experiment in the framework of the VR-Together project to assess the relative impact of different levels of body animation fidelity to presence.

In this experiment, the users are equipped with a motion capture suit and reflective markers to track their movements in real time with a Vicon optical motion capture system. They also wear Manus VR gloves for fingers tracking and an Oculus HMD. At each trial, the face (eye gaze and mouth), fingers and the avatar’s upper and lower bodies are manipulated with different degree of animation fidelity, such as no animation, procedural animation and motion capture. Each time, users have to execute a number of tasks (walk, grab an object, speak in front of a mirror) and to evaluate if they are in control of their body. Users start with the simplest setting and, according to the judged priority, improve features of the avatar animation until they are satisfied with the experience of control.

VR-Together VR-Together

Using the order in which users improve the movement features, we can assert on the most valuable animation features to the users. With this experiment, we want to confront the relative importance of animation features with the costs of adoption (monetary and effort) to provide software and use guidelines for live 3D rigged character mesh animation based on affordable hardware. This outcome will be useful to better define what makes a compelling social VR experience.

VR-Together

 

New venues for capturing facial performance

New venues for capturing facial performance

We will soon start shooting cinematic content to be used for showcasing the technology developed by the VR-Together consortium. In this post, we bring some of the production effort developed at Artanim, which is currently exploring the use of Apple’s iPhone X face tracking technology in the production pipeline of 3D animations.

Facial mocap rig

The photos below show the iPhone X holding rig, and an early version of the face tracking recording tool that was developed by Artanim. The tool integrates with full body and hands motion capture technology from Vicon to allow the simultaneous recording of body, hands and face performance from multiple actors.

With the recent surge of consumer virtual reality, interest for motion capture has dramatically increased. The iPhone X and ARKit SDK from Apple integrate depth sensing and facial animation technologies, and are a good example of this trend. Apple’s effort to integrate advanced face tracking to their mobile devices may be related to the recent acquisition of PrimeSense and FaceShift. The former was involved in the development of the technology powering the first Kinect back in 2011, the latter is recognized for their face tracking technology, which is briefly showcased in the making of Star Wars: The Force Awakens trailer. These are exciting times, when advanced motion tracking technologies are becoming ubiquitous in our life.

Image from the iPhone X keynote presentation from Apple

A step into virtual cinematography

Artanim just added a new tool to its motion capture equipment: the Optitrack Insight VCS, a professional virtual camera system. From now on, everyone doing motion capture at Artanim will be able to step into the virtual set, preview or record real camera movement and find the best angles to view the current scene.

The Optitrack Insight VCS shoulder mounted

The motion capture data captured by our Vicon system is processed in real time in MotionBuilder and displayed on the camera monitor. The position of MotionBuilder’s virtual camera is updated by the position of the reflective markers on the camera rig. In addition, the camera operator can control several parameters such as the camera zoom, horizontal panning, etc. The rig itself is very flexible and can be modified to accommodate different shooting styles (shoulder-mounted, hand-held, etc.).

We can’t wait to use it in future motion capture sessions and show you some results. Meanwhile, you can have a look at our first tests in the above video.

Virtual cinematography

Winter sports motion capture

Winter sports motion capture

Sochi 2014 – the next Winter Olympic Games – is coming soon… For us, it was the opportunity to motion capture several winter sports! Indeed, we were contacted by Kenzan Technologies for the making of a 3D animation requiring different short captures of professional athletes. We spent a weekend in Zermatt to acquire the necessary data. The weather conditions were not on our side, -15° at the top, snow falls, zero visibility, not really the best conditions to go skiing with our Xsens system and a computer on the slopes! After experiencing a lot of issues, especially because of the cold, we were finally able to obtain good animation data during the weekend. We captured four athletes in the following disciplines: Nordic skiing, alpine skiing, freestyle skiing and snowboard.

Back in Geneva, we captured three additional people to complete our sample of winter sports: ice hockey, speed skating and bobsleigh. Now you are wondering… Is there a bobsleigh slope in Geneva? Well of course, there isn’t! But with some tricks, a good carpenter and a little practice, it is possible to produce the illusion… Note that for this last capture, it was not necessary to have a professional athlete!

Hockey Mocap SnowBoard Mocap Ski Mocap

Bread mocap – A tribute to Charlie Chaplin

Bread mocap – A tribute to Charlie Chaplin

We were recently contacted to perform the motion capture for an upcoming short movie entitled “The Great Imitator” created by Boris Beer. This short animated movie will be a tribute to Charlie Chaplin. Without giving too much details, the goal of the shooting was to capture some iconic scenes of Chaplin’s most famous movies.

For example, the first scene we captured was the one from The Great Dictator where Chaplin plays with an inflatable globe. We also had to capture the famous nut screwing scene from Modern Times as well as some scenes from The Kid. Fabrice Bessire (the actor) did a great job reinterpreting Chaplin in those scenes.

Finally, among the selected scenes was the famous “Bread roll dance” from The Gold Rush. In this scene, Charlie Chaplin creates a small ballet by giving life to two forks and two bread roll in order to entertain his friends. As you can see on the pictures, this capture required a very specific and unique bread motion capture setup (patent pending!).

We will talk again about this short film when it will be finished. Stay tuned!

The famous bread roll dance from The Gold Rush Bread Mocap Charlie Chaplin Mocap

Upcoming workshop on motion capture

Upcoming workshop on motion capture

Are you interested in learning more about motion capture or simply curious to discover the magic of mocap? On May 7th and 8th 2012, we will organize a workshop at artanim in collaboration with Focal.

The goal of this two-day workshop will be to put hands on our two motion capture systems: our 24 cameras MXT40s Vicon system and our Xsens MVN motion capture suit. From system calibration to final rendering, the whole pipeline will be covered.

Motion Capture with Vicon Mocap with Xsens Motion capture studio

Program:

  • The first day will be dedicated to capture data. After a brief theoretical presentation on motion capture, participants will be able to try each system and assess their advantages and drawbacks.
  • The second day will focus on data post-processing and their integration in a 3D authoring software. This will allow the participants to get a clearer view of what is involved when using motion capture for 3D animation.

Registration to this workshop is open until April 4th on Focal’s website.