newsdirectory3

Newsletter

Exploring Motion Capture in Remote Work: Insights from CEDEC2023

CEDEC2023, a major gaming conference for developers, recently took place at Pacifico Yokohama North. The event was held both in person and online. One of the featured sessions at the conference was focused on “FINAL FANTASY.” It included presentations by Itsuto Sato, an animator from Square Enix’s Third Business Development Department, and Eiji Takada, a technical artist.

During the session, Mr. Sato discussed the possibility of using motion capture for remote work. Due to the COVID-19 pandemic, the need for remote work environments became necessary, even for motion capture. Typically, motion capture involves recording humanoid characters in a dedicated studio with specialized capture suits. In addition, professional actors are often brought in to perform powerful movements that are later integrated into the game. Multiple cameras and optical motion capture systems are commonly used in these studios.

In an effort to support remote work and accommodate team members who are not based in Tokyo, where the studio is located, the session explored a new recording method suitable for a remote environment and a motion capture studio. The presented footage showed a motion capture session, although the filming space was small, resulting in less than ideal angles. To reduce noise interference, cushions were placed and a futon was used to capture the sword-sheathing animation.

The session also highlighted two different methods of building a motion capture system at home: “Preception Neuron Studio” and “MediaPipe.” “Preception Neuron Studio” utilizes a sensor with a gyroscope, acceleration sensor, and magnetometer. This system was chosen for its ease of installation, as it can be set up by a single person. The rechargeable sensors eliminate the need for cords, but due to the large number of sensors, calibration takes approximately six and a half minutes. The setup process is completed through Axis Studio.

The process of inputting the captured data into the rig involves several steps, including outputting the animation in Axis Studio, retargeting with Maya HumanIK, using the Maya HumanIK rig input tool to confirm the 3D model, and finally integrating it into the internal rig system.

Mr. Sato demonstrated the successful animation of a motion-cloaked sword, proving that the technique worked effectively even in remote environments. The ability to share and display the animation on a screen during meetings, receive immediate feedback, and quickly share captured data for further adjustments was praised as a practical and enjoyable approach.

Although it is possible to capture animations solely from the hands, this method was not utilized in this session. Mr. Sato expressed an interest in exploring the option of capturing different body parts in the future, particularly for close-up shots.

Additionally, the session touched upon the “MediaPipe” system, which allows for video capture, and the “mocopy,” a tracking device currently in the testing stage.

Overall, the session showcased the advancements and possibilities of motion capture in remote work environments, providing game developers with valuable insights for their future projects.

CEDEC2023, a large-scale conference for game developers, was held on-site at Pacifico Yokohama North and online. We will present a session report for “FINAL FANTASY

This session featured Itsuto Sato, an animator in Square Enix’s Third Business Development Department, and Eiji Takada, a technical artist.

Mr. Itsuto Sato on the left, Mr. Eiji Takada on the right

Can motion capture be used for remote work?

Since the start of the coronavirus pandemic in 2020, we have had to set up a remote work environment for motion capture. Humanoid characters are usually recorded in a large motion capture studio using a special capture suit. Furthermore, professional actors participated and filmed their powerful performances for use in the game. The studio even uses multiple cameras and optical motion capture systems.

To promote a work style that takes advantage of the unique benefits of remote work, we started working on an environment where team members who don’t live near Tokyo, where the studio is located, can record and direct. Therefore, the content of this session was to try out a new recording method that is compatible with a remote environment and a motion capture studio.

Here we show the motion capture. The angle of the scene wasn’t perfect because we were filming in a small space, but in order to prevent noise in the apartment, we put up cushions and used a futon to shoot the sword-sheathing animation.

There are different ways to build a motion capture system at home, but this time we used two types: “Preception Neuron Studio” which uses an acceleration sensor and “MediaPipe” which creates capture data from videos. Unfortunately, the full tracking “mocopy”, which has six sensors attached to the body, was released in early 2023, when the development was almost finished, so it has not been verified yet.

“Preception Neuron Studio” performs motion capture with a sensor that includes a gyroscope, an acceleration sensor, and a magnetometer. This device was chosen because it can be installed by one person.

The sensors are rechargeable so no cords need to be tied, but it takes about 6 and a half minutes due to the large number of sensors. Calibration is completed by performing four poses: standing alone, T pose, hands together pose, and pinching an object pose. The setup can be installed with Axis Studio.

The flow of inputting the shooting data into the rig is as follows: Output the animation in Axis Studio, which receives the capture results → Retargeting with Maya HumanIK → Maya HumanIK rig input tool (use the 3D model to confirm) → Internal rig system There are 4 steps.

Here he shows off his motion cloaked sword. From the temporary model to the model used in the game, we saw that the animation was executed correctly. It was also a great success because even in a remote environment, the situation could be shared and displayed on the screen. He said it was a practical and fun approach that gave him peace of mind because he could check the shooting status in real time during the meeting, receive feedback, share the captured data immediately after the meeting, and redraw immediately .

It is also possible to shoot animations only from the hands, but they did not use them this time. He says he would like to try shooting different parts of the body only in the future, as he foresees the need to shoot close-ups in the scenes.

“MediaPipe” which can capture videos – “mocopy” which was in the testing stage


#Attach #sensor #mocap #home #home #motion #capture #FF16CEDEC2023 #GameBusiness.jp

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Trending