iPi Soft

Motion Capture for the Masses

newsletter share-facebook share-twitter

Archive for the ‘News’ Category

A Conversation With Blood & Clay Film Director

Posted on: August 20th, 2023

A Conversation With Blood & Clay Film Director

A Q&A With Martin Rahmlow, Director of Blood & Clay Short Film

Blood & Clay (https://bloodandclay.com/bnc.php) is an animated short movie (~20 min) co-directed by Martin Rahmlow and Albert Radl. The movie tells the story of the orphan Lizbeth, who tries to escape her nemesis Director Kosswick and his Golem. The movie is produced in a hybrid stop-motion/cgi technique. The film is created by a three-member team – Martin Rahmlow, Albert Radl and Onni Pohl.

We spoke with Martin Rahmlow, film director, to get known how they are using iPi Soft (https://www.ipisoft.com/) technology in their production pipeline.

Q. Please tell a few words about yourself and the team.

All three of us met 20 years ago at film school: Filmakademie Baden-Württemberg. Albert supported me during the production of my first stop-motion movie ‘Jimmy’ (https://vimeo.com/manage/videos/66549889) and I did the VFX for Alberts ‘Prinz Ratte’ movie (https://www.amazon.de/Prinz-Ratte-Albert-Radl/dp/B07HM1WMGD). Blood & Clay (https://bloodandclay.com/bnc.php)  is the first movie all three of us collaborating. Albert and I are the directors of the movie and Onni focuses on the technical part.

Q. How did the team first hear about iPiSoft Mocap? And, how long has the team been using it?

In May 2015 Onni was assigned setting up real life 3d characters for an advertisement for Mercedes Benz Sprinter (production: Infected). It was a freeze moment at a festival. There was a life shooting, but all the background people were made with poses that were extracted from tracking data made with iPi.  A college of Onni, Fabian Schaper, had proposed this concept, as he had a Kinect Cam and appropriate notebook.

A Year ago, we discussed this possibility with our team and after trying the software, we decided, that it could help us with our Blood & Clay, as we have human like characters and a lot of animation.   We did a few tests in the beginning but had a serious recording session (with an actress and our two directors) just in May. We have to process and clean the data and will have another session for some missing details.

Q. What is the storyline for the film?

The movie tells the story of the orphan Lizbeth, who tries to escape her nemesis Director Kosswick und his Golem.

Q. Please describe briefly your production pipeline, and the role of iPi Mocap in the pipeline.

  • We use Prism as production pipeline and scripted little add-ons for our project
  • Texturing and detailing is done with Substance and ZBrush
  • Modelling and rigging is done in Maya
  • Characters get an individual (simplified) mocap rig
  • The sets are scratch-built and painted miniatures (scale 1:4) – then scanned and transferred into 3d assets
  • Layout version of the film (animatic) may contain raw parts of the first mocap session
  • During the animation phase, we plan to use our recorded animation (and loops, that are created from it) as a basis for all possible body movements
  • We use Ragdoll to simulate cloth and hair and YETI for fur and hair FX
  • For final rendering we’re considering several options. Either going the classic path tracing route like Arnold or rendering with a game engine like Unity or Unreal

Q.  How long has the team been working on the film? What were the overall technical challenges that the team faced and how iPi Mocap helps to meet them?

We started with production in 2020. Since it will be a 20 min short, we have a lot of animation with 3 very different, but (partly) human characters. The feature of iPi to load an individual rig for the export gives us the possibility to fit the recording to the character at export time perfectly. Eventually all scenes with intensive human body movement, that are on the ground (not hanging or dangling) are planned to get a basis animation for body movement with head tracking from iPi – sometimes we use separate hands tracking. Fingers and face will be animated separately.

We had a first technical test in January. It took some hours until we found the perfect way to calibrate our two cams. We did our recording session in May, using 2 Kinect v2 cams and 2 JoyCon controllers. We have another Azure Kinect, but unfortunately it can’t be used together with Kinect v2. Nevertheless, our calibration was good and the tests we did with our recordings turned out very promising.

At the moment we are experimenting with iPi to assist for a roto scene. Hands are shaping clay in this shot and we want to replace the hands and fingers with 3d hands. We got a close up of hands shaping clay and recorded with iPi from another perspective, using special trackers for the hands. We hope that this will give us a base for the roto to facilitate the finger roto-motion, as hands are always quite difficult to track, while moving the fingers a lot. Because we had limited space, we modified the T-Pose. Arms were stretched out to the front.

In our recording session, we recorded movements like crawling and somebody pushing himself forward with his arms, sitting on a rolling board, but we did not processed the recordings yet and are very curious if iPi will be able to help us with this task.

Renderosity 2022 Animation Halloween Contest

Posted on: October 27th, 2022

A great opportunity to win a perpetual license of iPi Motion Capture – just send your animation to Renderosity Animation Halloween Contest. Submission deadline is Oct 31st.

1st prize is Pro, 2nd prize is Basic, and 3rd prize is Express edition. A lot of other sponsors give away their software licenses as well.

Good luck in the competition!

https://www.renderosity.com/contests/1594/2022-animation-halloween-contest

Renderosity-Halloween-2022

iPi Mocap Live Link Plugin for Unreal Engine 5.0 Released

Posted on: July 22nd, 2022

iPi Mocap Live Link plugin for Unreal Engine 5.0 released.

Use it in the same way as plugins with UE4. Live Link menu item is now in Window > Virtual Production group in UE5.

Plugin download link: https://files.ipisoft.com/iPiMocap-Unreal-5-0.zip

Docs on how to use the plugin: https://docs.ipisoft.com/Animation_Streaming#Usage_in_Unreal

Importand Updates and Enhancements in Motion Transfer Profile Editing

Posted on: June 27th, 2022

Importand Updates and Enhancements in Motion Transfer Profile Editing

We have released updates and fixes for editing of motion transfer profiles.
What’s new:

  • Improvements in UI for editing of motion transfer profile:
    • Check that no duplicate targets used when updating a symmetric bone mapping.
    • Update a symmetric bone when adding/removing a target bone.
    • Enabled the “(Unused)” item in the combo list.
  • Fixes in editing of motion transfer profile:
    • Fixed crash when trying to select an already used target bone in the viewport.
    • Fixed inability to map symmetric bone when it can’t be assigned automatically.
    • Fixed an error when reading swing/twist weights from an XML file.

See details in the release notes:

http://docs.ipisoft.com/iPi_Mocap_Studio_Release_Notes#ver._4.5.7.258

The New Version Includes Unreal Engine’s MetaHuman Character Support

Posted on: April 18th, 2022

The New Version Includes Unreal Engine’s MetaHuman Character Support

The recent release of iPiMocapStudio 4.5.6 includes support for Unreal Engine’s MetaHuman character and other motion transfer enhancements:

  • Support for Unreal Engine’s MetaHuman character in animation export and streaming.
  • Motion transfer enhancements (see details).
    • Allow for multiple target bones in character mapping.
    • Allow for separate swing and twist rotation channels.
  • Added built-in motion transfer profile for Daz Genesis 8 character.
  • Using separate swing and twist rotation channels in UE4 Mannequin, Endorphin built-in motion transfer profiles.

See details in the release notes:

http://docs.ipisoft.com/iPi_Mocap_Studio_Release_Notes#ver._4.5.6.256

meta-human-support

CGW explores how our iPi Mocap is being used to create safe industrial warehouse settings

Posted on: March 18th, 2022

Computer Graphics World explores how our iPi Motion Capture technology is being used to create safe industrial warehouse settings and assembly areas for humans working alongside cobots, collaborative robots. Read the Q&A with Marco Capuzzimati, a research fellow at the University of Bologna, who is streaming mocap data from iPi Mocap Studio to Unity to analyze biomechanical movements and produce ergonomic designs to maximize operator satisfaction and system performance. https://bit.ly/3CH2XNS

CGW_2022_03_MC

Dirty Mechanics Competitive High School Robotics/Animation Team Uses iPi Motion Capture To Create Stand-Out Short Film

Posted on: February 15th, 2022

Dirty Mechanics Competitive High School Robotics/Animation Team Uses iPi Motion Capture To Create Stand-Out Short Film

While most high school students are pushing parental boundaries, Keanu Brayman, a high school student and team co-captain and animation lead of the Dirty Mechanics Competition Team 3923, is more interested in pushing his motion capture creativity. Keanu and fellow members of the Dirty Mechanics team compete in FIRST® (For Inspiration and Recognition of Science and Technology), a non-profit international youth organization that aims “to inspire young people to be science and technology leaders and innovators, by engaging them in exciting mentor-based programs that build science, engineering, and technology skills, that inspire innovation.”

Dirty-Mechanics-Team-2202

For the past three years, iPi Soft has been a team sponsor providing licenses of our markerless motion capture solution, iPi Mocap, as well as technical support.

Founded by Dean Kamen, the inventor and advocate for science and technology, FIRST has over 97,000 students and 29,000 volunteers. The Dirty Mechanics team is comprised of students aged 13 to 18 from 12 different schools, placing a heavy emphasis on teaching digital skills to those who are younger to ensure the sustainability of the team’s future projects.

While the Dirty Mechanics’ key focus is building 60kg robots for competition, the league also competes in FIRST animation competitions. In 2021, Dirty Mechanics submitted a sophisticated short animation film “Be Safe to Each Other” to the FIRST Robotics Safety Animation Award. The project fully utilized iPi Soft 3D motion capture technology and allowed the team to experiment with real-time virtual production by streaming mocap data from iPi Mocap Studio to Unity 3D and using an HTC Vive VR system for camera tracking. (behind the scenes video link).

Be-safe-to-each-other-2202

“We have used iPi Mocap to compete in FIRST competitions three years in a row and appreciate iPi Soft’s continued support of our animation projects,” Brayman says. “Learning as we go, my team has grown strong as has our knowledge and passion for animation.”

We spoke to Keanu about his team’s animated short films and how our technology helped bring them to fruition.

Q. How did the Dirty Mechanics team first hear about iPi Mocap?

KB: This will be the team’s third year using the software. While preparing to work on our submission for the FIRST Robotics Competition 2020 Digital Animation Award, I started searching for a motion capture solution that we could use without prohibitively expensive equipment, and quickly found iPi Soft’s website. iPi Mocap was exactly what we needed to perform achievable motion capture with amazing results.

Q. What was the creative brief for the film?

KB: The FIRST competition is sponsored by Underwriters Laboratories (UL). They were asking teams to “create an environment where everyone feels safe to do their best work. We know it is critically important that teams are physically safe by making sure work environments are clean, organized and hazard-free, but safety is also about caring for each other’s feelings and mental well-being – their psychological safety.” Film submissions were restricted to 40 seconds, including opening titles and credits.

Q. What is the storyline for the film?

KB: Our animated film “Be Safe to Each Other” shows a robotic astronaut walking through a derelict spaceship corridor while being stalked by a shadowy figure. Thankfully, the shadow turns out to be just another friendly astronaut. After this reveal, the ship’s astronauts work together to repair the corridor.

Q. Tell us about how the team fully utilized 3D motion capture and built a working real-time capture system and virtual production pipeline.

KB: Our real-time capture system used the iPi Recorder running two Microsoft Kinect V2 sensors with the distributed recording; real-time motion tracking in iPi Mocap Studio; live streaming mocap data to Unity; an HTC Vive VR system tracking a motion controller as a virtual camera; a phone attached to the controller streaming the virtual camera’s perspective. This system was developed and tested in conjunction with our Safety Animation film project, although we opted to use a more standard iPi Mocap workflow for our final submission. This decision was made due to lower framerates in the real-time stream.

Q. How long did the team work on the Safety Animation piece? What were the overall motion capture technical challenges that the team faced?

KB: Our team spent about three months writing a script, developing our mocap system, creating 3D assets, acting out scenes, and rendering the final product. Our greatest challenge was finishing all the complex character animations in our script before the competition deadline. Thanks to iPi Mocap, we were able to create refined, highly accurate animations using motion capture, a task that took only a few hours. Without iPi Mocap, such an undertaking would undoubtedly have involved weeks of tedious keyframe animation.

Q. How did the team improve on its mocap skillset since your previous films?

KB: We learned a lot in the time between the 2020 “Infinite Reef” film and our 2021 “Safety Animation.” First, our overall knowledge of animation and rendering improved immensely. For the 2021 film, we utilized ray-tracing to create a much more realistic look. Additionally, we gained a much better sense of scope, deciding to focus on a single scene with a small number of detailed assets rather than modeling as many things as possible. We also became much more experienced with iPi Mocap, allowing us to incorporate more complex motions in our script. The primary goal of these projects is for everyone on the team to learn as much as possible, and that goal is certainly being achieved.

Q. You mentioned Dirty Mechanics is planning to improve its real-time virtual production system even further by utilizing Unreal Engine 4 and switching to PlayStation Eye cameras for tracking. How will iPi Mocap be incorporated into this pipeline? Is there a project underway?

KB: We are excited to say that there is a project underway. Currently, we are in production for our FRC 2022 Digital Animation Award submission. Our newest pipeline utilizes iPi Mocap with 8 PlayStation Eye cameras arranged in a circle. This has facilitated a smoother workflow by allowing us to operate the entire mocap system with a single PC and avoid some annoyances present in the Microsoft Kinect Sensors. Once we have captured high quality motion capture data with iPi Mocap, we import the resulting files into Unreal Engine 4, then record the final render in real-time using our HTC Vive VR system to track a virtual camera.

Dirty-Mechanics-Team-2202-01

Film Mocap in Miami

Posted on: December 13th, 2021

Film Mocap in Miami

A Conversation with George Carvalho, Animator /Director of Animaza Studios

Miami-based Animaza Studios, led by animator George Carvalho, has been in business for nearly four years creating a mix of original and client-based animation work. More than just an animation studio, Animaza sees itself as an entertainment company, with several of their original short films serving as trailers for books that have been published by the company’s publishing arm. The studio recently invested in iPi Soft markerless motion capture technology and has put it to use, particularly in two recent short films they’ve produced, “The Goblin Pact” and “Termite Samurai.” Both films were entered into international film festivals and received accolades

We spoke with Carvalho about Animaza Studios, its work and how markerless motion capture is helping the studio achieve its creative goals.

iPi Soft: Tell us a little about your background in animation and what drew you to it? How long have you been working in the field? Prior to Animaza did you work for other studios?

GC: I’ve always liked animation, especially Japanese animation. Many years ago, I was working with a software called Infini-D (now DAZ) to create titles for corporate video work. At the time, my brother was learning Autodesk 3ds Max, and told me that it was the software that Hollywood studios use and I should try it out.

I had recently finished a live-action independent feature film where I was also the cinematographer and thought the lighting could have been better. To improve the quality, I needed access to lights, and at the time only tungsten lights were available but very expensive to rent. With 3ds Max I was able to simulate lights that behaved like real world lights and create photo real results from my home computer. To better learn the software, I started making short films. When I started posting them on YouTube the response was better than I expected. That’s when I decided to start the company.

iPi Soft: How did you initially hear about iPi Mocap? Do the two Animaza award-winning short films – “Goblin Pact” and “Termite Samurai” – both use iPi Mocap?

GC: I watch an enormous number of behind-the-scenes and making-of videos, and mocap occasionally would come up as a new way of animating characters. Both of our short films make extensive use of iPi Mocap and received either best animation short or honorable mentions. I definitely hope to enter future film projects in festivals.

iPi Soft: Tell us a bit about your motion capture process and the major benefits it brings to production? Were you using other animation processes prior to using mocap that prompted you to switch gears towards using a motion capture solution?

GC: When I started on this journey, I was using key frame animation and doing everything by hand. This took an enormous amount of time. After much research, I came across iPi Soft motion capture software and presently use it with the Microsoft Azure Kinect camera.

One of the main benefits of using iPi Mocap in the character animation production process is that it drastically cuts down on hand animating characters. I capture all the performances in my living room conveniently and efficiently. I find iPi Mocap is very simple and intuitive to use and doesn’t require a whole team to get excellent results.

In addition, the quality of the software captures, including small nuances of movement like picking up objects or turning, is helpful in allowing me to capture individual movements that bring out a characters’ unique personality to life.

iPi Soft: What other software and hardware tools were used in conjunction with iPi Mocap in the Animaza production pipeline on the above films?

GC: The main tool I use for animation is Autodesk 3ds Max with the Arnold renderer. Adobe Photoshop is used to manage the textures while we used Adobe After Effects to composite all the images rendered out of 3ds Max while Adobe Premiere is for film editing. Adobe Mixamo is used for some of the animation, with DAZ studio used to acquire 3D resources such as 3D characters and settings.

iPi Soft: Can you walk us through a couple of key scenes in Goblin Pact and Termite Samurai where iPi Mocap helped meet a particular creative challenge? Is there something in particular about a markerless mocap solution, like iPi Mocap, that makes the content creation process easier to use?

GC: A challenging scene in “The Goblin Pact” was in the beginning of the film when the fairies are admiring their wings. I had my wife act out the motions and capture them with iPi Mocap because females have a different center of gravity and move differently than males.

In “Termite Samurai”, we faced a particular challenge when the Termite Samurai must draw out his sword as the spider is attacking, and then fight the spider. The samurai motion was captured all in iPi Mocap and the spider was hand animated – all of it had to be synced together to make it believable.

The advantage of markerless mocap is that it makes if fast to capture the movements needed. So fast, in fact, that it allows me to experiment and refine the animation so that it’s the best it can be. Other solutions may be more accurate, including the use of camera rigs or wearing a suit, but are much more expensive and require a team and extensive preparation to use.

iPi Soft: Are you working on any other animation projects currently that rely on iPi Mocap?

GC: We just released the trailer for the novel “The Proto Sapien Protocol”, on November 1st. The next project is based on Aesop’s fable, “The Hare and the Tortoise”. It is currently in production and both the film and the book are on track to be released early next year.

iPi Soft: Is there anything else you’d like to tell us about your mocap workflow?

GC: The first time I used mocap was on a “Silver Surfer” fan film where an alien character is holding a baby. The moment I saw the character move in such a convincing, fluid way, I knew the software would become an integral part of my future content creation workflow, especially since all the other characters were hand animated and you can see the difference. At the time, I was using the old Xbox Kinect camera, which had limited motion accuracy. But when Microsoft released their Microsoft Azure camera, and iPi Soft updated its support, I jumped right in and haven’t looked back since. I will always be grateful for iPi software, because it has truly allowed me to create high quality animation at a reduced cost and build a thriving entertainment company.

ipi-animaza-three-image-600px

animaza-termite-600px

animaza-irini-600px

Using iPi Soft Tools To Design A Safer Workplace

Posted on: August 31st, 2021

Using iPi Soft Tools To Design A Safer Workplace

A Q&A With Marco Capuzzimati, Research Fellow at University of Bologna

To hear many future-of-work theorists tell it, the manufacturing workspace of the future is one that will be increasingly filled with humans and robots working together. These collaborative robots — or cobots — work alongside humans without any separation. Moreover, they’re easily reprogrammed making cobots ideal for repetitive or dangerous tasks humans would prefer not doing.

But how do you know if the cobot is safe to work around humans? How far should the arm move? How fast? Figuring out those details is the job of researchers like Marco Capuzzimati, a researcher at the University of Bologna, who use markerless motion capture software tools like iPi Motion Capture to work out the range of motion details using the software before trying it out in the actual workplace.

In fact, scientist and medical researchers discovering innovative ways of incorporating motion capture technology led iPi Soft to create the Biomech Add-On, designed to enable visualization of tracking data for gait analysis and rehabilitation, sports motion analysis and research in 3D human kinematics.

We spoke with Marco about his work and how he’s using iPi Soft technology.

Q. Tell us more about your work?

MC: The project aims to create a management and feeding system for a collaborative cell, represented by a human operator and a cobot, or a collaborative robot. We are developing a system capable of being autonomous in the feeding phase, such as selection, picking and storage, of parts and components involved in the production process.

Q. How are you using iPi Soft to analyze movements? Is this being done in the assembly cell area of a warehouse?

MC: We stream mocap data from iPi Mocap Studio 4 to Unity 3D where we can do a biomechanical analysis of the movements both in the warehouse and in the assembly cell.

Q. Can you provide additional detail on the biomechanical analysis?

MC: We have developed a C-sharp code to handle the incoming mocap data. The data is then sent to a Microsoft HoloLens where we can do biomechanical analysis. We are particularly focused on the angles between certain joints so we can determine critical configurations.

Q. How does motion capture technology help in your research?

MC: Motion capture is helping us in warehouse settings where the cobots and people work together to load components and various parts onto warehouse shelving; and in the assembly phase with support from cobots and people doing tasks that cobots can’t do. The ergonomic design of the assembly cell is very important to maximize operator satisfaction and system performance, so the main objective of motion capture in this project is ergonomic optimization and risk assessment following the execution of certain assembly or load handling activities. In this way we can optimize system performance and reduce the risk of injury.

Q. Can you describe the assembly cell in greater detail? Is the assembly cell a specific area of a warehouse where the robot is assembled? 

MC: The entire area of the ‘Pilot Line,’ as we call it, consists of a warehouse setting with inclined shelves. An assembly cell is formed by a fixed cobot and a flexible working table, this is the main area of interest. These elements allow cobots to carry and move materials and components from a warehouse to the assembly cell.

The main objectives are real-time biomechanical analysis of human operators and application of the ergonomic index for risk assessment during work activities. We have to analyze every single movement during assembly and consider the weights of the parts and components handled, the frequency of handling and the efforts that the operator has to make.

Q. Finally, how long have you been conducting the motion capture research at the University of Bologna? Is there an end date?

My research in the industrial engineering department began in September 2020 and is scheduled for completion in July 2022. I’m looking forward to future projects and opportunities to deploy this motion capture technology.

Bi-Rex-1

Picture 1. Marco Capuzzimati, University of Bologna researcher, pictured here at the Bi-Rex: Big Data Innovation & Research Excellence competence center, is conducting biomechanical analysis using iPi Soft motion capture software on workplace safety between humans and collaborative robots (cobots).

Bi-Rex-2

Picture 2. Marco Capuzzimati, University of Bologna researcher, pictured here at the Bi-Rex: Big Data Innovation & Research Excellence competence center,  is using iPi Soft motion capture software to analyze workspaces between humans and collaborative robots (cobots), to ensure workplace safety.

iPiMocap_and_Hololens_in_biomech_research

Picture 3. Marco Capuzzimati, University of Bologna researcher, is using iPi Soft motion capture data in conjunction with the Microsoft HoloLens to conduct biomechanical analysis in warehouse settings to ensure workplace safety where humans and collaborative robots (cobots) will be working together.

Digital Media World Reports on Plugin Announcement for iPi Mocap and Unreal Engine

Posted on: April 6th, 2021

Thank you Digital Media World for reporting on our exciting plugin announcement for iPi Mocap and Unreal Engine. This integration is an ideal real-time workflow solution for game developers, virtual production content creators, and more. Take a look at the news here.

https://digitalmediaworld.tv/animation/3534-ipi-soft-develops-real-time-integration-for-ipi-mocap-and-unreal-engine

#iPiSoft #iPiMocap #UnrealEngine #gamedev #virtualproduction #motioncapture #DigitalMediaWorld

iPi-UE-NewDiscoveryVRProject2