Skip to main content

Inochi2D - Free Open Source 2D VTuber Avatar Rigging and Puppeteering Software (Part 2 - Inochi2D Session)

My Cartoon Animator TET Avatar in Inochi2D Session.
In part one of my deep dive into the free VTuber software, Inochi2D, I focused mainly on Inochi2D Creator, which is used for rigging your character avatar in the correct file format for use with Inochi2D Session, the puppeteering part of the software.

The two sides of the software are still very much in development and the documentation, particularly for Session, is very thin on the ground. To the point where I don't think I could even do a comprehensive tutorial because I'm not sure I'm even doing things right, and the software could change significantly in a single update.

As a result, in this part of my Inochi2D deep dive I'm changing tact from presenting my finished Cartoon Animator TET Avatar, and will be summarizing my experience of getting Session up and running using OpenSeeFace as the recommended webcam motion capture software.

To do this I will be using  the TET avatar I created in my review of Mannequin, since that can be exported as a full, ready to go rig, for use with Session, bypassing Inochi2D Creator altogether. If you want to give this a try the free version of Mannequin is all you need.


Before You Start Review Your Mannequin Avatar

In the course of writing this article I came to realize, if you want your Mannequin Avatar to be fully motion capture rigged for Inochi2D Session, you need to make sure you use assets that are pre-rigged for Inochi2D. This means, when putting your avatar together, you should specifically filter the various asset galleries using the Inochi2D logo. For example there is only one mouth in Mannequin you can use if you want your character to lip sync in Session.

Selecting an Inochi2D Session compatible mouth in Mannequin.
There is only one mouth asset in Mannequin that is pre-rigged for lip sync in Inochi2D Session.
If any of your Avatar's facial features are not being tracked in Session chances are it's
because you've selected an asset not pre-rigged. Simply go back into Mannequin, 
change it, then re-import your avatar into Session.

Setting Up My Mannequin TET Avatar in Session

To get my character working in Session I followed the video tutorial (below) for Mannequin on how to set up your character in Inochi2D Session using a webcam

Note that this tutorial is a continuation of a previous Mannequin video tutorial that you should watch the end of for instructions on how to connect Session to OBS for livestreaming.

Exporting my TET avatar from Mannequin as an INX file for OpenSeeFace.
On the Export tab of Mannequin these are the
settings you need to pay attention to.
Export Your Character

The first step is to export your finished character from Mannequin. Set the file type to .INX for Inochi2D. You will then have a choice of which motion capture method you want to choose.

We'll be using OpenSeeFace which is for webcam. OpenSeeFace only tracks your face and head movement. Though in the female Mannequin Avatar it does appear to track something called 'body roll' which gives some upper body movement.

Since the male body avatar is still a work in progress I presume body roll will be added eventually.


Download and Run OpenSeeFace

Once you've downloaded OpenSeeFace, extract the zip file into a folder. You don't actually install it. Just run it from the folder. Windows may try to stop you but I can assure you it's perfectly safe to let it run.

The tutorial will tell you to go into OpenSeeFace's Binary folder and run the file 'facetracker.exe'. Presumably this is supposed to immediately start tracking your movements through your webcam however it did not work for me. I could tell the software wasn't tracking anything when I ran the file.

This is because my Webcam isn't the default camera on my system. I have several virtual cameras installed that take data from various applications that use my webcam. Typically you might run into this if you have any application installed that applies a filter to your webcam before sending out an image.

If you want to choose your camera and, as an added bonus, see a preview window of your webcam with tracking dots, in the same binary folder, run the file 'run.bat' instead. Follow the prompts to select your camera and frames per second (entering -1 for the default camera settings, when prompted, should work fine. I selected 30 for frames per second).

OpenSeeFace Motion tracking and Preview Window.
If you run OpenSeeFace with the 'run.bat' file, once you've answered the prompt
questions to choose and set up your webcam you'll get two preview windows like
this showing you how the motion tracking is working.

Open Inochi2D Session

Once you have motion capture running, open Inochi2D Session. The only way to add your avatar is to drag them in from their folder onto Session's window. If you want to remove your avatar from the Session stage, click and drag it to the bottom left corner of Session's window (over the trashcan icon. Note: This may vanish but you can still remove avatars by dragging them to where the trashcan should be).

To position your avatar in the window click and drag the mouse pointer. To resize your avatar, the only way I could find to do this was to hold down the left mouse button over my avatar and spin the scroll wheel on my mouse.


Set Up Inochi2D Session's Environment

When Session started a bunch of setting windows are already open (stacked on top of each other). If not you can turn them on under the View menu. Move the tracking window over one of the side tabs that appear when you click on its title bar and drag it to attach it to the side of the window. You'll be using this a lot to fine tune your model.

Through the Scene settings window you can add a background to your scene, or make it one of four colors you may want to chroma key out in OBS, so you can have your avatar stand in front of things etc.

Inochi2D Session's Virtual Space Window.
Inochi2D Session's Virtual Space window
set up for OpenSeeFace.
Finally you'll want to go into the View menu and select 'Virtual Space' under Configuration. This is where you link Session to OpenSeeFace.

If this is your first time using session you'll need to set up a space by typing any name into the box and hitting the '+' button. Now select the space you just created. You'll see it appear alongside with a plus button for you to click.

Click it and select OpenSeeFace from the drop down box. Then enter 11573 into the osf_bind_port box and 0.0.0.0 into the osf_bind_ip box. Click Save Changes then click Save on the whole Virtual Space Window.

If you're standing in front of your camera you should see your avatar starting to react to your movements. If you're not seeing any action in the tracking box click on your avatar to select it, and you should see the tracking box (and blend shapes window) light up with info.


Fine Tune Your Tracking Settings

The final step is to fine tune all the tracking settings to minimize the amount of jitter you may be seeing. This is done by increasing the dampen setting for each attribute. Once you're happy click the Save to File button at the top of the tracking window to have all the tracking info saved with your Avatar (so you won't have to adjust all the tracking each time).

TET and Mia Mannequin Avatars in Inochi2D Session with tracking information.
Once OpenSeeFace is connected you should see tracking information for the currently
selected avatar appear. Use the tracking window to fine tune the Dampen settings
to remove jitter from the motion capture.

From here you're good to go. Follow the first Mannequin tutorial I mentioned above for how to connect Session up to OBS.

Note, Session will allow you to open more than one avatar at a time and they will both move in unison from the same motion capture feed. I presume it must be possible to have multiple motion capture sources so that you can, potentially, have two or more people operating separate avatars on one live stream.

My Cartoon Animator TET Avatar

Just to finish up this series on VTuber software and Inochi2D I will give you a quick update on my own TET, Cartoon Animator Avatar, that I was trying to rig in this software.

While I never rigged the full character for Inochi2D, the rig for just switching the mouth sprites I was working on in part 1 of this series, I was able to test in Session. 

Initially it worked the way it did in Inochi2D Creator with the mouth shapes dissolving between sprites. Which obviously wasn't useable, even though the dissolve was quick, it was very noticeable.

I then started playing around with the 'tracking out' settings in Session for the Mouth Open Tracking and fixed it just by switching the second tracking out number to one or higher. The mouth sprites switched exactly as expected and looked great.

Comparison of the effect of increasing the tracking out mouth open value in Inochi2D Session.
On the left is my TET Avatar in mid mouth sprite change with the crossfade effect. Increasing
the Tracking out maximum to one or more resolved this for a clean, instant sprite switch.

I didn't take my own Cartoon Animator TET Avatar any further for this blog post series because I need a lot of time to just tinker around with how to actually rig it, and to learn how the various options correlate to the motion capture (not to mention adding in auto motions). Time I just don't have in the space of a couple of weeks.

If you want to see what's possible with an Inochi2D Avatar try one of Inochi2D's demo models in Session. Hook all the tracking up to the various parameters. Or even try a female Mannequin avatar in Session (since it will already be rigged and ready to go).

The thing to take away, if you want to use character templates from Cartoon Animator as a source of sprites for an Inochi2D Avatar, it is certainly possible, and you can sprite switch the mouth, rather than creating a new mouth in the more typical VTuber style of deforming the lips in front of an interior mouth sprite.

If I get my Avatar up and running I will no doubt add a third post to this blog series. For now, There is enough here to make up for the lack of documentation for Session, and the documentation for Creator should be more than enough to help you rig your characters. 

I think I've focused on VTuber software more than enough and it's time to move on to other animation and video related topics. 

Comments

Popular posts from this blog

Inochi2D - Free Open Source 2D VTuber Avatar Rigging and Puppeteering Software (Part 1)

Inochi2D Creator - Free Open Source VTuber Software. If you've been looking for a way to live perform as a 2D cartoon avatar on camera, whether it be for a live stream or for pre-recorded content like educational videos, then VTuber software is a low cost (or even no cost) option worth looking into. In my previous post, How to Become a VTuber - 2D and 3D Software for Creating and Controlling Your Avatar , I took a brief look at the relatively new but completely free and open source Inochi2D  which I thought showed great potential for my own needs of creating a live performance character rig for my own TET Avatar that I use for all my promotional materials. While it is possible to live perform my character using Cartoon Animator itself, Reallusion's MotionLive2D capture system isn't great - with lip sync in particular. More importantly though, I can't exactly teach people how to use Cartoon Animator if I'm using Cartoon Animator to control my Avatar. What is Inochi2D

Wonder Unit Storyboarder - Free Storyboarding Software for People Who Can (or Can't) Draw

Wonder Unit Storyboarder.  As an independent and solo animator I'm always tempted to try and skip storyboarding my animated shorts because they're usually only single scene sketch comedy type jokes. As a result I have many unfinished projects that kind of petered out due to having no clear finishing line. Storyboarding your productions, no matter how small, gives you a step by step guide of every shot that needs to be completed (no planning shots as you animate). It also allows you to create an animatic that gives you a rough preview of the finished production. In short, you shouldn't skip storyboards as they, generally, increase the chance of the project being completed. Disclaimer - I'm Not a Fan of Storyboarder Upfront, Wonder Unit's Storyboarder  is not my preferred storyboarding software. However it's completely free, has a number of very compelling featu

Can You Learn Reallusion's Cartoon Animator 5 for Free Using Their 137 Official YouTube Video Tutorials Sorted Into a Logical Learning Order?

Or you could just buy The Lazy Animator Beginner's Guide to Cartoon Animator . While Reallusion's Cartoon Animator is one of the easiest 2D animation studios to get up and running with quickly, learning it from all of the official, free, video tutorials can be more overwhelming than helpful. With more than 137 videos totaling more than 28 and a half hours of tutorials, spread across three generations of the software (Cartoon Animator 3 through 5) it's hard to know if what you're learning is a current or legacy feature that you either need to know or can be skipped. Many of the official tutorials only teach specific features of the software and don't relate at all to previous or later tutorials. As a result there are many features either not mentioned or are hard to find. To make your learning easier, on this page, I've collected together all of the essential, official, free video tutorials and sorted them into a learning order that makes sense. Simply start at

Five AI Generative Image to Video Tools For Animation You Can Try Free Right Now

The Emo Girl Character created by Start Animating. A I generative video isn't new but it is the next big thing in the visual imaging space as various development teams work to perfect the generated output.  Just like generating still images AI video sometimes struggles with physics, arm and hand movement, and the general structure of things. However it is getting better and, as is the catch cry of all AI development, this is the worst it will ever be, because it's improving fast. If you're an animator one current potential use of generative AI video is to animate your key frames, as opposed to generating something entirely from a text prompt. Starting with an image helps to keep your characters and art style consistent across AI generations. With that in mind I tried five, free image to video AI generators to see what their potential might be and whether they can handle cartoon style characters well. Note all but the last entry on the list do not create any sound with the

Dollars Mocap: Full Body Webcam Motion Capture (Including Hands and Fingers) For iClone and Cartoon Animator

Even though I should be further away from the camera Dollars Mocap MONO still does a good job of  tracking my arms, hands and fingers. Ever since I wrote my series on becoming a VTuber , discovering it was possible to do full body motion capture, including hands and fingers, with just software and a webcam, I've been on the look out for any motion capture software that can bring that functionality to Cartoon Animator. Dollars Mocap is a low cost motion capture application with a free trial that I learned about through the YouTube Channel Digital Puppets  and their test video . It can record full body, upper body, arms and hands, and facial mocap from a live video source or pre-recorded video. Investigating further, I discovered not only does Dollars Mocap have a free iClone7, iClone8 character profile file download (look for it at the bottom of the main program download page), so you can use the saved motions with iClone8, they've also got a demo video for how to convert your

Eric W. Schwartz: Cartoonist, Animator and Amiga Die Hard

July 1992 Edition, CU Amiga Featuring Amy the Squirrel. American Cartoonist, Eric W. Schwartz , (whose unofficial Amiga Icon, Amy the Squirrel, is pictured on the July 92 edition of CU Amiga cover on the right) is my only real animation hero. Sure there are the big names like Disney , Chuck Jones , Tex Avery and even Preston Blair whose influences can all be seen in my own cartoons but Eric did what none of the others could. He showed that really great 2D computer animation was within my reach with little more than an Amiga Computer , a copy of Deluxe Paint and Moviesetter . This was at a time when computer based animation was in its infancy (outside of computer game animation) and Flash was something that lights did. There were many great Amiga artists but Eric was really the only one consistently making very funny, traditional style animations. His humor and drawing style is heavily influenced by classic Warner Brothers and Disney cartoons but he managed to build on this,

Animation Paper V5.0 Alpha 2 Pre-release

Animation Paper V5.0 Alpha 2 Pre-release. Animation Paper strives to be the best tool for digital hand drawn animation by not only recreating the traditional animator's table in a modern user interface, but also automating many of the more tedious processes so you can focus almost entirely on drawing. At the time of writing Animation Paper V5.0 is only available as an Alpha 2 Pre-release, and still does not have all of its features implemented, such as the ability to add audio (due to be part of the Alpha 3 release scheduled for the end of May 2020). Animation Paper has been completely redesigned and updated from th hand drawn animation app that was formerly known as Plastic Animation Paper (PAP) which, at the time, was easily one of the best apps for the sole purpose of creating hand drawn digital animation roughs. You can still download PAP completely free from the Animation Paper website . If you look past the extremely dated 90's interface it's still a great tool. I wr