Skip to main content

Inochi2D - Free Open Source 2D VTuber Avatar Rigging and Puppeteering Software (Part 2 - Inochi2D Session)

My Cartoon Animator TET Avatar in Inochi2D Session.
In part one of my deep dive into the free VTuber software, Inochi2D, I focused mainly on Inochi2D Creator, which is used for rigging your character avatar in the correct file format for use with Inochi2D Session, the puppeteering part of the software.

The two sides of the software are still very much in development and the documentation, particularly for Session, is very thin on the ground. To the point where I don't think I could even do a comprehensive tutorial because I'm not sure I'm even doing things right, and the software could change significantly in a single update.

As a result, in this part of my Inochi2D deep dive I'm changing tact from presenting my finished Cartoon Animator TET Avatar, and will be summarizing my experience of getting Session up and running using OpenSeeFace as the recommended webcam motion capture software.

To do this I will be using  the TET avatar I created in my review of Mannequin, since that can be exported as a full, ready to go rig, for use with Session, bypassing Inochi2D Creator altogether. If you want to give this a try the free version of Mannequin is all you need.


Before You Start Review Your Mannequin Avatar

In the course of writing this article I came to realize, if you want your Mannequin Avatar to be fully motion capture rigged for Inochi2D Session, you need to make sure you use assets that are pre-rigged for Inochi2D. This means, when putting your avatar together, you should specifically filter the various asset galleries using the Inochi2D logo. For example there is only one mouth in Mannequin you can use if you want your character to lip sync in Session.

Selecting an Inochi2D Session compatible mouth in Mannequin.
There is only one mouth asset in Mannequin that is pre-rigged for lip sync in Inochi2D Session.
If any of your Avatar's facial features are not being tracked in Session chances are it's
because you've selected an asset not pre-rigged. Simply go back into Mannequin, 
change it, then re-import your avatar into Session.

Setting Up My Mannequin TET Avatar in Session

To get my character working in Session I followed the video tutorial (below) for Mannequin on how to set up your character in Inochi2D Session using a webcam

Note that this tutorial is a continuation of a previous Mannequin video tutorial that you should watch the end of for instructions on how to connect Session to OBS for livestreaming.

Exporting my TET avatar from Mannequin as an INX file for OpenSeeFace.
On the Export tab of Mannequin these are the
settings you need to pay attention to.
Export Your Character

The first step is to export your finished character from Mannequin. Set the file type to .INX for Inochi2D. You will then have a choice of which motion capture method you want to choose.

We'll be using OpenSeeFace which is for webcam. OpenSeeFace only tracks your face and head movement. Though in the female Mannequin Avatar it does appear to track something called 'body roll' which gives some upper body movement.

Since the male body avatar is still a work in progress I presume body roll will be added eventually.


Download and Run OpenSeeFace

Once you've downloaded OpenSeeFace, extract the zip file into a folder. You don't actually install it. Just run it from the folder. Windows may try to stop you but I can assure you it's perfectly safe to let it run.

The tutorial will tell you to go into OpenSeeFace's Binary folder and run the file 'facetracker.exe'. Presumably this is supposed to immediately start tracking your movements through your webcam however it did not work for me. I could tell the software wasn't tracking anything when I ran the file.

This is because my Webcam isn't the default camera on my system. I have several virtual cameras installed that take data from various applications that use my webcam. Typically you might run into this if you have any application installed that applies a filter to your webcam before sending out an image.

If you want to choose your camera and, as an added bonus, see a preview window of your webcam with tracking dots, in the same binary folder, run the file 'run.bat' instead. Follow the prompts to select your camera and frames per second (entering -1 for the default camera settings, when prompted, should work fine. I selected 30 for frames per second).

OpenSeeFace Motion tracking and Preview Window.
If you run OpenSeeFace with the 'run.bat' file, once you've answered the prompt
questions to choose and set up your webcam you'll get two preview windows like
this showing you how the motion tracking is working.

Open Inochi2D Session

Once you have motion capture running, open Inochi2D Session. The only way to add your avatar is to drag them in from their folder onto Session's window. If you want to remove your avatar from the Session stage, click and drag it to the bottom left corner of Session's window (over the trashcan icon. Note: This may vanish but you can still remove avatars by dragging them to where the trashcan should be).

To position your avatar in the window click and drag the mouse pointer. To resize your avatar, the only way I could find to do this was to hold down the left mouse button over my avatar and spin the scroll wheel on my mouse.


Set Up Inochi2D Session's Environment

When Session started a bunch of setting windows are already open (stacked on top of each other). If not you can turn them on under the View menu. Move the tracking window over one of the side tabs that appear when you click on its title bar and drag it to attach it to the side of the window. You'll be using this a lot to fine tune your model.

Through the Scene settings window you can add a background to your scene, or make it one of four colors you may want to chroma key out in OBS, so you can have your avatar stand in front of things etc.

Inochi2D Session's Virtual Space Window.
Inochi2D Session's Virtual Space window
set up for OpenSeeFace.
Finally you'll want to go into the View menu and select 'Virtual Space' under Configuration. This is where you link Session to OpenSeeFace.

If this is your first time using session you'll need to set up a space by typing any name into the box and hitting the '+' button. Now select the space you just created. You'll see it appear alongside with a plus button for you to click.

Click it and select OpenSeeFace from the drop down box. Then enter 11573 into the osf_bind_port box and 0.0.0.0 into the osf_bind_ip box. Click Save Changes then click Save on the whole Virtual Space Window.

If you're standing in front of your camera you should see your avatar starting to react to your movements. If you're not seeing any action in the tracking box click on your avatar to select it, and you should see the tracking box (and blend shapes window) light up with info.


Fine Tune Your Tracking Settings

The final step is to fine tune all the tracking settings to minimize the amount of jitter you may be seeing. This is done by increasing the dampen setting for each attribute. Once you're happy click the Save to File button at the top of the tracking window to have all the tracking info saved with your Avatar (so you won't have to adjust all the tracking each time).

TET and Mia Mannequin Avatars in Inochi2D Session with tracking information.
Once OpenSeeFace is connected you should see tracking information for the currently
selected avatar appear. Use the tracking window to fine tune the Dampen settings
to remove jitter from the motion capture.

From here you're good to go. Follow the first Mannequin tutorial I mentioned above for how to connect Session up to OBS.

Note, Session will allow you to open more than one avatar at a time and they will both move in unison from the same motion capture feed. I presume it must be possible to have multiple motion capture sources so that you can, potentially, have two or more people operating separate avatars on one live stream.

My Cartoon Animator TET Avatar

Just to finish up this series on VTuber software and Inochi2D I will give you a quick update on my own TET, Cartoon Animator Avatar, that I was trying to rig in this software.

While I never rigged the full character for Inochi2D, the rig for just switching the mouth sprites I was working on in part 1 of this series, I was able to test in Session. 

Initially it worked the way it did in Inochi2D Creator with the mouth shapes dissolving between sprites. Which obviously wasn't useable, even though the dissolve was quick, it was very noticeable.

I then started playing around with the 'tracking out' settings in Session for the Mouth Open Tracking and fixed it just by switching the second tracking out number to one or higher. The mouth sprites switched exactly as expected and looked great.

Comparison of the effect of increasing the tracking out mouth open value in Inochi2D Session.
On the left is my TET Avatar in mid mouth sprite change with the crossfade effect. Increasing
the Tracking out maximum to one or more resolved this for a clean, instant sprite switch.

I didn't take my own Cartoon Animator TET Avatar any further for this blog post series because I need a lot of time to just tinker around with how to actually rig it, and to learn how the various options correlate to the motion capture (not to mention adding in auto motions). Time I just don't have in the space of a couple of weeks.

If you want to see what's possible with an Inochi2D Avatar try one of Inochi2D's demo models in Session. Hook all the tracking up to the various parameters. Or even try a female Mannequin avatar in Session (since it will already be rigged and ready to go).

The thing to take away, if you want to use character templates from Cartoon Animator as a source of sprites for an Inochi2D Avatar, it is certainly possible, and you can sprite switch the mouth, rather than creating a new mouth in the more typical VTuber style of deforming the lips in front of an interior mouth sprite.

If I get my Avatar up and running I will no doubt add a third post to this blog series. For now, There is enough here to make up for the lack of documentation for Session, and the documentation for Creator should be more than enough to help you rig your characters. 

I think I've focused on VTuber software more than enough and it's time to move on to other animation and video related topics. 

Popular posts from this blog

Can You Learn Reallusion's Cartoon Animator 5 for Free Using Their 137 Official YouTube Video Tutorials Sorted Into a Logical Learning Order?

Or you could just buy The Lazy Animator Beginner's Guide to Cartoon Animator . While Reallusion's Cartoon Animator is one of the easiest 2D animation studios to get up and running with quickly, learning it from all of the official, free, video tutorials can be more overwhelming than helpful. With more than 137 videos totaling more than 28 and a half hours of tutorials, spread across three generations of the software (Cartoon Animator 3 through 5) it's hard to know if what you're learning is a current or legacy feature that you either need to know or can be skipped. Many of the official tutorials only teach specific features of the software and don't relate at all to previous or later tutorials. As a result there are many features either not mentioned or are hard to find. To make your learning easier, on this page, I've collected together all of the essential, official, free video tutorials and sorted them into a learning order that makes sense. Simply start at

AE Juice - Animation Presets, Motion Graphics, Templates, Transitions for After Effects, Premiere Pro, and Other Video Applications

Level up you video edits and animations with AE Juice's motion graphics and templates. Some days you just don't have the time to create flashy motion graphics for your latest video or animation. For some of us it's more a question of our own artistic abilities being a little less than the awesome we'd like them to be. Whatever reason a resource like AE Juice's animation presets, motion graphics, templates, and transitions packs for After Effects , Premiere Pro , and other video applications can really make your work stand out very quickly. AE Juice gives you access to an instant library of free, premade content elements and sound effects, which you can add to with additional purchases of various themed packs from their store. There are three ways to manage their content, all of which can be used in commercial projects . The AE Juice Standalone Package Manager makes it easy to browse previews of all your pack contents and to download and find just the elements yo

Artbreeder - Using AI created Character and Background Content in your Animations

A selection of User/AI generated images from Artbreeder. If you're looking for an endless supply of 2D character and background images for your animations then Artbreeder , an online Artificial Intelligence (AI) that generates image mash-ups you can tweak as much as you like, could be the ultimate content library. What is Artbreeder? Artbreeder is free to use though there are various paid plans, that give you additional features, such as higher resolution download images or more settings to play with. All images created on the site are Public Domain (CC0 License) and can be used in commercial projects. Using Artbreeder's online app you can generate head shot portraits, full body characters, landscapes, and other scenes simply by choosing two or more existing images to mash together then, using a series of sliders, to select which traits from each image you wish to lean toward in the final image. Photo Comparison - Top is my original uploaded photo. Bottom is Artbreeder's ap

Jarrad Wright, The Big Lez Show - Who Would've thought Animating with MS Paint Could Take You So Far?

A friend of mine recommended I should check out The Big Lez Show after I mentioned to him I make animations for living. He said the show's creator, Australian animator, Jarrad Wright , just makes episodes from his home using MS Paint. Somewhat shamefully I hadn't heard of The Big Lez Show, but the fact that it was being made with MS Paint absolutely hooked me into checking out. If you've never heard or seen the show then you, like I was, are probably thinking how good could it be? MS Paint has kind of a cult following of hardcore animators but no one would use it as their primary animation tool on a series, right? WARNING - before going any further, you need to know The Big Lez Show and its humor contains some pretty strong language. By strong I mean it's peppered very liberally with the 'F' and 'C' words and is very every day Aussie, blue collar speak. Unapologetically, all of that, is part of why it's so good. There's a good chance you've

Moho 14 Released - Still the Best 2D Animation Software for Indy Animators on a Budget

Moho 14 Released. Regular readers know I am a Reallusion, Cartoon Animator advocate through and through. Hands down I would recommend Cartoon Animator 5 first over Lost Marble's Moho 14 to anyone who is just starting in 2D animation, is a team of one, or just needs to animate as quickly as possible. However, feature for feature, Moho is, arguably, the best 2D animation software for the rest of us who can't justify a Toon Boom Harmony , or Adobe Creative Cloud subscription (and even with their applications Moho is very competitive on features). You can get started with Moho Debut for just USD$59.99 which is a cut down version of Moho Pro but it still has the most essential features needed for 2D animation. While Moho Pro is a whopping USD$399.99 (Cartoon Animator, which only has one version, is just USD$149.00) upgrades to new version numbers come down to a quarter of the price at USD$99.00. Even though Reallusion just released features like Motion Pilot Puppet Animation and

Reallusion Releases Cartoon Animator 5 - One Version, More Features, Lower Price!

If you're serious about producing 2D animation as quickly as possible, while still achieving professional results, Reallusion's Cartoon Animator 5 makes the most compelling case yet as your animation studio/tool of choice. Cartoon Animator's point of difference has always been its ease of use and accelerated workflow. Creating fast, 2D animation using puppet, bone rigged based characters and props, on a stage with 3D depth for easy scene parallax effects. As it has developed Reallusion has incorporated more advanced features like motion capture for both face and body as well as being able to export scenes to post production tools like After Effects with the addition of plugins. After moving away from Flash based vector image support for a few years, Reallusion is back with full .SVG (scalable vector graphics) support for resolution independent graphics. They've also added Spring Dynamic physics and Full Form Deformation tools, both of which make it ridiculously easy t

Cartoon Animator 5 and G2 Characters - Why You'll Probably Never Use Them Even Though They're Great

Since I've previously covered how to get the most out of your purchased G3 and G1 characters for Reallusion's Cartoon Animator 5, it would be remiss of me not to look at the greatest character rig of all time, G2 characters. G2 Characters have been mostly relegated to legacy status since Cartoon Animator 3 but, as a rig that let you create fully 360 degree turn-able characters that moved in 3D space, animated with 3D motion files, and were mostly vector based, there was nothing else like them in any other 2D  software. The problem was, even with the templates provided by Reallusion for both Adobe Flash and Serif DrawPlus , they were complex and time consuming to make from scratch. They were also difficult to customize because there was no way to export and edit individual parts. G2 characters just weren't easy enough for the casual Cartoon Animator user to customize so they fell by the wayside. However they're still fully supported in CA5 with all the same functional