Skip to main content

Inochi2D - Free Open Source 2D VTuber Avatar Rigging and Puppeteering Software (Part 2 - Inochi2D Session)

My Cartoon Animator TET Avatar in Inochi2D Session.
In part one of my deep dive into the free VTuber software, Inochi2D, I focused mainly on Inochi2D Creator, which is used for rigging your character avatar in the correct file format for use with Inochi2D Session, the puppeteering part of the software.

The two sides of the software are still very much in development and the documentation, particularly for Session, is very thin on the ground. To the point where I don't think I could even do a comprehensive tutorial because I'm not sure I'm even doing things right, and the software could change significantly in a single update.

As a result, in this part of my Inochi2D deep dive I'm changing tact from presenting my finished Cartoon Animator TET Avatar, and will be summarizing my experience of getting Session up and running using OpenSeeFace as the recommended webcam motion capture software.

To do this I will be using  the TET avatar I created in my review of Mannequin, since that can be exported as a full, ready to go rig, for use with Session, bypassing Inochi2D Creator altogether. If you want to give this a try the free version of Mannequin is all you need.

Before You Start Review Your Mannequin Avatar

In the course of writing this article I came to realize, if you want your Mannequin Avatar to be fully motion capture rigged for Inochi2D Session, you need to make sure you use assets that are pre-rigged for Inochi2D. This means, when putting your avatar together, you should specifically filter the various asset galleries using the Inochi2D logo. For example there is only one mouth in Mannequin you can use if you want your character to lip sync in Session.

Selecting an Inochi2D Session compatible mouth in Mannequin.
There is only one mouth asset in Mannequin that is pre-rigged for lip sync in Inochi2D Session.
If any of your Avatar's facial features are not being tracked in Session chances are it's
because you've selected an asset not pre-rigged. Simply go back into Mannequin, 
change it, then re-import your avatar into Session.

Setting Up My Mannequin TET Avatar in Session

To get my character working in Session I followed the video tutorial (below) for Mannequin on how to set up your character in Inochi2D Session using a webcam

Note that this tutorial is a continuation of a previous Mannequin video tutorial that you should watch the end of for instructions on how to connect Session to OBS for livestreaming.

Exporting my TET avatar from Mannequin as an INX file for OpenSeeFace.
On the Export tab of Mannequin these are the
settings you need to pay attention to.
Export Your Character

The first step is to export your finished character from Mannequin. Set the file type to .INX for Inochi2D. You will then have a choice of which motion capture method you want to choose.

We'll be using OpenSeeFace which is for webcam. OpenSeeFace only tracks your face and head movement. Though in the female Mannequin Avatar it does appear to track something called 'body roll' which gives some upper body movement.

Since the male body avatar is still a work in progress I presume body roll will be added eventually.

Download and Run OpenSeeFace

Once you've downloaded OpenSeeFace, extract the zip file into a folder. You don't actually install it. Just run it from the folder. Windows may try to stop you but I can assure you it's perfectly safe to let it run.

The tutorial will tell you to go into OpenSeeFace's Binary folder and run the file 'facetracker.exe'. Presumably this is supposed to immediately start tracking your movements through your webcam however it did not work for me. I could tell the software wasn't tracking anything when I ran the file.

This is because my Webcam isn't the default camera on my system. I have several virtual cameras installed that take data from various applications that use my webcam. Typically you might run into this if you have any application installed that applies a filter to your webcam before sending out an image.

If you want to choose your camera and, as an added bonus, see a preview window of your webcam with tracking dots, in the same binary folder, run the file 'run.bat' instead. Follow the prompts to select your camera and frames per second (entering -1 for the default camera settings, when prompted, should work fine. I selected 30 for frames per second).

OpenSeeFace Motion tracking and Preview Window.
If you run OpenSeeFace with the 'run.bat' file, once you've answered the prompt
questions to choose and set up your webcam you'll get two preview windows like
this showing you how the motion tracking is working.

Open Inochi2D Session

Once you have motion capture running, open Inochi2D Session. The only way to add your avatar is to drag them in from their folder onto Session's window. If you want to remove your avatar from the Session stage, click and drag it to the bottom left corner of Session's window (over the trashcan icon. Note: This may vanish but you can still remove avatars by dragging them to where the trashcan should be).

To position your avatar in the window click and drag the mouse pointer. To resize your avatar, the only way I could find to do this was to hold down the left mouse button over my avatar and spin the scroll wheel on my mouse.

Set Up Inochi2D Session's Environment

When Session started a bunch of setting windows are already open (stacked on top of each other). If not you can turn them on under the View menu. Move the tracking window over one of the side tabs that appear when you click on its title bar and drag it to attach it to the side of the window. You'll be using this a lot to fine tune your model.

Through the Scene settings window you can add a background to your scene, or make it one of four colors you may want to chroma key out in OBS, so you can have your avatar stand in front of things etc.

Inochi2D Session's Virtual Space Window.
Inochi2D Session's Virtual Space window
set up for OpenSeeFace.
Finally you'll want to go into the View menu and select 'Virtual Space' under Configuration. This is where you link Session to OpenSeeFace.

If this is your first time using session you'll need to set up a space by typing any name into the box and hitting the '+' button. Now select the space you just created. You'll see it appear alongside with a plus button for you to click.

Click it and select OpenSeeFace from the drop down box. Then enter 11573 into the osf_bind_port box and into the osf_bind_ip box. Click Save Changes then click Save on the whole Virtual Space Window.

If you're standing in front of your camera you should see your avatar starting to react to your movements. If you're not seeing any action in the tracking box click on your avatar to select it, and you should see the tracking box (and blend shapes window) light up with info.

Fine Tune Your Tracking Settings

The final step is to fine tune all the tracking settings to minimize the amount of jitter you may be seeing. This is done by increasing the dampen setting for each attribute. Once you're happy click the Save to File button at the top of the tracking window to have all the tracking info saved with your Avatar (so you won't have to adjust all the tracking each time).

TET and Mia Mannequin Avatars in Inochi2D Session with tracking information.
Once OpenSeeFace is connected you should see tracking information for the currently
selected avatar appear. Use the tracking window to fine tune the Dampen settings
to remove jitter from the motion capture.

From here you're good to go. Follow the first Mannequin tutorial I mentioned above for how to connect Session up to OBS.

Note, Session will allow you to open more than one avatar at a time and they will both move in unison from the same motion capture feed. I presume it must be possible to have multiple motion capture sources so that you can, potentially, have two or more people operating separate avatars on one live stream.

My Cartoon Animator TET Avatar

Just to finish up this series on VTuber software and Inochi2D I will give you a quick update on my own TET, Cartoon Animator Avatar, that I was trying to rig in this software.

While I never rigged the full character for Inochi2D, the rig for just switching the mouth sprites I was working on in part 1 of this series, I was able to test in Session. 

Initially it worked the way it did in Inochi2D Creator with the mouth shapes dissolving between sprites. Which obviously wasn't useable, even though the dissolve was quick, it was very noticeable.

I then started playing around with the 'tracking out' settings in Session for the Mouth Open Tracking and fixed it just by switching the second tracking out number to one or higher. The mouth sprites switched exactly as expected and looked great.

Comparison of the effect of increasing the tracking out mouth open value in Inochi2D Session.
On the left is my TET Avatar in mid mouth sprite change with the crossfade effect. Increasing
the Tracking out maximum to one or more resolved this for a clean, instant sprite switch.

I didn't take my own Cartoon Animator TET Avatar any further for this blog post series because I need a lot of time to just tinker around with how to actually rig it, and to learn how the various options correlate to the motion capture (not to mention adding in auto motions). Time I just don't have in the space of a couple of weeks.

If you want to see what's possible with an Inochi2D Avatar try one of Inochi2D's demo models in Session. Hook all the tracking up to the various parameters. Or even try a female Mannequin avatar in Session (since it will already be rigged and ready to go).

The thing to take away, if you want to use character templates from Cartoon Animator as a source of sprites for an Inochi2D Avatar, it is certainly possible, and you can sprite switch the mouth, rather than creating a new mouth in the more typical VTuber style of deforming the lips in front of an interior mouth sprite.

If I get my Avatar up and running I will no doubt add a third post to this blog series. For now, There is enough here to make up for the lack of documentation for Session, and the documentation for Creator should be more than enough to help you rig your characters. 

I think I've focused on VTuber software more than enough and it's time to move on to other animation and video related topics. 


Popular posts from this blog

Inochi2D - Free Open Source 2D VTuber Avatar Rigging and Puppeteering Software (Part 1)

Inochi2D Creator - Free Open Source VTuber Software. If you've been looking for a way to live perform as a 2D cartoon avatar on camera, whether it be for a live stream or for pre-recorded content like educational videos, then VTuber software is a low cost (or even no cost) option worth looking into. In my previous post, How to Become a VTuber - 2D and 3D Software for Creating and Controlling Your Avatar , I took a brief look at the relatively new but completely free and open source Inochi2D  which I thought showed great potential for my own needs of creating a live performance character rig for my own TET Avatar that I use for all my promotional materials. While it is possible to live perform my character using Cartoon Animator itself, Reallusion's MotionLive2D capture system isn't great - with lip sync in particular. More importantly though, I can't exactly teach people how to use Cartoon Animator if I'm using Cartoon Animator to control my Avatar. What is Inochi2D

Dollars Mocap: Full Body Webcam Motion Capture (Including Hands and Fingers) For iClone and Cartoon Animator

Even though I should be further away from the camera Dollars Mocap MONO still does a good job of  tracking my arms, hands and fingers. Ever since I wrote my series on becoming a VTuber , discovering it was possible to do full body motion capture, including hands and fingers, with just software and a webcam, I've been on the look out for any motion capture software that can bring that functionality to Cartoon Animator. Dollars Mocap is a low cost motion capture application with a free trial that I learned about through the YouTube Channel Digital Puppets  and their test video . It can record full body, upper body, arms and hands, and facial mocap from a live video source or pre-recorded video. Investigating further, I discovered not only does Dollars Mocap have a free iClone7, iClone8 character profile file download (look for it at the bottom of the main program download page), so you can use the saved motions with iClone8, they've also got a demo video for how to convert your

Prome AI Sketch Render Tool - Your Tradigital Clean Up and Colorist Artist for Character and Background Design

Random character head, Biro sketches drawn by TET (left). Render by PromeAI (right) using Prome's Sketch Render tool set to 'Comon:Cartoon, Render Mode: Outline'. W hile I don't do New Year Resolutions, one of my plans for the year ahead is to do more of my own art. Specifically character design drawn in an actual, physical sketchbook.  To that end, I have been spending the last half hour of most days drawing a page or two of random biro sketches in my sketchbook and posting the pages to my Instagram account  (this link will take you to one of my posts). These sketches are mostly practicing my skills because I don't really draw regularly anymore. Here is a tip, if you do this kind of sketching, and push yourself to keep doing it, you will see many drawings that could be taken further, even if you don't have anything they're suited for just at the moment. Which is where my second favorite AI Image Tool (after )  PromeAI comes into play. PromeAI

Moho 14 Released - Still the Best 2D Animation Software for Indy Animators on a Budget

Moho 14 Released. Regular readers know I am a Reallusion, Cartoon Animator advocate through and through. Hands down I would recommend Cartoon Animator 5 first over Lost Marble's Moho 14 to anyone who is just starting in 2D animation, is a team of one, or just needs to animate as quickly as possible. However, feature for feature, Moho is, arguably, the best 2D animation software for the rest of us who can't justify a Toon Boom Harmony , or Adobe Creative Cloud subscription (and even with their applications Moho is very competitive on features). You can get started with Moho Debut for just USD$59.99 which is a cut down version of Moho Pro but it still has the most essential features needed for 2D animation. While Moho Pro is a whopping USD$399.99 (Cartoon Animator, which only has one version, is just USD$149.00) upgrades to new version numbers come down to a quarter of the price at USD$99.00. Even though Reallusion just released features like Motion Pilot Puppet Animation and

Start Your 2D Animation Side Hustle - Sell Your Cartoon Animator Characters, Props, Scenes, and Motion Files in the Reallusion 2D/3D Marketplace

Have you thought about starting a side hustle selling your original Cartoon Animator assets in the Reallusion 2D/3D Marketplace ? In this article, the first in a series on selling in the marketplace, I'll give you an overview of what's involved, why you should give it some thought, and whether you can earn enough to quit your day job (or at least have a worthwhile side hustle). If you're an artist with any kind of drawing skills, and you're creating your own original characters, props, scenes, and even motion files for your Cartoon Animator projects, then setting up your own store in the Reallusion Marketplace should be a no brainer. You're making content already, it doesn't cost you anything to set up, and Reallusion only takes a 30% commission from each item sold. (If you think that's a lot, I'll address that further down). Don't be put off if you think your art skills aren't up to professional standards. There are plenty of artists with naïve

Wonder Unit Storyboarder - Free Storyboarding Software for People Who Can (or Can't) Draw

Wonder Unit Storyboarder.  As an independent and solo animator I'm always tempted to try and skip storyboarding my animated shorts because they're usually only single scene sketch comedy type jokes. As a result I have many unfinished projects that kind of petered out due to having no clear finishing line. Storyboarding your productions, no matter how small, gives you a step by step guide of every shot that needs to be completed (no planning shots as you animate). It also allows you to create an animatic that gives you a rough preview of the finished production. In short, you shouldn't skip storyboards as they, generally, increase the chance of the project being completed. Disclaimer - I'm Not a Fan of Storyboarder Upfront, Wonder Unit's Storyboarder  is not my preferred storyboarding software. However it's completely free, has a number of very compelling featu

Can't Draw Characters? Create Highly Detailed Characters from Simple Drawings and Prompts Free with Realtime Canvas by Leonardo.AI's   Realtime Canvas. Create highly detailed images from simple drawings. I f you've had an idea for a character but don't have the artistic skill to design it yourself, or the budget to hire someone to do the design work for you, then's Realtime Canvas may be your new creative partner. Sure you could use's regular text prompt to image generator but that can be very hit and miss, and may take many generations before you finally craft a complex prompt that's getting something close to what you had in mind. Realtime Canvas, on the other hand, lets you craft a simple text prompt and draw a rough image, both of which you can keep refining until you get a final, real time, updated image that looks close to (and probably better than) what you had in mind. Using Realtime Canvas Once you've signed up for a free account with  (which will give you 150 free credits, renewed daily), click on Realtime Canvas, from the side