Hewwo
#vtuber
#ENVtuber
My Android precision face tracker, MeowFace, is finally out on Play-store.
Here's a tutorial on how to use it to get ARKit-like blendshape tracking to VSeeFace.
Thanks to the addition of Spout2 to VTube Studio it's now very easy to get a 2D avatar in
#VNyan
through a Spout2 receiver prop and use most of the redeems just fine~
Are you a 2D VTuber wanting to add some interactive redeems to your streams?
Did you know that
#VNyan
has supported VTube Studio, Inochi2D as well as face camera streams for almost 2 years already?
It's super easy to bring any 2d image to VNyan through the builtin Spout2 props
Thanks to the addition of Spout2 to VTube Studio it's now very easy to get a 2D avatar in
#VNyan
through a Spout2 receiver prop and use most of the redeems just fine~
Hewwo
#Vtuber
#ENVtuber
I just released VNyan, the Node Graph-interface 3D VTuber application to take your streams to the next level~
Check the release & tutorial video here
To celebrate the big UI/UX redesign milestone the
#VNyan
community helped me put together a showcase reel of some of the awesome things you can do with the app~
Enjoy~ 💕
#VTuber
#VTuberUprising
While waiting for the Play-store to approve the Android tracker. Here's a side-by-side comparison between it on the left and iFacialMocap (iPhone) on the right.
Phones used were Galaxy S20 and iPhone XR
Sneak peek of the new VTuber App that I've been working recently~ 💕
Supports vrm and vsfavatar files mostly~
Might be ready for Beta testing in a couple of weeks
#VNyan
1.08 introduces the Pendulum Physics-system for creating eye wobbles.
The release also fixes several bugs and contains improvements to anti-alias and bloom.
Download:
Side by side comparison of the ARKit tracking in
#VNyan
using Web Camera and iPhone.
While it doesn't quite reach the same quality it's still way better than no ARKit at all
With some tweaking of the expression settings the quality could be improved even further
#VNyan
1.3.1 adds one of the popular community created Node Graphs as a builtin feature to allow more expressive, Live2d-style body movement.
It also adds new post processing effect and a lot quality of life features such as object picker
Enjoy~ 💕
Just tested LeapMotion 2 in
#VNyan
and I must say the tracking is way way better and much wider range than in the first version.
The device was used with neck holder
#VNyan
1.1.5 introduces experimental Web Camera tracking with ARKit and simple blendshape support
This finally making VNyan a full fledged VTuber App with its own tracking. Enjoy cuties~ 💕
As show on stream today, VNyan's parameter system allows addition of eye jiggle to even normal VRoid VRM files as long as necessary highlight movement blendshape is added.
Testing 3d world loading with VNyan. You'll be able to create worlds in Unity and load them as background to your avatar.
Also testing microphone volume monitoring node which allows effects to trigger based on microphone volume level.
#VNyan
1.3.2 adds the popular arm sway and breathing motion community graphs as easy builtin features. You can even connect your heart rate to the breathing movement speed.
The update also improves web camera detection.
Enjoy~ 💕
#VTuber
#VRM
Working on a new hand tracking app for artists etc~
It sends cursor movements through VMC-protocol to for example VSeeFace. Still needs a little bit more work before it's ready tho~ 💕
This upcoming
#VNyan
feature may be of interest to all you 3d modeller types~
Camera Angle Blendshapes
The app will track the angle between avatar face and camera, and applies it to 4 camera blendshapes if you want to change how your avatar's mouth etc looks based on the angle
Prototyping ARKit face tracking using Android phone. Not quite as good a FaceID but so far pretty decent.
Image is from Face2VMC (was iFacialMocap2VMC)
NyartTracker 0.1.2, the cursor to VMC-protocol tracker, adds a small update of keeping the non-tracked hand in default pose instead of the pen holding pose~
The Android precision face tracker, MeowFace, is now waiting for check up at Play-store. So hopefully I'll have it up soonish~
Will make a tutorial once it's up there~
My Twitch Integration for
#VRChat
and
#VSeeFace
is almost ready for the first test release~ Hopefully getting it done by the end of the week.
The first release will support channel points, bits and chat commands - subs, follows etc will come later~
Valentine's Day is coming soon so here's some wiggly heart blob throwables and droppables for
#VNyan
💕5 Colors
💕Sticky and non-sticky versions
Get them for free from my ko-fi:
#freeVTuberAssets
#VTuberAssets
The VR Tracking support in
#VNyan
is coming along nicely~ 💕
Still a lot of work and fine tuning of the tracking result left before public release though~
#VNyan
1.0.19 adds support for reading virtual and physical trackers from VMC-protocol. The trackers can then be linked to props.
It also adds several Leap Motion updates including Screentop-mode
Having worked on
#VNyan
for nearly 2 years, I keep forgetting we had support for certain features until someone requests them again.
For example this leaning forward-feature's been in the app over a year. Never used it myself but now it's definitely going to the ASMR streams
The upcoming version of
#VNyan
will increase the shadow precision by quite a bit. To some degree even making the contact shadows unneeded.
This is a comparison video of the current mode with contact shadows vs the new one without 💕
Some new stuff coming to
#VNyan
soon~
Expression Mapper for toggling blendshapes based on combination of other blendshape values.
Also various rain effects~
My VTuber precision face tracker, MeowFace, has reached 7,5k concurrent installs. Thank you so much for trying out the app everyone~ I can't believe it reaching such a wide audience. 💕
I promise to work on an update after VNyan has been released out of the way first~
Lots of you recently talking about the impact you've made for VTubing.
Personally I hope I've managed to make 3d VTubing a little bit more accessible for everyone whether that's with tutorials or software. And it's not like I was done yet~ 💕
I get a lot of questions asking what is VNyan and what does it do. It's difficult to answer as it all depends on your imagination.
However, here's a clip showing how quick and easy it's to create for example a Water Blast redeem
#VNyan
1.2.0 adds the long awaited YouTube-support to the application.
It also introduces multitude of new features such as a built-in Look At-camera and various new Post Processing effects.
The update and more detailed changes are available on the itch-page as always~ 💕
Another sneak peek of VNyan, the new VTuber App 💕
Showcasing an idle loop with 2 timers to wiggle ears and do a random redeem every couple of seconds.
As you can see, pretty complicated effects are also doable if you want to
The app is also almost ready for beta testing~ 💕
Testing Vive Ultimate Tracker PCVR Beta. Combining a single ultimate tracker for head tracking with 2 index controllers for hands and sending the data to VNyan through VMC.
Index headset was inactive on the side table.
Works surprisingly well to be honest
#VNyan
1.2.6 brings several updates to the Bubble Shooter-node including stickiness and GIF-image support.
Other features include web camera hand mirroring and virtual camera background selector.
Enjoy~ 💕
#VNyan
1.2.2 introduces a lot of community requested features such as Virtual Camera-support and VMC-smoothing.
It also adds and updates Worlds that come with the app.
The update is up on itch as usual~ 💕
#VNyan
1.2.1 introduces a redone User Interface and Starup wizard to make the application easier for new users.
It also adds Modding/Plugin-support to allow creation of new features and custom unity scripts 💕
The upcoming version of
#VNyan
is going to add resource browser for easy discoverability of assets for the app
If you want your asset links added to the browser you can submit them in the discord already~ 💕
#VNyan
1.0.9 introduces experimental ARKit-support through VTube Studio (iphone), iFacialMocap and MeowFace.
The tracking can be combined with VMC-protocol as well.
Download:
#VNyan
1.0.16 introduces the new Expression Mapper that allows you to map for example ARKit expressions to blendshapes.
The new version also adds some IK fixes and a new rain-effect
Tutorial link is in the comments~ 💕
Prototyping VNyan's Pulsoid integration~ You'll be able to build Node Graphs that react to your Heart Rate.
In this test graph the higher the heart rate the more wind affects the avatar. You could make it adjust blendshape values or do whatever you want~
The LeapMotion-support in
#VNyan
will allow node graph to react when the tracking status of each hand changes.
This allows for example swapping to some arm sway or similar automatically when hand tracking is lost
Hey
#VTuber
#ENVtuber
💕
The first test version of my Twitch Integration for
#VRChat
and VSeeFace has been released with a tutorial on how to use it with both VSF SDK and VRChat SDK~
Enjoy~ 💕
The tutorial and app can be found at:
#VNyan
1.0.13 finally introduces the experimental MMD motion-support for both humanoid and camera motions.
Motions that use IK are sadly not supported.
Hewwo
#vtuber
#ENVtuber
💕 Many of you have been asking how to get the new VRoid working with Blender.
In today's tutorial I'll show an updated workflow on how to take your model from the new Stable VRoid to Blender for editing and then back to Unity.
#VNyan
1.2.8 adds support for NSFW content creation with Chaturbate-integration as well as support for Lovense Toys
It also adds couple new post processing effects including Motion Blur
Enjoy~ 💕
Since the video quality on this platform seems to be stuck to 270p, here's a screenshot showing the precision of the shadows in the next version of
#VNyan
Each finger casts a shadow correctly now even without contact shadows on
What if one was to make a 3d model that was just a flat plane with Spout Receiver. Then project Live2d avatar from VTube Studio to that.
2d VTubers could become cardboard cutouts in vsfavatar-compatible VTuber Apps 🤣
Not sure if this needs to be said but
#VNyan
is not impacted by the new Unity price changes. The thresholds are very very far.
Of course, this kind of change is a huge loss of trust towards the engine and definitely encourages one to pick different one for future projects
Audio Reactive shaders are coming to
#VNyan
soon through AudioLink-support.
This means that you'll be able to utilize AudioLink-capable shaders in Avatars, Worlds and props~
Hewwo cuties~ 💕
I got to test this relatively new Full Body Tracking solution from Uni-motion that just might be the answer 3D VTubers have been waiting for~
Link to the review video is in comments~
#vtuber
#vrchat
#VNyan
1.1.6 adds ARKit Adjustment settings to adjust blendshapes for both phone and web camera tracking
It also adds optimizations to VRM 1.0 performance in the app
I just released a small flag asset pack for
#VNyan
All the flags come with physics and there's also unity package for making your own flag assets.
You can get it for free from my ko-fi:
Enjoy~ 💕
iFacialMocap2VMC 0.1.6 has been released~ 💕
It fixes a decimal conversion bug when eye bones were active. This may have prevented tracking data from being sent to VSeeFace in specific regions
Latest version is available at as always
#VNyan
1.1.0 introduces completely redone tracking layers, 4 vmc receivers, animation injection, camera angle blendshapes etc.
The update is big so I urge you to reserve enough time to reconfigure your settings before your next stream~ Please do not try to update mid stream~ 💕
Some of you have been asking if it was possible to move your avatar in the 3d world in VNyan.
Not by default but you can build a simple movement system using the Node Graph-interface.
To take this further you could also add walking animation to your vsfavatar
Hewwo
#Vtuber
#ENVtuber
In today's tutorial we get into more advanced animation mechanics of VSeeFace to create animations that can be started through VMC protocol. This works perfectly with Twitch integrations etc.
#VTuberUprising
💕VNyan Launch & Partner Anniversary Subathon💕
NOV 12 AT 11AM UTC
Let's have comfy and fun times together~ and then release VNyan - the game changing 3D VTuber App~
Hope to see you there~💕💕
#ENVtuber
#Vtubers
#Vtuber
#VtuberUprising
#VNyan
1.3.5 introduces a new Resource Browser-feature allowing much easier discovery of various kinds of assets for the application.
It also adds a Cooldown-node that can help simplify a lot of the existing Node Graphs.
Enjoy~~ 💕
It's time~
#VTuber
#ENVtuber
Now 3D VTubers can also get those wobbly eye-physics that only Live2d avatars have had so far.
In this tutorial I'll showcase how to use the new Pendulum Physics-system in
#VNyan
Check the tutorial at:
Cuties a reminder~
#VNyan
is a free app and available only on my itch-page. If you see someone selling it on some random website then it's a scam. Please do not download it. Stay safe!
I'm starting a new VNyan Basics tutorial series that'll cover some of the most asked questions about the app.
In this first tutorial we look into the new Wait-node that was just added to
#VNyan
Enjoy~ 💕
#VTuber
Don't blink or things explode~
Showcasing VNyan's blendshape-filter node's capabilities to trigger actions when specific blendshape's hit or don't hit specific values.
This node graph and ARKit compatible version of it will be included in VNyan's Examples-folder~ 💕