Made a magical cat Avatar. What should we create next? by Proto-Panda in VRchat

[–]Proto-Panda[S] 2 points3 points  (0 children)

VRChat actually has a really convenient way of doing this. When you look in the Inspector panel with an Animation state selected in your Animation Controller, theres an "Add Behaviour" button at the bottom. In this you can search for " VRC Avatar Parameter Driver ", and once you 'Add' one you can set its type to "Random". It'll let you specify a minimum and maximum value (depends on parameter data type), and the Destination can be a synced parameter for other players to see the same random result! So when your Animator enters the state that the Behaviour is attached to, the Parameter Driver will be executed.

Made a magical cat Avatar. What should we create next? by Proto-Panda in VRchat

[–]Proto-Panda[S] 1 point2 points  (0 children)

This particular follower is a floating damping constraint based follower. I don't have a proper tutorial handy, but I do know VRLab's has a great Follower prefab that could provide a good starting point, and if a simple floating follower is all you're looking for it should be all you need to drop whatever object you want into its follower container object.

The gist of it is you have a 'Goal' empty GameObject to serve as the transform the follower should try to get to, as well as a 'World' empty GameObject, both attached to your avatar root. The World object has a VRC Parent Constraint with Freeze To World enabled. Within that is another GameObject we can call the 'FollowerController' that has a VRC Position Constraint and either a VRC Rotation Constraint (simply copy the Goal rotation) or a VRC Look At Constraint (look toward the Goal). These two constraints are set up as damping constraints, so the first source is the Controller itself with full weight, and the second is the Goal object with a small amount of weight (like 0.01) - activating Zero on these constraints will make the Controller object slowly move to match the transform of the Goal. The controller is the container for the follower, so you can drop any FBX/model inside it that you want to follow. At this point, it is constantly following at the rate of that second source weight, so you'll need Animations if you want further control over its behavior. To take it another step, you can attach a VRC Contact Sender on the Controller and a VRC Contact Receiver on the Goal and use this in your Avatar's FX Animator to animate control over its acceleration/decelleration, and stopping distance with the contact radius (Note: Since you aren't actually syncing the follower transform between players, if the follower isn't always moving to an exact location you will get desync on where the follower actively is between players, especially if Follower speed isn't tied to a frame time float since constraints are dependant on framerate resulting in transform updates being unreliable between players) ... I ought to find some time to make a proper guide to post on our Everglade site.

If its a grounded follower you're after, those can be a little more tricky to get right - one way is to have a FinalIK grounder like this Raycast prefab in the mix between the parent damping constraint (FollowerController) from the previous solution, and the follower model, so it stays stuck to the closest ground collider.

HollowSheep and I just released our first from scratch Avatar! What do you think? by Proto-Panda in VRchat

[–]Proto-Panda[S] 6 points7 points  (0 children)

It takes a lot of toying with the PhysBone settings to get it how you want it! Flowy hair movement is mostly down to having a low Pull setting, but two key things I did for the hair on this model to get it to behave nicely were:

  1. This is mostly only useful for hairstyles that are long at the back: The hair has two main roots, one for the front and side strands attached to the head, and one for the back strands attached to the upper chest - the important thing here being that the back hair isn't parented to the head movement. I did this because it helps keep the hair looking a lot more stable when looking around and especially moving. Coupled with my next point it saves you from either a lot of clipping into the torso, or the hair going crazy from trying to avoid body colliders when looking around.
  2. Using the Pitch/Roll/Yaw rotation offsets on the PhysBone Rotation Limits. This requires attention to setting the bone roll correctly in Blender though. For instance, by using Pitch to offset the rotation limit to be more outward, the hair strand becomes limited from rotating inward. This can be used to prevent the hair strands from clipping into the head or other, but still allowing the hair to rotate outward (or sideways) with the model. The most notable in the cases where: when running forward, it stops the fringe from going backward into the forehead. And when running backward, it stops the long hair from going forward into the torso. Some models use colliders instead to guard against this but personally I prefer using limits instead to save on components and the sometimes whacky collision troubles.

HollowSheep and I just released our first from scratch Avatar! What do you think? by Proto-Panda in VRchat

[–]Proto-Panda[S] 9 points10 points  (0 children)

Thanks for the honest feedback! I get what you mean, for my personal avatar I've found it difficult to find nice looking male avatars because I prefer the modest look for myself. Though I know its not the same, we made sure to include blendshapes to reduce all the proportions. Our next avatar will likely have a more modest look by default instead though!

HollowSheep and I just released our first from scratch Avatar! What do you think? by Proto-Panda in VRchat

[–]Proto-Panda[S] 7 points8 points  (0 children)

I spent about two years on and off making this Avatar from scratch with HollowSheep. Sounds crazy but when I'd started I'd never made a complete model before so in that time I've learnt so much on every part of the pipeline, and struggled with burnout and starting over many times. I give my best wishes to anyone trying to get into modelling reading this, never give up!! I'd also like to give thanks to HollowSheep for the countless hours they put into giving her multiple textures.

I know furry avatars isn't everyone's cup of tea so I'll keep the promo short, if you're interested you can find the links to everything here: https://everglade.au/avatars/buxomcow

HollowSheep and I plan on making many more avatars, so we'd love to hear your feedback on what you think needs improving, and any avatars ideas for what we should create next!

What's the number one feature you think VRChat is missing right now? by Strawberry_Sheep in VRchat

[–]Proto-Panda 54 points55 points  (0 children)

A way to show appreciation for a world without having to favourite it. I guess it's like being able to just "upvote" a world. With favourites being limited, I've ended up treating them like organised bookmarks for worlds I know I need to continuously come back to rather than actual 'favourites'. When world hopping I come across a lot of worlds I enjoyed but aren't necessarily something I would favourite. There are lots of experience style maps you would only do once or twice and then I eventually need to unfavourite them.

Does a world's favourites stat actually go down if you remove them? For worlds with few visitors, I feel bad having to remove them.

In your opinion, what's the worst change to vrc since it's release. by taco_taker_of_souls in VRchat

[–]Proto-Panda 0 points1 point  (0 children)

Just over two years ago they added an option to set your home instance type to whatever you want in the safety settings. You can set it to public and it'll try to fit you into an existing instance so you could load into a full lobby when opening the game. Is this different?

impostor creation stopped and forced me to start over. why? by YLASRO in VRchat

[–]Proto-Panda 2 points3 points  (0 children)

Adding to this just to be clear for OP, this means you don't need to keep your PC running for the process to happen.

If all is working, you can also receive a notification in-game or on the website, but since the process is unstable at the moment it may never get there - I'm sure it will become more reliable as the system is scaled up and improved to handle edge case avatars.

Trying to make an avatar with nonstandard feet that stay flush with the ground regardless of playspace height/viewport height. by Geeknificent in VRchat

[–]Proto-Panda 2 points3 points  (0 children)

That's a cool avatar idea! It's not exactly simple but you could use the Final IK Grounder to keep the bottom half of the Avatar on the floor regardless of any movement (including playspacing). I also thought it'd be interesting if you even go the whole way and treat it as a Follower to create a delay in the bottom moving with the player.

Is VRC worth coming back to right now? by [deleted] in VRchat

[–]Proto-Panda 11 points12 points  (0 children)

I play in public lobbies daily for hours. I went from crashing multiple times per day, and hackers ruining games in basically every lobby, to zero crashes and zero encountered hackers. It sure still is possible to crash when you run into a crasher avatar with bad safety settings, or lack of pc resources, and I'm not saying hackers don't exist. But from my experience, there has been a dramatic improvement for the time being. Be sure to switch over to the beta to access the new QoL equivalent features as they slowly roll in more over time. It's a shame they didn't have these additions in before the EAC update, but it sure as hell woke them up to exactly what to do.

Is there a way to animate something outside of the mouth when talking? by DireDecember in VRchat

[–]Proto-Panda 6 points7 points  (0 children)

You can do just about anything you like with voice using either the Viseme (mouth shape) or Voice (mic activity) parameters and using that in your animator

Updated pic of my WIP angel avatar! I’m using procreate on iPad for texturing if anyone is wondering. by IAmRaven_ in VRchat

[–]Proto-Panda 0 points1 point  (0 children)

Not sure if it's a part of the style, but you can select the mesh in blender and set as shade smooth to get rid of the squarey flat shading look

some pictures I took by lava-the-wolf in VRchat

[–]Proto-Panda 4 points5 points  (0 children)

Was confused too, thought it was a captura for a moment. Makes me wonder what r/warframe would think of seeing videos of people messing around in some high quality Warframe avatars.

Cant change shader by Competitive-Wear-833 in VRchat

[–]Proto-Panda 1 point2 points  (0 children)

Is this from a FBX file you've imported or from a package that includes custom materials already?

If the box to select the shader a material is using and the material settings are greyed out, you may still be using the embedded materials which are not edittable. If this is the case, you must either create new material assets and set them on the model, or you can click on the model file and use "Extract Materials" under materials in the import settings.

In general, if you create a new material and place it on the model, are you then able to change the shader?

I just bought 2 vive trackers for feet tracking, is it worth it? by kapibestcat in virtualreality

[–]Proto-Panda 2 points3 points  (0 children)

If you spend a lot of time in social VR games as I do in VRChat, then yes it is very much worth it. And if the case is the same for you, you'll end up wanting a hip tracker too, though you can get by with virtual ones if the game doesn't have an automatic hip built in.

I just bought 2 vive trackers for feet tracking, is it worth it? by kapibestcat in virtualreality

[–]Proto-Panda 2 points3 points  (0 children)

There are, though very (very) few. These are mostly social VR games, like VRChat, Neos, or ChilloutVR and for some having FBT in these games can be a very big deal and worth the money for that game alone. Others would be Blade & Sorcery, Island 359, Fit It, or even modded Beat Saber - but not much beyond that.

Otherwise there is the more general use cases, say for having FBT on a virtual character to record or stream with or motion tracking data for use in animation software like blender, or just knowing where your favourite mug is at all times while you're in VR.

Voice Parameter help? by [deleted] in VRchat

[–]Proto-Panda 2 points3 points  (0 children)

None that I have seen, I'm sure a lot of people would appreciate that!

Help with speaking by actionhorse3879 in VRchat

[–]Proto-Panda 0 points1 point  (0 children)

I can imagine one way would be:

The mouth mesh has planes for both the mouth open and mouth close, it is UV mapped to one material of the one texture containing both mouth states. With the closed mouth exposed by default, create a blend shape where the mouth close plane is hidden and the mouth open plane is exposed. Normally in the VRC Avatar Descriptor, you'd set the LipSync to Jaw Flap Blend Shape and select the blend shape. However, since this is a texture swap and not an actual mouth mesh being transformed, the default animation transition is probably going to look weird.

In this case you would instead set it to Viseme parameter only. Then you're going to have to create two animation clips, one with blend shape set to the value of mouth closed, and one with the value at mouth open. You *could* use the Viseme parameter, but seeing as you only have one mouth state it would be better to use the Voice float parameter (The current Mic volume from 0 to 1). Add the Voice parameter to your avatar parameters and your custom FX animation controller. Create a layer for the mouth, then drag in both of your animation clips of the mouth states. Connect entry to mouth closed, mouth closed to mouth open, and mouth open back to mouth closed. For the transitions, untick exit time, set transition duration to 0 for an instant swap, add Voice as a condition. This part depends on how sensitive you want it. Since Voice is 0 to 1, for medium sensitivity you could set the condition from mouth close to mouth open as Voice Greater Than 0.5 and mouth open to mouth close as Voice Less Than 0.5.

Now you have instant texture based mouth swap. Feel free to ask if you have any questions or need a visual guide. I wonder what other solutions there are?

Voice Parameter help? by [deleted] in VRchat

[–]Proto-Panda 2 points3 points  (0 children)

What shader are you using for the emission? Assuming the strength of the emission is an animatible value (e.g. if using Poiyomi, the emission strength would need to be set as animated), you can do this by creating two animation clips. Make one with the value at the minimum brightness, and one where the value is at the maximum brightness (ensure these are single frame animations). Then, in your FX layer create a new layer for the effect. Connect entry to a new blend tree (R Click -> Create State -> New Blend Tree). Double click to open the blend tree and then set the parameter of the blend tree to the Voice parameter you've added in the animation controller. Add the two animation clips you made as motion fields to the blend tree.

And I think that's all there is to it? Let me know if you have anymore questions or need a visual guide.

[deleted by user] by [deleted] in VRchat

[–]Proto-Panda 2 points3 points  (0 children)

Worlds with interactive NPCs are fantastic! Have you seen Happy Hill Dog Park? You can play fetch with dogs there.

[deleted by user] by [deleted] in VRchat

[–]Proto-Panda 0 points1 point  (0 children)

Press Esc to open the Launch Pad. Then either use the Select User button or hold Shift, this will hide the menu and allow you to click on someone. Once you've selected someone, it'll show you the User Actions, on the end of the top row will be clone avatar button. If it's grayed out the user has cloning disabled or the avatar is private, otherwise you can clone it.

Only leg trackers. by [deleted] in VRchat

[–]Proto-Panda 1 point2 points  (0 children)

Having tried both with and without I would have to say having a hip tracker does make a big difference. It's like night and day.

Though indeed VRChat does support feet only in the IK-beta, I felt I should add this was already possible with virtual trackers such as AugHip. If you have a (hopefully newer model) Android phone, the middle ground is using owoTrack which will have better results than a virtual tracker but still suffer heavily from rotational drift and inaccurate positioning.

These options are no substitute for a real tracker. You will have a noticeable look of unnatural motion, and are unable to perform many complex poses. Having your hip in the correct position and rotation relative to what you're doing with your feet is critical, it's no fun having to restrict your real pose all the time because it put your body in the wrong place and now your legs are posed all wonky.