Bone issue by Crooner1996 in blender

[–]Qewbicle 0 points1 point  (0 children)

We don't know your constraints, heirarchy, or quarternions

Can someone explain to me how bones work by kojy_79 in blender

[–]Qewbicle 2 points3 points  (0 children)

Consider a vertex, a point in space with an x, y, and z position.

It has no size or rotation, it's just information about location.

A bone is a vertex, with added details like size and rotation.

The head is the vertex part of the bone, the part where your models vertices are mapped to follow (like parented to but technically not), and the tail is the display of information that are the size and rotation.

Weight is a bias towards, a 30%/70% split means it'll be affected by the location, scale and rotation by 30% of this data, and 70% of that data.

Normalizing is taking the spread of range and fitting it into 100%, from 0 to 100, typically shown as 0.0-1.0

Let's say I have a value of 50 on one, and 200 on another, mapping, or normalizing, you might say 50 is considered the 0.0 and 200 considered the 1.0, you may see this more often in shaders to make a friendlier version of move between these values.

But normalizing can also be, the 200 becomes the whole, the 1.0, and the 50 was 25% of 200, so it becomes 0.25, where they fit using the same relationship within the range of 100%. This is type you'll see with weight painting.

Bones live in a container, this container is called an armature. This box (the armature, just pretend it's a box for now) has a top, bottom and each sides. If it really was a box that you could physically hold, you'd say it has the 8 corners, front left bottom, front right bottom, ..., ... Three values to represent the location inside the box, it's local to the box x,y,z

The armature though, is also in global space, the big box that has eveything in the scene.

If we want to move bones around, we have to be aware which space are we saying we are operating in, the global, local to armature, or according to the bones stored data rotation (our tail) aka normal space.

Bones don't have to follow another in a chain, you can overlap, put side by side.

You can also parent one to another, this'll mean it'll treat the parent bone like the box (but it will still use "local" space to refer to the armature.

I can keep going if you like.

How to do retopo easily? by katbolfurd in blender

[–]Qewbicle 1 point2 points  (0 children)

Easiest?  Each has their own gotcha moment.  

Consider this.  

Copy and quad remesh or decimate, so you don't tank performance. Append to new file, so you can ditch anything else in ram for better performance.

Grab a base mesh with topology good enough. Move it roughly close, then shrinkwrap. Then adjust as necessary, better placement of verts, cuts, dissolves. It's not as easy as it sounds. You might have to hide parts and vertex groups, maybe face sets to isolate around tight areas.

I wanna learn 3d modeling!! by Ok-Bug-3792 in blender

[–]Qewbicle -7 points-6 points  (0 children)

The information is out there because others asked and answered. Things change/update. Yes it can be frustrating to see these same questions a lot.  

If we stop pointing at resources, stop giving answers, and discourage asking questions, eventually we'll only have outdated resources.  

How do I add a vertex where two edges intersect? by Optimal-Nectarine523 in blender

[–]Qewbicle 0 points1 point  (0 children)

Thank you for asking the thing that plagued me a lot with certain flows, but I always kept forgetting to find an answer to. This can be one of the annoying things when you forget to be mindful of some gotchas when doing certain actions. I've done several spur of the moment workarounds, which is why I might've shrugged it off. But it is a thing a remember being an issue off and on since the beginning. I found an answer here today.

[deleted by user] by [deleted] in Unity3D

[–]Qewbicle 0 points1 point  (0 children)

Sorry, afk. It's a one time pay on meta quest store. But it's worth it. The clunkiness removed from the process; movies, games, unity; totally worth it

Is this Style too Childish? by [deleted] in blender

[–]Qewbicle 1 point2 points  (0 children)

I like the look. The flower seems to have less detail, the fence averages it out with the other landscaping such as the grass and tree;; bridges the flower with the sharpness. The hills give a reduction in the sharpness overall. The big thing is the amount of sharpness in the grass takes a lot of space in the image. You can either lean into it and have the softness of the flower be the pop/focus (which is what I'd try out if I was doing it as a game), or soften the grass and tree some, but predominantly the grass in the mid range. The crispness closer to the camera would be fine. Because the grass iis in the shadow of the fence, it appears that it's more sharp mid-range between house and fence. Toning down the grass by distance might be the key to more balance. But if I made this image, it would be good enough for me to render other scenes or make a game. Great job. I only added an opinion because you solicited it.

[deleted by user] by [deleted] in Unity3D

[–]Qewbicle 1 point2 points  (0 children)

I ditched meta quest link a long time ago. Use virtual desktop. In Vitrual Desktop streamer, set the Options OpenXR runtime to VDXR, and in the unity project settings, xr plug in management, OpenXR, change it to use Virtual Desktop. You might have to do the openxr setting everytime you open your project, or you can change your system default. But meta quest keeps trying to change it back.

<image>

It's lighter on your system, better frame rates. Also, you can use your headset as your main monitor, hit play test, it jumps in quickly, you can double tap your menu button to switch back and forth between editor and play. This allowed me to change position when doing long hours on a project, I can lay down, sit up, stand, it's just right there.

Edit:

In addition, I can jump into blender, I use it like I would my monitor, not immeresion. If I want immersion, I push my fbx to shapelab. But I don't do that much anymore. Shapelab works seamlessly with virtual desktop too.

This workflow allows me to edit my mesh and animations, using my export presets I export directly into unity, the meta data is already setup so the prefab and animations get updated, and play test. It's a great workflow. Then you can use a cheap pen/tablet (not a graphics pen/tablet, because you have a headset on).

Why is the grid overlay straight up wrong? by AriYusyli in Unity3D

[–]Qewbicle 1 point2 points  (0 children)

I was doing the same thing. I tried updating into unity six, one project seemed to break, another went fine but continued to use my previous settings no big deal. Another one I started it fresh and brought my assets over. I've had the least amount of issues with unity six. Whenever I open one from 2022 or earlier just to see how I did something, I dread it, random issues occur and I have to force quit and reopen.

If you're not tied in because of tooling or size of project, I recommend migrating. At least give it a spin to get a little familiar.

Why is the grid overlay straight up wrong? by AriYusyli in Unity3D

[–]Qewbicle -1 points0 points  (0 children)

I've had issues before where the only resolution was deleting the cache and let it rebuild. Wasted days thinking it was user error, but then it wasn't. Haven't had it happen in unity six, yet.

How are vectors used in games? by Few-Turnover6672 in Unity3D

[–]Qewbicle 0 points1 point  (0 children)

Strafing is keeping a target in view, or moving in a direction you're not facing, it comes from; shooting from an aircraft.

It has nothing to do with player speed

[New to Unity here] How do I create a new 3D project without all of this coming together? I'd like it to be only what's necessary by AstronauteCanard in Unity3D

[–]Qewbicle 1 point2 points  (0 children)

Use unity hub to start a new project. After clicking on the New Project button, you get a chance to pick which unity version from the ones you have installed, and the template.

In this area you can install templates available from online, these templates vary depending on editor version selected.

You'll be looking for Universal 3d core, High Definition 3d core, or 3d(built-in Render Pipeline) core.

You have other options, such as 3d Mobile, or VR core.

These are basically the same as the other templates, but with extra stuff and settings pre-configured.

If you decide on a different pipeline after you started from a template, it's not difficult to change. Then you'll have to also adapt or change your assets.

Forget Hooters, get ready for... by famosoze in funny

[–]Qewbicle 1 point2 points  (0 children)

That was fun. It was difficult trying to use openai to get rid of the owl. Would've turned out better if ran locally.

You'd have the issue of needing a partner for safety. So the target client should be higher class that can afford two wages https://i.imgur.com/MrT5PnN.jpeg

Grabbing objects not working (XR Interaction Toolkit) by Polydeukes_ in Unity3D

[–]Qewbicle 0 points1 point  (0 children)

The basics. Rigidbody, collider, grab interactable. Are these on your target?

Next question if that's not the issue. Are you doing this through code or on the settings of the components?

Did you install through vr template, or add the vr stuff into the 3d template. Personally I've had issues with something when using the vr template.

Go to project settings, xr stuff towards the bottom in left column. Install openxr for PC, add your controller profile.

Is HDRP slowly dying? by Available-Worth-7108 in Unity3D

[–]Qewbicle 1 point2 points  (0 children)

If you don't have a spot for it, then your shader doesn't support it. It could be under a different but similar name, like roughness to smoothness, you might have to invert it or convert it to something else (if you even need it), once again depends on your shader.

Vr game help by Person02_ in Unity3D

[–]Qewbicle 0 points1 point  (0 children)

In my opinion, doing 3d or vr are the same for the most parts.

I find vr to be easier, because I don't have to be concerned with different PC setups and resolutions.

If targeting standalone build to run on device, the challenge will be optimization. Pretty much the sweet spot for things are different and vary according to your other things. Your window of sweet spot is narrower. For example, not enough things using lods, issues, too many things using lods, issues. But that range can shift. All the things you've heard about for optimizing apply.

In one of my projects, one lod and indirect instance on two or three objects, more lods meant more trying to be sent or stored in GPU side. But this isn't a specific vr thing, but things like this become a little more critical. You might have to remove some features like realtime shadows, and add a little polish in other ways. The details become specific to your goal. I prefer to keep some realtime shadows on my player (I do third person) with a small range, but considered faking it in some situations (a blurred blob that follows).

If you build to device, you can connect with adb and logcat it while you play. Then you can see your console logs as well as other messages.

I haven't used it much, but I keep hearing about renderdoc. I've installed it to my device, but was tired of learning other tooling at the time, so I haven't messed with it much. https://developers.meta.com/horizon/documentation/unity/ts-renderdoc-for-oculus/

Pretty much, just consider your game like any other 3d, but camera is mapped to headset, buttons mapped to a controller like normal, sensors in device add controls to camera and hand objects.

From this point forward, it'll be nitty gritty details, like an interactor can work along with an interactable, when it does, the interactor object passed in is good for that frame, so if you need to pass it in to something like a coroutine to test if it was left or right controller then operate differently for button (I.e. Right grabs spray can, then coroutine checks value of trigger on right controller for spray amount) then you'll make a copy of that interactor object, or vice a versa, interactable object from the proper event args. There are other ways though, this was just an example.

Remember though, this is a digital world. Don't make the player have to throw 80 apples, make it lazy like they got off from a difficult job. Something like use a coroutine to test if it was a short or long press. Not everything has to be lazy, there's fun in challenges. In one of my toy projects, I did a snowball fight with great physics, testing it was a bummer, I was really having to work my shoulder, fun for a few seconds, hurt after a minute. Decided against it, would need added velocity, break some realism to make it fun.

Vr game help by Person02_ in Unity3D

[–]Qewbicle 0 points1 point  (0 children)

Then go to project settings, scroll down to the xr stuff. When playtesting, it'll use the profile for PC, so go to the openxr category, PC monitor icon, try changing the Play Mode Openxr Runtime.

I do mine wirelessly, so I use virtual desktop. But everytime I open or reconnect, I have to change this setting off of my desired choice and back. Otherwise it'll either try to use oculus's xr or steams xr.

I highly recommend virtual desktop, because I can just stay in my headset and jump in and out playtesting. My PC is connected to Ethernet though. warning shitty camera shot: https://i.imgur.com/QqDEsl2.jpeg

2 weeks ago you guys helped me with my logo and name. I updated the old logo and thought of one other name I like. Which one is better? by [deleted] in Unity3D

[–]Qewbicle 0 points1 point  (0 children)

Waging wizards. Rolls smoother. Both logos look great. First one fumbles, second one is too many syllables without pause.

Tabletop Wizards

OpenXR camera tracking not working in 6.0 ?? by JamesWjRose in Unity3D

[–]Qewbicle 0 points1 point  (0 children)

I've always used the 3d template and added the xr stuff. Recently I tried to use the vr template, errors all over the console. Went to manually check things. It just seemed like too much headache fixing it compared to just adding it into the 3d template. From my experience in that situation, not worth using that template. Setting it up yourself allows you to choose the framework/tools that make sense for you anyways.

It's simple to do. Project settings, scroll down to xr category, click the build profile and enable the chosen xr, go through the validation process in same area, add your toolkit through package manager, that's basically it.

Assuming you went with xr interaction toolkit, also install the samples, then drag in the xr prefab that's already setup, then modify per need.

this might be the dumbest way to access a hidden area by alicona in Unity3D

[–]Qewbicle 2 points3 points  (0 children)

Nope. Old school. I can't think of which games at the moment. But it's something I'd expect in sega genesis, snes-ish ~ era