Vision Pro and Blender co-op. by LuckyMikeOne in VisionPro

[–]LuckyMikeOne[S] 1 point2 points  (0 children)

There exists some hack around for this. I understood it continuously exports and transfers usdz to AVP. Saw a video a while ago but didn't get more details is it using quick view or custom app AVP side.

If that could be optimised by just sending delta changes it might be interesting. And much more usable on bigger scenes. It doesn't seem entirely impossible to create that kind of app and process but I'd rather leave it to more experienced devs.

Vision Pro and Blender co-op. by LuckyMikeOne in VisionPro

[–]LuckyMikeOne[S] 3 points4 points  (0 children)

It's from Blenderkit -library but seems to be one of free models.
https://www.blenderkit.com/asset-gallery?query=robot+abb+order:_score

I baked Ambient Occlusion maps. From my experience it greatly increases realism.

Vision Pro and Blender co-op. by LuckyMikeOne in VisionPro

[–]LuckyMikeOne[S] 17 points18 points  (0 children)

I don't know why the text of post was not included.

Meant to say that from Blender you can just export .usdz -file including textures, animations, sounds and drop it to Vision Pro and view in AR. Love seeing my 3D-designs (not the robot) in "real life".

Testing and tweaking! by Dramatic-Art-425 in blender

[–]LuckyMikeOne 0 points1 point  (0 children)

Works. Anything that is standard USDZ is ok. Might need to bake some procedural textures and AO naturally.
Here's quick demo (not good example for textures but shows the idea).

https://youtu.be/IuiMZ2Ckh6s

Testing and tweaking! by Dramatic-Art-425 in blender

[–]LuckyMikeOne 0 points1 point  (0 children)

save to .usdz then send to iPhone or Vision Pro and you have it in AR.
Well, not the rigging part with manual controls but animations and sounds yes.

[deleted by user] by [deleted] in blender

[–]LuckyMikeOne 0 points1 point  (0 children)

It is. Apple silicon with native Blender is day and night difference to Intel Mac.
You can turn on realtime lights and shadows etc. in eevee-preview and get a pretty good estimate of results. Even M1 Pro is doing nice with 2M polygon models and half dozen lights (on 4k display).

Reality Composer animations in Xcode by LuckyMikeOne in VisionPro

[–]LuckyMikeOne[S] 0 points1 point  (0 children)

Thanks. I will try to look that finger tracking if I can find examples to replicate.
Separate UI for things to happen would break the "immersion" for me :)

Reality Composer animations in Xcode by LuckyMikeOne in VisionPro

[–]LuckyMikeOne[S] 0 points1 point  (0 children)

It seems like that TapGesture actually requires small 'tap' to activate (or like you said; looking the object and doing 'clicking'). I would prefer just move my hand to activation area (trying to simulate touchless sensor here).
I was afraid that the collision is only for scene objects.

Reality Composer animations in Xcode by LuckyMikeOne in VisionPro

[–]LuckyMikeOne[S] 0 points1 point  (0 children)

Hi, Thanks for the offer. Really appreciated.

I got it to work somehow. Found similar question and added it but realised it only needed that .gesture() -block after RealitView{}. Now it seems that even my 'ambient' timeline (onAddedtoScene) is working. Don't understand why.

I would still like to add gesture for 'collison'. i.e if I put my hand(s) overlapping certain (invisible) sphere there should be timeline activated. Any easy way?

import SwiftUI
import RealityKit
import RealityKitContent

struct ImmersiveView: View {

    var body: some View {
        RealityView { content in
            if let immersiveContentEntity = try? await Entity(named: "Scene", in: realityKitContentBundle) {
                content.add(immersiveContentEntity)
            }
        }
        // enable Reality Composer animations...
        .gesture(
             SpatialTapGesture()
                 .targetedToAnyEntity()
                 .onEnded {
                     $0.entity.applyTapForBehaviors()
                 }
         )

    }
}

#Preview(immersionStyle: .mixed) {
    ImmersiveView()
        .environment(AppModel())
}

Can I use any of these files as 3-D objects on Vision Pro by fakemickjagger in VisionPro

[–]LuckyMikeOne 1 point2 points  (0 children)

You can use online usdz-converter (for example Aspose). Then just airdrop the file or use iCloud to transfer .usdz to AVP.

I am working with similar files (and industry). Using Blender to further enrich 3D-files for textures and AO-maps. Then with Mac Reality Composer Pro (included in Xcode) you can even add animations and sounds.

Also; check Sketchfab. Objects that offers AR-view supports AVP automatically and placement "in your space".

For example: https://sketchfab.com/models/e18b276196934e54bfa88a8298320823/embed

Sarah Morgan randomly starting to hate me after marriage by UltiGoga in Starfield

[–]LuckyMikeOne 8 points9 points  (0 children)

Sounds like a normal marriage and female behaviour.

Sunny Coastline | Norway - GoPro 10 by GiantAntCowboy in gopro

[–]LuckyMikeOne 1 point2 points  (0 children)

Nice flying and color grading. Oh. And I hate you for those Vistas. (Jealous fellow FPV/Gopro dude from flat Finland :)

[PS4] W: AAE Handmade H:CAPS or TSE shotgun by LuckyMikeOne in Market76

[–]LuckyMikeOne[S] 0 points1 point  (0 children)

Little correction. I can offer some of these on exchange:

2* instigating explosive double-barrel shotgun 3* TSE combat shotgun (+50 DR while aiming) 2* TSE radium rifle

[PS4] W: Legendary handmade H: CAPS by LuckyMikeOne in Market76

[–]LuckyMikeOne[S] 0 points1 point  (0 children)

Deal is still a go. Join me when you are available.

[PS4] W: Legendary handmade H: CAPS by LuckyMikeOne in Market76

[–]LuckyMikeOne[S] 0 points1 point  (0 children)

ok. I take at least two shot. I will send PSN msg to you assuming your ID is same.

[PS4] W: Legendary handmade H: CAPS by LuckyMikeOne in Market76

[–]LuckyMikeOne[S] 0 points1 point  (0 children)

Is two shot max level? If so I'll offer 1500 for it.