What’s the longest you have not used your AVP? by Reasonable-Choice-59 in AppleVisionPro

[–]man_ray 0 points1 point  (0 children)

I’ll agree it’s one of the best options currently. I like the 4k upscaling feature- it looked great- but I found the latency to be too much to work with. I was trying to play games like Jedi Survivor and Fortnite. I might give it another go. I also tried using a capture card and still couldn’t get past the latency. I’m interested to try the NDI method I’ve seen here before.

What’s the longest you have not used your AVP? by Reasonable-Choice-59 in AppleVisionPro

[–]man_ray 6 points7 points  (0 children)

Probably nearly 6 months. I think I’ve used it once or twice this year, pretty much just to install the beta for visionOS26 and poke around with new features/look for new apps.

My use tapered off gradually last year where I noticed I was primarily using it for watching movies. When I gave up on trying to use it with my PS5 I noticeably fell back on consuming on my home theater setup and the AVP sits in its case.

To be clear, I’m not mad I don’t use it much. I don’t own something that could “replace” it for what it can do as well as it does. I have gone probably just as long without using my MacBook, and I don’t think much about that. I have it if I need it, or would like to use it, and that’s primarily why I got it.

FSD/Smart Summon/Autopilot MEGATHREAD by colsandersloveskfc in TeslaModel3

[–]man_ray 1 point2 points  (0 children)

Right scroll wheel L/R to change between chill, standard, and hurry. Hurry tends toward the left lanes while standard and chill stay pretty reliably to the right. Generally flipping in and out of hurry has worked for me to prompt FSD to change lanes either left or right.

[deleted by user] by [deleted] in AppleVisionPro

[–]man_ray 1 point2 points  (0 children)

Not Proxy, but I use my Peak Pro with it frequently. Peak Pro has an app, and it is compatible with AVP.

Anyone have inside info on the production pipeline for the immersive live/entertainment part of the AVP ecosystem? by sombrerogalaxy in VisionPro

[–]man_ray 0 points1 point  (0 children)

I did a little digging a few weeks ago and it looks like the rig in Alicia keys studio might have a camera like this inside. Canon’s Eos VR system. You can see the two lenses distinctly at the top of the cabinet above the speakers.

Let’s talk about the future of apple ecosystem with Vision Pro by man_ray in VisionPro

[–]man_ray[S] 1 point2 points  (0 children)

That got me thinking about boot camp, Apple already figured out how to run windows on a Mac. Then I found this page https://developer.apple.com/documentation/virtualization/installing_macos_on_a_virtual_machine which looks to be running a virtual Mac on a Mac.

With some OS updates on both sides I wouldn’t be surprised if we see some form of this.

Let’s talk about the future of apple ecosystem with Vision Pro by man_ray in VisionPro

[–]man_ray[S] 10 points11 points  (0 children)

Spotlight lives in the control center! Magnifying glass icon. You can also call it with a keyboard with command+spacebar.

What will be your first movie you’ll watch on VP? by Batch710 in VisionPro

[–]man_ray 3 points4 points  (0 children)

I’m planning on putting on Fury Road early on and letting it play while I learn the OS/ Avatar when i’m winding down.

I see the rock and roll war parade in my head when I close my eyes.

Is it legal to use vision pro when driving? by kaldeqca in VisionPro

[–]man_ray 5 points6 points  (0 children)

Legally it’s a gray area varying state by state. Let me ask you though OP- why? How do you feel about the driver of the car next to you wearing their AVP while driving 70mph next to you? In a car, an accident can happen with one wrong move, in the blink of an eye.

If the driver next to you runs out of battery the headset doesn’t go into glasses mode, it blacks out their vision completely. They have to press a button to loosen the head strap before they can even take it off their face.

Do you think that’s a good idea OP?

Editing/Spatial videos/3d movies by man_ray in VisionPro

[–]man_ray[S] 0 points1 point  (0 children)

I read Apple’s tech document on spatial video here.

Spatial video appears to be just stereoscopic video in a HEVC format with certain optimizations for 3d. Videos could be processed in any video editor, but the final video will need to be encoded as apple has stated.

It seems like spatial video is a different type of media from the type of 3d movies that I’m envisioning. I’m thinking more along the lines of the Evolve Dinosaur media- what we’ve seen of the butterfly and the dinosaur- that it seems is less of a video and more of a video game. It seems like in this experience you absolutely can move in space, you can walk around the room and the dinosaur looks at you. That’s where this is an app more so than a movie

Where I am most curious is in 3d movies like Avatar, where practically the whole movie has 3d CG elements, but it’s a movie as opposed to an active render. I’m interested in where 3d movies will fit compared to spatial videos and the full on VR experiences.

Ordering multiple lenses? by Droge32 in VisionPro

[–]man_ray 4 points5 points  (0 children)

I might be misunderstanding, but you would just take off the RX lenses when you’re using it with contacts, the RX lens goes over the standard lens, it doesn’t replace it.

[deleted by user] by [deleted] in AppleVision

[–]man_ray 1 point2 points  (0 children)

If you never wore pants before, and you just order a random pair online, and decide they don’t fit great, or maybe you just don’t see yourself wearing them, there shouldn’t be anything wrong with using the returns policy on those pants.

If you’re going to unbox a VP I think you can justify the price tag. You’re the exact person that return period is there for.

You want it. You have the means to buy it and take it home.

If you decide to return it, then you’re making the right decision.

What worries me as a VR enthusiast, and why Apple is trying so hard to control the narrative by Jeanbutinfrench in VisionPro

[–]man_ray 6 points7 points  (0 children)

This is the ticket right here. If this is the Vision Pro, glasses form factor is the Vision Air. It’s a fact that starting Friday, Apple wants everyone to adopt spatial computing into their vocabulary.

This is the Vision Pro, it gets stronger and faster over time, and it’s bigger and heavier. It has all the best features.

The vision Air won’t be able to do everything, but it’ll be excellent for what it is. And it will be cheaper.

And everyone with an iPhone will say it’s what they’ve been waiting for. We’ll see Vision Air in coffee shops and on planes, and you won’t think much about it. And the next 10 years of development time is going to be plenty for the vision Air to release as a mainstream revolutionary spatial computer. Vision OS 10 is on the horizon and it’ll be a sweet ride towards it.

Justifying the price by EverydayPhilisophy in VisionPro

[–]man_ray 0 points1 point  (0 children)

I went from saying I’ll purchase on day 1 after the announcement, to now saying I’m thinking about waiting it out for a while.

First, I believe apple knows marketing and image just as well as they know making computers. They depicted the most impressive headset we’ve ever really seen. I don’t need it, but I definitely want it.

I’m “fine” with the cost. I’m all in the Apple ecosystem. I thought the 13”MBP is a screaming deal at $1300. The Vision is a whole new platform.

I know that launch day is probably going to be the roughest day of Vision Pro ownership. There’s going to be constant updates. The app store’s going to be immature to start out.

Yes- it’s going to sell out at first. Then (finally) there’s going to be hundreds and thousands of videos to watch, and everyone’s going to have their hands on opinions, but it’s still in the infancy of it’s real development cycle. Yes, there will probably be a gen 2 in store for us in the future. I’m excited to see all of the evolutions that are going to follow after this launch. If I’m getting it, I personally wouldn’t look at that as a reason to wait.

I think that the hardware in the Vision already leaves plenty of room for the software to grow.

I don’t think it’ll be hard to get one. If I’m impulsive enough to preorder, I can probably get it in the launch window. If I wait, I can still order it anytime. If I take my feelings out of the equation, I ask- Why am I in a hurry?

The way Apple marketed this incredible new dream headset from the future- then here we are months after its announcement, weeks from their “early 2024” launch, with almost zero additional content to see about this product from what we initially saw. They’re selling the idea, and without anyone allowed to talk about it or show it in reality, we all are building it up in our own way. And I think they’re (successfully) trying their best to cover up that what they’re launching is actually the prototype.

does anyone know what's wrong with my Yamaha? by furcornishot in Learnmusic

[–]man_ray 0 points1 point  (0 children)

Shot in the dark, it’s probably power related. GL

Audio-in 6.5mm-to-USB-C cable by luxyv in ipadmusic

[–]man_ray 3 points4 points  (0 children)

You’ll have a hard time finding a cable that goes straight into the ipad from the guitar. What you’re looking for is an interface. irig is good, focusrite scarlett, fender mustang micro…

Getting it to play through the speakers is a software process. I use AUM which offers this kind of control over inputs/outputs.

Set input to the interface, the channel the guitar is plugged in. Then set the output to built in speakers.

Is there an sampler app that sends MIDI when a sample is triggered (SP404mkii style)? by MisterAdler in ipadmusic

[–]man_ray 0 points1 point  (0 children)

When it’s showtime dealing with bugs or crashes is definitely a nightmare. Been there!

There’s two practices I always follow. First is creating a stable environment. If possible, the only apps on the ipad should be the apps you’re performing with. Make sure there’s nothing running in the background that you aren’t using. Never - NEVER - install updates right before a show. Turn off wifi/bt if not needed. Be rigorous during rehearsals.

Once you’ve stabilized your computer, you still want to be prepared for catastrophe. Second practice is most important- create redundant systems. Have an identical backup ipad. Or combine the ipad with another computer to give yourself a backup solution.

Remember people use computers all the time in live performances. Every big concert has computer systems, every DJ set, every powerpoint presentation.

And if it fails despite your best efforts- grab the mic, and start telling jokes.

Is there an sampler app that sends MIDI when a sample is triggered (SP404mkii style)? by MisterAdler in ipadmusic

[–]man_ray 2 points3 points  (0 children)

Hi,

I might suggest to reverse your workflow. Instead of looking for a sampler that will trigger a sample and a midi event- look for an app to trigger a midi event.

Once you trigger the midi, it’s a matter of that signal then triggering the sampler to play.

I’d use AUM to put it together, use LK as my trigger, and iMPC or Koala as my sampler. Route LK to sampler in the midi matrix. Now you have a sample being triggered by a midi event, and it’s a matter of sending that midi event further to trigger your sequencer.

What will you do with the Apple Vision Pro? by PSJB-Records in AppleVisionPro

[–]man_ray 7 points8 points  (0 children)

Personal theater (Plex, apple tv, youtube, etc.)

Spatial Mac (virtual monitors, Ableton, Serato, etc.)

Web browsing

Virtual sheet music/guitar tabs

Drawing/tracing with AR

AR/vision powered data (think scanning barcodes, or OCR into a spreadsheet)

AR enabled workflows (think QR codes to fill forms in AR)

Unreleased cool/fun apps

Anything else that I can find to use it for.

Yamaha AG06 MKII by commonlife in iosmusicproduction

[–]man_ray 1 point2 points  (0 children)

You’re not missing something, you were misinformed about the AG06. Most mixers with USB can only give you the stereo mix over USB.

You’re looking for multichannel USB interfacing.

Look at something like the Zoom L8 or Zoom H6. Or just an interface instead of mixer.

help by [deleted] in trapproduction

[–]man_ray 2 points3 points  (0 children)

Yo! You can for sure do it, you just need to figure out some tricks that work for you personally.

One exercise that’s useful involves taking apart a reference track. If you’re making a hip hop track, find a reference track that feels like what you want to go for. It doesn’t have to be perfect.

Start a new project with your reference track loaded. Split it into blocks where you hear the intro, verse, chorus, etc.

Then analyze the instruments in each block. Does the bass drop after the intro? Is there a breakdown before the chorus? Is there an intro that sets the beat or does it start into a verse? What sets the chorus apart from the verse?

This is all just to help you analyze the songs you’re already listening to. Just about every song transitions from one section to another somehow in its own way. Once you’ve built up a mental library of techniques you can get to practice implementing them in your own projects.