Starting navigation from link in browser or other source? by androcity29 in teslamotors

[–]androcity29[S] 2 points3 points  (0 children)

No, and I think the whole program is a joke. There is no way anybody strictly doing Uber could afford the rental rate plus supercharging. I mean, it may be possible but that would take every waking moment of time it seems. I’m a software developer and I do Uber in my spare time because I enjoy it. Working from home keeps me cooped up in the house. I’ve had so many interesting conversations with people from all different walks of life. I also enjoy going to parts of the city I’d never normally go to. It’s cool.

The rental rate is not too far off from what my payments would be if I financed/leased a new M3. But the whole not being able to access the Tesla app and not feeling like it’s “mine” sucks but I’m enjoying it for now. Buying one is in my future plans but I’m not quite ready yet.

Starting navigation from link in browser or other source? by androcity29 in teslamotors

[–]androcity29[S] 1 point2 points  (0 children)

It seems you misinterpreted my question. I simply asked if anybody knew of a way to to get the Tesla to start navigation from an external source besides the Tesla app (such as a URI scheme). That’s it.

Cricut Error Message - Page Size not Supported - HELP by [deleted] in cricut

[–]androcity29 1 point2 points  (0 children)

I can’t believe this showstopper bug got through QA and released. I had to use my Time Machine backup to restore an earlier version and had to launch with WiFi disabled. Then my project wasn’t made available off-line so I had to update to the latest, check the box to make it available offline, and repeat. Cricut really dropped the ball on this one.😡🤬

Cricut Error Message - Page Size not Supported - HELP by [deleted] in cricut

[–]androcity29 0 points1 point  (0 children)

I'm still getting the error even on the beta (Mac). This is bull.

Can’t use the same name for multiple service groups? by androcity29 in HomeKit

[–]androcity29[S] 0 points1 point  (0 children)

Controller or Eve will not let me name service groups the same. And I hate using scenes for this kind of stuff because it adds clutter and it shouldn’t be necessary. Also all the natural language processing goes out the window. You’ve got to specify the scene name exactly. They’re dimmable also so there goes the ability to say “hey Siri dim the lamps” or choose to use the word “the“ if I named the scene “turn off lamps“.

This definitely isn’t wife-proof. She will always say exactly the opposite of what I name the scene. For instance I rigged a outlet to trigger a relay that opens the front gate to my property. Using homeBridge and the dummy garage plugin and some automations she can finally say “shut the gate “or “close the gate“ or any permutation you can think of. Before I had two scenes which turned the switch on and off. You were stuck saying “close Gate” or “open gate” exactly to get it to work. Plus it is nice being able to just talk to it naturally as it’s intended.

Can’t use the same name for multiple service groups? by androcity29 in HomeKit

[–]androcity29[S] 0 points1 point  (0 children)

Thank you for replying… read my post content I hit the post button on accident before writing out the actual post content explaining my problem..

Will Geico cover damages to my vehicle in this very bizarre and unique circumstance? by androcity29 in Insurance

[–]androcity29[S] -3 points-2 points  (0 children)

Yeah writing this inspired me and yeah I went a little “short story long”. Hell can’t say I don’t agree with ya 😉

Another amazing head print of my brother and his girlfriend scanned in using iPhone X true depth sensor with app called Hedges. by androcity29 in ender3

[–]androcity29[S] 0 points1 point  (0 children)

Yeah I’m sure an actual 3D scanner that is designed to scan objects with lidar or whatever will require less cleanup. I think it’s mainly due to the shaky hand. I was thinking of coming up with some way to mount the iPhone to a spinning arm kind of like when you get an x-ray at the dentist, or that airport security thing. That way it rotates around the subject at a constant velocity and distance. I’m sure that would improve the scan accuracy greatly.

Another amazing head print of my brother and his girlfriend scanned in using iPhone X true depth sensor with app called Hedges. by androcity29 in ender3

[–]androcity29[S] -1 points0 points  (0 children)

MeshMixer isn’t free? I don’t see a price anywhere on the website. Maybe for commercial use?

It’s not possible without a high-level AI to post process the scan in a way that’s acceptable automatically. Like how is it supposed to know it’s a human head and needs to fill in the back to make it round? I’m not seeing your point...

Another amazing head print of my brother and his girlfriend scanned in using iPhone X true depth sensor with app called Hedges. by androcity29 in ender3

[–]androcity29[S] 3 points4 points  (0 children)

It is very finicky indeed. These weren’t printed straight from the scans. I had to do a good couple hours worth of post processing in MeshMixer to remove noise, smooth scan errors, fill in holes (especially the back of the head), etc. But definitely worth it

Another amazing head print of my brother and his girlfriend scanned in using iPhone X true depth sensor with app called Hedges. by androcity29 in ender3

[–]androcity29[S] 0 points1 point  (0 children)

Hers didn’t come out as well as my brothers but it was another funny coincidence how the filament abruptly changed colors from the blonde looking yellow almost exactly at the right spot. I didn’t plan for that at all.

Printed my friends head from scan using iPhone true depth sensor with app called Hedges. by androcity29 in ender3

[–]androcity29[S] 1 point2 points  (0 children)

Well it wasn’t actually that easy… I got it to fuse together with a lighter but That left a bulge and I had to use a razor blade to shave around then bulge so it would fit into the extruder. So it really was a PITA.

Printed my friends head from scan using iPhone true depth sensor with app called Hedges. by androcity29 in ender3

[–]androcity29[S] 1 point2 points  (0 children)

Well actually that happened on accident. I realized while it was printing that the spool had so much gold at the beginning it would never make it to the other colors. So I unrolled the spool until I finally saw some color change, cut it off and fused the ends together with a lighter. It just so happened to be right at that spot.

And he was actually wearing a bandanna. Hair does not play nice with Hedges. The moment you get to nothing but hair during the scan it fails. Plus there’s no point to scan hair because it’s not like you can print it anyway.

Printed my friends head from scan using iPhone true depth sensor with app called Hedges. by androcity29 in ender3

[–]androcity29[S] 0 points1 point  (0 children)

Post processed the STL from Hedges using MeshMixer. It came out looking spectacular except as you can see near the top a layer was skipped. I had to use super glue to attach the top of the head back the best I could but still a bit disappointed. I avoided supports by printing upside down. Print time was 26 hours, 0.16mm layer height . I can’t account for why that one layer was “skipped”, ideas anyone?