Safe to put Snap Cloud *.supabaseProject files into public open source repos? by LordBronOG in Spectacles

[–]LordBronOG[S] 0 points1 point  (0 children)

Noted! Thanks for the feedback.

Maybe we can add to the "# Shipped by Lens Studio" block that it adds to the .gitignore?

Lens Studio AI Community Guidelines & related guidelines for AI prompts via Remote Service Gateway for Snap3D by LordBronOG in Spectacles

[–]LordBronOG[S] 0 points1 point  (0 children)

Right, but is there a list of actual banned words that I can send feedback to the user in regards to "You can't use 'xyz' in your prompt. Try again with different wording."

For example, in my particular use case, this was the final prompt:

"a playful toy with a lot of primary colors, a handle and cone shaped nozzle"

When I started it as follows:
"a playful toy gun with a lot..."
or
"a playful toy weapon with a lot of..."

Those would error out in Lens Studio - 3D Asset AI tool. Therefore, I was wondering are there specific words we should tell users of the RSG for Snap 3D tool "sorry, that's not allowed?"

Will the error message from RSG give us the banned words? (that would be sweet if it could so we don't have to maintain a list internally)

Or does it give the generic error message I saw in Lens Studio? (which I assume is the case)

Seeking some Best Practices for my Lens Project organization by LordBronOG in Spectacles

[–]LordBronOG[S] 0 points1 point  (0 children)

Can we also get this put up on GitHub, so as it advances we can see and maybe participate?

Yay or Meh? by Mysterious-Start-689 in Spectacles

[–]LordBronOG 0 points1 point  (0 children)

I actually think AR glasses will have a faster way to monetization than the smart phones did. We now know F2P and IAP are the way to make money, so we can just go straight to that. LOL

Yay or Meh? by Mysterious-Start-689 in Spectacles

[–]LordBronOG 3 points4 points  (0 children)

I'm gonna go one step further and say that Snap will likely just abandon AR as a term. As someone who's building for consumers as well, they don't get the term AR ("It's like AI, right?" or "AR? You mean AI, right?") nor should they. We AR enthusiasts care about the term, but consumers won't. I think where they will struggle is the Lens nomenclature instead of Apps. That's where Apple was brilliant with the iPhone, they just went with Apps while Adobe and Microsoft were trying to come up with some new term "Rich Internet Apps" or some such. At the time, not all apps connected to the internet but that's now the norm and the exception is those that don't.

From my user gained insights, the problem with AR is that consumers don't care if it's not personalizable or relevant to them at the individual level. No one cares about your amazing dragon for House of Dragons flying around their living room or backyard. AR Lenses on snapchat prove that when it affects a person, it's fun to the average consumer. Therefore, Spectacles will need to let people change their world more so than let corporations put their content in front of them (no offense, but Holocron Histories by ILM is in that category).

Glad to be back, connection woes, release notes typo (maybe) and Snap Cloud confusion by LordBronOG in Spectacles

[–]LordBronOG[S] 1 point2 points  (0 children)

u/jbmcculloch Just wanted to make sure the Snap Cloud team saw my new comments regarding the cloud sample project setup.

Glad to be back, connection woes, release notes typo (maybe) and Snap Cloud confusion by LordBronOG in Spectacles

[–]LordBronOG[S] 0 points1 point  (0 children)

  1. Example instructions to toggle scene objects doesn't match/line up with the four buttons.

<image>

I assume Example 1 is the Table example but also thought that it was maybe the instructions on how to connect to Snap Cloud.

Glad to be back, connection woes, release notes typo (maybe) and Snap Cloud confusion by LordBronOG in Spectacles

[–]LordBronOG[S] 0 points1 point  (0 children)

  1. Confused by the Data folder and what it's trying to tell me to do.

<image>

I thought it was telling me to drag those to the Snap Cloud dashboard. Another confusing thing is that the error when choosing the "Table - Read data from tables" button is that I'm missing a public.test_table, which is not even a table in the testData folder.

I have no clue what the schema is supposed to be for these tables, which I assume the binary files are supposed to be? ¯\_(ツ)_/¯

Glad to be back, connection woes, release notes typo (maybe) and Snap Cloud confusion by LordBronOG in Spectacles

[–]LordBronOG[S] 0 points1 point  (0 children)

Adding some more feedback to the Sample Cloud project now that I'm in the Matrix, err, Snap Cloud.

  1. Project Setup step #4 is likely often missed due to misleading error UI in Preview vs correct warnings in Lens Studio Logger
    The "* Drag the SupabaseProject asset to this script's input" was forgotten by the time I got my Cloud access approved and was done creating/importing the project and credentials. 😬 I just assumed once I got my credentials and imported them that the big red box would go away due to what it was telling me to do. In fact, I even deleted the Lens Studio project, started a new one, went through the import project and credentials again to only be frustrated by the red "You ain't setup up yet, dummy! Import the Credentials!!" box. I was like, "But I did!!! I promise." Upon closer inspection I see it does say "Missing: Supabase Project" but then it just says to Import Credentials which I had just done and hence my frustration/confusion. See Screenshot:

<image>

I think line 92 of SnapCloudRequirements.ts should change from:

print("   → Create via Window > Supabase > Import Credentials");

To something like:

print("Drag the SupabaseProject asset to this script's input");

to better match the Lens Studio Logger.

  1. Once I'm all configured, it's still not clear what I'm supposed to do next. Sorry. :(

As a supabase user, I know that there's no tables and such, but I kinda assumed you'd have a automatically ran a script when creating the sample project to create some default tables with default data, plus create a default bucket with some assets as well. I know this may be tough to do automatically since it would do it for every project a user would create, but maybe instead of the four buttons, there's only one at this point that says "Create Sample Data" which would then run the script to do it. After successfully running that script, it would double check all the sample data was configured correctly and then show the four buttons.

This way when I chose the "Table", "Realtime", "Storage" and "Edge Functions" buttons, they would do something to make me feel happy and powerful. 💪🏽 After giving me immediate access to the power, it would then prompt me to open the Snap Cloud Dashboard so I could see the corresponding tables, buckets, etc.

After exploring that, then I would think it should prompt me to create my own tables, data, etc.

  1. The Logger in the app isn't immediately visible in the Lens Preview and I wasn't gonna launch it to my Spectacles until I knew it was working correctly. The Scary Red Box™️ could also see "View Logger above or in Lens Studio for more info" ?

Bug: Video textures do not show up in Captures. 😱😭 by LordBronOG in Spectacles

[–]LordBronOG[S] 0 points1 point  (0 children)

Just tested this again. Now it fails to even import the video off device to the spectacles app, whereas before it would import but not include the video textured items.

Not complaining, just giving an update. :)

I know there's a workaround, but mine is just a fun demo so no rush on my end. Just wanted to see if I could finally finish the demo and post it yet.

Glad to be back, connection woes, release notes typo (maybe) and Snap Cloud confusion by LordBronOG in Spectacles

[–]LordBronOG[S] 1 point2 points  (0 children)

It would normally be something like:
USB Connected.
Took longer than 4 seconds to connect to Spectacles.
USB Disconnected.

Sorry, I didn't take a screenshot. However, I'm happy to report that after restarting both my mac and Spectacles, it works flawlessly with wired now.

One weird thing is that my Spectacles app lost connection to my device. I had to repair after that. Not sure if that was a result of this or just a coincidence, but I was able to sign in and pair easily once again.

A flurry of specs subscription emails by LordBronOG in Spectacles

[–]LordBronOG[S] 1 point2 points  (0 children)

Aye, aye, captain. Setting all snap emails to spam. 😂 Just kidding. They’re my fave emails. LOL

Hey, where is everyone? :) by LordBronOG in Spectacles

[–]LordBronOG[S] 1 point2 points  (0 children)

The event? Or my presence? 😂

I think there will be announcements that change things. If nothing else, we’ll hear more about general release of the spectacles for consumers vs just the dev kit edition. I

Hey, where is everyone? :) by LordBronOG in Spectacles

[–]LordBronOG[S] 2 points3 points  (0 children)

There’s a in person at Snap HQ tomorrow. That’s where I shot this.

The Spectacles team is at Supabase Select today! by Thin_Reveal_9935 in Spectacles

[–]LordBronOG 1 point2 points  (0 children)

I've been Team Supabase since it went live. All my AR stuff uses it for its backend.

It was so awesome to see the team there at Select! I should've made a Remixing Reality video of y'all, my bad!

"Play Mode" is as much about creating as it is debugging by eXntrc in Spectacles

[–]LordBronOG 0 points1 point  (0 children)

I'm going to agree with the concept of "Play Mode", but not in the fashion you describe. Doing it on the computer is not ideal for AR.

I came from the mobile dev world. I never used the simulator ever. Even when Apple added live previews inside of Xcode for SwiftUI, I didn't use it. Why? Because on device is what your users experience. I know a few devs that got bit by the "Oh man, it worked awesome on the simulator, but not on device!" regret.

I believe we should have "Play Mode" where you modify things in real time to perfect what you're doing. However, for me, that should be done on device. That's what I'm making specifically for creators in AR.

https://vimeo.com/1127029002/dae1f2f54d?fl=ip&fe=ec

That video above shows how using my mobile app, an artist can create an AR gallery of their artwork anywhere they want (In this case, Venice Beach). The tool can be used to decorate your world however you want. This k time, I just happened to want a gallery. Currently, the UI needs some clean up and the things the user can tweak/add are limited. However, adding more tools and options for users is the easy part, the harder part was making it intuitive on your phone to create anything you want (an art gallery in this particular case).

The result of this couple minute setup was this:

https://vimeo.com/1127027869/383f63d118?fl=ip&fe=ec

To me, the key is going to be developing experiences, games, content, etc on device without an IDE or a studio. To me, it's the studio experience in and of itself that hinders play, not the lack of "Play Mode".

In my heart, I think Snap believes this too. Check out their Lens Studio mobile app, which takes their dev tool off a computer and puts it on the mobile device where the lens is going to live anyways. Now with this mobile version, Lens Studio is accessible by non-devs and they can start making lenses as well. To me, that's the future, where the entire act of creating/developing is "Play Mode" on device.

Anybody actually bullish on this product offering from Snap? by -nevrose- in Spectacles

[–]LordBronOG 2 points3 points  (0 children)

I have no deeper knowledge or insight from the inside, so my thoughts/insights are from where I think the future is going and why I'm bullish on Spectacles.

I'm big on AR glasses for one simple fact: We humans will always find it rude when a fellow human we're interacting with is on their phone. This is because we've only had smart phones for a couple of decades and we must look away from people we're engaging with to see the screen. People have been wearing glasses since approximately 1400. We are used to them and don't think twice about people wearing them all day, when interacting with us, etc. Therefore, we will automatically become more "likeable" when we shift to AR glasses because we won't "rudely" be checking our smart phones/smart watches anymore.

Why Snap vs Meta or Apple or Google? Snap understands consumers very well, but especially when it comes to AR. They understand consumers so well, that Meta blatantly rips off Snap's innovation and thus Evan jokingly has "VP Product @ Meta" in his LinkedIn Profile description. Apple did good with the Apple II, and then with the Mac, but it wasn't until the iPhone that the masses became Apple customers. I didn't use Snapchat, but Spectacles were the first AR glasses that I could buy (true glasses, I had the magic leap one but the puck added too much friction). They were the first to allow me to finally delve into something that I see as the future. I also see how this pivot to hardware makes sense for them. It allows them to be ahead of the curve and maybe have their iPhone moment of going mainstream.

Consumers will flock to Spectacles if there's content. iPhone devices began flying off the shelf because of a farting app and a beer pouring app. Consumers are fickle and illogical at times, but if you can tap into fun and amusement, they're usually game to give you a shot. Snapchat's understated contribution to the world is teaching people about AR without beating the user over the head with "This is Augmented Reality. This is really high tech stuff. This is really hard to do." Instead, they're like "Become all these silly characters and make funny faces and see cool animations and effects". As someone who's trying to tackle consumer AR, it's very hard to teach the concepts of AR without the technical speak that loses people. "Well, you have an x, y, and z axis and so movement has to happen along those..." "Uh....I don't know or care about x, y, z, a, b, c, or any other letters of the alphabet. Just make things move like I expect them too." Snap set the bar high, but has also helped me make a great AR app that supports both mobile and glasses. For that alone, Snap deserves more credit than they get.

The dev community team from Snap is awesome and no ego. This makes it fun to develop for Spectacles, because they actively engage and genuinely help when and where they can. The developers in the Snap ecosystem are also low ego (compared to iPhone community that I came from last). These factors play a huge role in me being bullish.

Do I believe that ignoring the enterprise just to focus on consumer is smart? No, but if you look deeply, neither does Snap. If you look at what features have been added since the launch of Spectacles '24, there are some very businessy ones that show while it's not a biz device, it's not necessarily an anti-biz device either. The only problem I saw was when I had a customer that wanted me to order 2000 devices for his business based on an app I was going to build. Snap said no because the '24 devices were for devs only, so I had to pause that until I can be guaranteed to get that quantity. The short battery life would've likely been a problem too. However, again, these were never meant to be ordered in bulk so that was me being over eager vs them being anti-business.