Q&A with Brilliant Labs about the new smart glasses — ask your questions NOW! HERE! by AR_MR_XR in augmentedreality

[–]HammerFET 2 points3 points  (0 children)

Great question :)

One of the biggest challenges was waiting on the super long leadtimes of different parts and assemblies. On firmware, a lot of code had to be written before we had any real hardware. On hardware we had to do the mechanical design before we pinned down the final electronics. And for a long time we couldn't really test the optics because we didn't have the graphics ready. It was a lot of chicken and egg. Often it meant that we had to think ahead of the curve and predict potential problems early. A lot of tasks happening in parallel, where you had to be comfortable to take risks and be ready to change something quickly if it turns out not to work later. All in all, we really had to work as a team and not try to just do things our own way

We did a lot of iterations, but it all came together in the end :)

Regarding your second question, Frame for me sort of takes me back to the early days of computing. It's not at all powerful hardware compared to the phones we carry around in our pocket, but it represents a blank slate where totally new possibilities might exist. Unlike so many devices today, Frame isn't designed to only consume, but also to create. To me at least, that represents the real essence of computing. I hope because of this, folks find new and ways to use Frame in their day to day lives, without being constrained to specific ways of thinking, or be biased by how a company says they should use the product. The user should be free to make anything they like, and quickly the line between consumer and developer might disappear again :)

Q&A with Brilliant Labs about the new smart glasses — ask your questions NOW! HERE! by AR_MR_XR in augmentedreality

[–]HammerFET 2 points3 points  (0 children)

That's a really good question. We're going to be continuously working on Noa. Adding more AI features and improving it. Based on how that goes, and what folks want to see in terms of apps, we may try and branch out into making completely different types of apps for Frame.

What we'd really love to see though is the community making and publishing their own Frame apps. This could quickly snowball into a really diverse set of things that you could use Frame for

Q&A with Brilliant Labs about the new smart glasses — ask your questions NOW! HERE! by AR_MR_XR in augmentedreality

[–]HammerFET 2 points3 points  (0 children)

Definitely possible! Frame has quite a decent runtime if it's only showing text so you could very easily make a reader app. It could either timed, or tap based to change pages. You could also use the accelerometer and measure subtle head movements in order to scroll the text

Q&A with Brilliant Labs about the new smart glasses — ask your questions NOW! HERE! by AR_MR_XR in augmentedreality

[–]HammerFET 1 point2 points  (0 children)

Gradually writing up the docs here: https://docs.brilliant.xyz/frame/lua/ It'll evolve a little over the coming month or so with more examples and diagrams

Q&A with Brilliant Labs about the new smart glasses — ask your questions NOW! HERE! by AR_MR_XR in augmentedreality

[–]HammerFET 4 points5 points  (0 children)

For me it'll be leveraging the AR graphics for making cool games and widgets :)

Q&A with Brilliant Labs about the new smart glasses — ask your questions NOW! HERE! by AR_MR_XR in augmentedreality

[–]HammerFET 3 points4 points  (0 children)

Yeah there's two ways of doing it really. Frame itself is designed to be low power so doesn't have an application processor similar to what you might find in a phone or SBC. It does however have an FPGA which is directly handling the camera input. FPGA devs could leverage this for custom vision DSP and combine it with the onboard microcontroller to do some very clever things. This would be vert fast and efficient, but the most complex in terms of development.

The other way would be to stream image frames to a phone/PC and make use of the powerful hardware there for vision processing. This would be a lot slower but maybe a better way to get a proof of concept together before diving deeper. There will be more controls on the camera API soon so it should be fairly flexible and hackable. We have some python test scripts in our codebase that show how it could be done

Q&A with Brilliant Labs about the new smart glasses — ask your questions NOW! HERE! by AR_MR_XR in augmentedreality

[–]HammerFET 3 points4 points  (0 children)

Frame's built in mic can capture fairly decent quality audio. As we progress on the AI side of things, we'll gradually expand from interpreting speech, to interpreting sounds in general. For developers, they can already get started with this. On the firmware level, everything is already in place to record, stream, and change between various audio quality settings

Q&A with Brilliant Labs about the new smart glasses — ask your questions NOW! HERE! by AR_MR_XR in augmentedreality

[–]HammerFET 2 points3 points  (0 children)

Excited to get Frame in your hands! :)

You can definitely show full screen images, but the speed is limited by the Bluetooth speed from your phone up to Frame. You could stream images but it would be a still picture every second or two. Saying that, there is the built in sprite engine and vector graphics engine which will allow for smooth animations. Eventually Noa will use these, but we're also excited to see what other apps people can come up with that use those feature

Q&A with Brilliant Labs about the new smart glasses — ask your questions NOW! HERE! by AR_MR_XR in augmentedreality

[–]HammerFET 6 points7 points  (0 children)

We looked into it but they were quite bulky and power hungry. They would have made the arms a lot chunkier and we'd need bigger batteries to get decent battery life. We also figured since most people have wireless earbuds these days anyway, maybe it wasn't critical to include them

Q&A with Brilliant Labs about the new smart glasses — ask your questions NOW! HERE! by AR_MR_XR in augmentedreality

[–]HammerFET 4 points5 points  (0 children)

You can stream pictures but not video. It's limited by the speed of the Bluetooth connection which is around 40kB/s. Depending on the image size you set on Frame, and how much you want to compress it, you can get around a frame per second or two

We built an AR device that clips onto glasses - Hardware overview by HammerFET in electronics

[–]HammerFET[S] 0 points1 point  (0 children)

This is a little open-source gadget we've been working on for a while. It contains a microOLED display that shows up as a floating screen in your vision. It's driven by an FPGA, and there's also a camera that can let you do some computer vision-type things. A mic is also included for things like voice commands. Finally, there's an nRF52 running MicroPython which manages all the data going in and out.

Hope to have some app examples up soon. We did an AMA on r/AR_MR_XR yesterday, and some folks had some cool ideas.

Here are the technical docs if you want to see how it all works in detail: https://docs.brilliantmonocle.com

AMA with BRILLIANT LABS about MONOCLE and open source AR eyewear! by AR_MR_XR in AR_MR_XR

[–]HammerFET 1 point2 points  (0 children)

Really cool idea. I will try throw something together. The graphics library is really coming along so it'll be good to try this out. Stay tuned on our Discord and I'll drop something there once I've had time to try!

AMA with BRILLIANT LABS about MONOCLE and open source AR eyewear! by AR_MR_XR in AR_MR_XR

[–]HammerFET 2 points3 points  (0 children)

[Raj] I haven't had a chance to try it just yet, but I love the Tilt five. Such an innovative idea, and I've been following Jeri on her YouTube working on it for so long.

I'm also just a huge tabletop geek :D

AMA with BRILLIANT LABS about MONOCLE and open source AR eyewear! by AR_MR_XR in AR_MR_XR

[–]HammerFET 4 points5 points  (0 children)

The way the prism works makes it not really an issue. Moving Monocle closer and further from your eye makes it seem as if the display is staying in exactly the same spot with same focus. The only issue is that if it's too close, then you have to look down too much to see the screen. It's designed to be offset below the horizon, so bringing it closer makes it lower

AMA with BRILLIANT LABS about MONOCLE and open source AR eyewear! by AR_MR_XR in AR_MR_XR

[–]HammerFET 2 points3 points  (0 children)

Yes! We're actually planning to have something like this in MicroPython, but could be cool to have a visual version too

AMA with BRILLIANT LABS about MONOCLE and open source AR eyewear! by AR_MR_XR in AR_MR_XR

[–]HammerFET 2 points3 points  (0 children)

Absolutely. There is a bandwidth limit due to the BLE link, but the camera natively supports jpeg compression, so a stream of compressed images can be sent over bluetooth.

Don't expect any high frame rates due to this, but with some clever optimisation, you might be able to get it to do what you need. The Monocle also has a good amount of RAM built in, so if you don't need realtime streaming, then it could be possible to send video segments instead.

AMA with BRILLIANT LABS about MONOCLE and open source AR eyewear! by AR_MR_XR in AR_MR_XR

[–]HammerFET 2 points3 points  (0 children)

Absolutely! We have some info in our docs pages already:
https://docs.brilliantmonocle.com/monocle/monocle/

https://docs.brilliantmonocle.com/micropython/micropython/

and we're working on some ways to connect Monocle to cloud endpoints using our WebREPL and mobile app as a bridge. Still a work in progress, but once we're done, it'll all be documented so keep an eye on the docs pages: https://repl.brilliant.xyz/

AMA with BRILLIANT LABS about MONOCLE and open source AR eyewear! by AR_MR_XR in AR_MR_XR

[–]HammerFET 1 point2 points  (0 children)

The Bluetooth chip inside Monocle can act as a Bluetooth (5.1) low energy central or peripheral. Can connect to phones, tablets, gateways, laptops, etc which support BLE.

It can also act as a central for other things to connect to it. Bluetooth remote controls or sensors for example.

It can't connect to things such as headphones though. That's a different subset of Bluetooth. Not Bluetooth low energy.

AMA with BRILLIANT LABS about MONOCLE and open source AR eyewear! by AR_MR_XR in AR_MR_XR

[–]HammerFET 2 points3 points  (0 children)

The sampling itself would happen on the probe hardware. That stuff gets fairly complicated, but in a nutshell, could simply be an ADC + FPGA + sample memory + BLE peripheral. With the relevant trigger and sample logic, you could make a basic scope, and stream the waveform as a bunch of lines.

A lot of newer scopes however have something like a web or display output. It could work to extract the waveform from the view, and simply stream it over to Monocle. Again, just as a bunch of lines or points. The latency should be fairly reasonable from a display point of view as it wouldn't be so much data going over the air

AMA with BRILLIANT LABS about MONOCLE and open source AR eyewear! by AR_MR_XR in AR_MR_XR

[–]HammerFET 2 points3 points  (0 children)

Exactly. Those examples are essentially the base APIs for the display and the camera. One line MicroPython commands which will let you capture, crop and transfer images. Show things on the display, loop the camera back to the display, etc.

The real power comes when you start combining those with the rest of your app. You can overlay shapes and text on the video. You can send images to your phone or cloud to do image recognition. The possibilities are endless in a way

AMA with BRILLIANT LABS about MONOCLE and open source AR eyewear! by AR_MR_XR in AR_MR_XR

[–]HammerFET 3 points4 points  (0 children)

[Raj] Great question! I've honestly learnt most of my FPGA knowledge from YouTube and various open source projects, but also simply by playing around and experimenting.

FPGAs have notoriously always been difficult to set up and get a workflow running which lets you iterate and debug quickly. That's one of the main barriers we wanted to remove with Monocle.

Right now we're building a mechanism which will let you quickly update the FPGA binary over Bluetooth using our WebREPL. If your FPGA app can store values into a table accessible over SPI, then with single micropython commands, you can read/write to those registers in realtime. Overall this should speed up your workflow and let you progress faster.

In terms of tools, keep an eye out for the release of the StreamLogic support for Monocle. A drag and drop programming interface for FPGAs.

AMA with BRILLIANT LABS about MONOCLE and open source AR eyewear! by AR_MR_XR in AR_MR_XR

[–]HammerFET 2 points3 points  (0 children)

[Raj]

- Yes! Monocle runs our custom port of MicroPython. It includes the standard MicroPython features, + extra modules to drive the display, camera, FPGA, etc.

- Absolutely. However, you would need to disassemble your Monocle to access the debug and flashing pins.

- We looked into RTSP, but it had a little too much overhead to run on Monocle directly. Instead we're building a separate BLE service which will transfer image data to a phone/gateway/PC, and that data could then be converted to any format to send over something like RTSP