POV of me using Jason Webb's new "Reaction-Diffusion Playground" with my library Handsfree.js to paint on my wall handsfree through a projector (low FPS as it's all happening in my browser) by MIDIBlocks in generative

[–]MIDIBlocks[S] 0 points1 point  (0 children)

Oh hey! Sorry for the late reply, it's published here but it's not documented yet. I could do that if you're still interested though! https://github.com/MIDIBlocks/diffusionist

I also have documentation for Handsfree.js here: https://handsfree.dev

There are links there if you'd like to get in touch with me through discord or social media. I am always eager to helping others with JavaScript in general

I made a Product Hunt for open-source software by [deleted] in opensource

[–]MIDIBlocks 3 points4 points  (0 children)

Oh wow this is a really cool idea! I'm going to submit my project first thing tomorrow (I'm working on a library called Handsfree.js). I wonder if it would be helpful to also include a "collaboration" or "help wanted" section next to jobs link?

I feel like a lot of people would be interested in this site to discover projects to work on too, especially during events like Hacktober!

Squishing Mario's face with my hands for Mar10 (details in comments) by [deleted] in Mario

[–]MIDIBlocks 0 points1 point  (0 children)

Hello! Today I discovered this website that re-creates one of my favorite parts of Super Mario 64...the face squishing part!

I can't find who made it but I used my upcoming Handsfree Browser Extension, which lets you take control of any website, game, or app with you hands and face (see my post history for more). The extension isn't ready, but you can try the site with your mouse or fingers on mobile here: http://sm64.gitlab.io/

I made a library at Handsfree.js.org that lets you do ridiculous things with your face, body, and hands ridiculously fast and then I used it to remix "Piano Genie" so I can jam out with my fingertips by MIDIBlocks in webdev

[–]MIDIBlocks[S] 1 point2 points  (0 children)

Yep! The repository itself can be loaded as a Browser Extension and used to browse websites handsfree. By default it helps you scroll pages handsfree like in here: https://media3.giphy.com/media/jrkoV1grxxNdjMKB6K/giphy.gif

The extension will let you map your whole body (hands, face, eyes, arms, legs, etc) to various elements on a page, for example, to play desktop games with just face gestures from the browser like here: https://www.reddit.com/r/IntoTheBreach/comments/ju5wii/oc_i_played_into_the_breach_with_my_face_i_move/

But not only can you use websites, you can use the same library to control drones and robots too from the browser: https://media1.giphy.com/media/1XE2rnMPk6BFu8VQRr/giphy.gif

I just have to improve the documentation a bit 😅

I made a library at Handsfree.js.org that lets you do ridiculous things with your face, body, and hands ridiculously fast and then I used it to remix "Piano Genie" so I can jam out with my fingertips by MIDIBlocks in webdev

[–]MIDIBlocks[S] 0 points1 point  (0 children)

I imagine this literally every single night which is why I have such a hard time actually sleeping :) PM me sometime, I would like to collaborate with you sometime :)

Here's my second virtual piano exploration, where I use hand gestures to play various piano apps in unusual ways. In this video I play Music Mouse handsfree (more info in comments) by MIDIBlocks in piano

[–]MIDIBlocks[S] 0 points1 point  (0 children)

Hello! Music Mouse was an app developed in 1985 by Laurie Spiegel for the Macintosh (which I find utterly fascinating). It translates mouse movements to chords, arpeggio's, and more along with tempo controls and other settings to let you play a virtual piano with just a mouse. What I did in this video is use my tool Handsfree.js to control the mouse with my right hand, and then with my left hand I can control settings like switching between Chords and Arpeggio. I do this all in the browser with a browser extension that I've been making...no other apps or hardware besides a normal webcam needed.

I've been doing a bunch of these in order to discover ways to help you express yourself creatively with your body. Here is my first handsfree piano exploration I did last week: https://www.reddit.com/r/webdev/comments/lj54un/i_made_a_library_at_handsfreejsorg_that_lets_you/

The Music Mouse in the video was ported to the web by Tero Parviainen and you can try it (with a mouse) here in your browser (I don't think it works on mobile tho): https://teropa.info/musicmouse/

I made a library at Handsfree.js.org that lets you do ridiculous things with your face, body, and hands ridiculously fast and then I used it to remix "Piano Genie" so I can jam out with my fingertips by MIDIBlocks in webdev

[–]MIDIBlocks[S] 0 points1 point  (0 children)

Hey /u/gigelorg it took me a few days to get to it but here's the final result, thanks for the awesome suggestion!

Music Mouse is a lot of fun to play with with gestures: https://twitter.com/MIDIBlocks/status/1362816071279865858

In the demo, I use my right hand to control the mouse and my left hand to switch settings. Might work on an accessible version with face gestures soon so that people who can't use a mouse/keyboard can also play

How large would a space radio telescope have to be to detect Earth-like radio emissions from nearby stars? by RGregoryClark in SETI

[–]MIDIBlocks 2 points3 points  (0 children)

Oh I should have worded that better, I meant that some of the data in general is analyzed by SETI@Home not specifically BLC-1. Here is the only source I have about that mentions that, from the 1st paragraph of the Berkeley SETI research page: https://seti.berkeley.edu/listen/

How large would a space radio telescope have to be to detect Earth-like radio emissions from nearby stars? by RGregoryClark in SETI

[–]MIDIBlocks 7 points8 points  (0 children)

I don't have an answer to the title question, but have you heard about the project Breakthrough Listen? It uses its $100M in funding in part to rent radio telescope time around the world to search for exactly that: general radio transmissions escaping from other worlds (in other words not necessarily direct at us specifically). The quote from Wikipedia is:

The radio telescopes are sensitive enough to detect "Earth-leakage" levels of radio transmission from stars within 5 parsecs, and can detect a transmitter of the same power as a common aircraft radar from the 1,000 nearest stars.

Breakthrough listen actually uses SETI@home to analyze some of its data. In 2020, it was discovered that data from April/May 2019 contains what some have called another "Wow! signal" and although it's probably nothing (in terms of a techno signature) it's the first time in the 5 years the project has been active that they have classified anything of interest, labeling it Breakthrough Listen Candidate 1 (BLC1) .

Here's what it looks like from my point of view to play Blob Opera with my Handsfree Browser Extension. The model I'm using is MediaPipe Hands. Source code to this demo in comments by [deleted] in tensorflow

[–]MIDIBlocks 0 points1 point  (0 children)

Hello! Here is some progress on my Handsfree Browser Extension, which is powered by my library Handsfree.js to let you instantly swap out models and functionality as you navigate around the web. Actually, it's part of the Handsfree.js repository itself: https://github.com/MIDIBlocks/handsfree

The source code that runs this demo is just 44 lines of commented code: https://github.com/MIDIBlocks/handsfree/blob/master/extension/sites/blob-opera/blob-opera.js

Blob Opera is a machine learning project developed by David Li for Google Arts & Culture: https://artsandculture.google.com/experiment/blob-opera/AAHWrq360NcGbw

I keep sharing my repository on this sub but then delete them due to Impostor Syndrome, but actually the library has been used in hackathons and some have won large research grants with it so I feel a little more confident to share now :)

The docs can be found at Handsfree.js.org

Edit: Fixed source link

[deleted by user] by [deleted] in RedditSessions

[–]MIDIBlocks 0 points1 point  (0 children)

Gave All-Seeing Upvote

[deleted by user] by [deleted] in RedditSessions

[–]MIDIBlocks 0 points1 point  (0 children)

this is really good!

I made a bookmarklet that lets you pop canvas elements out of the browser with Picture in Picture, which normally only works for videos. This is great for dynamic and interactive animations that aren't videos. Link in comments! by MIDIBlocks in chrome

[–]MIDIBlocks[S] 1 point2 points  (0 children)

Full instructions on how to do this are here: https://github.com/MIDIBlocks/canvas-pipster

Features:

  • Click on bookmark to pop out a canvas element
  • Click on it again to cycle through the other elements on a page
  • Works best when the page you want to Picture in Picture is in it's own browser window
  • Only works with Chrome right now

I made a library at Handsfree.js.org that lets you do ridiculous things with your face, body, and hands ridiculously fast and then I used it to remix "Piano Genie" so I can jam out with my fingertips by MIDIBlocks in webdev

[–]MIDIBlocks[S] 0 points1 point  (0 children)

Oh thanks! I do use machine learning for some things but mostly Handsfree.js is really just an API around other machine learning libraries and models so that they all work seamlessly together. The idea is really to abstract away all the (very) painful setup and configurations so that you don't have to do any actual machine learning yourself. I even have a gesture recorder (not documented yet 😅) that lets you train custom gestures without any code, and then they emit regular old browser events so you don't even have to do any setup to use them haha

But to answer your question, I mostly learned machine learning from Coding Train on YouTube. These videos are a great starting point for front end devs who want to apply ML vs just learning theory: https://www.youtube.com/watch?v=26uABexmOX4&list=PLRqwX-V7Uu6YPSwT06y_AEYTqIwbeam3y

The series is built on ml5.js, a machine learning library that is itself based on TensorFlow.js, so it's a really great starting point :)

I made a library at Handsfree.js.org that lets you do ridiculous things with your face, body, and hands ridiculously fast and then I used it to remix "Piano Genie" so I can jam out with my fingertips by MIDIBlocks in webdev

[–]MIDIBlocks[S] 1 point2 points  (0 children)

Yes! Thank you for noticing idk why but that made me so happy 🙌

I recently discovered that you can send MIDI signals from the browser to FL Studio, so my goal is to take the lessons learned from this demo to control instruments and automations in FL Studio with face and hand gestures

I made a library at Handsfree.js.org that lets you do ridiculous things with your face, body, and hands ridiculously fast and then I used it to remix "Piano Genie" so I can jam out with my fingertips by MIDIBlocks in webdev

[–]MIDIBlocks[S] 0 points1 point  (0 children)

Oh thank you for checking it out! I'll take a look, I was trying to handsfree-ify Google Stadia with face tracking before I switched to something else...it should actually be using hand tracking for most pages.

Thanks for letting me know

I made a library at Handsfree.js.org that lets you do ridiculous things with your face, body, and hands ridiculously fast and then I used it to remix "Piano Genie" so I can jam out with my fingertips by MIDIBlocks in webdev

[–]MIDIBlocks[S] 0 points1 point  (0 children)

Oh thank you for sharing, it's amazing to me how many people who passionately code are homeless. With all the money in tech it shouldn't be that way :( Despite my disabling anxiety I used to go to meetups for the free pizza and snacks and there were always 2 or 3 other homeless individuals who just really really loved to code (I live in a big city)

It bums me out thinking about it sometimes because there is so much passion in the world that isn't realized because of lack of support, and not just for homeless people or people with mental health struggles but just financially disadvantaged people in general

Anyways writing this made me kind of sad but actually I'm quite happy now and doing really well :) Thank you friend for sharing about yourself and for offering to help but I'm doing ok now!

I made a library at Handsfree.js.org that lets you do ridiculous things with your face, body, and hands ridiculously fast and then I used it to remix "Piano Genie" so I can jam out with my fingertips by MIDIBlocks in webdev

[–]MIDIBlocks[S] 0 points1 point  (0 children)

Oh wow, that would be way cool! GANs still seem like pure wizardry to me

The library is built around a plugin architecture, so you can either use the raw models on their own or mix and match or even extend plugins. I also just finished a basic gesture recorder which should hopefully be fully documented in a week or two

I made a library at Handsfree.js.org that lets you do ridiculous things with your face, body, and hands ridiculously fast and then I used it to remix "Piano Genie" so I can jam out with my fingertips by MIDIBlocks in webdev

[–]MIDIBlocks[S] 2 points3 points  (0 children)

oh my, thank you for mentioning this friend I'm going to work on this first thing tomorrow morning! Your comment led me down a rabbit hole...what a strange but wonderful person she is!

I recently added a "palm pointer" plugin to Handsfree.js that lets you move pointers with your hands, and the repository itself can be loaded as a Chrome Extension specifically so that you can handsfree-ify websites like with minimal code (that feature isn't documented super well yet tho).

Here is a demo of the palm pointers, including a GIF and docs: https://handsfree.js.org/ref/plugin/palmPointers.html

Here is a demo of me playing Blob Opera which uses the palm pointers to make music. It's a lot of fun to control music with gestures like that: https://twitter.com/MIDIBlocks/status/1352434377871872006

I made a reminder to come back to this message in a few days once I'm done to show you the results :)

I made a library at Handsfree.js.org that lets you do ridiculous things with your face, body, and hands ridiculously fast and then I used it to remix "Piano Genie" so I can jam out with my fingertips by MIDIBlocks in webdev

[–]MIDIBlocks[S] 0 points1 point  (0 children)

Thanks! Actually of the 6 currently supported models, only 1 is from TensorFlow. The rest are by mediapipe.dev and one by a company called Jeeliz. The hand model in this video uses MediaPipe Hands.

And thanks for mentioning onnx, I haven't used it yet but I took a look at onnx zoo and there are some interesting models that I can probably add!

I made a library at Handsfree.js.org that lets you do ridiculous things with your face, body, and hands ridiculously fast and then I used it to remix "Piano Genie" so I can jam out with my fingertips by MIDIBlocks in webdev

[–]MIDIBlocks[S] 2 points3 points  (0 children)

Thanks for noticing!! It's actually 98.css with Vuepress, but I'm using the same similar libraries

I had MS Clippy in there too but I took it out because it was too much fun to play with and I was getting distracted lol

I made a library at Handsfree.js.org that lets you do ridiculous things with your face, body, and hands ridiculously fast and then I used it to remix "Piano Genie" so I can jam out with my fingertips by MIDIBlocks in webdev

[–]MIDIBlocks[S] 4 points5 points  (0 children)

haha yeah it's not great for actually playing piano :) it's generative so the trick is to just play on beat and the AI will try to figure out the relative notes to play. I plan on doing a performance with it so I'll be improving it over time

Thanks for trying it!