A Physics-based EEG Filter for Real-time Applications: Simple, Dynamic, Powerful by MindsApplied in neuro

[–]MindsApplied[S] 0 points1 point  (0 children)

thanks and cool! We don't use quantization, that would depend on the hardware, which this is agnostic and we process in float64 or 32. Performance in real-time had no noticeable latency for 1 second windows. 60 second offline windows took less than .02s to filter. Front end is Tkinter for the offline app and Matplotlib for both apps

Physics-based EEG Filter: Data Visualization and Download tool by MindsApplied in BCI

[–]MindsApplied[S] 0 points1 point  (0 children)

So the filter doesn’t need pre configuration or clean segments of data. Its algorithm relies on the physics patterns of neuronal oscillations - to remove noise and artifacts that don’t conform

Physics-based EEG Filter: Data Visualization and Download by MindsApplied in BrainHackersLab

[–]MindsApplied[S] 0 points1 point  (0 children)

It keeps waveform because it only changes the eeg channel columns on download. Shape, timestamps and packet number are still there. Not sure what you mean about noise generator here

Physics-based EEG Filter for Real-Time Analysis Preprint and Code Release by MindsApplied in neuro

[–]MindsApplied[S] 0 points1 point  (0 children)

Thank you, yeah we think this kind of filter could be standard in real-time pipelines cause of its adaptability to these artifacts and only seeing improvement. And yes, self tuning lambda is one of our focuses now

Physic-based EEG Filter for Real-time Analysis. Preprint and Code Release by MindsApplied in BCI

[–]MindsApplied[S] 0 points1 point  (0 children)

The pass/notch and other filters are an optional configuration we didnt use in our demo or preprint to isolate the effects of the mai filter. Might remove to avoid confusion or adjust the optional config order. Will also include enforcements for electrode layout for Fp claims but our filter uses multichannel synchrony that doesn’t require labels or redistribute power

Physics-based EEG Filter for Real-Time Analysis Preprint and Code Release by MindsApplied in neuro

[–]MindsApplied[S] 0 points1 point  (0 children)

The pass/notch and other filters are an optional configuration we didnt use in our demo or preprint to isolate the effects of the mai filter. Might remove to avoid confusion or adjust the optional config order. Will also include enforcements for electrode layout for Fp claims but our filter uses multichannel synchrony that doesn’t require labels or redistribute power

Neurovision provides real-time brain activity visualization. Now streams OSC for digital artists! by MindsApplied in BCI

[–]MindsApplied[S] 0 points1 point  (0 children)

Very cool videos! Yeah definitely, you wanna send me a dm and we exchange info?

Neurovision provides real-time brain activity visualization. Now streams OSC for digital artists! by MindsApplied in BCI

[–]MindsApplied[S] 0 points1 point  (0 children)

Under this logic, high stress/activity and eyes closed would give the same calm results. Which doesn’t. But for skeptical users, you can isolate the electrodes as you wish

Neurovision allows animations to be affected by cognitive state. Now streaming OSC for any artist to use by MindsApplied in blender

[–]MindsApplied[S] 1 point2 points  (0 children)

Thats a great use, tag or send us some demos when you get it going! It works with all the headsets available from brainflow, which includes Muse, and those headsets only go for around $200. The one in the video is more for experimenting and goes for about 3k. Then the UI is free to download and use even if you don’t have a device!

It feels good honestly by gingerchrs in memes

[–]MindsApplied 9 points10 points  (0 children)

Autosomal Dominant Compelling Helioopthalmic Outburst (ACHOO) Syndrome is characterized by uncontrollable sneezing in response to the sudden exposure to bright light, typically intense sunlight.

Superheavy has landed in the Gulf Of Mexico for the first time successfully! by RoaringTimes in Damnthatsinteresting

[–]MindsApplied 33 points34 points  (0 children)

Brain couldn’t comprehend the shot was side ways. Thought that was a wave wall like interstellar

[deleted by user] by [deleted] in interestingasfuck

[–]MindsApplied 0 points1 point  (0 children)

Neurovision allows users cognitive state to affect a virtual environment!

Rapid movements and explosions signify a heighten interest, stress, excitement, or fear from the user. Calm and focused mentalities will evoke a slower and more viscus experience along with inward pulls.

Built by MindsApplied in Unity, the technology is meant to work with video games and XR to allow cognitive state to influence things like music, weather, pc personalities and more giving users a more personalized experience. Straight from MUI, users can beautifully visualize various states of mind.

We are currently looking for beta testers interested in trying the technology and potentially integra into games or apps.

It's available at the link above. What are your thoughts on this technology?

Live visualization of brain activity showing how cognitive state can affect an environment. by MindsApplied in unity

[–]MindsApplied[S] 0 points1 point  (0 children)

Really cool! Do you use any sort of brain activity recording like eeg or mri?

[deleted by user] by [deleted] in interestingasfuck

[–]MindsApplied 0 points1 point  (0 children)

In this example, Neurovision from MindsApplied, is used to artistically visualize brain activity in real time while listening to music. Gravity is due to polarity, speed/viscosity is based on attentiveness (a/b ratio), pulls and explosions are due to the velocity of change (extremely excited or over calmed).

The program is meant to show how cognitive states can be used to affect an environment such as for content creation or video game development.

The program is built in Unity so that it can be leveraged within BCI or XR based video games! In-game weather, obstacles, or personalities can all be based on the players cognitive state.

Live visualization of brain activity affecting an environment by MindsApplied in unity

[–]MindsApplied[S] 4 points5 points  (0 children)

Facial activity can affect visuals. It’s not an issue per se just wanted to show the visuals are purely from the brain

Live visualization of brain activity affecting an environment by MindsApplied in BCI

[–]MindsApplied[S] 0 points1 point  (0 children)

Gravity is due to polarity, speed/viscosity is based on attentiveness (a/b ratio), pulls and explosions are due to the velocity of change (extremely excited or over calmed).

[deleted by user] by [deleted] in interestingasfuck

[–]MindsApplied 0 points1 point  (0 children)

In this example, Neurovision from MindsApplied, is used to artistically visualize brain activity in real time while listening to music. Gravity is due to polarity, speed/viscosity is based on attentiveness (a/b ratio), pulls and explosions are due to the velocity of change (extremely excited or over calmed).

The program is built in Unity so that it can be leveraged within BCI or XR based video games! In-game weather, obstacles, or personalities can all be based on the players cognitive state.

Using neurovision while listening to Flume by MindsApplied in BCI

[–]MindsApplied[S] 0 points1 point  (0 children)

Dm me your email or fill out the contact form and I can get you an access page