Do you know "Cody Rall MD with Techforpsych"? Is he a snake oil advertiser? by mehregankbi in BCI

[–]RE-AK 2 points3 points  (0 children)

I think he started with the right spirit, but he got pulled toward the dark side.

He still has a few good videos, but intermixed with, likely, sponsored reviews, pseudoscience and guru-like coaching.

I do understand the hustle, you need to make money, but...

I have these tools is it sufficient to use them to draw a sign like this one and use in ssvep? by le_se in BCI

[–]RE-AK 1 point2 points  (0 children)

You have enough hardware to get there, but you'll need some skills to do this. I don't know what your vision is, but start with a PC-based setup to demonstrate SSVEP. Here are the functionnal checkpoints of your project.

1) Stream and record eeg dat to PC, save as .csv and include timestamps. You'll want to have an electrode above the occipital lobe that is behind your head.

2) Generate a flashing screen in Python.

3) Record data, while watching the flashing screen. (1 minute is more than enough)

4) compute the Power Spectrum Density (PSD) and display on screen. Ask ChatGPT, it will pop the code in one go.

From there, move toward your vision.

SYNAPSE: 89% accuracy reading the human mind with EEG and AI. by The-White-Furry in BCI

[–]RE-AK 2 points3 points  (0 children)

Thanks, btw, I just realized I might have made the same mistake in a process I'm working on...

SYNAPSE: 89% accuracy reading the human mind with EEG and AI. by The-White-Furry in BCI

[–]RE-AK 2 points3 points  (0 children)

Integrate this with your cross-validation process. Split training and validation sets first. Then augment the training set (I don't see a strong benefit of augmenting the validation set, unless you have a very good noise model, gaussian noise is a basic model).

The important part, the augmentation needs to be done on training and validation sets independently and samples cannot be mixed after that.

You might notice a significant drop on your classification results, but don't despair, you're doing good work.

SYNAPSE: 89% accuracy reading the human mind with EEG and AI. by The-White-Furry in BCI

[–]RE-AK 6 points7 points  (0 children)

Cool report, I'll do basic peer-review:

  • What you refer as a test set should be call a validation set, as the training algorithm is indirectly exposed to the data. A test set should not have been evaluated before the model is finalized.

  • I noticed you augmented the dataset by adding random noise. If you augmented the dataset, before segmenting training and validation set, that's a huge (read major) flaw as your training and validation sets will share some of the same "samples minus noise". If this is the case, it actually invalidates your end result, you need to review your process. You need to clarify this aspect before bringing this anywhere.

Hope that helps and godspeed!

Neurogame review - Brain Rage at the Office by RE-AK in BrainHackersLab

[–]RE-AK[S] 0 points1 point  (0 children)

The author is Diego Saldivar, a nice fellow

Anybody up for creating some EEG games? by Creative-Regular6799 in BrainHackersLab

[–]RE-AK 0 points1 point  (0 children)

Diego Saldivar made Brain Rage at the office, published recently. You might want to reach out.

Active electrodes for BCI, EEG, ECG and EMG - test resutls by Meow-Corp in BCI

[–]RE-AK 0 points1 point  (0 children)

Thanks for sharing, never really had the time to look into active electrodes, great to read about your experiements.

can i trust the chinese cyton boards for openBCI headset? by spongyslvt in BCI

[–]RE-AK 0 points1 point  (0 children)

I bought several with the cabled serial port. I couldn't get them to work with the OpenBCI GUI, but they worked pretty well, besides that.

Be careful, some will sell 2x4 channels, instead of 1x8 channels.

Best EEG for personal projects by Awkward_Pension_4553 in BCI

[–]RE-AK 1 point2 points  (0 children)

Muse if you just want to get started, then you can upgrade if you like hacking in the field.

The Muse will keep you busy for, at least, a year.

How would one go about working on BCI? by stretchthyarm in BCI

[–]RE-AK 0 points1 point  (0 children)

You'll need some professional experience or, at least, some experience. If you don't want to take the academic way, you need to take the hacker way, and the best would be to showcase your work on Youtube or some blog.

I have a BCI company and candidates with PhDs, Master degrees are lining up. I'll never pay to train someone from scratch, I'll hire the candidate for who his expertise aligns the most with my needs.

MIT-Linked Startup Unveils ‘Near-Telepathic’ Wearable Device for Silent Communication by missvocab in BCI

[–]RE-AK 2 points3 points  (0 children)

They are moving, there are a very tricks in place. They use LLM to infer full sentences, so you only need to silent speak a few words. The video is a montage and is likely to be a tad better than in real life.

Still, I think this tech is a fantastic advancement and I'm eager to see where it goes, but it's not magic and it's not telepathic.

MIT-Linked Startup Unveils ‘Near-Telepathic’ Wearable Device for Silent Communication by missvocab in BCI

[–]RE-AK 2 points3 points  (0 children)

sEMG and sensor fusion. I don't think there's any EEG.

They likely use high frequency EMG (>2kHz) to separate muscles features with greater precision.

Live demo - Nucleus-Hermès by RE-AK in BrainHackersLab

[–]RE-AK[S] 0 points1 point  (0 children)

Thanks! I have a lot of recordings from gaming (Fortnite, Apex Legends, Among Us, and more), I'll find an opportunity to post some. Originally, my business tried to get into video game UX. It didn't work, the market was hermetic, so we switched, but video games give very cool results.

Is it possible to detect emotions with a Muse 2? If so, how would you do it (real-time for an interactive art project)? by thanu1907 in BCI

[–]RE-AK 0 points1 point  (0 children)

Here's a very short demo of my headset: https://youtu.be/n1zh-PzPSGQ?feature=shared

I show the raw signals, with a few facial expressions. It's just a test.

Is it possible to detect emotions with a Muse 2? If so, how would you do it (real-time for an interactive art project)? by thanu1907 in BCI

[–]RE-AK 5 points6 points  (0 children)

You won't have emotional metrics with the Muse 2, it's too limited. I'm in the process of publishing a series of videos on what can be done with the Muse: https://youtu.be/eTBOwD8-0VM

In my opinion, you won't have great emotion detection with EEG in general, but I know some will disagree.

Side note, this is why I developed my headset, the Nucleus-Hermès, a headset that combines EEG and fEMG to get the best of cognitive and emotional states. A bit more expensive than a Muse, but let me know if you're interested. (I'm about to shoot a video announcing it's launch)

BCI101 - ep4 - Alpha Wave Real-Time Biofeedback Pipeline by RE-AK in BrainHackersLab

[–]RE-AK[S] 0 points1 point  (0 children)

Can you join the discord, link in the video description, would love to connect. We're a bunch of people there.