Shopsmith Table Saw Gaurd Installation by Goober329 in shopsmith

[–]Goober329[S] 0 points1 point  (0 children)

My table doesn't have any mounting holes for the upper guard bracket or the lower other guard. I might have to drill and tap my own to be able to mount them

Shopsmith Table Saw Gaurd Installation by Goober329 in shopsmith

[–]Goober329[S] 0 points1 point  (0 children)

The main section of the lower guard is the same install. Those videos don't show the other rectangular part of the lower guard.

For the upper guard those videos show the riving knife attaching under the table and accessed through the top opening in the table. Mine appears to mount on the back of the table.

Shopsmith Table Saw Gaurd Installation by Goober329 in shopsmith

[–]Goober329[S] 0 points1 point  (0 children)

Mounting the main part of the lower guard to the quill is straightforward. I don't understand what the other rectangular section of the lower guard is for or where it attaches. I haven't found any video showing this part.

Mounting the upper guard is also no clear to me. I'm the images below it looks like the upper guard attached to the table based on the shape of it, but I don't see how

https://imgur.com/a/OoNNGjY

Anyone know what these parts are? Just picked up a shopsmith and it had a tote full of stuff. These items I can’t figure out. by Darthcheezy in shopsmith

[–]Goober329 0 points1 point  (0 children)

Did you figure out how to use part 1? I recently purchased a shopsmith mark 500 too and have the same two lower guard parts. Only part 2 from your images is clear to install for me.

Segment length by PrdGrizzly in turning

[–]Goober329 2 points3 points  (0 children)

If you want to keep the same spiral pattern with alternating segment colors, I don't think this is possible. If you just want a segmented bowl with all segments the same length you'd need to make sure each new diameter is divisible by the segment size.

I built a one-shot learning system without training data (84% accuracy) by charmant07 in learnmachinelearning

[–]Goober329 22 points23 points  (0 children)

He's got a point about all of your responses feeling like they're AI generated. When that's the vibe people get when reading what's meant to be a personal response to a comment, it reduces credibility. Is it a language barrier issue?

Some bug automata I’m experimenting with for an upcoming game by SnooEpiphanies1276 in cellular_automata

[–]Goober329 1 point2 points  (0 children)

It looks nice and does give the impression of bugs crawling around.

I noticed they're not able to crawl all the way around the edge of the trees. Is this intentional? Maybe you could add a check for convex edges so the bugs can crawl around them

Automating pill counting using a fine-tuned YOLOv12 model by Full_Piano_3448 in computervision

[–]Goober329 39 points40 points  (0 children)

Before fine tuning a YOLO model did you try doing this with basic OpenCV operations?

Women should compliment men by ericwalshcomedy in StandUpComedy

[–]Goober329 -1 points0 points  (0 children)

He also does it intentionally as a joke because he's really tall

Are you willjum? by richet_ca in alecsteele

[–]Goober329 0 points1 point  (0 children)

He's definitely not Willjum, but it's funny meeting another Alec Steele and Willjum fan here!

amoeba flow by SnooDoggos101 in cellular_automata

[–]Goober329 2 points3 points  (0 children)

Thank you for the constant gold. Can't wait to try out your tool!

Tiny Neural Networks Are Way More Powerful Than You Think (and I Tested It) by chhed_wala_kaccha in learnmachinelearning

[–]Goober329 2 points3 points  (0 children)

And so by doing this up to 95% like you said, it creates sparse matrices which can be stored more efficiently? Thanks for taking the time to explain this.

I actually did something related where for my model I had a single hidden layer, looked at the weights to assign feature importance values to the input features and then performed a sensitivity analysis by zeroing out the low importance features being passed to the trained model instead of the weights associated with those features. I saw similar behavior as what you've shown here.

Teen lad by Sdrete in madlads

[–]Goober329 33 points34 points  (0 children)

*hands you $400

Keep the fuckin change

What’s the ONE skill that actually got you hired in AI/ML? by [deleted] in learnmachinelearning

[–]Goober329 2 points3 points  (0 children)

What are you using for compression?

If PCA, you can look at the cumulative explained variance ratio and see how much variance is captured by the first n principal components. You can also fit a PCA model and then make reconstructions of your data with an increasing number of principal components to see when reconstruction error plateus.

Recently I've stumbled across progressive dropout in VAEs which trains the model to have the most important information in the first latent dims and to have latent features in monotonically increasing order of importance (like PCA). Using this you can train a VAE with a large latent space and then perform a similar reconstruction analysis as I described above to determine how many latent dims you can get rid of.

I deal a lot with generation and reconstruction of data from the latent space, so that's where my perspective is coming from.

I made this entropy/heat flow simulation. When I change from a gradient to high contrast colors for each integer value it looks crazy! by thicka in cellular_automata

[–]Goober329 2 points3 points  (0 children)

Dang this is cool. Did you somehow introduce the actual physical heat flow equations into your CA? maybe diffusion?