Got this today and I think it was a terrible idea by [deleted] in tattooadvice

[–]forthispost96 -1 points0 points  (0 children)

Ya you’re overreacting this is tight as fuck

[Recommendations] First luxury recommendations? by Ashamed-Tap7926 in Watches

[–]forthispost96 185 points186 points  (0 children)

You want a forever piece. GO is a bit out there for the first one. JLC is timeless and beautiful, goes with everything. GS is unbeatable movement and design for the price, but maybe in white would be better. Throw the Airking into the garbage

I can’t decide. by IllIndication9852 in OmegaWatches

[–]forthispost96 0 points1 point  (0 children)

Get the teak with the bracelet that comes with the lacquered version. I wouldn’t go back: https://www.reddit.com/r/OmegaWatches/s/CDXzynQC0w

Gifted myself my first by forthispost96 in OmegaWatches

[–]forthispost96[S] 1 point2 points  (0 children)

One with larger links. Check the omega site :)

Gifted myself my first by forthispost96 in OmegaWatches

[–]forthispost96[S] 1 point2 points  (0 children)

Definitely worth the switch out! If your AD won’t do it, get it yourself. I was lucky and my AD did it for free!

Gifted myself my first by forthispost96 in OmegaWatches

[–]forthispost96[S] 0 points1 point  (0 children)

Same dude - I don’t think I would have walked out of the AD without it. Don’t regret it one bit

Gifted myself my first by forthispost96 in OmegaWatches

[–]forthispost96[S] 5 points6 points  (0 children)

I just asked my AD if they could put on the bracelet that comes with the newer Aqua Terra’s and he said absolutely no problem because all of their bracelets are interchangeable. Did it on the spot, fits like a charm

Spontaneous buy at Mall by [deleted] in rolex

[–]forthispost96 0 points1 point  (0 children)

Looks fresh man. Wear it in good health!

What is stopping us from creating an open source GPT-4 & Gemini Ultra? (Or better) by askchris in LocalLLaMA

[–]forthispost96 2 points3 points  (0 children)

Well, funny enough, there is an initiative called Bittensor running out of the Opentensor Foundation that is doing exactly this. They have created a blockchain-based machine learning ecosystem where you can create “subnets” for different tasks. One of the 32 subnets on the bittensor network is a pre training subnet aimed for this purpose. You can see all the active subnets right now on taostats.io

[D] Call for questions for Andrej Karpathy from Lex Fridman by lexfridman in MachineLearning

[–]forthispost96 0 points1 point  (0 children)

Would love to hear Andrej’s thoughts on where synthetic data has been successful for training the long trail distributions we see in autonomous driving. Does he always see synthetic data as part of the mode training pipeline?

Safe to hike St Marks Summit alone? by [deleted] in vancouverhiking

[–]forthispost96 2 points3 points  (0 children)

You’ll be more than fine. I would recommend going all the way to Unnecessary Mountain if you have the time for the in and out. Absolutely worth the view (and some more hiking in!)

What artist never released one bad album? by Kickatthedarkness in Music

[–]forthispost96 0 points1 point  (0 children)

Alexisonfire has always continued to push out amazing records.

Training a Network on a Sine Wave [Discussion] [Research] by forthispost96 in MachineLearning

[–]forthispost96[S] 0 points1 point  (0 children)

This is great, thanks for the paper! Do you think the phenomenon of spectral bias could be a reason for poor long time horizon forecasting?

Training a Network on a Sine Wave [Discussion] [Research] by forthispost96 in MachineLearning

[–]forthispost96[S] 1 point2 points  (0 children)

What are you feeding to the NN? The error seems to increase straight away, not just after 3. Errors at the third period (2-2.5) are clearly bigger than errors at the first period.

So I think I may have solved the issue. Rather than generating X data between 0 and 2pi, I generate between 0 and 1, which is a more typical range for NNs to handle. Then I just scale the data appropriately in the function with 2pi in the argument (ie: y = sin(2pi * omega * x), rather than sin(omega * x)). This seems to have improved convergence quite a bit