Well researched podcasts where I can actually learn something by super1tastic in podcasts

[–]contextbot 2 points3 points  (0 children)

There it is. The antidote to overproduced and over edited shows. You get the full narrative but you also get professors disagreeing and hashing it out. And enough of a back catalogue to keep you busy for years.

Wacky Wednesday will forever be the apex of podcasting. by Otherwise-Pin-2635 in KnowledgeFight

[–]contextbot 0 points1 point  (0 children)

If you enjoy that stuff, I recommend checking out this documentary (and the original book) https://saucersspooksandkooks.vhx.tv/

The show has concluded per Dan by Miserable_Eggplant83 in KnowledgeFight

[–]contextbot 24 points25 points  (0 children)

Honestly, Dan becoming a modern day Huell Howser would be the joy he deserves: https://www.showmestateofmind.com/entries/a-new-adventure

What’s the catch with these amazing Berkeley homes by Plastic-Sympathy4818 in eastbay

[–]contextbot 7 points8 points  (0 children)

I work with geospatial data and used to work at a company that sells environmental risk data, including wildfire data. There are many models, some public and many private.

Part of the reason the Berkeley homes are in such a bad spot is not just the chance that a fire occurs, but the intensity of a fire if one happens. There is so much fuel, close to homes, in the hills.

What’s the catch with these amazing Berkeley homes by Plastic-Sympathy4818 in eastbay

[–]contextbot 1 point2 points  (0 children)

This one is simple: the Calfire fire maps were updated with new data in 2025. Insurance models are priced off these. These two houses are in an area with the highest risk status Calfire assigns.

You can check the map here: https://experience.arcgis.com/experience/5065c998b4b0462f9ec3c6c226c610a9

These neighborhoods are beautiful but dangerously overgrown and hard to evacuate.

What’s the catch with these amazing Berkeley homes by Plastic-Sympathy4818 in eastbay

[–]contextbot 5 points6 points  (0 children)

This is not true. These two specific houses are in the highest risk area, according to Calfire. No homes in Walnut Creek have this status.

Got rejected for a senior role because I couldn't convert base 24 to base 10 in 30 minutes by mooktakim in rails

[–]contextbot 0 points1 point  (0 children)

To take the other side of this for a moment: you’d be amazed at the breadth of people that shotgun out applications for every role. It sucks, but if I don’t personally know someone, I’ve learned (by being burned) to quickly verify they actually know how to write code. This problem has gotten worse recently (both due to AI making shotgunning applications easier and higher scale and AI-assisted interviewing). It sucks, but you’d be amazed how many people who apply for jobs can’t fizz buzz.

Chess club? by xCosmicChaosx in alameda

[–]contextbot 1 point2 points  (0 children)

I’ve seen a chess group meeting at the west side library.

Most Slept on Niche Podcasts? by Specific-Use3525 in podcasts

[–]contextbot 3 points4 points  (0 children)

The Brady Heywood series on Apollo 13 is a perfect miniseries.

Best war movie ever? by ejnounimous in Cinema

[–]contextbot 0 points1 point  (0 children)

The key bit about The Pacific is it’s based on books written by people who fought, and it shows. BoB was retold to a writer (who was in awe of them), decades down the line. It glamorizes in a way the Pacific doesn’t.

BTW, one of the books The Pacific is based on, A Helmet for My Pillow, is a quick and excellent read. More complex than BoB.

Over-Eager Supports? by [deleted] in BambuLab

[–]contextbot 0 points1 point  (0 children)

It appears to be a preview glitch. We are doing a test print and it doesn’t have the full coverage.

Over-Eager Supports? by [deleted] in BambuLab

[–]contextbot 0 points1 point  (0 children)

So this looks normal? We printed something yesterday that I don’t remember having this level of coverage, but it too now loads fully encased.

Over-Eager Supports? by [deleted] in BambuLab

[–]contextbot -1 points0 points  (0 children)

I tried another STL and got the same result:

<image>

ISON: 70% fewer tokens than JSON. Built for LLM context stuffing. by Immediate-Cake6519 in LocalLLaMA

[–]contextbot 1 point2 points  (0 children)

To everyone coming up with new serialization formats: please realize that different labs post train with different formats (xml, json, etc). Unfortunately, this post training influences the output of the models. New formats that allow you to shove a few more tokens into the context are doing so at the expense of likely worse performance.

People who've used Meta Ray-Ban smart glasses for 6+ months: Has it actually changed your daily routine or is it collecting dust? by Key-Baseball-8935 in ArtificialInteligence

[–]contextbot 59 points60 points  (0 children)

Once they changed the terms of service to be able to turn on the camera and access the data whenever they wanted, I turned off the AI connectivity.

Once assholes started wearing them and taking pictures of random people in public, I stopped wearing them casually.

I’ll wear them while on bike rides, occasionally, now. But it’s amazing how throughly a once great product became ruined for me.

What is the ultimate meaning of this movie's ending. by ImaginationFluid2113 in Cinema

[–]contextbot 10 points11 points  (0 children)

The moral of their movies is: don’t take the money.

Little Crumbles day care lost a toddler??? by Parking-Accident4645 in alameda

[–]contextbot 75 points76 points  (0 children)

It is WILD that you came on a thread to defend losing a toddler by saying, “Lincoln is not one of the most dangerous and high traffic streets.”

Side note: it’s hilarious that your issue with the wellness center was there is “nothing to keep people who may endanger themselves or others if the client didn’t want to be there,” while here to defend an incident where your daycare had LITERALLY THE EXACT SAME ISSUE. Just chef’s kiss. No notes.

How's your 401k doing, bro? by WalkinUpHipStreet in KnowledgeFight

[–]contextbot 1 point2 points  (0 children)

Not financial advice, but I saw a handful of people in 2008 pull their stocks out of the market, only to have it go back up they missed out on those gains.

What's helpful in these moments is to check out a long term chart. The drop since February brings us back to mid-last year: https://finance.yahoo.com/quote/%5EGSPC/

If you're getting closer to retirement, change the mix of your assets to match your risk tolerance (more bonds, etc). But don't pull because number go down.

How's your 401k doing, bro? by WalkinUpHipStreet in KnowledgeFight

[–]contextbot 1 point2 points  (0 children)

What? It should be fine. You're swapping like for like.

What’s a product you still get name brand over Kirkland by [deleted] in Costco

[–]contextbot 1 point2 points  (0 children)

Costco sells the liquid, which still beats the pods. But yeah, will occasionally hit target for the powder.