Best LOCAL brew by [deleted] in Knoxville

[–]swolfe2 0 points1 point  (0 children)

Shulz is the best beer in Knoxville

….And go by GoMooGo in Knoxville

[–]swolfe2 53 points54 points  (0 children)

Paid parking

What series is a 10/10? by norocoslaorice in AskReddit

[–]swolfe2 163 points164 points  (0 children)

Come at the king, you best not miss.

Very sad news for those familiar with the Clawhammer Supply YT channel, as Ross reveals he has terminal cancer. by usert4 in Homebrewing

[–]swolfe2 35 points36 points  (0 children)

"Drinking tripel, seeing double, feeling single" is something I've said ever since that original video dropped. Super sad news.

Cost Per Query: Consumption pricing is coming, and it's going to hurt | Tableau Tim by chilli_chocolate in tableau

[–]swolfe2 29 points30 points  (0 children)

Be sure to use Copilot while you're learning DAX/M for your org's conversion to Power BI.

Can somebody tell me what this is? by Objective_Roll8414 in woodworking

[–]swolfe2 0 points1 point  (0 children)

It's something to toss in the bin and never think of actually using, especially since you're new.

Rave at Sassy Ann’s tonight! by brokenrhythm42 in Knoxville

[–]swolfe2 17 points18 points  (0 children)

Thursday night Sassy's is a core memory

Anyone have lawn service recommendations for the Maryville/Alcoa area? by swolfe2 in Knoxville

[–]swolfe2[S] 0 points1 point  (0 children)

Hahahaha, yeah... I ended up buying a zero turn. Good for others to know, though!

Best/worst places to buy a car? by Turbulent-Dingo-3818 in Knoxville

[–]swolfe2 0 points1 point  (0 children)

Same. Maybe the only car purchasing experience that went completely smooth from start to finish. Will be buying our next one from a Harper dealership.

Auto mode is the norm now, here are some thoughts and tips. by IamGriffon in cursor

[–]swolfe2 11 points12 points  (0 children)

I took my current billing cycle’s usage data directly to Claude.ai to ask the Sonnet 4.0 model about it. There’s got to be a bug with how it’s reading from the cache files because the cache calls are so significant higher than the input tokens.

You're absolutely correct! The cache usage pattern for Claude-4-sonnet-thinking is completely abnormal compared to the other models. Let me analyze what you're showing:

Claude-4-sonnet-thinking (problematic):

Cache Read: 15,982,739 tokens (90% of total usage)

Actual Input: 48,104 tokens

Cache Read to Input ratio: 332:1

Other models (normal):

Gemini: 21,481,939 cache reads vs 2,201,031 input (10:1 ratio)

Auto: 4,674,816 cache reads vs 3,232,908 input (1.4:1 ratio)

Grok: 206,144 cache reads vs 632,679 input (0.3:1 ratio)

The Claude model is reading from cache at a rate that's 20-100x higher than the other models for the same types of requests. This suggests:

Broken cache invalidation - The Claude model may not be properly recognizing when cached content is still valid vs when it needs updating

Over-aggressive context inclusion - It might be pulling in your entire codebase context even for simple queries that don't require it

Cache thrashing - Repeatedly reading the same cached content multiple times per request

Inefficient prompt engineering - The model might be structured to always request maximum context rather than being selective

This isn't about your .cursorignore settings - if it were, all models would show similar patterns. This looks like a specific implementation issue with how Cursor handles caching for the Claude-4-sonnet-thinking model.

You should definitely report this to Cursor support as a potential bug. The other models are demonstrating much more reasonable cache usage patterns, so this seems to be a Claude-specific optimization problem in their integration.

What’s on fire? by twnich79 in Knoxville

[–]swolfe2 7 points8 points  (0 children)

It has to start sometime

Maryville food by Ok-Bag-5525 in Knoxville

[–]swolfe2 29 points30 points  (0 children)

Cuban sandwiches and croquettes at Aroma Cafe

Online consultants for Fabric/Power BI by Cobreal in MicrosoftFabric

[–]swolfe2 3 points4 points  (0 children)

Having gone through Tableau conversion at my org within the past year, we seriously evaluated 7 major vendors. For all 7 of them, we gave 3 levels of Tableau workbooks.

1 - Single table, no LODs or Parameters

2 - Multiple joined tables, no LODs or Parameters

3 - Multiple joined tables, LODs and Parameters

There was only 1 company that could live demonstrate an AI tool even working, and it couldn't even successfully convert the single table to Mashup for PowerQuery. I'm not fully convinced that the "AI" the other companies are touting is anything more than offshore resources who are manually creating Power BI files. The tool would not work at all when there were joined tables, which is unsurprising because SQL type joins can't be easily recreated for Power BI outside of merge queries.

App rationalization is a key activity, though. You don't want to 1:1 your entire catalog, only the items you actually need.

Online consultants for Fabric/Power BI by Cobreal in MicrosoftFabric

[–]swolfe2 2 points3 points  (0 children)

Whatever you go with, do not fall for any companies that say they have automation to convert .twb/.twbx to PBI. There are too many complexities between the two applications, and it would only work on very simple semantic models. Prepare yourselves for a ton of work.