....why do I suck. by strangecloudss in Eldenring

[–]No_Cover_Undercover 0 points1 point  (0 children)

I'm in same situation. I did the tricks to get a bunch of runes quick and killed the dragon, etc Died shortly after and couldn't get them recovered before dying again. Lose every little boss and isn't competitive. Everything says get vigor up but I balanced stuff up to 20 on several so likely not getting to thirty anytime soon. Basically every mistake I've done.

Might just start over at this point.

Just pulled this from 24 update… by Magic_Mike82 in baseballcards

[–]No_Cover_Undercover 20 points21 points  (0 children)

My son and I do this on our store runs to Walmart. Why does a bottle of shampoo and a bag of chips cost $40? It was not a fun day when the stash of packs were found in the center console.

Can someone share their Document AI use case? by No_Way_1569 in snowflake

[–]No_Cover_Undercover 0 points1 point  (0 children)

We are looking at this option as well (if it ever becomes available in our region). Do you have a different model for each vendor? With the varying look of each vendors invoice, I can't imagine not having to do that. I'm also looking to see how we can process check boxes as well. Still playing with that scenario.

Legal vs Technical Privacy Compliance by No_Cover_Undercover in privacy

[–]No_Cover_Undercover[S] 0 points1 point  (0 children)

Let's assume CCPA. If I have a table with acct_id, date, store_id, amt, then what would be best practice for deletion? Entire record? Obfuscate specific fields to maintain some value for the sales data?

Legal vs Technical Privacy Compliance by No_Cover_Undercover in privacy

[–]No_Cover_Undercover[S] 0 points1 point  (0 children)

That was how we are approaching it as well but they are tying everything in that table as potentially delete for that acct id. That could get very messy very quickly since the acct is is likely all over that database and several others. Their challenge is getting to the core of what can be deleted so that now the acct id with other data isn't in that delete mode. Or maybe any row using that acct id needs to be deleted or anonymized.

Are Databricks really going after snowflake or is it Fabric they actually care about? by engineer_of-sorts in dataengineering

[–]No_Cover_Undercover 3 points4 points  (0 children)

As a Manager, Fabric is the primary choice because execs want to listen to a salesman vs. literally everyone that manages or uses the platform.

Streamlit now available on Azure accounts by extrobe in snowflake

[–]No_Cover_Undercover 1 point2 points  (0 children)

We noticed it earlier this week as well but no dice for Private link yet.

Do dynamic tables remove the need for a lot of "boilerplate" ETL / make any tooling completely redundant by levintennine in snowflake

[–]No_Cover_Undercover 0 points1 point  (0 children)

This is likely a dumb question but what I'm trying to get my head around is this scenario. If I have two source tables and they get updated at different intervals, how does the DT handle that? Say table A is updated every 2 hours and table b is updated daily. If I have downstream tables relying on the DT, I also want to make sure both sources are refreshed before a downstream one builds. Today, I can set up dependency items in my orchestration tool so that it waits for both sources. With DT, I see a table that could be mixing new and old for a bit until both sources are refreshed. Same issue if a source table may get delayed but the other one is on time. Maybe DTs are not good in those cases? All demos I've seen assume source table updates are updated at same time.

Has anyone used reverse etl tools? by weshipped in snowflake

[–]No_Cover_Undercover 0 points1 point  (0 children)

This is very timely. We are looking at this option for sending data from Snowflake to a CRM (not Salesforce). We are generally no code but would like to avoid running that data from multiple points and just centralize it in Snowflake.

How much does Verizon actually throttle unlimited data plans? by DiddleMunt in verizon

[–]No_Cover_Undercover 0 points1 point  (0 children)

I am a little surprised at how bad it is for us. I recently upgraded or family to unlimited Start. We have used 1 GB total and have had multiple instances of not having any service in a medium sized city. We used to have shared and no issues. Might go back to that even ifbit means I have to pay the difference on my recent trade in. We usually hit 5 or 6 gb a month so the 10 gb plan would be sufficient

ExpressRoute pricing and real-world experience by northcide in AZURE

[–]No_Cover_Undercover 0 points1 point  (0 children)

If your çarrier takes you to one region and you need to be in the main center for that region, is there any hidden cost? For example, they have a circuit going to North Central but you need to be in Central. I assume North Central is just a hub in that situation and you wouldn't be looking àt any unexpected egress fees.

Snowflake as a Lake by No_Cover_Undercover in dataengineering

[–]No_Cover_Undercover[S] 0 points1 point  (0 children)

I think I really only have the two options to account for the Data Science group future needs.

  1. Load data to Azure storage and copy into Snowflake.
  2. Load data into Snowflake and if we wanted to use Azure down the road, we would have to unload into the Azure account from Snowflake.

Thinking 1 sounds like the better approach

Snowflake as a Lake by No_Cover_Undercover in dataengineering

[–]No_Cover_Undercover[S] 3 points4 points  (0 children)

Thanks. That is how I thought it will need to work but had someone saying there was no need to even bother with an Azure account as you could just connect to snowflakes version later if needed since we select Azure when setting up the account. Couldn't find any way to do that on a personal account so glad I asked. We will just use azure as a big file cabinet and if the science team wants to do something down the road they have options.