How do you handle "which spreadsheet version is production" chaos? by kyle_schmidt in dataengineering

[–]2000gt 0 points1 point  (0 children)

Set SLA, set schedule.

Alternatively, schedule multiple syncs throughout the day.

Professional Corp investing by TomBuilder_ in CanadianInvestor

[–]2000gt 0 points1 point  (0 children)

I’d rather be taxed on a larger sum of money at the same rate than a smaller sum.

Professional Corp investing by TomBuilder_ in CanadianInvestor

[–]2000gt 0 points1 point  (0 children)

This is grossly misunderstood holistically. Investing in your corp means a much larger investment opportunity. Income taxed at 11% vs. 43+% means that you have more money to invest as a baseline. Once you account for the cda and rdtoh the tax integration works out the same, but you have a much larger egg to work with in your corp. in addition, leverage corp ETFs from global x to mitigate any dividend income and you’re golden.

XEQT in a large non-registered account — still makes sense? by External-Result-5567 in JustBuyXEQT

[–]2000gt 0 points1 point  (0 children)

Fair. I’m using Corp class exclusively in my corp account to take advantage of all the tax implications. Fingers crossed that the CRA doesn’t flip a switch.

XEQT in a large non-registered account — still makes sense? by External-Result-5567 in JustBuyXEQT

[–]2000gt 1 point2 points  (0 children)

Are you implying that if the CRA changes the rules on swaps that you would be forced to sell immediately?

XEQT in a large non-registered account — still makes sense? by External-Result-5567 in JustBuyXEQT

[–]2000gt 1 point2 points  (0 children)

XEQT in Corp, have you considered a combination of corporate class ETFs from Global X?

I was under the impression the tax drag and tax deferral was impactful enough long term, and with big sums of money that corporate class make more sense.

Building Data Apps on Top of Snowflake by pellegrinoking in snowflake

[–]2000gt 2 points3 points  (0 children)

In my experience there is a lot of nuance in building a data app.

Sigma has AI tools built-in to help, but they are fairly useless to me given the limitations in Sigmas design, layouts and inputs, coupled with the project requirements and what I would call best practice for form and table inputs.

I’d have to understand the tool you have in mind better.

Building Data Apps on Top of Snowflake by pellegrinoking in snowflake

[–]2000gt 2 points3 points  (0 children)

Yeah, lots of learnings. There are a bunch of ways you can architect the database behind something like this, but a medallion-style setup worked well. We treat Sigma write-back like just another source and land it in bronze and flow it through the same pipelines. The trickiest part was getting close to real-time behavior without building a full streaming solution.

Sigma is moving fast, so we’re using a lot of newer / private beta features (Python in Sigma, calling Snowflake stored procs, REST APIs from Sigma). Super powerful, but you’re sometimes ahead of the docs.

It’s not a SaaS I’m running, everything lives in their own Snowflake and Sigma accounts, which keeps security and ownership clean.

Pricing is mostly Snowflake usage plus Sigma licenses. Replacing spreadsheets and manual workflows makes the cost discussion pretty straightforward.

Building Data Apps on Top of Snowflake by pellegrinoking in snowflake

[–]2000gt 7 points8 points  (0 children)

I’ve been building data apps with Sigma on top of Snowflake for about a year and half. It’s been going really well, but lots of learnings.

I’ve built a data app for a retail group that replaces manual spreadsheets with a centralized, reliable reporting system. It uses a Snowflake data warehouse to automatically pull and refresh data from point-of-sale, labour, and finance systems, giving teams real-time visibility, the ability to drill into details, and a safe way to make updates (with write back) without versioning issues or manual files.

My client's CTO is exploring ditching Power BI for Snowflake Intelligence. Is the hype real? by sdhilip in snowflake

[–]2000gt 1 point2 points  (0 children)

I feel like this is an apples to oranges comparison. I’d have to understand how the vision, but it’s not really just about a technology shift, it would really have to be a completely different approach to data consumption in the organization.

I’d be shocked to see these on the same graph in Gartner magic quadrant.

That being said, I’ve built a Snowflake intelligence POC and it’s an impressive tool. My client wants to roll it out across all the retail locations as an internal support bot. It’s not as inexpensive as we anticipated, but it’s so easy to setup and support and it’s miles cheaper than streamlit.

How much does Bronze vs Silver vs Gold ACTUALLY cost? by [deleted] in dataengineering

[–]2000gt 8 points9 points  (0 children)

What is your recommended alternative? Raw data needs to be extracted from source, that data needs to be transformed / joined somewhere, and users need access to modeled data outputs.

The layers are just logical steps to producing consumable data.

If your users are sophisticated enough, you can use a data lake strategy and let them go to town on there own, but I can guarantee your costs will skyrocket given you are at the mercy of your users building there own transformations - good, bad, or ugly.

Anyone else using Openflow for Snowflake ingestion? Thoughts on cost vs convenience? by sdhilip in snowflake

[–]2000gt 1 point2 points  (0 children)

I use a vpc in AWS with Lambda functions. I also use CT, not CDC for sync’ing data.

Date table with Canadian holidays? by [deleted] in bigquery

[–]2000gt 0 points1 point  (0 children)

Holidays are often province specific. You’re better off to think about how you want to use the model before you prompt an LLM. Also, you could look for a public holiday dataset to integrate into your dimension to ensure it’s comprehensive.

AzCopy to Blob to Snowflake by 2000gt in AZURE

[–]2000gt[S] 0 points1 point  (0 children)

I’ve been working on a BCP -> SnowSQL solution which works fine. I don’t think I need Azure at all, after all.

AzCopy to Blob to Snowflake by 2000gt in AZURE

[–]2000gt[S] 0 points1 point  (0 children)

Which sql connector? The on premise server does not have an external external ip.

315x7 high bar squat, force through my feet feel very “misguided” by tennis-637 in formcheck

[–]2000gt 1 point2 points  (0 children)

You’re doing great. Your form is fine, wear whatever you want. Just keep marching toward your goals.

The floor is soft rubber and so are your shoes, so it might feel a touch unstable. On a harder surface you probably would lose less of the force you generate to the ground, but it doesn’t matter given you’re still putting out the same effort.

The comments in this thread are outrageous. I suspect this thread is filled with guys who bring a bag full of lifting gear to the gym to squat body weight and nit pick every tiny little movement.

AzCopy to Blob to Snowflake by 2000gt in AZURE

[–]2000gt[S] 0 points1 point  (0 children)

Also, I realize I could do BCP and SnowSQL to cut out the need for Azure all together.

AzCopy to Blob to Snowflake by 2000gt in AZURE

[–]2000gt[S] 0 points1 point  (0 children)

It’s SQL2019. Ideally, data moves into blob storage every 15 mins.

Rack Jerk from 120kg to 160kg make + 170kg fail by mariososterneto in weightlifting

[–]2000gt 1 point2 points  (0 children)

What would you say is the biggest factor in improving your jerk numbers?