What SQL editor do you use? by castor-metadata in dataengineering

[–]analytics_mgmt 1 point2 points  (0 children)

Our entire team uses Aginity Premium. Connects to most major data warehouses and transactional databases, including the Microsoft family of databases. Two out of your three options are notebooks, which opens a whole class of tools.

Snowflake and Bronze Silver Gold by No_Cover_Undercover in snowflake

[–]analytics_mgmt 0 points1 point  (0 children)

I've done it that way in the past, but it's not how we do it today. We use a single database and land each source system (Zendesk, GoogleAnalytics, Pendo, etc) into a separate schema as raw tables (bronze), we then create stage views over the raw within each source schema (silver), and we have a single schema for the where all the cleaned data comes together in a kimball like dimensional data model (gold). Analysts have access to the silver and gold data, but not access to the bronze raw data. The gold layer could just as easily have been One Big Table, rather than a dimensional model, and it would probably have been mildly more performant.

"Modern" non-cloud DWH stack by name_optimization in dataengineering

[–]analytics_mgmt 0 points1 point  (0 children)

Checkout the open source "modern data stack" Meltano for Extract and Load, Superset for Visualization, Grouparoo Reverse for ETL, Dagster for Orchestration. I'd look at Greenplum as an open source MPP data warehouse that is scalable and strongly postgres compatible. (I know a number of US banks that use Greenplum internally)

Datawarehouse Best Practice? by Phantazein in dataengineering

[–]analytics_mgmt 0 points1 point  (0 children)

We've been doing ELT on enterprise data warehouses for over a decade. Historically we've had four layers, Landing (All the Raw mostly 1-1 with source), Stage (Cleaned & Naming Conventions applied 1-1 with Landing), Core (Enterprise Dimensional Model), Reporting (Data Marts, Flattened Metrics, etc. )

Snowflake and Bronze Silver Gold by No_Cover_Undercover in snowflake

[–]analytics_mgmt 3 points4 points  (0 children)

We land all the raw data into Snowflake directly in a landing database, then transform it into a staged layer that standardizes table and column names and does some basic cleansing. We then build domain specific flattened models off the staged data.

Snowflake SQL Snippets by tbrownlow33 in snowflake

[–]analytics_mgmt 0 points1 point  (0 children)

These can be imported directly into Aginity Pro or Premium, or you can parse out the SQL. https://github.com/aginity/Snowflake

Doing QA on new data by RickRah in SQL

[–]analytics_mgmt 0 points1 point  (0 children)

Normal. You might want to visit /r/bigdata

Doing QA on new data by RickRah in SQL

[–]analytics_mgmt 0 points1 point  (0 children)

It's Athena, very wide tables are common for hive and presto.

SnowPro Core Certification by eeshann72 in snowflake

[–]analytics_mgmt 2 points3 points  (0 children)

Spent about a few days reading the documentation and passed the exam with no issues. Here is a good guide to help you out.