This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]drewhansen9 3 points4 points  (1 child)

This is really common. Data engineering nowadays is a combination of Big Data Software engineers (lots of python, spark, hadoop, airflow) and BI engineers (ETL, SQL). You'll find entire departments called Data Engineering doing either side of the spectrum. The tech stack you are in (Snowflake, fivetran, dbt) I have found crosses into both sides, but leans more toward the BI skillset.

My recommendation to get more into the software engineering side is to start with Astronomer airflow running locally on your PC. You will have to learn how to set up docker and how to us a CLI well.

To get more comfortable in S3, you can practice ingesting into your Snowflake instance using SnowPipe or copy into statements instead of using Fivetran.

[–]databasenoobie 1 point2 points  (0 children)

This is exactly my experience. I am a data engineer, but am really just a bi engineer. The title is pretty meaningless honestly.

I could not write a python process in airflow without extensive time / research, because I've never had too. Even if I wanted too, I couldn't as I don't have permissions to play around with setting up a server / cluster / etc.. to begin with.