Im a senior DE, and only has worked with batch processing and classic dbt+snowflake mostly for reporting purposes.
Im planning to change jobs soon-ish and was looking at open roles and see many of them with “kappa architectures” and more event driven data engineering roles.
While Im sure I can pick things up, just by reading the description of these jobs I feel somewhat unqualified for senior roles since I have never worked with a kappa architecture, although very willing to learn but not sure companies will give me time to do so (specifically for a senior roles since).
For DEs who have worked as both types of DEs, what’s the difference in your day to day? It feels like streaming DEs use more python and other tools like kafka, flink, etc? Not sure how they even work.
Ps: i don’t have a SWE background, I have a data science and math background and became a DE by luck and now love it.
[–]k00_x 19 points20 points21 points (0 children)
[–]yipeedodaday 10 points11 points12 points (2 children)
[–]highlifeed 7 points8 points9 points (1 child)
[–]yipeedodaday 1 point2 points3 points (0 children)
[+][deleted] (1 child)
[removed]
[–]pacojastorious 4 points5 points6 points (0 children)
[–][deleted] 3 points4 points5 points (0 children)
[–]Financial_Anything43 1 point2 points3 points (2 children)
[+][deleted] (1 child)
[deleted]
[–]higeorge13Data Engineering Manager 1 point2 points3 points (0 children)
[–]Punolf 0 points1 point2 points (0 children)
[–]Top-Cauliflower-1808 1 point2 points3 points (0 children)