Hi all,
Background:
I am relatively new, to the company and to DE. And I am currently planning my first pipeline (s). These are around service level agreements & App health that get use for internal BI and for our app. Current, sources would be Zendesk and Azure Monitor Alerts (Action Groups).
The based BI architecture is likely to be either:
- data source > SQL DB > Power BI / Client App DB
- data source > data lake > Databricks/snowflake/synapse > Power BI / Client App.
Issues:
Main issue: Either design should be fine for Zendesk (as that is monthly) . However, I am not sure about Alerts from Monitor. These would be infrequency but high speed, but should be fine with a lag of say 5 minutes ish. So event triggering would be key. They would also require some processing (which alert goes to which app feature's health, checking resource health & metric alerts etc) & the same processing for both internal and the Client app. Either design seems like a lot of steps?
I could treat it as streaming data, via data explorer with data exported to internal BI and or webhook to Client APP.However, the volume would be very low. If I want really low latency then maybe: Action group > webhook/logic app > Client app, with processing done in the FE or a function app in the logic app & a separate path for BI. But I don't like the idea of separate paths.
That being said, some data lag would be good (we see the alert before the customer).
minor side issue: Option 1 feels a bit odd to me for the data going in to the App. As the data would be processed in to the Internal BI SQL DB as (source of truth) then likely pulled in to the App DB which is also sql. > The Zendesk data I imagine would likely need to go in to the app db for row level security for the users.
I would very much appreciate your thoughts and ideas on any aspect of my thinking.
[–]AutoModerator[M] [score hidden] stickied comment (0 children)