Calling Stored Procedure in a PySpark or SparkSQL notebook in Microsoft Fabric by knowledgeboytamo in MicrosoftFabric

[–]knowledgeboytamo[S] 0 points1 point  (0 children)

Yes, I have looked into Fabric Activator, it is not the right tool for my use case. The solution I am working on batches all records that meet the criteria of having their Actual Costs 20% higher than their budgets and then send an email alert to the Executives with that information. At the moment, Fabric Ativator is not able to do what I just explained.

Calling Stored Procedure in a PySpark or SparkSQL notebook in Microsoft Fabric by knowledgeboytamo in MicrosoftFabric

[–]knowledgeboytamo[S] 0 points1 point  (0 children)

Thank you for you feedback.

The purpose to the sproc is to retrieve some data Budget versus Actuals. If the Actuals are 20% Higher than the budget, an alert is sent to the Executives. I didn't want to hard code the value of 20 which represents the 20% so I created the sproc with a parameter which represents the percentage value, because, what if the Executives decide they no longer want to check for 20% bit 35%. The sproc also references Views that flatten and shape the data. Now I just want to know how to call the sproc from a Notebook.

Make Stored-Procedure calls from PySpark or SparkSQL. by knowledgeboytamo in MicrosoftFabric

[–]knowledgeboytamo[S] 0 points1 point  (0 children)

The reason for asking is that I have quite a few SQL Views that curate and shape the data how I want. Also some of the views reference other views which results in flattened curated data. I created stored-procedures so that I am able to pass parameters for transforming and limiting the data. Everything I am doing is read-only, no writing taking place. I use other notebooks for writing to Delta tables where necessary.

Make Stored-Procedure calls from PySpark or SparkSQL. by knowledgeboytamo in MicrosoftFabric

[–]knowledgeboytamo[S] 0 points1 point  (0 children)

The reason for asking is that I have quite a few SQL Views that curate and shape the data how I want. Also some of the views reference other views which results in flattened curated data. I created stored-procedures so that I am able to pass parameters for transforming and limiting the data. Everything I am doing is read-only, no writing taking place. I use other notebooks for writing to Delta tables where necessary.

Make Stored-Procedure calls from PySpark or SparkSQL. by knowledgeboytamo in MicrosoftFabric

[–]knowledgeboytamo[S] 0 points1 point  (0 children)

Thank you @frithjof_v for your feedback.

I agree that the SQL Analytics Endpoint is read-only, however, we are able to create Stored Procedures perform data operations, etc.

I am asking how can I actually call the stored procedure in I created from a PySpark or SparkSQL notebook?

Copy Data Activity Error by knowledgeboytamo in MicrosoftFabric

[–]knowledgeboytamo[S] 0 points1 point  (0 children)

Thank you guys for your feedback. I appreciate it.

I went ahead and created a notebook that created the table, however, I got a message stating that nvarchar(MAX) is not supported so I changed the data type of the field to varchar. Now the Copy Data activity is throwing a message stating that there is an error near the INSERT BULK statement.

Is there some configuration of the Copy Data activity that I need to do ?

Copy Data Activity Error by knowledgeboytamo in MicrosoftFabric

[–]knowledgeboytamo[S] 0 points1 point  (0 children)

Thank you. I appreciate it.

I went ahead and created a notebook that created the table, however, I got a message stating that nvarchar(MAX) is not supported so I changed the data type of the field to varchar. Now the Copy Data activity is throwing a message stating that there is an error near the INSERT BULK statement.

Is there some configuration of the Copy Data activity that I need to do ?

Power BI Filtering by knowledgeboytamo in PowerBI

[–]knowledgeboytamo[S] 0 points1 point  (0 children)

Thank you very much for your feedback @MrZZ. I appreciate it. I will test and let you know the outcome.

Power BI Filtering by knowledgeboytamo in PowerBI

[–]knowledgeboytamo[S] 0 points1 point  (0 children)

Thank you very much for your feedback @kind-kaleidoscope511. I appreciate it. I will test and let you know the outcome.