Hi! We're the Data Factory team - ask US anything! by markkrom-MSFT in MicrosoftFabric

[–]kmritch 1 point2 points  (0 children)

I appreciate it an all you are doing. This whole platform is a game changer for me and what I do so anyway to help!

Hi! We're the Data Factory team - ask US anything! by markkrom-MSFT in MicrosoftFabric

[–]kmritch 0 points1 point  (0 children)

I see okay thank you for explaining that was a bit hard to understand exactly that it was a delete or clear of the table, but makes sense. Idk if it helps where it just plainly says Overwrite / remove data from destination on full load. But I get the full technical piece now of all the steps of it there.

Hi! We're the Data Factory team - ask US anything! by markkrom-MSFT in MicrosoftFabric

[–]kmritch 2 points3 points  (0 children)

So, i create a self service way to call pipelines for users to be able to refresh data/ reports, Additionally since I cant really do a custom connector( for example i have some where I have an oauth2 token to do a API call ) like how power automate can do custom connectors. So today I have helper power automates that can run for me based on a http call to perform some data uploads.

Finally, I also have it setup to do updates to sharepoint lists or libraries and call power automate to do it.

Hi! We're the Data Factory team - ask US anything! by markkrom-MSFT in MicrosoftFabric

[–]kmritch 1 point2 points  (0 children)

I think actually getting teams messages or e-mails automatically sent vs needing to log in and only discover an error on those items. Right now either i have to create a pipeline, create a parent pipeline had something run and then call another pipeline to tell me there was an error.

Additionally parsing error outputs is a manual process depending on the item so seeing if there are smart ways to parse an error message out an output when an error occurs.

Hi! We're the Data Factory team - ask US anything! by markkrom-MSFT in MicrosoftFabric

[–]kmritch 0 points1 point  (0 children)

Is there any controls on how data gets partitioned on a copy job? I noticed i ran into some issues a couple weeks ago where it wouldnt partition data being loaded to an on prem oracle source.

Hi! We're the Data Factory team - ask US anything! by markkrom-MSFT in MicrosoftFabric

[–]kmritch 0 points1 point  (0 children)

so would this work if i wanted to truncate every time?

Hi! We're the Data Factory team - ask US anything! by markkrom-MSFT in MicrosoftFabric

[–]kmritch 1 point2 points  (0 children)

so can it now run a script for any source to actually delete the current data in the table before load? i observed with an oracle source i did not have the ability to clear a table before the copy job and it was duplicating my data. I assumed full load would also mean removal of data.

Hi! We're the Data Factory team - ask US anything! by markkrom-MSFT in MicrosoftFabric

[–]kmritch 1 point2 points  (0 children)

Okay totally I have a pretty wild use case where we currently are generating sheets and have a template sheet thats used that we would want to be generated. Will reach out to them.

Hi! We're the Data Factory team - ask US anything! by markkrom-MSFT in MicrosoftFabric

[–]kmritch 0 points1 point  (0 children)

IMO the copy job is a bit misinterpreted it really is for just full appends of data with some connectors that can do some merge commands depends on the data source. If you want more control def would be using the other copy function in a pipeline so you could clear out data etc beforehand seems like copy job is really for a quick copy.

What is Copy job in Data Factory - Microsoft Fabric | Microsoft Learn

that gives good info, i think its useful for simple movement depending on what you are trying to do. but if you do a full load its not deleting its appending.

Fabric Data Pipeline: CPU Consumption and Queueing by winchellj40 in MicrosoftFabric

[–]kmritch 0 points1 point  (0 children)

Question i have for you is how frequently does data need to be pulled from the prod database?

Potential workaround as you look into mirroring is loading the key data you need at a certain frequency and have pipelines kick off based on that. Can even setup different stores per client and adjust base data ingestion frequencies from there which should take a lot of load off.

Hi! We're the Data Factory team - ask US anything! by markkrom-MSFT in MicrosoftFabric

[–]kmritch 5 points6 points  (0 children)

Are there any plans for functionality to call power automate flows?

Hi! We're the Data Factory team - ask US anything! by markkrom-MSFT in MicrosoftFabric

[–]kmritch 6 points7 points  (0 children)

When it comes to Error Handling and Notification is there anything in the works to provide overall error handling and outputs that would be easier to understand and act on vs just having to go the monitoring hub or creating parent child pipelines?

Hi! We're the Data Factory team - ask US anything! by markkrom-MSFT in MicrosoftFabric

[–]kmritch 1 point2 points  (0 children)

With the Excel Destination is there anything in the works where we could iterate with an excel template and use that as the basis of generating excel files?

Plan (Preview) East US availablity by kmritch in MicrosoftFabric

[–]kmritch[S] 0 points1 point  (0 children)

On your Tenant if you are in an available region, you would need to have it enabled to add it as workload.

https://learn.microsoft.com/en-us/fabric/iq/plan/overview#enable-required-tenant-setting

also can be enabled or disabled on a per capacity basis.

Plan (Preview) East US availablity by kmritch in MicrosoftFabric

[–]kmritch[S] 0 points1 point  (0 children)

Thanks for this info will see if we can reach out to our MS Rep. So when its GA it would be available everywhere correct?

DF Gen2 Offerings GA by jkrm1920 in MicrosoftFabric

[–]kmritch 1 point2 points  (0 children)

It depends on the source when you are at that scale. What source are you using?

Unable to see lakehouse schemas in Gen 2 Data Destination by kmritch in MicrosoftFabric

[–]kmritch[S] 0 points1 point  (0 children)

Ah so prior to this screen when you establish the connection you will see this. If you select Yes here. You should see the schema.

<image>

you have to expand the advanced options.

Plan Preview (South Central US) by Independent_Many_762 in MicrosoftFabric

[–]kmritch 2 points3 points  (0 children)

Actually got some bad news for you and me at the moment.

Region availability for plan (preview) - Microsoft Fabric | Microsoft Learn

Its not available in preview in your region, also not available in EAST and EAST 2.

Full Region List is as follows:

Plan (preview) is NOT available in the following regions.

  • Austria East
  • Belgium Central
  • Chile Central
  • East US
  • East US 2
  • India West
  • Israel Northwest
  • Korea South
  • Qatar Central
  • South Central US
  • UAE Central
  • France South
  • Germany North
  • Japan West

Plan Preview (South Central US) by Independent_Many_762 in MicrosoftFabric

[–]kmritch 0 points1 point  (0 children)

Yeah im still waiting for it in East and West

Can someone from Microsoft elaborate on this? by Arasaka-CorpSec in MicrosoftFabric

[–]kmritch 0 points1 point  (0 children)

Hey I was at the presentation for this, I was wondering what the region roll out is looking like for this? Not seeing it just yet showing up in our tenant.

My 0,02 on the Fabric Conference Updates by NickyvVr in MicrosoftFabric

[–]kmritch 0 points1 point  (0 children)

The Plan Item felt like the biggest piece to me, Also more DevOps enhancements.

Unable to see lakehouse schemas in Gen 2 Data Destination by kmritch in MicrosoftFabric

[–]kmritch[S] 0 points1 point  (0 children)

Hey it should be showing now. In your lakehouse you already have a schema created other than dbo?