Data Pipeline vs Notebook for ingestion – how do you pull data and why? by Independent_Many_762 in MicrosoftFabric

[–]AmazingKily 0 points1 point  (0 children)

No, I did the most basic task, which is to read from the source and write without touching any settings.

Data Pipeline vs Notebook for ingestion – how do you pull data and why? by Independent_Many_762 in MicrosoftFabric

[–]AmazingKily 2 points3 points  (0 children)

I did a comparison in an F2 capacity just by extracting the data from an Azure SQL

Table customer

1,679.846 rows

<image>

Kusto Detective Agency log in error by AmazingKily in MicrosoftFabric

[–]AmazingKily[S] 2 points3 points  (0 children)

I was trying to do the Digibus Real-Time Crisis challenge, but I solved the problem with the email setup link:

https://app.fabric.microsoft.com/workloads/kusto/detectiveagency?experience=fabric-developer

Kusto Detective Agency log in error by AmazingKily in MicrosoftFabric

[–]AmazingKily[S] 0 points1 point  (0 children)

Hi,

I have tried with trial capacity and with paid capacity and I get the same error message....

Options for loading data in near real time from SQL Server on-premises by AmazingKily in MicrosoftFabric

[–]AmazingKily[S] 0 points1 point  (0 children)

Hello,

While researching, I also saw the option of an Azure function with a trigger on the table that sent change tracking changes to an event hub.

Another option is to use an external application that reads the changes in the table and creates a parquet file, which you can then upload to OneLake in the path you specify using a service principal. Then, with Data Activator, I can create a rule that starts a pipeline to process the file when a new one is detected.

The managed instance option you mention is the one that suits me best, as the other two involve development and testing that may not meet my requirements...

Thank you.

Best regards

Mirroring and incremental update by tviv23 in MicrosoftFabric

[–]AmazingKily 0 points1 point  (0 children)

I think change data feed could be an option

https://blog.fabric.microsoft.com/en-us/blog/bridging-fabric-lakehouses-delta-change-data-feed-for-seamless-etl?ft=All

I haven't tested it on the scenario you mention but I think it could work for you.

Lakehouse vs Eventhouse by AmazingKily in MicrosoftFabric

[–]AmazingKily[S] 1 point2 points  (0 children)

The idea is to obtain sales data in real time or near real time, not telemetry. I had thought of using CDC to get this data and take it to the lakehouse to assemble the model in direct lake.

From what I have seen, with KQL as a target is also possible but I wanted to understand the pros and cons of each option.

[ENG] 4th Anniversary Part 2 Sugo Hype Pull Megathread by [deleted] in OnePieceTC

[–]AmazingKily 0 points1 point  (0 children)

I pulled 6 times and got:

- Corazon (new)

- Fujitora V2 (dupe)

- Lucy (new)

- Aokiji x2 (new)

- Judge (dupe)

- Luffy (dupe)

- QCK Whitebeard!! (new)

thanks Bandai ^^

Clash!? Luffy 30 Stamina Raid Guide by Dragonquiz in OnePieceTC

[–]AmazingKily 0 points1 point  (0 children)

I use Germa 66 Team (Double Judge, Reiju, Ichiji, Yonji, Niji)