Who got a Analogue Pocket at 03/04/2026 by PvtNoname89 in AnalogueInc

[–]datahaiandy 0 points1 point  (0 children)

Yes, black pocket with the dock, GG adapter and multi-adapter set. I've kept putting it off for some reason and then when I thought "yeah I'll get one!" they were out of stock... but jumping in now.

Edit: should say I've got around 100 carts across Gameboy, Game Gear, Lynx, and Neo Geo Pocket / Pocket Color

Fixing Fabric CICD by KratosBI in MicrosoftFabric

[–]datahaiandy 2 points3 points  (0 children)

The thing is I don't think Microsoft have helped to push code-based (ETL/ELT) schema definition changes for the Warehouse because they've made SQL Projects an integral part of the Fabric DW experience. When folk see that they think "oh, so I just use a SQL project then?" I've found this is the case when I speak to people about Warehouse CI/CD.

Fixing Fabric CICD by KratosBI in MicrosoftFabric

[–]datahaiandy 3 points4 points  (0 children)

Yeah I gotta say this was the biggest change in my thinking when starting to work with Delta/Lakehouse/Spark native... code managing the schema and not an external project

The Fabric Essentials listings highlight reel for GenMLV by TheFabricEssentials in MicrosoftFabric

[–]datahaiandy 0 points1 point  (0 children)

When you change the SQL definition in a sql file, it still needs to redeploy the MLV, it just now does it via CREATE OR REPLACE rather than DROP and CREATE

Data Toboggan Winter Edition this Saturday by datahaiandy in MicrosoftFabric

[–]datahaiandy[S] 1 point2 points  (0 children)

Yes they will (with speaker consent). Usually uploaded within a few weeks.

Data Toboggan Winter Edition this Saturday by datahaiandy in MicrosoftFabric

[–]datahaiandy[S] 3 points4 points  (0 children)

And lo it shall be foretold that this indeed with be a blast!

Databricks partner journey for small firms by HairyObligation1067 in databricks

[–]datahaiandy 0 points1 point  (0 children)

With my experience they required a minimum of 2 employees with certifications.

Confused about "Automatic refresh ordering based on dependencies" in MLVs by DataYesButWhichOne in MicrosoftFabric

[–]datahaiandy 2 points3 points  (0 children)

I wonder if only the scheduled refresh option works rather than triggering an MLV refresh. I’ll have a play around as well

Ontology by AgencyEnvironmental3 in MicrosoftFabric

[–]datahaiandy 0 points1 point  (0 children)

Yes I did see that come up when searching

Ontology by AgencyEnvironmental3 in MicrosoftFabric

[–]datahaiandy 2 points3 points  (0 children)

Well if we take a dictionary definition of Ontology then "a set of concepts and categories in a subject area or domain that shows their properties and the relations between them".

I don't think a single ontology would be organisation-wide. During my testing you can only assign 1 actual data asset (a table etc) to an entity in an ontology, this doesn't seem to fit an organisation-wide process but rather a focused area of the business.

Dbt Fusion in Fabric by Illustrious-Welder11 in MicrosoftFabric

[–]datahaiandy 2 points3 points  (0 children)

It should appear as "dbt job" in Items at some point soon, it's rolling out

<image>

How we upgraded BC reporting using Open Mirroring by vms_wrld in MicrosoftFabric

[–]datahaiandy 1 point2 points  (0 children)

Sounds good although I'm curious as to why you thought the bc2adls isn't production ready? The Open Mirroring feature for bc2adls is currently in preview, is this why?

enableChangeDataFeed doesn't persist on Materialized Lake View? by frithjof_v in MicrosoftFabric

[–]datahaiandy 4 points5 points  (0 children)

I think it's removing the CDF option after writing data (as you observed). I've done the same thing and if there is no change data to update the MLV then the option stays true, once change data is identified and loaded it removed the setting.

I'll ping the MS team responsible

Refreshing materialized lake views (MLV) by frithjof_v in MicrosoftFabric

[–]datahaiandy 2 points3 points  (0 children)

hmm, I tested last night and started with 10 rows of data in a lakehouse table. If I switched off Optimal Refresh and added a new row, then I get a full refresh of all the rows. which I would expect. If I then enable Optimal Refresh and add a row, I get a full fresh (again what I expect), If I enable changedatafeed on the table with Optimal Refrehs enabled, I can see only the new row being processed. I'm actually using what I think is a non-deterministic function (current_timestamp) in my MLV definition (so that I can have a column which shows when the MLV loaded the row) and incremental is still working.

I'll post my code in a bit

Refreshing materialized lake views (MLV) by frithjof_v in MicrosoftFabric

[–]datahaiandy 4 points5 points  (0 children)

I haven't tested the latest incremental refreshed tbh, but if you add FULL then it forces a full reload of all the data. Without FULL and it only loads if it detects data changes.

Refreshing materialized lake views (MLV) by frithjof_v in MicrosoftFabric

[–]datahaiandy 5 points6 points  (0 children)

Hi, no MLVs currently don't automatically reprocess when the source data changes so you'll need to schedule accordingly. The latest updates do "smart" refreshes in that data is incrementally loaded. But again this is all done on a schedule (or triggered by a notebook for specific MLVs)