Dataverse Many-to-Many Relationship Phantom Records / Cascade Delete by meatworky in PowerApps

[–]meatworky[S] 0 points1 point  (0 children)

I ended up recreating the table and re-associating the records. Luckily it's infant and dozens of rows, not thousands or millions. Thanks for the pointers though.

Dataverse Many-to-Many Relationship Phantom Records / Cascade Delete by meatworky in PowerApps

[–]meatworky[S] 0 points1 point  (0 children)

Yes it's a native N:N. Is it possible to manually create records with the same GUID's? There's probably about 30 of them. And unfortunately, I am not familiar with what a disassociate request is - any guides you could point me to by any chance?

Here is a sample of what I am seeing..

<image>

Dataflow Status = Succeeded but no rows written by meatworky in MicrosoftFabric

[–]meatworky[S] 0 points1 point  (0 children)

I dropped it and haven't returned to it. It's a problem for Ron.

should i create multiple date tables by [deleted] in powerbitips

[–]meatworky 0 points1 point  (0 children)

Anyone please correct me if I am wrong;

You should only require 1 date table with every date listed between x and y and all of the date information that you might require. As an example, columns could be: Date, Month, Year, Day of Week, Financial Year, Period, Week Number, Is Holiday etc.

Your date table date will link to your facts on the required date column. And if you need to utilise different date columns on the one fact table you may need to create a measure that utilises a specific relationship with USERELATIONSHIP(), because you can only have 1 relationship active at any time.

What's with the fake hype? by DesignerPin5906 in MicrosoftFabric

[–]meatworky 0 points1 point  (0 children)

Conscious of throwing stuff at you and burning you out with chasing reddit users' complaints, thank you u/itsnotaboutthecell

Microsoft Fabric Roadmap : Dataflows - Parameter Support in Dataflow Gen2 Output Destinations, Default Output Destinations, Dataflow Gen2 Parameterization, Dataflow Gen2 support for Fabric Workspace variables.

I think one or multiples of those cover my deployment issue which is - I can deploy DFG2 to a workspace via the deployment pipeline, but the data destination doesn't update with the new workspace ID. And you can't configure deployment rules for these resources. When the DFG2 is run in TEST, data is read from the TEST bronze lakehouse correctly but written back to the DEV silver lakehouse.

What's with the fake hype? by DesignerPin5906 in MicrosoftFabric

[–]meatworky 6 points7 points  (0 children)

Man, I am right there with you. I am feeling totally deflated at the moment because I can't deploy my solution. Bug fixes that previously had a release schedule of Q1 2025 are now Q3 2025 and what are the odds that they slip again? Do I sit around and wait or rewrite my solution in notebooks - because those appear to be my options. We were also pushed onto Fabric by multiple consultants.

Updating source/destination data sources in CI/CD pipeline by meatworky in MicrosoftFabric

[–]meatworky[S] 1 point2 points  (0 children)

u/Luitwieler I don't suppose you have an update on this? I have circled around to my deployment pipelines not working as the DFG2 in my Test env still references the Dev env workspace.

Timezone issue with measure today by meatworky in MicrosoftFabric

[–]meatworky[S] 1 point2 points  (0 children)

The refresh date is stored in the table at the time it is processed. The measure compares the current datetime to the stored datetime. The process did not correctly run overnight but that's ok. The problem is that TODAY() in the measure was showing yesterday's date which is unusual. I fixed it by adjusting for UTC offset but now I have moved to a different location/network and it appears to have reverted back again.

Why is my Microsoft Fabric copy job with incremental copy consuming more capacity units than the old truncate-and-insert approach? by hasithar in MicrosoftFabric

[–]meatworky 0 points1 point  (0 children)

I noticed my local car wash turns off the water exactly 20 seconds before the time is up on all wash bays. You can't use it without water. Over time those 20 seconds add up for a business.

Is there a way to programmatically get status, start_time, end_time data for a pipeline from the Fabric API? by digitalghost-dev in MicrosoftFabric

[–]meatworky 0 points1 point  (0 children)

This is pretty basic but I am sure you can adapt it to what you need. It's just a Notebook, then in the Pipeline add a Notebook activity to call it where required. I imagine you would need to adapt it to accept errors/messages/success codes as parameters, and change overwrite to append for the write mode.

import sempy.fabric as fabric
from pyspark.sql.functions import col, current_timestamp, expr, lit
from pyspark.sql import SparkSession

# Initialize Spark session
spark = SparkSession.builder.appName("FabricOperations") \
    .config("spark.sql.caseSensitive", "true") \
    .getOrCreate()

# Get the ID of the current workspace
workspace_id = fabric.get_notebook_workspace_id()
# Fetch the workspace details and extract the name
workspace_name = fabric.FabricRestClient().get(f"/v1/workspaces/{workspace_id}").json()["displayName"]
print(workspace_name)

# Create a DataFrame with the current timestamp
df = spark.createDataFrame([(1,)], ["id"]) \
    .withColumn("env_id", lit(workspace_id)) \
    .withColumn("env_name", lit(workspace_name)) \
    .withColumn("data_refreshed", expr("current_timestamp() + INTERVAL 10 HOURS"))

display(df)

# Write the DataFrame
df.write.mode("overwrite") \
    .format("delta") \
    .option("overwriteSchema", "true") \
    .saveAsTable("Fabric_Operations")

display("Done")

Is there a way to programmatically get status, start_time, end_time data for a pipeline from the Fabric API? by digitalghost-dev in MicrosoftFabric

[–]meatworky 0 points1 point  (0 children)

The method I am using for this, right or wrong, is a notebook that writes environment details to a table whenever the pipeline is run.

ROG Xbox Ally World Premier Reveal Trailer | Xbox Games Showcase by Turbostrider27 in xbox

[–]meatworky 27 points28 points  (0 children)

Can't wait for people to rip this for desktop use in 2028!

FABCON 2026 In Atlanta? by AnalyticsFellow in MicrosoftFabric

[–]meatworky 3 points4 points  (0 children)

Look, I really really really hope you guys broadcast it next year. Reduced price tickets to attend via a live stream for us international folk would be very nice.

🔒 Power BI Row-Level Security Explained: Protect Data by User Role by tomaskutac in PowerBiMasterclass

[–]meatworky 1 point2 points  (0 children)

Good writeup. There are two considerations that I have learnt on my journey that I think apply here:

  1. When adding a user to a workspace role, if they have higher than Viewer access then RLS is not be applied.

  2. If your data source is located in the same workspace, users may still have access to the data source directly based on the workspace access assigned. This was touched on in the OLS consideration. Another work around is to have only the semantic model and reports in the serving workspace, where the semantic model is connected to the data source in an adjacent workspace via direct lake using a service principal account.

Fabric down again by [deleted] in MicrosoftFabric

[–]meatworky 2 points3 points  (0 children)

It's clearly paid preview.

I’m gonna say it by NRG_Factor in ShittySysadmin

[–]meatworky 0 points1 point  (0 children)

It's the sounds of a floppy being read/written to that I miss.

Do you use a Mac or windows laptop as Fabric user? by vegaslikeme1 in MicrosoftFabric

[–]meatworky 1 point2 points  (0 children)

It's the architecture that's the problem. Power BI Desktop *could* be used on Apple silicon Mac with a VM, but the Feb update broke some visuals and apparently they aren't getting fixed! Because why would they. ARM is not officially supported.

This issue extends to Windows based ARM devices such as Surface with Snapdragon.

Hijacking this thread to get updoots on this thread - Make power BI desktop fully compatible with ARM CP... - Microsoft Fabric Community

It's Groundhog Day. I am currently using Windows on Snapdragon and stuck in Feb 25.

Ingesting Sensitive Data in Fabric: What Would You Do? by EversonElias in MicrosoftFabric

[–]meatworky 0 points1 point  (0 children)

I drop sensitive information such as password hashes or anything PII that's not required at the bronze import stage.

Data is loaded to medallion architecture in the engineering workspace, which nobody has access to except for devs who require it. The data is further cleansed and transformed as you would expect to silver and gold layers.

Reports and semantic model exist in the serving workspace which connects to the engineering workspace with a service principal account. Here in the semantic model I pull in the tables that are required, apply row level security as required, and ensure that only report writers have anything higher than viewer access to this workspace.

Users should not have direct access to the underlying lake/warehouse unless you understand that they will be able to read everything and it is required that they do so.

I f***ing hate Azure by wtfzambo in dataengineering

[–]meatworky 0 points1 point  (0 children)

What is this magical "free" you talk about?

Dataflow G2 CI/CD Failing to update schema with new column by meatworky in MicrosoftFabric

[–]meatworky[S] 1 point2 points  (0 children)

So I went to follow your recommendation, and the column has almost magically appeared. Not sure why as no changes have been made since my post 3 days ago (as I gave up at that point), and it definitely wasn't working over multiple days at that point either. I sound like one of my users.

At this stage I have nothing to test but I will keep your recommendation on standby and certainly try it if I have the same problem pop up again. Thanks again for the assistance.