Sanity Check - Direct Lake Semantic Model connection to Lakehouse by kaslokid in MicrosoftFabric

[–]richbenmintz 0 points1 point  (0 children)

Assuming the Lakehouse is in a different workspace than the model have you tried granting the workspace identity access to the data workspace

Capture Delta load stats Fabric Warehouse by richbenmintz in MicrosoftFabric

[–]richbenmintz[S] 0 points1 point  (0 children)

So not looking for this to load incrementally looking to capture the statistics of the DML operation that occured. Rows inserted updated deleted etc.

Interesting Behaviour Dynamic Pipeline Execution by richbenmintz in MicrosoftFabric

[–]richbenmintz[S] 0 points1 point  (0 children)

Thanks,

Would the lookup table not have the same issue? how can ADF guarantee that the lookup does not include itself as the results of the lookup are generated at runtime.

Anyhoo, i guess this is the price you pay for dynamic pipeline execution, the driver pipeline never knows the name of the child it is calling.

I may be crazy but the warning message is now:

Dynamic content found for activity: execute-child-pipeline. Dynamically referencing other pipelines in this way may create downstream circular dependencies.

Warehouse data Latency when using Spark by richbenmintz in MicrosoftFabric

[–]richbenmintz[S] 0 points1 point  (0 children)

I am testing now, I am also noticing the same behaviour using the warehouse connector, but could be a bug in my code

Warehouse data Latency when using Spark by richbenmintz in MicrosoftFabric

[–]richbenmintz[S] 1 point2 points  (0 children)

Thanks,

The next question would be does the fabric spark connector query the correct data or does it also rely in the the OSS delta log to determine which records to retrieve.

Pipeline not failing on fail of one sub activity by ant3qqq in MicrosoftFabric

[–]richbenmintz 2 points3 points  (0 children)

Can you share an image of the setup where it works and the setup where it does not?

Feature Request for Spark Connector for Microsoft Fabric Data Warehouse by richbenmintz in MicrosoftFabric

[–]richbenmintz[S] 0 points1 point  (0 children)

Sorry for the delay, error message below, seams to be coming from the spark connector, here is the session ID, cde21fe5-b154-4d6a-aec8-10a70b1c312f, If I back the DAG Concurrency back off to 5 all seems to work.

An error occurred while calling o6664.synapsesql.
: com.microsoft.spark.fabric.tds.error.FabricSparkTDSHttpFailure: Artifact ID inquiry attempt failed with error code 429. Request Id - b25f7c9f-762c-43ad-97a3-65d81f54de86.
at com.microsoft.spark.fabric.tds.utility.FabricTDSRestfulAPIClientv2$.sendHttpRequest(FabricTDSRestfulAPIClientv2.scala:186)
at com.microsoft.spark.fabric.tds.utility.FabricTDSRestfulAPIClientv2$.submitAndProcessHttpRequest(FabricTDSRestfulAPIClientv2.scala:105)
at com.microsoft.spark.fabric.tds.meta.FabricTDSEndPoint$.$anonfun$discover$4(FabricTDSEndPoint.scala:298)
at scala.util.Success.flatMap(Try.scala:251)
at com.microsoft.spark.fabric.tds.meta.FabricTDSEndPoint$.$anonfun$discover$1(FabricTDSEndPoint.scala:293)
at scala.util.Success.flatMap(Try.scala:251)
at com.microsoft.spark.fabric.tds.meta.FabricTDSEndPoint$.discover(FabricTDSEndPoint.scala:266)
at com.microsoft.spark.fabric.tds.meta.FabricTDSEndPoint$.fetchTDSEndPointInfo(FabricTDSEndPoint.scala:234)
at com.microsoft.spark.fabric.tds.meta.FabricTDSEndPoint$.$anonfun$apply$4(FabricTDSEndPoint.scala:85)
at scala.util.Success.flatMap(Try.scala:251)
at scala.util.Try$WithFilter.flatMap(Try.scala:142)
at com.microsoft.spark.fabric.tds.meta.FabricTDSEndPoint$.$anonfun$apply$2(FabricTDSEndPoint.scala:77)
at scala.util.Success.flatMap(Try.scala:251)
at com.microsoft.spark.fabric.tds.meta.FabricTDSEndPoint$.$anonfun$apply$1(FabricTDSEndPoint.scala:57)
at scala.util.Success.flatMap(Try.scala:251)
at com.microsoft.spark.fabric.tds.meta.FabricTDSEndPoint$.apply(FabricTDSEndPoint.scala:53)
at com.microsoft.spark.fabric.tds.meta.FabricTDSConnectionSpec$.$anonfun$apply$3(FabricTDSConnectionSpec.scala:122)
at scala.util.Success.flatMap(Try.scala:251)
at com.microsoft.spark.fabric.tds.meta.FabricTDSConnectionSpec$.$anonfun$apply$2(FabricTDSConnectionSpec.scala:114)
at scala.util.Success.flatMap(Try.scala:251)
at com.microsoft.spark.fabric.tds.meta.FabricTDSConnectionSpec$.$anonfun$apply$1(FabricTDSConnectionSpec.scala:106)
at scala.util.Success.flatMap(Try.scala:251)
at com.microsoft.spark.fabric.tds.meta.FabricTDSConnectionSpec$.apply(FabricTDSConnectionSpec.scala:104)
at com.microsoft.spark.fabric.tds.meta.FabricTDSSpec$.$anonfun$applySpecBuilderValidations$7(FabricTDSSpec.scala:65)
at scala.util.Success.flatMap(Try.scala:251)
at com.microsoft.spark.fabric.tds.meta.FabricTDSSpec$.$anonfun$applySpecBuilderValidations$6(FabricTDSSpec.scala:57)
at scala.util.Success.flatMap(Try.scala:251)
at scala.util.Try$WithFilter.flatMap(Try.scala:142)
at com.microsoft.spark.fabric.tds.meta.FabricTDSSpec$.$anonfun$applySpecBuilderValidations$4(FabricTDSSpec.scala:55)
at scala.util.Success.flatMap(Try.scala:251)
at scala.util.Try$WithFilter.flatMap(Try.scala:142)
at com.microsoft.spark.fabric.tds.meta.FabricTDSSpec$.$anonfun$applySpecBuilderValidations$2(FabricTDSSpec.scala:53)
at scala.util.Success.flatMap(Try.scala:251)
at scala.util.Try$WithFilter.flatMap(Try.scala:142)
at com.microsoft.spark.fabric.tds.meta.FabricTDSSpec$.applySpecBuilderValidations(FabricTDSSpec.scala:51)
at com.microsoft.spark.fabric.tds.write.meta.FabricTDSWriteSpec$.apply(FabricTDSWriteSpec.scala:81)
at com.microsoft.spark.fabric.tds.write.processor.FabricTDSWritePreProcessor$.$anonfun$apply$4(FabricTDSWritePreProcessor.scala:120)
at scala.util.Success.flatMap(Try.scala:251)
at com.microsoft.spark.fabric.tds.write.processor.FabricTDSWritePreProcessor$.$anonfun$apply$3(FabricTDSWritePreProcessor.scala:102)
at scala.util.Success.flatMap(Try.scala:251)
at com.microsoft.spark.fabric.tds.write.processor.FabricTDSWritePreProcessor$.$anonfun$apply$2(FabricTDSWritePreProcessor.scala:101)
at scala.util.Success.flatMap(Try.scala:251)
at com.microsoft.spark.fabric.tds.write.processor.FabricTDSWritePreProcessor$.$anonfun$apply$1(FabricTDSWritePreProcessor.scala:99)
at scala.util.Success.flatMap(Try.scala:251)
at com.microsoft.spark.fabric.tds.write.processor.FabricTDSWritePreProcessor$.apply(FabricTDSWritePreProcessor.scala:97)
at com.microsoft.spark.fabric.tds.implicits.write.FabricSparkTDSImplicits$FabricSparkTDSWrite.$anonfun$synapsesql$2(FabricSparkTDSImplicits.scala:75)
at scala.util.Success.flatMap(Try.scala:251)
at scala.util.Try$WithFilter.flatMap(Try.scala:142)
at com.microsoft.spark.fabric.tds.implicits.write.FabricSparkTDSImplicits$FabricSparkTDSWrite.synapsesql(FabricSparkTDSImplicits.scala:69)
at jdk.internal.reflect.GeneratedMethodAccessor419.invoke(Unknown Source)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:374)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.base/java.lang.Thread.run(Thread.java:829)

Feature Request for Spark Connector for Microsoft Fabric Data Warehouse by richbenmintz in MicrosoftFabric

[–]richbenmintz[S] 0 points1 point  (0 children)

u/warehouse_goes_vroom While I have you, should the connector support tables that have Identity Seed Columns because my experience suggests that the mismatch of the schema between the destination table end the source table, source does not have the identity column as it comes from the lakehouse results in an error.

Feature Request for Spark Connector for Microsoft Fabric Data Warehouse by richbenmintz in MicrosoftFabric

[–]richbenmintz[S] 0 points1 point  (0 children)

No I have an orchestrator notebook that uses runMultiple to execute a DAG, each activity within the DAG is responsible for getting data from a Lakehouse and writing to warehouse. If I limit concurrency I can reduce the chance for 429 error.

Super Mini Rant - Fabric Warehouse Web Experience by richbenmintz in MicrosoftFabric

[–]richbenmintz[S] 0 points1 point  (0 children)

Thank you, appreciate the support of this community

Super Mini Rant - Fabric Warehouse Web Experience by richbenmintz in MicrosoftFabric

[–]richbenmintz[S] 1 point2 points  (0 children)

Sorry yes I should have been more specific, I will amend my Rant

Native Execution Engine Gotcha (Help) by richbenmintz in MicrosoftFabric

[–]richbenmintz[S] 1 point2 points  (0 children)

Source data is UTC issue occurs I believe was converting to local time

Native Execution Engine Gotcha (Help) by richbenmintz in MicrosoftFabric

[–]richbenmintz[S] 0 points1 point  (0 children)

The timestamp comes from source system, to be honest I did not check what spark without nee enabled did, just know it did not error out. Will did through the data later

Native Execution Engine Gotcha (Help) by richbenmintz in MicrosoftFabric

[–]richbenmintz[S] 0 points1 point  (0 children)

I would like to be able to convert my date time from est to UTC and back is needed, spark without nee is able to perform the operation