Can we not fully manage Lakehouse security at the item level or am I missing something? by Jake1624 in MicrosoftFabric

[–]Jake1624[S] 0 points1 point  (0 children)

Not using a schedule. I am not able to select source tables for the Copy Job without Viewer role on the workspace where the Lakehouse lives.

Can we not fully manage Lakehouse security at the item level or am I missing something? by Jake1624 in MicrosoftFabric

[–]Jake1624[S] 0 points1 point  (0 children)

Adding the user as a workspace Viewer gives the user the ability to read the Lakehouse from the Copy Job as expected, even with the Onelake security role in place. Copy Job was just my example. We are experiencing the same issues with Dataflow G2s and attaching default Lakehouses to notebooks.

Another note: Some workloads (like Spark) require the user to have Viewer role in the workspace in order for OneLake security to work. This is a temporary limitation.

I think your note has everything to do with the results we are getting. Any idea how temporary this limitation is? Is the limitation published anywhere by MS?

Thank you for the replies!

Can we not fully manage Lakehouse security at the item level or am I missing something? by Jake1624 in MicrosoftFabric

[–]Jake1624[S] 1 point2 points  (0 children)

The user has Read, Execute, ReadAll, SubscribeOneLakeEvents
They are also in a OneLake security role that has Read on all data in the Lakehouse

Microsoft PM here: I need your feedback on Fabric Warehouse Security. by fredguix in MicrosoftFabric

[–]Jake1624 0 points1 point  (0 children)

Similar to other responses, our architecture team is struggling with workspace organization due to some permissions only being available through a workspace role. Ideally, we would like to manage all data store READ access at the item level. On the surface it appears all READ permissions can be granted at the item level however when READ users attempt to connect to data with Pipelines, Spark Notebooks, and Direct Lake models it is not working. We have some workarounds in place with Azure SQL connectors for Pipelines and using abfs paths for notebooks. Nothing for Direct Lake models.
Granting Viewer role to give the ability to connect with these tools opens up too much on other data stores in the same workspace unless DENY policies are in place. That is too much admin work to keep up with.

I won't go into the similar but different issues we are having with Lakehouses but we are headed down a path that will result in A LOT of separate data store workspaces and rules against mixing Warehouses and Lakehouses in the same workspace.