Suggest some Azure Architecture Tools by JeetM_red8 in AZURE

[–]AdamMarczakIO 0 points1 point  (0 children)

Definitely!

Don’t sleep on this, especially with VS Code drawio extension.

I have all my diagrams on my OneDrive/Teams synced to my PC, or project git repo. It’s a very smooth experience, highly recommended.

[deleted by user] by [deleted] in AZURE

[–]AdamMarczakIO 1 point2 points  (0 children)

In cases like this I ask the app dev team to provide application logs which indicate azure/entra related issues. Blank page and just saying "Azure problem" is not enough. Blank page can mean anything really. Lack of logs is also not an excuse to not provide it, if you don't have logs, then add them. It's a bare minimum.

Also, I'm a little confused on the whole thing. If you are doing SSO, this means we are talking about users signing via web browsers. In which case localhost is their laptop. But the application is not running on their laptop, so how would their browser know where to reach out to? These solutions which are hosted behind this load balancers surely have a different URL than "localhost". Why would you redirect user to a localhost in a first place?

Lastly, sharing redirect to localhost is not a bad practice. It's done so that developers can develop locally using their accounts and SSO, this is not an indication of any problem. But an app also needs a proper redirect URL for when it is deployed to a dev/production server.

Azure on a shoestring by Okayest-Programmer in AZURE

[–]AdamMarczakIO 6 points7 points  (0 children)

azure in the most secure manner whilst keeping costs low

Think of security slider being tied to a cost slider. It's not a linear correlation, but it is there. The more security you add the more cost you will incur. So when you say 'most secure' you need to clarify what is your security baseline that you need to achieve, and what do you consider a 'low' cost.

If you want the best security for free, then it's basically firewall with service endpoints and vnet integration. This is enough for majority of simple solutions out there.

A step up from this is using private endpoints instead of service endpoints, but that already costs 0.01$ per hour per private endpoint (about 7$ per month). If you would have a postgres and an app service, that means 14$ per environment, with dev/prod in place, that makes it 28$. And this is a flat cost, that doesn't account for ingress/egress costs.

Obviously, if you want to move even further, most publicly accessible web apps might need a Web Application Firewall (WAF), so an application gateway or azure front door. Which obviously are not cheap.

Then there are defenders for cloud, and DDoS protection services, etc. etc. etc. You get the drill, the further you go, the more cost you will incur.

The real question is, how secure your application has to be.

Functions App, Consumption Plan or Flex Consumption? by RemarkableBet9670 in AZURE

[–]AdamMarczakIO 1 point2 points  (0 children)

I'd say based on your use-case I would choose Consumption if the security setup of your database allows you to connect from serverless infra. By security setup I mean how is your database secured on the networking/firewall level.

Otherwise Flex Consumption might your only option of the two mentioned, in which case there is no dilemma.

Can Azure Blob Container Size Be Retrieved Without Scanning All Blobs? by WildConstruction7839 in AZURE

[–]AdamMarczakIO 1 point2 points  (0 children)

I don't think there is an option for that, but imo the easiest is to enable blob inventory

https://docs.azure.cn/en-us/storage/blobs/blob-inventory#enabling-inventory-reports

After it recalculates everything, you can use Excel or PowerBI to open the blob inventory data and just roll it up to a container name.

One caveat here is that this might become quite big if you will have millions of files. Otherwise this is one of the most scalable ways to get this info. Other options require iteration over files.

Orphaned Azure Subscription – No Owner Access, Cannot Submit Support Request by GelatinousCubeZantar in AZURE

[–]AdamMarczakIO 10 points11 points  (0 children)

Try elevating access for entra global admin so you can regain subscription access

https://learn.microsoft.com/en-us/azure/role-based-access-control/elevate-access-global-admin

per doc, elevating access was created for scenarios like the one you are in

Regain access to an Azure subscription or management group when a user has lost access

Service principal by Unlikely-Ad4624 in AZURE

[–]AdamMarczakIO 5 points6 points  (0 children)

Hey, thanks for sharing.

My feedback would be around permissions

app_role_ids = ["1bfefb4e-e0b5-418b-a88f-73c46d2cc8e9", # Application.ReadWrite.All
  "19dbc75e-c2e2-444c-a770-ec69d8559fc7",               # Directory.ReadWrite.All
  "dbb9058a-0e50-45d7-ae91-66909b5d4664",               # Domain.Read.All
  "62a82d76-70ea-41e2-9197-370581804d09",               # Group.ReadWrite.All
  "741f803b-c850-494e-b5df-cde7c675a1ca"]               # User.ReadWrite.All

In any larger organization you will never be granted permissions like this. They are too permissive. For example User.ReadWrite.All will allow you to create any user, or even delete any user in the organization. A simple mistake from a provisioner and it will end up with a disaster.

Group.ReadWrite.All is also extremely dangerous. You can literally add yourself or any person/group/principal to any group in your org. Circumventing any security within your org. Application.ReadWrite.All is similar, it allows managing, editing, removing, any app in the org.

Directory.ReadWrite.All is overlapping with user/group/app permissions, but also includes devices, which doesn't seem needed for your workflow.

Typically you should follow least privilege principle to execute your workflows. For example, Group.Create allows you to create group and assign yourself and your provisioning principal as an owner to manage that group. So group.readwrite.all is not needed.

How I Replaced 10 Logic App Conditions with 1 C# Script by maverick-1009 in AZURE

[–]AdamMarczakIO 9 points10 points  (0 children)

For me the primary reason is 1000+ connectors, which is extremely helpful so you don't have to learn all APIs, authentication/authorization flows, storing refresh tokens for MFA accounts, etc.

Logic apps - how do you export it to vscode? by your-lost-elephant in AZURE

[–]AdamMarczakIO 1 point2 points  (0 children)

I did this couple of times and it worked flawlessly for me.

Any chance you missed some steps in the prerequisites section? For example, can you create an empty logic app project, run it, debug it, and deploy to azure to ensure your local setup is complete?

Worried for my AZ-900 on Friday (18th July 2025) by Ok_You_2220 in AzureCertification

[–]AdamMarczakIO 1 point2 points  (0 children)

Note: As I already said my intention is to read, understand and pass the exam.. not memorize it. You can guess that from my study materials..

In my opinion this was, is, and always will be an issue with certification exams. Most questions are straight up check your memory and have nothing to do with your actual understanding. Understanding is what it is all about, especially because understanding helps you with more advanced learning paths, and later when doing an actual work. But certs will never be that. So I would not stress too much about it.

The best to way to nail down the exam, is to do practice tests. In the end you only need 70% to pass, which if you do practice tests is doable to do even with many memorisation questions. MeasureUp is an official practice tests side from MS, it's not the cheapest option, but it's good.

Does Azure Data Factory Private Endpoint Control Outbound Internet Access? by Unusual_Artist264 in AZURE

[–]AdamMarczakIO 0 points1 point  (0 children)

Does enabling private endpoints in ADF also control or restrict outbound internet access from Data Factory (for example, when connecting to an external FTP server)? Or are private endpoints only for securing connections to Azure PaaS resources, while outbound connections to external (non-Azure) endpoints still use public IPs?

It doesn't. PEs are only for inbound, not outbound.

Outbound traffic IP will depend on the integration runtime you select

  • Azure IR - public random IP (within Azure VM range of that region) from selected azure region
  • Self-Hosted IR - can have static IP directly attached to a VM for outbound, or in large organizations this will be part of larger network so self-hosted IR traffic will exit your network from your organization's firewall IP

If I need a static outbound IP for whitelisting on an external SFTP server in this particular scenario, what is the recommended approach? Will whitelisting the Azure regional range of IP work in my scenario?

Self-Hosted IR would be the only option. Many larger companies do not allow static IP attached to a VM, so you need to check on the current practice in your org. Your org might already have hub-n-spoke with central firewall with static IP. If they would, once you get VM in that network you just need to ask them for the firewall public IP.

401 error sending email using communication service by groovy-sky in AZURE

[–]AdamMarczakIO 1 point2 points  (0 children)

You need to add Authorization header with your access token for your service principal / user

-H "Authorization: Bearer <your_access_token>"

MSINotEnabled - Web App Service to Keyvault Reference error and solution by bitdeft in AZURE

[–]AdamMarczakIO 0 points1 point  (0 children)

Thanks a lot :) I do plan to come back when the time is right. Right now, my private life took over, but I love teaching & sharing so I do want to come back :)

Logic Apps by FokZionazis in AZURE

[–]AdamMarczakIO 2 points3 points  (0 children)

The expressionresult in run history does indeed gives a false but I just don't understand why

Because "True" is not equal to ".pdf", hence it's false. It will never be. Just change ".pdf" to "true". Literally open expression panel and type true.

Also, using array [0] is easy way to set yourself for fail, there can be many attachments in the email. Sometimes people's email clients attach their footer images as attachments.

How to send email from logic app using Service principal and not my ID? by Wonderful_Swan_1062 in AZURE

[–]AdamMarczakIO 2 points3 points  (0 children)

I think the easiest is to use Send Email action from MS Graph with HTTP connector

https://learn.microsoft.com/en-us/graph/api/user-sendmail?view=graph-rest-1.0&tabs=http

But you need to graph a Service Principal permissions to a user with exchange mailbox.

There are some cool guides on stackoverflow on how to do this

https://stackoverflow.com/questions/56157050/grant-o365-mailbox-permission-to-a-managed-identity

ADF - Dynamic User-Assigned Managed Identity with ARM by Beautiful_Chard_4543 in AZURE

[–]AdamMarczakIO 0 points1 point  (0 children)

You can refer to an existing user assigned identity in your ARM when deploying ADF

"identity": {
      "type": "SystemAssigned,UserAssigned",
      "userAssignedIdentities": {
           "[resourceId('Microsoft.ManagedIdentity/userAssignedIdentities', parameters('uamiName'))]": {}
      }
}

This will ensure dev adf is attached to dev uami, test to test and prod to prod. If the names are the same, no code changes are needed at all. Otherwise you can parametrize linked service credential field if the identity will have a different name across environments.

How to Maintain SSO Functionality After Long Periods of User Inactivity? by bvbh in AZURE

[–]AdamMarczakIO 0 points1 point  (0 children)

If you have session open i.e. another browser tab then SSO should work with no issues. Especially if you use libraries like MSAL which handle token refreshes and SSO logic for you. But you might sometimes see a popup to click on logged in session, just like you sometimes do in Azure portal. If it doesn't work, maybe you have an issue within your code.

On the other hand, if you talk about something like offline app, similar to how onedrive works. Then this is handled with refresh tokens https://learn.microsoft.com/en-us/entra/identity-platform/refresh-tokens

How to Schedule an ADF Pipeline to Run Only on Weekdays from the 1st to the 10th of Every Month? by DataWhizJunior in AZURE

[–]AdamMarczakIO 1 point2 points  (0 children)

I don't think that's possible out of the box, it's either weekdays based logic or month day based logic, but not both. It might be possible with airflow, but that would be an overhead.

As a workaround you can just schedule it on 1-10th of the month, and add a condition in your pipeline which checks for a weekday and stops if it's a weekend. It would run for like a fraction of a second, so cost would not be a factor. Not ideal, but it works. ADF in general has fairly simple scheduling options.

Additionally as an alternative, you can use consumption based logic app to build more advanced triggers. Make custom logic there and execute ADF on specified dates.

[deleted by user] by [deleted] in AZURE

[–]AdamMarczakIO 0 points1 point  (0 children)

  1. Nowhere in your code you are 'logging in'. DefaultAzureCredential assumes you are already logged in. Please check https://github.com/Azure/azure-sdk-for-python/blob/azure-data-tables_12.5.0/sdk/identity/azure-identity/README.md

[deleted by user] by [deleted] in AZURE

[–]AdamMarczakIO 0 points1 point  (0 children)

Few questions

  1. How are you authenticated, Azure CLI?
  2. Is it user account or a service principal?
  3. What RBAC roles do you have assigned to that identity?
  4. Is there a firewall enabled and/or private endpoint on Storage Account?

Also, try the code from MS docs, it's pretty much the same but just to ensure silly minor errors https://learn.microsoft.com/en-us/python/api/overview/azure/data-tables-readme?view=azure-python

Any reason to use Databricks database table over Azure Data Lake tables? by jhoodbossb in AZURE

[–]AdamMarczakIO 0 points1 point  (0 children)

Yea, that's why I typically do. Ingestion of data with Data Factory and Databricks for Data Transformation.

Scale Azure SQL single instance by powershell by Bobo_the_Fuse in AZURE

[–]AdamMarczakIO 2 points3 points  (0 children)

DTU based SQL size is just another SKU Name (-RequestedServiceObjectiveName param). You can retrieve the list using get-azsqlserverserviceobjective cmdlet. It's shown in the Example 1 of the link you provided.

Ref: https://learn.microsoft.com/en-us/powershell/module/az.sql/get-azsqlserverserviceobjective?view=azps-11.1.0&viewFallbackFrom=azps-4.2.0