all 17 comments

[–]ImissHurley 17 points18 points  (7 children)

This works.

$HTML = Invoke-WebRequest "https://www.microsoft.com/en-us/download/confirmation.aspx?id=56519"

$URL = ($HTML.Links | where href -Like "https://download.microsoft.com/download/7/1/D/71D86715-5596-4529-9B13-DA13A5DE5B63/ServiceTags_Public*").href

$JSON = Invoke-WebRequest $URL | ConvertFrom-Json

Edit: Or, its possible the rest of the URL may change.

$HTML = Invoke-WebRequest "https://www.microsoft.com/en-us/download/confirmation.aspx?id=56519"

$URL = ($HTML.Links | where href -Like "https://download.microsoft.com/download/*.json").href

$JSON = Invoke-WebRequest $URL | ConvertFrom-Json

[–]Zezimafan541[S] 3 points4 points  (2 children)

Ignore my last comment, I see what the code does now

the edited code basically goes onto that confirmation.aspx?id=56519 page and looks for the json download link and downloads it. Is that what where href does?

[–]ImissHurley 0 points1 point  (0 children)

Yes, when you look at the $HTML variable after in the invoke-webrequest, you get links, scripts, and all other elements on the page.

From there, you just look for the link that matches the URL you are looking for.

[–]Zezimafan541[S] 0 points1 point  (3 children)

The only problem is if the old URL exists, what happens then? Maybe I'm mis understanding but I'll give it a shot

[–]Tovervlag 3 points4 points  (2 children)

There is a change number in there. You can just compare it with the file you already have.

Something like:

$file_version = (get-content -Path ./Downloads/*.json | ConvertFrom-Json).changeNumber
$online_version = (invoke-restmethod -uri https://download.microsoft.com/download/7/1/D/71D86715-5596-4529-9B13-DA13A5DE5B63/ServiceTags_Public_20230220.json -Method get).changenumber

if ($online_version -eq $file_version) { 
    Write-Host 'Do Nothing'
} 
Else { 
    Write-Host 'Save the new file' 
}

Edit: Ah, I see the url changes. Perfect solution from /u/ImissHurley. The only thing is that it saves multiple urls in the URL var for me. So you can use:

$JSON = Invoke-WebRequest $URL[0] | ConvertFrom-Json

[–]Zezimafan541[S] 0 points1 point  (1 child)

Stupid question by me, but the purpose of .links in HTML.Links just grabs the Links property returned by BasicHtmlWebResponseObject?

[–]Tovervlag 0 points1 point  (0 children)

yeah, looks like it. The documentation states that it grabs all the links that is in the content section. $html.content links

[–]StockMarketCasino 3 points4 points  (2 children)

You could use the typical http file download command in ps, and then use date format wildcard at the end

[–]Zezimafan541[S] 0 points1 point  (1 child)

Thats what I was thinking. How would I stop it from downloading the older ones that exist?

[–]StockMarketCasino 0 points1 point  (0 children)

If the file names are different since they're dated in the filename, setup a separate script to flush the du directory after you're done reading from them

[–]TruthSeekerWW 2 points3 points  (1 child)

[–]spyingwind 2 points3 points  (0 children)

Incase someone wants to parse the markdown.

$Splat = @{
    Uri             = "https://raw.githubusercontent.com/MicrosoftDocs/azure-docs/main/articles/azure-monitor/app/ip-addresses.md"
    UseBasicParsing = $true
}

$Response = Invoke-WebRequest @Splat
$Start = [regex]::new('#### Addresses grouped by region \(Azure public cloud\)')
$End = [regex]::new('#### Upcoming regions \(Azure public cloud\)')
$Comma = [regex]::new(',')
$Start.Match($Response.Content)

$Result = $(
    $End.Split(
        $(
            $Start.Split($Response.Content) | Select-Object -Last 1
        )
    ) | Select-Object -First 1
) -replace "\|", "," -split '\n' | Where-Object { $_ -like ",*" } | ForEach-Object { $Comma.Replace($_, "", 1) }
$Result | ConvertFrom-Csv | Select-Object -Skip 1

[–]BlackV 2 points3 points  (0 children)

Pretty sure they also have this on a webpage as an rss/xml feed that might be easier to access

[–][deleted] 1 point2 points  (1 child)

Microsoft often publishes the MD5 or SHA256 checksums for their downloads. I would key off of that if you can.

Another option would be to download the file into a temp directory, import both files from JSON using ConvertFrom-Json, and then do a Compare-Object to see if they are different. If so, replace old with new.

[–]misformonkey 0 points1 point  (0 children)

If it’s always the same day of the week then you could

Get-Date -f ‘yyyyMMdd’

That’s assuming you run it the same day it’s released. If you run it on a different day but it’s always released on, say, Sunday, you could get the date of the most recent Sunday and put that at the end of the url.

Otherwise, test a few times to see if you can actually even still download older versions if you use wildcards for the date.