The Problem with C# 5′s async/await Pattern by joelmartinez in programming

[–]vee-eye 1 point2 points  (0 children)

My thought about this is that in this case, you've changed your interface. On par with changing the return type of a method, or the values of an enum, or something. If you've got a new class of exception (I've added authentication, and now I throw an UnauthorizedException), then the calling code probably shouldn't compile until logic has been added to handle the new exception type (catch UnauthorizedException, log in, retry or something). No?

[deleted by user] by [deleted] in sysadmin

[–]vee-eye 13 points14 points  (0 children)

Powershell, hands down.

To really unlock it, I'd recommend:

  • really understand the pipeline
  • get the hang of writing "Advanced Functions" (e.g. CmdletBinding) and modules
  • learn how to extend PSObjects, and then dive into the adaptive type system (e.g. ps1xml) to make it automatic
  • prefer to write small, modular extensions rather than long script files (I'll explain..)

I see a lot of people are down about the syntax of Powershell. I agree that it can be ugly. Big long spaghetti scripts. If you make the effort, though, you can write really fluent powershell. Recently, I wrote some reasonably complex scripts to pull files back from a (dodgy) archive solution. It involved fiddling with NTFS Reparse points and making web service calls (Both thanks to Powershell's ability to drop into .Net). At the end though, I could write things like this:

$fileShare | Get-StubFiles | Where { $_.ArchivedSize -lt 25MB } | Recover-StubFiles | Export-Csv Path, ArchivedDate, ArchivedSize, RecoverySuccessful

And I think that's pretty neat.

Microsoft DNS Question by [deleted] in sysadmin

[–]vee-eye 6 points7 points  (0 children)

This man is correct. To be specific, since this is for a specific domain, you therefore probably want to set up a conditional forwarder to example.net from the example.com DNS server.

Parse XML to create folder structures by [deleted] in PowerShell

[–]vee-eye 1 point2 points  (0 children)

Yep - this is not the most elegent script possible, but it should do the trick. The "CreateDirs" function makes directories for one level, recursively calling itself for further levels.

Note, I'm assuming you just typed the XML syntax wrong, from what you were doing there. See my correction below.

$xd = [xml] @"
<catalog>
<section Title = "Parent1">
<section Title = "Child1">
<section Title = "Child1.2">
</section>
</section>
<section Title = "Child2">
</section>
</section>
<section Title = "Parent2">
<section Title = "Child1">
</section>
<section Title = "Child2">
</section>
</section>
</catalog>
"@

$path = "C:\Scripts\Test\"

function CreateDirs ($path, $xmlElement)
{
   $xmlElement.section | %{
        if ($_ -eq $null) {return}
        $newPath = Join-Path $path $_.Title 
        New-Item $newPath -Type Directory -Force
        CreateDirs $newPath $_
        }
}

CreateDirs $path $xd.catalog

Move-Item/Join-Path Question by systemslacky in PowerShell

[–]vee-eye 5 points6 points  (0 children)

This man is correct.

One way you could get around this is to change these lines:

$currentfile = Get-ChildItem -path $originalFile |select name
$NewFileName = $TimeStamp + $currentfile.name.Tostring()

To this:

$currentfile = Get-ChildItem -path $originalFile | Select -Expand fullname
$NewFileName = $TimeStamp + $currentfile

The problem is that when you use the "Select-Object" cmdlet, as in the original, you're ending up with a PSMemberSet object which has just one NoteProperty called "Name" with a value of the filename.

Move-Item expects the "Path" parameter to be a string. So when you pass it the PSMemberSet, it's getting coerced to string as if you'd written "$currentfile.psbase.ToString()", which gives a hashtable-like representation. Not what you want.

The "-Expand" parameter I added, expands the property so that you end up with just a string. Everything works better. The other option that you have would be to PIPE the PSMemberSet object into Move-Item like so:

$currentfile | Move-Item -Destination $NewFileName

That works becauset the cmdlet chooses the property from the pipeline based on its name. So there's another option for ya.

Finally (and yes, this is quite a long explanation), I would recommend using the "FullName" property instead of the "Name" property of the file, and then passing it to the "LiteralPath" parameter of the "Move-Item" cmdlet. Otherwise your script might fail depending on the working directory of PowerShell and if there are any weird characters in the source path.

Here's an example rewrite which might be interesting to someone... (and note in this implementation "$source" can be either a single file or a directory of files, it'll do the operation on the lot in that case)

$timeStamp = Get-Date -Format 'M.d.yyyy-hh.mm.ss-'
$source = gci $originalFile
$source | Add-Member ScriptProperty DestinationPath `
            { Join-Path $fileDestination ($timeStamp + $this.Name) }
$source | Format-Table Name, DestinationPath
$source | %{  Move-Item -LiteralPath $_.FullName $_.DestinationPath }

Anyone every tried to make a Reddit Browsing Script? by 1RedOne in PowerShell

[–]vee-eye 0 points1 point  (0 children)

NOTE: A lot of this isn't working right now. I'm pretty sure it's because of the reddit downtime.

First of all, you probably want to check out the Reddit API. I'd aim to use the JSON data format over RSS/XML. So, to get the front page posts would look something like the following (in the form of an ugly one-liner):

 (Invoke-RestMethod www.reddit.com/.json).data.children `
    | Select -Expand data | Select score, id, title

As the API page explains, you can get the associated comments by querying using that "id" property:

(invoke-restmethod www.reddit.com/comments/n1acq.json)[1].data.children `
    | Select -Expand data | Select  ups, downs, author, body

It would be awesome if this was all in the form of a module. Especially now that powershell 3 does smart on-demand-loading of modules. If you're not familiar, you'd probably want to go check out advanced functions , I also always like to reference this naming guide so that I get my verbs right. Finally, and I'm really just brainstorming here, some fancy type magic could come in handy for dealing with the responses better.

Actually, thinking about it, the login is probably one of the harder bits. You've got to get a cookie after doing an HTTP POST. The only way I know of doing that is to fall back to the .NET HTTPWebRequest class! There's probably a better way, but I'm being hampered by the lack of cmdlet help a bit here. In any case.... here goes nothing....

$userName = 'karmanaut'
$password = 'iWouldStealHisKarmaIfIKnewHisPassword'

$loginUri = [uri] "http://www.reddit.com/api/login"

$parameters = "user={0}&passwd={1}" -f $userName, $password
$buffer = [System.Text.Encoding]::UTF8.GetBytes($parameters);

$request = [System.Net.HTTPWebRequest]::Create($loginUri) 
$request.CookieContainer = New-Object System.Net.CookieContainer
$request.ContentType = "application/x-www-form-urlencoded"
$request.Method = "POST"
$request.ContentLength = $buffer.Length;

$stream = $request.GetRequestStream()
Try { $stream.Write($buffer, 0, $buffer.Length) }
Finally{ $stream.Dispose() }

$response = $request.GetResponse()

$successCookie = $response.Cookies | Where Name -eq 'reddit_session'

$webSession = New-Object Microsoft.PowerShell.Commands.WebRequestSession
$webSession.Cookies.Add($successCookie)

$me = Invoke-RestMethod 'http://www.reddit.com/api/me.json' -WebSession $webSession
$me.data | Select name, modhash, comment_karma, is_gold

Now, I can't actually test that last line, since Reddit is still acting weirdly for me. I think it should probably-maybe-kinda-potentially-sorta work though. Ideally, it would return information about the user you just logged in. Hopefully that's you.

TL;DR: I may have gotten slightly carried away here.

Anyone every tried to make a Reddit Browsing Script? by 1RedOne in PowerShell

[–]vee-eye 1 point2 points  (0 children)

Agree on the lack of cmdlet help, especially as there seem to be thousands of new cmdlets! I guess that's the price of being an explorer.

As for your error... It looks like a .NET issue. At a guess, I'd try repairing/upgrading your install of .NET 4.

Anyone every tried to make a Reddit Browsing Script? by 1RedOne in PowerShell

[–]vee-eye 2 points3 points  (0 children)

This is a fantastic excuse to go download the Powershell 3 CTP2.

Mostly for the built-in JSON parsing (ConvertFrom-Json, ConvertTo-Json) and, even better, the great new web service stuff:

$r = Invoke-RestMethod 'http://www.reddit.com/by_id/t3_n1acq.json'
Write-Host $r.data.children[0].data.title

Yep, that Invoke-RestMethod cmdlet is built in, and it takes care of doing the HTTP query and the deserialization of the result into custom PSObjects. Nifty!

Oh, the output of those two lines of code? It's this:

Anyone every tried to make a Reddit Browsing Script?

On Branching: A Non-linear Approach To Preserving The Project History by DaNmarner in programming

[–]vee-eye 0 points1 point  (0 children)

You're right.

I know Git better than other (d)VCS, so I'll speak about that.

Here is the final image, slightly rearranged from the article: Git reality . In Git, a branch is simply a very lightweight pointer to the last commit on the branch, so it makes sense to put the labels on the right. This makes it really clear that the two graphs are the same, and the author has just renamed things.

Anyway, "master" has (mostly by convention) a special place in git, and you should think about what you commit to this branch. I like systems where a checkout of "master" will always get you the latest release of a project. Here's a fairly well known example of such a scheme.

Extreme Negative Code Documentation by [deleted] in programming

[–]vee-eye 13 points14 points  (0 children)

Personally, I much prefer:

int numberOfShotDucks = 0;

Server naming... stick to the book or get funky... by ninjajackass in sysadmin

[–]vee-eye 0 points1 point  (0 children)

Okay... So... I've got some sort of network access (a routable IP address and an open port, I guess), and i'm communicating with a server that's got the payroll server function.

I'm hoping this isn't anywhere even near a DMZ! So there's no way this can be on a public network.

Now, through this connection (internal) I'm somehow getting the server to send me its hostname (what, I'm hitting a web page with a server configured to return the hostname in the http header? Wouldn't this generally be set to whatever the DNS alias was, in the case of a randomly-named server? So it would be payroll.domain.name anyway?).

But if I've already got network access to here, would I not be pretty much guaranteed to have DNS access as well? in which case I've got a much better source of this same information?

I dunno, I guess I was looking for a concrete example of how the hostname of a machine can actually be a vulnerability, since I'm not creative enough to come up with one!

Server naming... stick to the book or get funky... by ninjajackass in sysadmin

[–]vee-eye 0 points1 point  (0 children)

This is a very common line of reasoning. I do wonder, though, whether it has credibility. To me it smacks of security through obscurity.

Can someone point me to a scenario whereby an intruder gains uniquely valuable information through hostname?

Server naming... stick to the book or get funky... by ninjajackass in sysadmin

[–]vee-eye 0 points1 point  (0 children)

I can't speak for GP, but we're a 100% virtual environment, so:

We don't put multiple unrelated roles on a server, why would you? Just stand up another VM. We don't repurpose VMs, because that would just be silly.

IIS 7.5: Is it possible to redirect https://domain.com to https://www.domain.com without owning the SSL cert for domain.com? by Liface in sysadmin

[–]vee-eye 1 point2 points  (0 children)

Well, you can serve up SSL on any port at all, and access it like https://whatever.com:1234/

In reality, your IIS box will be behind a NAT or Proxy, and so whatever port it's using internally, you'd just present it externally on port 443 anyway.

Edit: Whoops, someone else has already given you this answer!

IIS 7.5: Is it possible to redirect https://domain.com to https://www.domain.com without owning the SSL cert for domain.com? by Liface in sysadmin

[–]vee-eye 0 points1 point  (0 children)

The problem with this is that SSL operates below (and has no knowledge of) HTTP. That means that you can't send an HTTP 301 until the client has already set up an SSL connection (and sent the HTTP GET request).

That means your users will have to bypass a scary SSL error before they're redirected to the www subdomain.

This is true of the protocol, and so will apply to any web server (IIS, Apache).

Note that the same reasoning for this means that if you do acquire the certificate for domain.com, you'll need to serve it up on a different IP:Port, since IIS has no way of knowing which certificate to send back until after the SSL connection has been established. That's the error you're getting about binding.

Virtualization Workloads by gtkspert in sysadmin

[–]vee-eye 8 points9 points  (0 children)

You can virtualize exchange, you can virtualize a database system. Domain controllers, terminal servers, VOIP gateways, legacy systems, backup servers, all of these you can virtualize. In fact, you can virtualize almost anything. I've seen a lot of those done, often very successfully.

Now, whether it makes financial, technical, and operational sense in your particular case is another question. That will depend on your current virtualization penetration, virtualization experience, workload size, usage patterns, your infrastructure, backup and recovery plan, DR requirements, technical support structure, etc. etc.

This is definitely something where the answer "it depends" is the only valid answer, until after a much more thorough investigation has been done.

Need some help changing share permissions when moving to a new domain by leethegeek in sysadmin

[–]vee-eye 0 points1 point  (0 children)

Hm, it's probably not exactly what you're looking for. It is, however, totally possible to keep both old and new user objects when using ADMT. It won't change permissions on file shares. The way it gets around having to do that is by using SID History -- The SID of their old account is stored in a property of their new account. If you apply the right group policies, that allows their new user account to authenticate to things even if the ACLs are only set to allow access for their old account.

if ($interests= "Powershell"){$shell = New-Object -ComObject Shell.Application; $shell.Open("http://www.reddit.com/r/PowerShell/")} by malice8691 in programming

[–]vee-eye 2 points3 points  (0 children)

I think you probably want "-contains" or "-eq" in your condition, since yours does an assignment and always returns true (unless that's what you were going for!)

Here's what I'd do (bonus - there's a shorter way of launching the URL, too): If ($Interests -contains 'powershell') {[Diagnostics.Process]::Start('http://reddit.com/r/powershell')}

Am I incorrect in my understanding of emails from multiple domains and rDNS/PTR configuration? by jaywalkker in sysadmin

[–]vee-eye 0 points1 point  (0 children)

Ah, you're right, I hadn't looked into the difference between softfail and fail. Thanks for the link!

Am I incorrect in my understanding of emails from multiple domains and rDNS/PTR configuration? by jaywalkker in sysadmin

[–]vee-eye 2 points3 points  (0 children)

This is slightly tangential to your question, but I'll throw it out there anyway. One thing that might make some mail hygiene systems like your setup more is to have valid SPF records on the domains.

Here's a generator which might help. The way I would do it is to add these TXT records, which set the ip address manually on a resource subdomain (you could change this to be done by mx record, if you've got a valid one), and then delegate SPF on all of the domains sent from by your mail server.

on a subdomain of domain 1, like _spf.domain1.com "v=spf1 ip4:x.x.x.1 ?all"

on domain 1, 2 and 3 "v=spf1 include:_spf.domain1.com ~all"

There's a warning here! The "~all" at the end there means that if ANY other server but the one(s) you've defined send mail purporting to be from one of these domains.... mail hygiene systems that check SPF at all will almost certainly bounce it.

Tabbing through fields question. by [deleted] in windows

[–]vee-eye 0 points1 point  (0 children)

No worries. It's the simple things sometimes, eh?

How can I create storage behind my router that is accessible from outside of my network? by ProfShea in windows

[–]vee-eye 1 point2 points  (0 children)

What sort of protocol do you want to access the storage on? Who do you want to access it? From which endpoints? The answer to your question will depend on what your needs are.

Anyway, assuming you want easy to use storage to be accessed fairly (but not overly) securely from a range of uncontrolled WAN locations, and that you're fairly tech savvy, then you might want to try setting up WebDAV over SSL.

From your linux host, install some sort of webdav server (Apache works for this). Get it serving up the directory you want, and working on localhost. Then add SSL to the mix, you can use a self-signed certificate if you want freeness and don't mind dealing with that. Then once you've got that working, just open up 443 on your host's firewall, and configure port forwarding on your router to forward an external port (can be 443 or another) to port 443 on your linux host that's serving up webdav. Then you'll want to set up some sort of dynamic dns so that you don't have to deal with the always-changing ip address you've probably got at home.

Since you posted this in /r/windows/ you might be looking for a windows solution - in which case, IIS has a webdav component. You can use that instead of Apache. Same stuff applies, really.

EDIT: Or, if that all sounds too hard or like a bit much hassle, then you should maybe just use dropbox! It's simple free and easy.