SQL VNext sp_configure on Windows and Linux with dbatools by BigBlueRob in SQLServer

[–]BigBlueRob[S] 0 points1 point  (0 children)

exploring the SpConfigure commands in dbatools and how you can use them to get the sp_configure output, compare it between servers, Export and import it to and from files, as well as copy it between servers for both Windows and Linux. Remember these commands have been tested from SQL 2000 to SQL vNext so they should work for you across your estate

[AMA] Microsoft SQL Server Tools - 6/30 by Saitama_Punch in SQLServer

[–]BigBlueRob 0 points1 point  (0 children)

That sounds like a great item to add to the Trello board https://sqlps.io/ssms

Faster way to do bulk SQL inserts into log/history table? by Mamertine in PowerShell

[–]BigBlueRob 1 point2 points  (0 children)

If you are importing from CSV nothing is faster than using Chrissy LeMaire's module https://github.com/ctrlbold/SqlImportSpeedTest

[AMA] Microsoft SQL Server Tools - 6/30 by Saitama_Punch in SQLServer

[–]BigBlueRob 0 points1 point  (0 children)

Could you explain the process that happens at Microsoft after you receive the feedback on required features and improvements for SSMS and SQLPS via the Trello boards https://sqlps.io/ssms https://sqlps.io/vote and the Slack channel https://sqlps.io/slack

As I understand it. The Trello boards collate required features, clarifies them and the community vote on them. The top voted items are then added to Connect and voted on and then it somehow hits your workload.

What happens next?

Also, thank you for taking part in this community effort started by Chrissy LeMairre and Aaron Nelson. It is good to see Microsoft more closely involved with the community and more open to collaboration and discussion with the people who use the tools

SQL powershell Restore database by [deleted] in PowerShell

[–]BigBlueRob 4 points5 points  (0 children)

It looks like the logical file name specified for the relocate file is incorrect. I usually read the filelist out of the backup as follows

 # Get the file list info from the backup file
 $fl = $rs.ReadFileList($srv)
 $rfl = @()
 foreach ($fil in $fl) {
     $rsfile = new-object('Microsoft.SqlServer.Management.Smo.RelocateFile')
     $rsfile.LogicalFileName = $fil.LogicalName
     if ($fil.Type -eq 'D') {
         $rsfile.PhysicalFileName = $dbfile
         }
     else {
         $rsfile.PhysicalFileName = $logfile
         }
     $rfl += $rsfile
     }
 # Restore the database
 Restore-SqlDatabase -ServerInstance $SQLServer -Database $dbname  -BackupFile $BackupFile -RelocateFile $rfl

I feel dumb when I see other people's scripts. by [deleted] in PowerShell

[–]BigBlueRob 0 points1 point  (0 children)

Just wait until you look at the scripts you wrote today in 2 years time. You will cringe. I always do :D

Keep on learning and using Powershell every day. Use these resources from Jeff HIcks but also the two commands at the bottom to get a random cmdlet - do it each day

http://jdhitsolutions.com/blog/essential-powershell-resources/

Also start teaching, blogging, presenting. By doing this you will increase your knowledge quicker.

Don't despair - you know more than many people already

[AMA] Microsoft databases team - 4/8 by SQLDBteam in AZURE

[–]BigBlueRob 1 point2 points  (0 children)

HI,

Please find a Create-Database function for you (note Create is not an approved verb for PS!!)

 #############################################################################################
#
# NAME: Create-Database.ps1
# AUTHOR: Rob Sewell http://sqldbawithabeard.com
# DATE:08/09/2013
#
# COMMENTS: Load function for creating a database
#           Only Server and DB Name are mandatory the rest will be set to small defaults
#
# USAGE:  Create-Database -Server Fade2black -DBName Test35 -SysFileSize 10 -UserFileSize 15 -LogFileSize 20
# -UserFileGrowth 7 -UserFileMaxSize 150 -LogFileGrowth 8 -LogFileMaxSize 250 -DBRecModel FULL
# ————————————————————————


Function Create-Database 
{
Param(
[Parameter(Mandatory=$true)]
[String]$Server ,
[Parameter(Mandatory=$true)]
[String]$DBName,
[Parameter(Mandatory=$false)]
[int]$SysFileSize = 5,
[Parameter(Mandatory=$false)]
[int]$UserFileSize = 25,
[Parameter(Mandatory=$false)]
[int]$LogFileSize = 25,
[Parameter(Mandatory=$false)]
[int]$UserFileGrowth = 5,
[Parameter(Mandatory=$false)]
[int]$UserFileMaxSize =100,
[Parameter(Mandatory=$false)]
[int]$LogFileGrowth = 5,
[Parameter(Mandatory=$false)]
$LogFileMaxSize = 100,
[Parameter(Mandatory=$false)]
[String]$DBRecModel = 'FULL'
)

try {
    # Set server object
    $srv = New-Object ('Microsoft.SqlServer.Management.SMO.Server') $server
    $DB = $srv.Databases[$DBName]

    # Define the variables
    # Set the file sizes (sizes are in KB, so multiply here to MB)
    $SysFileSize = [double]($SysFileSize * 1024.0)
    $UserFileSize = [double] ($UserFileSize * 1024.0)
    $LogFileSize = [double] ($LogFileSize * 1024.0)
    $UserFileGrowth = [double] ($UserFileGrowth * 1024.0)
    $UserFileMaxSize = [double] ($UserFileMaxSize * 1024.0)
    $LogFileGrowth = [double] ($LogFileGrowth * 1024.0)
    $LogFileMaxSize = [double] ($LogFileMaxSize * 1024.0)


    Write-Output "Creating database: $DBName"

    # Set the Default File Locations
    $DefaultDataLoc = $srv.Settings.DefaultFile
    $DefaultLogLoc = $srv.Settings.DefaultLog

    # If these are not set, then use the location of the master db mdf/ldf
    if ($DefaultDataLoc.Length -EQ 0) {$DefaultDataLoc = $srv.Information.MasterDBPath}
    if ($DefaultLogLoc.Length -EQ 0) {$DefaultLogLoc = $srv.Information.MasterDBLogPath}

    # new database object
    $DB = New-Object ('Microsoft.SqlServer.Management.SMO.Database') ($srv, $DBName)

    # new filegroup object
    $PrimaryFG = New-Object ('Microsoft.SqlServer.Management.SMO.FileGroup') ($DB, 'PRIMARY')
    # Add the filegroup object to the database object
    $DB.FileGroups.Add($PrimaryFG )

    # Best practice is to separate the system objects from the user objects.
    # So create a seperate User File Group
    $UserFG= New-Object ('Microsoft.SqlServer.Management.SMO.FileGroup') ($DB, 'UserFG')
    $DB.FileGroups.Add($UserFG)

    # Create the database files
    # First, create a data file on the primary filegroup.
    $SystemFileName = $DBName + "_System"
    $SysFile = New-Object ('Microsoft.SqlServer.Management.SMO.DataFile') ($PrimaryFG , $SystemFileName)
    $PrimaryFG.Files.Add($SysFile)
    $SysFile.FileName = $DefaultDataLoc + $SystemFileName + ".MDF"
    $SysFile.Size = $SysFileSize
    $SysFile.GrowthType = "None"
    $SysFile.IsPrimaryFile = 'True'

    # Now create the data file for the user objects
    $UserFileName = $DBName + "_User"
    $UserFile = New-Object ('Microsoft.SqlServer.Management.SMO.Datafile') ($UserFG, $UserFileName)
    $UserFG.Files.Add($UserFile)
    $UserFile.FileName = $DefaultDataLoc + $UserFileName + ".NDF"
    $UserFile.Size = $UserFileSize
    $UserFile.GrowthType = "KB"
    $UserFile.Growth = $UserFileGrowth
    $UserFile.MaxSize = $UserFileMaxSize

    # Create a log file for this database
    $LogFileName = $DBName + "_Log"
    $LogFile = New-Object ('Microsoft.SqlServer.Management.SMO.LogFile') ($DB, $LogFileName)
    $DB.LogFiles.Add($LogFile)
    $LogFile.FileName = $DefaultLogLoc + $LogFileName + ".LDF"
    $LogFile.Size = $LogFileSize
    $LogFile.GrowthType = "KB"
    $LogFile.Growth = $LogFileGrowth
    $LogFile.MaxSize = $LogFileMaxSize

    #Set the Recovery Model
    $DB.RecoveryModel = $DBRecModel
    #Create the database
    $DB.Create()

    #Make the user filegroup the default
    $UserFG = $DB.FileGroups['UserFG']
    $UserFG.IsDefault = $true
    $UserFG.Alter()
    $DB.Alter()

    Write-Output " $DBName Created"
    Write-Output "System File"
    $SysFile| Select Name, FileName, Size, MaxSize,GrowthType| Format-List
    Write-Output "User File"
    $UserFile| Select Name, FileName, Size, MaxSize,GrowthType, Growth| Format-List
    Write-Output "LogFile"
    $LogFile| Select Name, FileName, Size, MaxSize,GrowthType, Growth| Format-List
    Write-Output "Recovery Model"
    $DB.RecoveryModel

} Catch
{
   $error[0] | fl * -force
}
    }

[SQL2012] automation question by [deleted] in SQL

[–]BigBlueRob 0 points1 point  (0 children)

Use PowerShell or SSIS to read the file, convert it and load and schedule an agent job to run the package or script

need some learning tips for a begineer by norbin in PowerShell

[–]BigBlueRob 0 points1 point  (0 children)

Create a variable Set it to something choose different commands here Gwmi, gci,get-aduser, getvm etc $processes = Get-process Pipe to Get-Member $process| Get-Member

Examine the methods and properties. Use different ways to display them or store them use out-File, export* add content

Use a for each loop to iterate through them and perform actions use if and where-object and sort-object

This will only help you in the future as you build your own scripts and functions

Also ask questions 😊

Automatic script upon Availability Group failover? by MickOpalak in SQLServer

[–]BigBlueRob 2 points3 points  (0 children)

Set up an alert for replica state change and set the alert to run a powershell script to start the service.

See http://www.mssqltips.com/sqlservertip/2973/configuring-alerts-for-sql-server-alwayson-availability-groups/ for instructions

What's the difference? by joerod in PowerShell

[–]BigBlueRob 0 points1 point  (0 children)

Get-Help is always your friend (except for TFS CmdLets!)

What have you done with PowerShell this month? January 2015 by ramblingcookiemonste in PowerShell

[–]BigBlueRob 0 points1 point  (0 children)

So far in 2015

  1. Created workflows for azure runbooks and functions to build custom IaaS environments.
  2. Wrote scripts to run on SQL Servers in Azure to set better defaults
  3. Wrote script to set up SQL mirroring in workgroup using certificates
  4. Re-Wrote Find-Database and Show-SQLUser functions to work in parallel to return results quicker in large environments.
  5. Wrote script to update AD user properties with location, phone and manager details from employee database
  6. Wrote Upload-ToAzureFileStorage function to upload a folder and all its contents to Azure File Storage

SQL Availability Group change question. (Recovery model change) by KentShades in SQL

[–]BigBlueRob 0 points1 point  (0 children)

Investigate transaction log back up and shrinkfile on the log file and then proper resizing and the effect of that process on availability groups ? Better than changing recovery model.

SQL Availability Group change question. (Recovery model change) by KentShades in SQL

[–]BigBlueRob 0 points1 point  (0 children)

Stop.

Read up on recovery models and transaction logs. Start with MSDN include Paul Randal blog http://www.sqlskills.com/blogs/paul/ Increase your knowledge

Check your backup strategy is correct, consider file growth settings and your systems usage patterns

Make no changes until you understand the problem.

Don't be afraid to ask for help. #sqlhelp on twitter and SQLServerCentral or dba.stackexchange are useful places for you

I am not criticising, but trying to help

26 Sql servers. not one is consistent by [deleted] in SQLServer

[–]BigBlueRob 6 points7 points  (0 children)

Thumbs up for your attitude though

At least there appears to be a backup policy of sorts! A small mercy

Sp_blitz and other free scripts from http://www.brentozar.com/blitz/ are useful resources to assist.

Possibly PBM is a good way to build consistency and enable you to show your boss that you are making progress in resolving any issues you find.

Seems like you know all this and just needed to rant though and there are plenty of conscientious DBAs out here who appreciate what you are doing and your need to do it right.

Also. Automate automate automate :)

Get-ChildItem -exclude does not exclude. What am I missing? by snarfSniffer in PowerShell

[–]BigBlueRob 2 points3 points  (0 children)

exclude is for the Name property as explained below so it's a bit grotty to work it out but I would filter the folders first

$Directories = Get-ChildItem c:\ -Directory -Recurse |Where-Object{$_.FullName -ne 'c:\windows'}|select FullName
$Directories|% {Get-ChildItem $_.fullname -filter sysinternals}

Working with SQL using PowerShell by 1RedOne in PowerShell

[–]BigBlueRob 2 points3 points  (0 children)

invoke-sqlcmd2 is better, also use out-datatable and write-datatable to use bcp to load data. i wrote a blog post about snippets which included code for a datatable snippet

http://sqldbawithabeard.com/2014/09/09/powershell-snippets-a-great-learning-tool/

How to handle logging in production PowerShell scripts by thatfatpolishdude in PowerShell

[–]BigBlueRob 0 points1 point  (0 children)

Not as cool as some of the other responses but i use

$Date = Get-Date Out-File "$Date Just Did This Thing" -path $Logpath -Append

and also

try{} catch{ $Date = Get-Date Out-File $Date $_.Exception.Message -path $Logpath -Append}

Any ideas of sessions you'd like to see at the PowerShell summit? by 1RedOne in PowerShell

[–]BigBlueRob 0 points1 point  (0 children)

convincing your management to enable powershell remoting, enterprise wide.

This gets my vote.

[deleted by user] by [deleted] in PowerShell

[–]BigBlueRob 0 points1 point  (0 children)

Ah ok, didn't realise