I got this Powershell script that gets information about whether SQL Server is installed on a host and gets the SQL Server version and other details.
It takes the list of hosts from a txt file and saves the information to a DataTable.
$data = New-Object ('System.Data.DataTable')
$data.Columns.Add('Host name') | Out-Null
$data.Columns.Add('Ip Address') | Out-Null
$data.Columns.Add('SQL Server Product Name') | Out-Null
$data.Columns.Add('SQL Server Edition') | Out-Null
$data.Columns.Add('SQL Server Version') | Out-Null
$data.Columns.Add('SQL Server Type') | Out-Null
$data.Columns.Add('SQL Server Status') | Out-Null
Get-Content .\servers.txt | ForEach {
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO') | out-null
$row = $data.NewRow()
$row['Host name'] = $_
try {
$row['Ip Address'] = [Net.Dns]::GetHostEntry($_).AddressList.IpAddressToString
}
catch [System.Net.Sockets.SocketException] {
$row['Ip Address'] = 'Offline'
}
If ($row['Ip Address'] -eq 'Offline') {
$row['SQL Server Product Name'] = 'N/A'
$row['SQL Server Edition'] = 'N/A'
$row['SQL Server Version'] = 'N/A'
$row['SQL Server Type'] = 'N/A'
$row['SQL Server Status'] = 'N/A'
}
else {
$smo = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $_
$row['SQL Server Product Name'] = $smo.Product + ' ' + $smo.ProductLevel
$row['SQL Server Edition'] = $smo.Edition
$row['SQL Server Version'] = $smo.VersionString
$row['SQL Server Type'] = $smo.ServerType
$row['SQL Server Status'] = $smo.Status
}
$smo.ConnectionContext.Disconnect()
$data.Rows.Add($row)
}
$data | Format-Table -AutoSize
The problem with this script is that it takes a long time to run (more than an hour with a list of 113 servers).
Is there some way to speed up the process?
You could run your script asynchronously using background jobs ( you will have to use 3 cmdlets : start-job, get-job and receive-job).
Quoted from About_remote_jobs
START A REMOTE JOB THAT RETURNS THE RESULTS TO THE LOCAL COMPUTER (ASJOB)
To start a background job on a remote computer that returns the command
results to the local computer, use the AsJob parameter of a cmdlet such
as the Invoke-Command cmdlet.
When you use the AsJob parameter, the job object is actually created on
the local computer even though the job runs on the remote computer. When
the job is completed, the results are returned to the local computer.
You can use the cmdlets that contain the Job noun (the Job cmdlets) to
manage any job created by any cmdlet. Many of the cmdlets that have
AsJob parameters do not use Windows PowerShell remoting, so
you can use them even on computers that are not configured for
remoting and that do not meet the requirements for remoting.
Related
I have two SQL Server 2019 instances running on Linux. These two instances both contain a single database which is synchronized using AlwaysOn Availability Group. Data in the database is synchronized, but the problem is that the SQL Agent jobs are not part of the database itself.
Therefore, when I create a SQL Server Agent job on the primary replica, this configuration does not copy to the secondary replica. So, after creating each job, I always have to also go to the secondary and create the job there as well. And I have to keep track of all the changes I make all the time.
Is there a built-in way to automate this cross-replica synchronization of SQL Server jobs on Linux when using availability groups? Job synchronization across AG replicas seems like something that should already be natively supported by SQL Server/SQL Server Agent tools, but I found nothing from Microsoft, only a third-party tool for called DBA Tools that I can use to write my own automation scripts in PowerShell.
After some trial and error, I ended up with this script that works on Ubuntu Linux 18.04. Big thanks to Derik Hammer and his blog for the base of the script and also to David Söderlund for his reply.
For the script to work, you will need to install PowerShell for Linux and both DBATools and SqlCmd2 PowerShell modules. You will also have to store sql credentials in a file somewhere. I chose /var/opt/mssql/secrets/creds.xml for mine and changed access rights to root only. Script can sync logins, DBmail settings, SQL Agent categories, jobs, operators and schedules from primary replica to all secondaries (uncomment what you need, but be careful, order matters and some things cannot be synched in one connection, i.e operators and jobs), skipping configuration replicas if you have any.
You can set up scheduled execution as root with output logged into file using CRON. To set this up, run:
sudo crontab -e
and adding this line to the file:
*/5 * * * * pwsh /<PATH>/sync-sql-objects.ps1 >> /<PATH>/sync-sql-objects.log
Script:
<#
.DESCRIPTION
This script will detect your Availability Group replicas and copy all of its instance level objects from primary replica to secondary replicas within the Availability Group. It will skip any configuration replicas.
.EXAMPLE
sudo pwsh sync-sql-objects.ps1
.NOTES
One limitation of this script is that it assumes you only have one availability group. This script should run on your configuration replica server.
.LINK
https://www.sqlhammer.com/synchronizing-server-objects-for-availability-groups/
DEBUG
To see logs on Ubuntu Linux, install Postfix Mail Transfer Agent and then go to see mails in /var/mail/<username>
#>
Write-Output ("Sync started: " + (Get-Date -Format G))
#Error handling
$ErrorActionPreference = "stop";
Trap
{
$err = $_.Exception
while ( $err.InnerException )
{
$err = $err.InnerException
Write-Output $err.Message
};
}
# Prerequisites
try
{
Write-Output "Valiating prerequisites."
# You need to have these modules installed in advance, otherwise the import will fail
if ((Get-Module -Name dbatools) -eq $null)
{
Import-Module dbatools | Out-Null
}
if ((Get-Module -Name Invoke-SqlCmd2) -eq $null)
{
Import-Module Invoke-SqlCmd2 | Out-Null
}
Write-Output "Prerequisites loaded."
}
catch
{
Write-Error $_.Exception.Message -EA Continue
Write-Error "One or more of the prerequisites did not load. Review previous errors for more details." -EA Continue
return
}
# Detect Availability Group replicas
Write-Output "Begin query for Availability Group replicas"
$ConfigurationMode = "CONFIGURATION_ONLY"
$Hostname = hostname
$Credentials = Import-CliXml -Path /var/opt/mssql/secrets/creds.xml
$ReplicasQuery = #"
SELECT replica_server_name,
availability_mode_desc,
primary_replica
FROM sys.availability_replicas AR
INNER JOIN sys.dm_hadr_availability_group_states HAGS
INNER JOIN sys.availability_groups AG ON AG.group_id = HAGS.group_id
ON HAGS.group_id = AR.group_id;
"#
$Replicas = Invoke-Sqlcmd2 -ServerInstance $Hostname -Query $ReplicasQuery -ConnectionTimeout 30 -Credential $Credentials
if(([DBNull]::Value).Equals($Replicas[0].primary_replica))
{
Write-Error "Availability Group query returned no results. Confirm that you connected to a SQL Server instance running an Availability Group. No work was accomplished."
return
}
Write-Output "Completed query of Availability Group replicas"
foreach($replica in $Replicas)
{
# Skip if destination replica is primary replica itself
if($replica.primary_replica.CompareTo($replica.replica_server_name) -eq 0)
{
continue
}
# Skip configuration replicas
if($replica.availability_mode_desc.CompareTo($ConfigurationMode) -eq 0)
{
continue
}
#Connect
$PrimaryReplica = Connect-DbaInstance $replica.primary_replica -ClientName 'ConfigurationReplica' -SqlCredential $Credentials
$SecondaryReplica = Connect-DbaInstance $replica.replica_server_name -ClientName 'ConfigurationReplica' -SqlCredential $Credentials
Write-Output "Copying instance objects from $sourceReplica to $replica"
# Copy objects
# Write-Output "Copying Logins."
# Copy-DbaLogin -Source $PrimaryReplica -Destination $SecondaryReplica
# Write-Output "Copying DBMail."
# Copy-DbaDbMail -Source $PrimaryReplica -Destination $SecondaryReplica -Force
# Write-Output "Copying Agent Categories."
# Copy-DbaAgentJobCategory -Source $PrimaryReplica -Destination $SecondaryReplica -Force
# Write-Output "Copying Agent Schedules."
# Copy-DbaAgentSchedule -Source $PrimaryReplica -Destination $SecondaryReplica -Force
# Write-Output "Copying Operators."
# Copy-DbaAgentOperator -Source $PrimaryReplica -Destination $SecondaryReplica -Force
Write-Output "Copying Jobs."
Copy-DbaAgentJob -Source $PrimaryReplica -Destination $SecondaryReplica -Force
Write-Output "Copy complete from $PrimaryReplica to $SecondaryReplica"
}
Write-Output "SQL Instance object sync complete."
Enjoy!
dbatools can sync them but I haven't tried it on an AG running on linux. Let me know if it works or not!
The first parameter is the name of your AG, the second is the virtual network name of your cluster.
param($AvailabilityGroup, $SqlInstance)
try {
$replicas = Get-DbaAgReplica -AvailabilityGroup $AvailabilityGroup -SqlInstance $SqlInstance
$primary = $replicas | Where-Object Role -EQ Primary | Select-Object -ExpandProperty Name
$secondaries = $replicas | Where-Object Role -EQ Secondary | Select-Object -ExpandProperty Name
$primaryInstanceConnection = Connect-DbaInstance $primary -ClientName 'ScriptBorrowedFromStackOverFlow'
$secondaries | ForEach-Object {
$secondaryInstanceConnection = Connect-DbaInstance $_ -ClientName 'ScriptBorrowedFromStackOverFlow'
Copy-DbaAgentJob -Source $primaryInstanceConnection -Destination $secondaryInstanceConnection -Force
}
}
catch {
$msg = $_.Exception.Message
Write-Error "Error while syncing jobs for Availability Group '$($AvailabilityGroup): $msg'"
}
What I'd like to do is execute two stored procedures from a PowerShell and save the results into a log file.
I have two procedures so I though it would be easier to create one "master" procedure which will execute the two procedures.
So far I was able to execute the master procedure successfully (both child procedures ran), but only the output from the second is stored in the log file.
param([string]$datein="string")
$OutputFile = "C:\...\ScriptOutput.txt"
$handler = [System.Data.SqlClient.SqlInfoMessageEventHandler] {param($sender, $event) Out-File -filepath $OutputFile -inputobject $event.Message };
$SqlConnection = new-Object System.Data.SqlClient.SqlConnection("server=ServerName;database=DB;integrated security=true")
$SqlConnection.add_InfoMessage($handler);
$SqlConnection.FireInfoMessageEventOnUserErrors = $true;
$SqlConnection.Open() | Out-Null
# Execute procedures
$cmd = new-Object System.Data.SqlClient.SqlCommand("[dbo].[tst_master]", $SqlConnection)
$cmd.CommandType = [System.Data.CommandType]'StoredProcedure'
$cmd.Parameters.Add("#date_in",$datein) | Out-Null
$cmd.ExecuteNonQuery() | Out-Null
$SqlConnection.Close()
EDIT:
Its on virtual machine, Windows server 2016. Powershell version Major:5, Minor: 1. Im working with MSSQL 2014.
I'm using below powershell script to read Version Build from a dtsx pkg(read as an xml). I'm able to read the file as long as it's in my local or shared path. However, one of the packages are on an Integration Services server path and I'm not able to simply use get-content on that path. Below is the code I'm using for local/shared path files.
$xml = [xml](get-content *filepath*)
$value = $xml.Executable.Property | Where-Object {$_.Name -like 'VersionBuild'}
$ver_dev = $value.'#text'
I read on the internet about net use but I guess that's just for mapping shared path, the integration server path is not really a shared path as we can only access it using SQL server(as per my understanding), the path is as displayed in sql server is as follows Server_path
I also came across some code invoking packages in SQL server using powershell, however I am novice in powershell(only started for this version compare purpose) and could hardly understand anything.
The ocde in itself is simple and frankly that's what motivated me to automate this Version compare, however I'm stuck on reading from this server path. Any help would be greatly appreciated. Thanks
If your packages are stored in an Integration Services Catalog, the code in this link below will save an .ispac file (which is a zip file) to local storage and extract the contents. The code below, based on that, goes as far as downloading a specific .ispac project file. From there, you'll need to extract the dtsx package (as in the linked example).
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.Smo") | Out-Null
$SsisServer = 'SERVERNAME'
$Folder = 'FOLDER'
$ProjectName = 'PROJECT'
## Build the list of SQL Server names.
$smo = New-Object Microsoft.SqlServer.Management.Smo.Server $SsisServer
$SsisNamespace = "Microsoft.SqlServer.Management.IntegrationServices"
[System.Reflection.Assembly]::LoadWithPartialName($SsisNamespace) | Out-Null
$SqlConnectionstring = "Data Source=$SsisServer;Initial Catalog=master;Integrated Security=SSPI;"
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection $SqlConnectionstring
# Create the Integration Services object
$IntegrationServices = New-Object "$SsisNamespace`.IntegrationServices" $SqlConnection
$Catalog = $IntegrationServices.Catalogs["SSISDB"]
$oFolder = $Catalog.Folders[$Folder]
$oProject = $oFolder.Projects[$ProjectName]
$ISPAC = $oProject.GetProjectBytes()
[System.IO.File]::WriteAllBytes(($DownloadFolder + "\" + $oProject.Name + ".ispac"),$ISPAC)
If your packages are stored in msdb, the code below, from Jamie Thomson, will download all packages to local storage.
Param($SQLInstance = "localhost")
#####Add all the SQL goodies (including Invoke-Sqlcmd)#####
add-pssnapin sqlserverprovidersnapin100 -ErrorAction SilentlyContinue
add-pssnapin sqlservercmdletsnapin100 -ErrorAction SilentlyContinue
cls
$ssisSQL = "WITH cte AS (
SELECT cast(foldername as varchar(max)) as folderpath, folderid
FROM msdb..sysssispackagefolders
WHERE parentfolderid = '00000000-0000-0000-0000-000000000000'
UNION ALL
SELECT cast(c.folderpath + '\' + f.foldername as varchar(max)), f.folderid
FROM msdb..sysssispackagefolders f
INNER JOIN cte c ON c.folderid = f.parentfolderid
)
SELECT c.folderpath,p.name,CAST(CAST(packagedata AS VARBINARY(MAX)) AS VARCHAR(MAX)) as pkg
FROM cte c
INNER JOIN msdb..sysssispackages p ON c.folderid = p.folderid
WHERE c.folderpath NOT LIKE 'Data Collector%'"
$Packages = Invoke-Sqlcmd -MaxCharLength 10000000 -ServerInstance $SQLInstance -Query $ssisSQL
Foreach ($pkg in $Packages)
{
$pkgName = $Pkg.name
$folderPath = $Pkg.folderpath
$fullfolderPath = "c:\temp\$folderPath\"
if(!(test-path -path $fullfolderPath))
{
mkdir $fullfolderPath | Out-Null
}
$pkg.pkg | Out-File -Force -encoding ascii -FilePath "$fullfolderPath\$pkgName.dtsx"
}
Strictly speaking, the question is, I use this solution and it works, but is there a better way?
With the following caveats.
1) I don't want to do a network wide search for SQL instances, I am interrogating known SQL servers, but I want to grab the instance names on each.
2) The code assumes Microsoft will never change the display name for the SQL Server Service.
function getSQLInstance ([string]$SERVER) {
$services = Get-Service -Computer $SERVER
# Filter for SQL services
$services = $services | ? DisplayName -like "SQL Server (*)"
# Remove MSSQL$ qualifier to get instance name
try {
$instances = $services.Name | ForEach-Object {($_).Replace("MSSQL`$","")}
}catch{
# Error if none found
return -1
}
return $instances
}
getSQLInstance "YOUR_SERVER"
Rather than re-invent the wheel, take a look at how SQL Power Doc discovers instances on a server. Which, from looking at NetworkScan.psm1, appears to be very similar to your approach:
$ManagedComputer.Services | ForEach-Object {
if (($_.Name).IndexOf('$') -gt 0) {
$InstanceName = ($_.Name).Substring(($_.Name).IndexOf('$') + 1)
$IsNamedInstance = $true
$ManagedComputerServerInstanceName = $InstanceName
} else {
$InstanceName = $null
$IsNamedInstance = $false
$ManagedComputerServerInstanceName = $_.Name
}
Or, just use SQL Power Doc and point it at specific server names to collect this and more data about the instances.
I have a PowerShell script that I am writing to extract all the jobs of a specific server like
$sqlserver = "Servername"
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.Smo') | Out-Null
$srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $sqlserver
$jobs = $srv.JobServer.Jobs
ForEach ($job in $jobs)
{
$jobname =$Servernaam.replace("\","_") + '_'+ $job.Name.replace(" ","_").replace("\","_").replace("[","_").replace("]","_").replace(".","_").replace(":","_") + ".sql"
$job.Script() | Out-File C:\Users\Desktop\Jobs_from_Server\Orgineel\$jobname
if ($jobs -like '*.dtsx*')
}
The code I got now gets all the jobs from the server and store them in separate files.
The problem is that I only want to get the jobs of the Micrososft SQL Server that contains .dtsx in the string of #command
I tried for example
ForEach ($job in $jobs)
{
$jobname =$Servernaam.replace("\","_") + '_'+ $job.Name.replace(" ","_").replace("\","_").replace("[","_").replace("]","_").replace(".","_").replace(":","_") + ".sql"
$job.Script() | Out-File C:\Users\Desktop\Jobs_from_Server\Orgineel\$jobname
if ($jobs -like '*.dtsx*')
I also have tried - Contain and set the code in the foreachloop like
ForEach ($job in $jobs |if ($jobs -like '*.dtsx*'))
Your code has a number of typos and other errors in which objects you're using where. Try this as a starter. It assumes that the first step in a job that uses SSIS is an SSIS step. Modify as needed.
The key here is checking the subsystem of the job step(s) to detect if the step is run with the SSIS subsystem.
$sqlserver = "servername"
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.Smo') | Out-Null
$srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $sqlserver
$jobs = $srv.JobServer.Jobs
ForEach ($job in $jobs)
{
if ($Job.JobSteps[0].Subsystem -eq "ssis") {
# Do SSIS stuff
write-output "Found an SSIS job $($job.name)"
}
}