Powersehll - Implement loging for failed attempt servers? - sql-server

Requirement : The below code will provide the instance name and database details & database states from the list of instances or servers available in SQL_Servers.txt from the path C:\PowerSQL\.
For few servers its not able to connect & throwing the error. I would like to keep these failed attempt server names in separate text file as Log.txt.
ForEach ($instance in Get-Content "C:\PowerSQL\SQL_Servers.txt")
{
Import-Module SQLPS -DisableNameChecking
Invoke-SQLcmd -Server $instance -Database master 'select ##servername as InstanceName,name as DatabaseName,state_desc as DBStatus from sys.databases' | Format-Table
}
Help on this is highly appreciated.

the below code is full filled my requirement.
Import-Module SQLPS -DisableNameChecking
ForEach ($instance in Get-Content "C:\PowerSQL\SQL_Servers.txt")
{
try {
Invoke-SQLcmd -Server $instance -Database master 'select ##servername as InstanceName,name as DatabaseName,state_desc as DBStatus from sys.databases'
$results += New-Object PSObject -Property $instanceinfo
} catch {
Add-Content "C:\PowerSQL\ErrorLog.txt" $instance
}
}
$results | export-csv -Path C:\PowerSQL\SQLServerInstance.csv -NoTypeInformation

Related

Script runs in Powershell ISE but not in Powershell or via CMD prompt

I have the following script that grabs an Excel file and loads it to an SQL Server database. It works perfectly fine when executed in Powershell ISE:
Import-Module DBATools
Import-Module ImportExcel
$File = "S:\Shares\Reporting\File.xlsx"
$Instance = "10.24.10.100"
$Password = ConvertTo-SecureString "Password" -AsPlainText -Force
$Credential = New-Object System.Management.Automation.PSCredential ("User", $Password)
$Database = "ESS_TAB"
$Table = "DB_DATA_AUTO"
$Data = Import-Excel -Path $File | ConvertTo-DbaDataTable
Write-DbaDataTable -SqlInstance $Instance -SqlCredential $Credential -Database $Database -InputObject $Data -Table $Table
If I try to run it from anywhere else I just get a cursor. There is never and error it just runs endlessly.

How to read SQL rows as PowerShell parameters

On Azure, I am running multiple .sql files from a container in 100s of Azure SQL Databases via Powershell runbook.
I want Powershell to read the server name and the database name to run the scripts from my SQL Server table that looks like this:
Servername
Databasename
Status
Server-01
DB-01
Process
Server-01
DB-02
Skip
Server-02
DB-03
Process
In my current version of the Powershell script, it can read the files in the container and run them in a given server and database:
# Get the blob container
$blobs = Get-AzStorageContainer -Name $containerName -Context $ctx | Get-AzStorageBlob
# Download the blob content to localhost and execute each one
foreach ($blob in $blobs)
{
$file = Get-AzStorageBlobContent -Container $containerName -Blob $blob.Name -Destination "." -Context $ctx
Write-Output ("Processing file :" + $file.Name)
$query = Get-Content -Path $file.Name
Invoke-Sqlcmd -ServerInstance "Server-01.database.windows.net" -Database "DB-01" -Query $query -AccessToken $access_token
Write-Output ("This file is executed :" + $file.Name)
}
I am looking for a method that will read the rows from the table and feed them into the -ServerInstance and -Database fields in the Invoke-Sqlcmd. Ideally it can filter out the Skip rows.
One method is to load the database list into a DataTable and iterate over the list for each query. Change the $databaseListConnectionString in the example code below per your authentication method and set the connection AccessToken if/as needed.
# get database list
$databaseListConnectionString = "Data Source=YourServer;Initial Catalog=YourDatabase"
$databaseListQuery = "SELECT ServerName, DatabaseName FROM dbo.DatabaseList WHERE Status = 'Process';"
$dataAdapter = New-Object System.Data.SqlClient.SqlDataAdapter($databaseListQuery, $databaseListConnectionString)
$dataAdapter.SelectCommand.Connection.AccessToken = $access_token
$databaseList = New-Object System.Data.DataTable
[void]$dataAdapter.Fill($databaseList)
# Get the blob container
$blobs = Get-AzStorageContainer -Name $containerName -Context $ctx | Get-AzStorageBlob
# Download the blob content to localhost and execute each one
foreach ($blob in $blobs) {
{
$file = Get-AzStorageBlobContent -Container $containerName -Blob $blob.Name -Destination "." -Context $ctx
Write-Output ("Processing file :" + $file.Name)
$query = Get-Content -Path $file.Name
foreach($database in $databaseList.Rows) {
Invoke-Sqlcmd -ServerInstance "$($database.ServerName)" -Database "$($database.DatabaseName)" -Query $query -AccessToken $access_token
Write-Output ("This file is executed :" + $file.Name)
}
}
}

How may I export the output of SQL query in Excel fetched through PowerShell?

I am using this power-shell script to fetch the versions of all SQL Servers in a list.
How may I export the result columns (only query output not error messages) into excel and send to email after the script is run?
Can someone help me add the required script please?
Import-Module SQLPS -DisableNameChecking
$ServerInstences = Get-Content "D:\DBA\All_Server_monitoring\ServerList.txt"
$SQLQuery = #"
Select ##Servername 'Server Name' ,##version 'Version'
"#
$DBName = "master"
$ServerInstences |
ForEach-Object {
$ServerObject = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Server -ArgumentList $_
Invoke-Sqlcmd -ServerInstance $_ -Database $DBName -Query $SQLQuery
}
The easiest way to export data to a csv file is by using Export-CSV which takes an input object (or object array) and a path and can fill out the csv file from that object/array. For you, it would look like this:
$results = Invoke-Sqlcmd -ServerInstance $_ -Database $DBName -Query $SQLQuery
New-Item -Path "MYPATH.csv"
Export-CSV -Path "MYPATH.csv" -InputObject $results -Append
CSV files are versatile and can be opened with the most lightweight text editors. They also can be easily emailed and opened with Excel.

How to load specific packages from a server with Powershell

I have a PowerShell script that I am writing to extract all the jobs of a specific server like
$sqlserver = "Servername"
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.Smo') | Out-Null
$srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $sqlserver
$jobs = $srv.JobServer.Jobs
ForEach ($job in $jobs)
{
$jobname =$Servernaam.replace("\","_") + '_'+ $job.Name.replace(" ","_").replace("\","_").replace("[","_").replace("]","_").replace(".","_").replace(":","_") + ".sql"
$job.Script() | Out-File C:\Users\Desktop\Jobs_from_Server\Orgineel\$jobname
if ($jobs -like '*.dtsx*')
}
The code I got now gets all the jobs from the server and store them in separate files.
The problem is that I only want to get the jobs of the Micrososft SQL Server that contains .dtsx in the string of #command
I tried for example
ForEach ($job in $jobs)
{
$jobname =$Servernaam.replace("\","_") + '_'+ $job.Name.replace(" ","_").replace("\","_").replace("[","_").replace("]","_").replace(".","_").replace(":","_") + ".sql"
$job.Script() | Out-File C:\Users\Desktop\Jobs_from_Server\Orgineel\$jobname
if ($jobs -like '*.dtsx*')
I also have tried - Contain and set the code in the foreachloop like
ForEach ($job in $jobs |if ($jobs -like '*.dtsx*'))
Your code has a number of typos and other errors in which objects you're using where. Try this as a starter. It assumes that the first step in a job that uses SSIS is an SSIS step. Modify as needed.
The key here is checking the subsystem of the job step(s) to detect if the step is run with the SSIS subsystem.
$sqlserver = "servername"
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.Smo') | Out-Null
$srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $sqlserver
$jobs = $srv.JobServer.Jobs
ForEach ($job in $jobs)
{
if ($Job.JobSteps[0].Subsystem -eq "ssis") {
# Do SSIS stuff
write-output "Found an SSIS job $($job.name)"
}
}

Join SQL query Results and Get-ChildItem Results

Background: I have a directory with a number of files that are imported to SQL server.
Task: Creating a PowerShell script which will pick up files within this directory and use the filenames as in the SQL query.
Ultimate objective: To display SQL results besides the filenames but the resultset being displayed should also show files having no entries in SQL server. Something like RIGHT JOIN in SQL server queries.
Powershell Code
$files = Get-ChildItem -Recurse -Force $filePath -ErrorAction SilentlyContinue | Where-Object { ($_.PSIsContainer -eq $false) } | Select-Object Name
$Server = "Loadv1"
$DB = "LoadDB"
$dbResults = #()
ForEach ($file in $files)
{
$fileName = $file.name
write-host $fileName
if($fileName.Length -gt 1)
{
$Query = "
SELECT FileName,CurrentStatus
FROM LogStatus
WHERE FileName LIKE '$fileName%'
"
# Write-host $Query
}
$dbResults += Invoke-Sqlcmd -ServerInstance $Server -Database $DB -Query $Query
}
$dispResults = $dbResults,$file
$dispResults | Format-Table -autosize
Work done so far: I have been able to fetch the file names using Get-ChildItem and loop them to get the query results. However, the result I am currently getting does not show the files that don't have corresponding entry in SQL server table
Current Result
OperationalLanding20150622061502.dat
OperationalLandingAudit20150622061502.dat
OperativeThird_Party_System20150616090701.dat
FileName CurrentStatus
OperationalLandingAudit20150622061502.dat SSIS Package Complete
OperativeThird_Party_System20150616090701.dat SSIS Package Complete
Expected Result
OperationalLanding20150622061502.dat
OperationalLandingAudit20150622061502.dat
OperativeThird_Party_System20150616090701.dat
FileName CurrentStatus
OperationalLanding20150622061502.dat NULL
OperationalLandingAudit20150622061502.dat SSIS Package Complete
OperativeThird_Party_System20150616090701.dat SSIS Package Complete
Hoping I was able to explain my requirement above.
OK so if the SQL query does not have results then NULL is returned and, in essence, nothing is added to the $dbResults array. Instead lets append the results to a custom object. I don't know what PowerShell version you have so I needed to do something that I know should work. I also don't use the SQL cmdlets much so I had to guess for some of this.
$files = Get-ChildItem -Recurse -Force $filePath -ErrorAction SilentlyContinue |
Where-Object {$_.PSIsContainer -eq $false -and $_.Length -gt 1} |
Select-Object -ExpandProperty Name
$Server = "Loadv1"
$DB = "LoadDB"
$files | ForEach-Object{
write-host $_
$Query = "
SELECT FileName,CurrentStatus
FROM LogStatus
WHERE FileName LIKE '$_%'
"
$Results = Invoke-Sqlcmd -ServerInstance $Server -Database $DB -Query $Query
$props = #{Name = $_}
If($Results){
$props.CurrentStatus = $Results.CurrentStatus
} Else {
$props.CurrentStatus = "Null"
}
New-Object -TypeName PSCustomObject -Property $props
} | Format-Table -autosize
What this does is create a custom object that contains the results of the sql query (Which I did not change for reasons stated above). If there are no results returned we use the string "null" as a filler.
I cleaned up how you generated the $files variable by making is a simple string array with -Expand and moved the length condition there as well.
You should now have all the expected results. I say should since I am assuming what the return object looks like.
$Query = "
SELECT isNull(A.FileName, b.FileName) FileName,ISNULL(A.CurrentStatus,B.CurrentStatus) CurrentStatus
FROM LogStatus A
Right JOIN (SELECT '$filename' FileName,NULL CurrentStatus) B
ON a.Filename like '$filename%'
"
This should pad out the filenames for you. A little tough to prototype since it's in powershell but I might be able to come up with a sql fiddle to prove it.
EDIT
Answer edited, with sql fiddle:
http://sqlfiddle.com/#!3/12b43/9
Obviously, since you're in a cursor, we can only prove one query at a time.

Resources